While mobile visualizations are becoming more common, many still rely on design patterns created for larger screens—overlooking the unique contexts and interactions of mobile use.
This section, drawn from my contribution to the Mobile Visualization book, introduces an ideation method grounded in human-centred design. Through in-situ activities, contributors developed concepts shaped by their real-world situations and needs.
The following three examples illustrate how visualization can support mobile decision-making in everyday contexts—ranging from food choices to neighborhood exploration and local event discovery.
People often pass through cities without realizing what’s happening around them—local events, small gatherings, or nearby opportunities go unnoticed. Existing event discovery apps are static, generic, or not context-aware and this leads to a disconnect between what’s happening nearby and what users might genuinely care about—especially when they’re already on the move. This app surfaces nearby events in real time, based on the user’s location, interests, and timing. The most powerful version of this app might barely need interaction. It surfaces what's nearby when it's meaningful—and fades into the background when it’s not.
Real-Time, Location-Aware Map: The app adapts to users' current location, displaying nearby events dynamically.
Walk-Time Visuals: Estimated walk times are shown for each event, helping users gauge effort vs. reward at a glance.
Smart Filters: Users can quickly filter by interest, energy level (quiet vs. lively), price, or time—supporting more tailored discovery with minimal friction.
Left: Exercise Equivalent- Shows how much running, or lifting it would take to burn the meal—making calories more tangible. Users can customize the activities based on their routines or preferred workouts.
Right: Healthier Alternatives
Compares the meal to options like salads, along with a satiety icon to show how filling each is. Users can explore and modify the alternatives to reflect their dietary needs, preferences, or conditions.
We’re surrounded by nutrition data, yet most people still struggle to make healthy food choices. The numbers—calories, macros, serving sizes—don’t mean much on their own. They’re hard to relate to daily life, and they rarely help people understand how a meal will actually affect their body, energy, or goals.
This mobile app tackles that disconnect by turning food pictures into something people can see, feel, and interact with. It puts the user in control, using simple visuals and personalized feedback to make every food decision more informed—and more human.
Center: Meal & Nutrients- A photo of the selected food is paired with a radial nutrient chart, breaking down protein, carbs, and fats in an at-a-glance format. Users can retake or upload a new photo to explore the nutrition of their own meals or food options.
When people look for a place to live, understanding the neighborhood—like walkability, green space, safety, or transit access—can be fragmented and time-consuming. Current tools fail to support decision-making for complex, personal priorities, and users have to bounce between tabs, rely on scattered reviews, or make assumptions. It's a disjointed experience during one of the most important user journeys.
This mobile app integrates visualization that compares neighborhoods using an interactive heatmap that responds to users' priorities and criteria.
Live Heatmap: The map updates in real-time based on selected criteria. Colour-coded heat spots make it easy to spot high-performing areas at a glance.
Personalized weighting: Users can check off the factors that matter to them, and adjust their weight. For instance, someone might prioritize green space over nightlife, or safety over convenience.
Drill-Down Insights: Tapping on any neighborhood reveals a deeper layer of data—crime rates, school scores, noise levels, commute times—supporting more informed comparisons.