top of page

Roles

Product Designer

Team Leader

AR Developer

React Native

Figma

Unity

Tools

Time

48 Hours

Platform

iOS

FoodPrint

FoodPrint is a food diary app that allows the users to automatically log their carbon footprint with Computer Vision and promote their awareness on food sustainability through an AR experience. 

With global warming and climate change becoming a global crisis, more and more people are now more aware of their impact on the environment and opt for a more sustainable lifestyle.

 

According to an analysis by Joseph Poore and Thomas Nemecek (2018), published inScience – food is responsible for approximately 26% of global Green House Gas emissions. When we try to reduce our carbon footprint in our daily life, food is one of the most important factor we need to look into.

While most people have the intention to make changes, it is difficult to be consistent and motivated. We want to see our progress on our carbon footprint reduction. When it comes to food, the progress became difficult to quantify and visualize.

How do we motivate people to maintain a sustainable food lifestyle ?

With that in mind, we decided to create a tool to make that process easier. We aimed to develop a mobile app using React.

The Challenge

When we try to track our food carbon footprint, several obstacles come up in the process.

  1. We eat every a few hours, it is tedious to write down every trivial food that we have.

  2. After keeping everything written down, we need to look up the carbon footprint data for each food.

  3. Provided the data, it might be too abstract to make sense of the carbon footprint data.

  4. After we understand our carbon footprint, we also need to understand our own progress.

We look them up online and there are solutions developed for each of these obstacles.

  1. Use food journal app.

  2. Look it up in carbon footprint database.

  3. Use GHG Calculator and compare the result to average emission data.

  4. Use spread sheet to understand the progress.

 

However, putting these together is very long and complicated process, especially given that we have to do it multiple times a day.

Defining the Problem

How do we streamline the process of logging carbon footprint?

Ideation

We can use food pictures as an entry point.

Our early adapters are people who like taking pictures of their meals

To streamline this process, we want to connect all four of the steps. Our app will consist of the four components above and connecting them by using tools and APIs.

After putting in your food journal, the app automates the process to directly show your impact and GHS emissions.

 

However, there are still two problems left unsolved:

  1. How users can easily log down everything they have in a day.

  2. How do we mark the progress?

 

The first step is the biggest resistance a user might face in the experience. Developing a new habit is very difficult. However, we noticed that many people already have a habit of taking picture of their foods.

Users can select a place and pin the virtual island there.  The island gets "flooded" when the user exceeds maximum carbon emission. The numbers of animals and air transparency reflect the carbon emission from land use, farm and food processing.

We need to answer several key questions for our UX Design

  • What's the average time of logging a meal? 

  • How many people would stay and keep using the app for long term?

  • How easy the users feel when they are logging their meals?

If we want to move forward with product, we need a more rigorous approach to develop and design the product.

  • Invite people to use the the minimum viable product.

  • Test our hypothesis on the pain points by interviewing users.

  • Analyze the priority of key features related to our problems.

  • Understand the capacity and limit of the technology we use.​

Next Steps

Gamify the Experience - AR Island

The one last part of our experience is how we show and mark the progress of our users. Beyond the tables and graphs, we want to establish a deeper connection with the users.

We decided to make an "island" that represents our world with  impact from climate change.

Using categories from our data, the carbon footprint impact is translated into conditions of the island - sea level, animals, trees and air quality. Using AR, we blur the boundary between the virtual island and the real world.

We have identified our core users to be people who have a habit of logging their food activities.

The key interactions occur in the input and output of our app. To convert their regular habits into our product, we decided to explore the technology of computer vision. Using computer vision APIs, users can seamlessly identify foods using their phone. In addition, it will also allow users to use photos in their album and update their food journal once a day.

User Flow

Wireframe

We sketched out wireframes for our experience to figure out the layout. In the first iteration, we put all the diary, the data and the insights in a flat card structure. 

Iteration

Distinguish the Homepage Structure

 

The flat structure makes it difficult to distinguish between different functions. We chose to structure the diary and the data into two layers. This makes the user more focused on one of the layer and efficient to navigate between them.

Working on this hackathon project is a very exciting experience. We start from a concept and end up with an actual workable MVP in two days. I am very proud that we utilized our knowledge and skillsets to create a very unique experience for environmental awareness. I hope a better UX will lead to more people shifting to a lifestyle with lower carbon emission. Nonetheless, there are many things we learned from this project.

How to prioritize the pain points 

With or without time constraint, it's important to prioritize what UX problems we need to work on first. If we took sometime to think through all the needs, we would have implemented the album idea earlier. 

Motion design are very helpful

The motion design really helped both the developer and the designer to have an intuitive user perspective. Problems and new challenges often emerge naturally from the motion prototype.

Always test with users

We first thought testing with the users would take too much time for a hackathon, but it turns out to save our time because users can provide their honest reaction when we are too caught up in our thinking process. Even we could not find more general testers, it still worth testing with anyone we can find.

Takeaway

1. Increase the efficiency of log meals and calculate the carbon food print.

2. Serve as a food journey to attract users.

3. Solve problems with failed picture recognition.

4. Decrease the burden of logging every meal.

Our Goal

Improving the Object Detection experience

Instead of letting computer vision api guesses all the food, we provide all the detected foods and let the user easily choose what they actually have. To handle the error case, user can easily pick the correct result.

In addition, users can search for what they eat and input manually.

Forget about the App

 

We want the experience to take as less steps as possible.

Opening the app every time we eat is a burden. So we allow the users to open the app once a day or even a week to add the meals from their album altogether.

We also designed an on-boarding test and put placeholder data if they forgot to log.

Our final solution is a food diary app that also tracks your carbon footprint.

 

Using recipe and computer vision API, the app can automatically recognize foods in the pictures. We designed experience that help the users to utilize computer vision smoothly and intuitively.

 

As a fun part, we also developed an AR mini game in our app to illustrate our impact on the environment

The Solution

bottom of page