logo
×

LabelboxFebruary 15, 2021

American University of Beirut takes on dietary behavior research with ML

Researchers at the American University of Beirut have embarked on a project funded by International Development Research Centre (IDRC) to build two machine learning models that classify images of food taken by wearable cameras. They are being trained to identify whether an image shows the camera-wearer consuming food, other people consuming food, food outlets, or advertisements depicting food.

A sample of the images used to train ML models for dietary analysis.

The project will pave the way for machine learning to be used for dietary analysis, which is typically performed by nutritionists. Currently, dietary analysis is a time-consuming process that’s often impacted by poor recall and bias, but it provides important insights into food intake patterns, quality, and quantity in an individual or population. Relying on an ML model can speed up this process and make it much easier to scale.

The team designed the models based on the architecture of convolutional neural networks (CNN), and are using transfer learning to accelerate training. The models are being trained with supervised deep learning methods, which depend on high quality training data, so the researchers turned to Labelbox for its intuitive annotating interface and our labeling operations team.

Within Labelbox, the researchers made use of the clear pipeline from raw data to production, as well as QA tools to identify mislabeled images. They also leveraged Labelbox Workforce to accelerate the labeling process. With Labelbox, the research team was able to get labeling time down from thirteen seconds per image to seven seconds per image — a 46% reduction that also lowered the costs for the research team.

“The labeling team did a great job...the overall experience with Labelbox was awesome!” said Zoulfikar Shmayssani, a researcher at the American University of Beirut.

Learn how the Labelbox Workforce could help label training data for your project.