Using Crowdsourced Food Image Data for Assessing Restaurant Nutrition Environment: A Validation Study

Crowdsourced online food images, when combined with food image recognition technologies, have the potential to offer a cost-effective and scalable solution for the assessment of the restaurant nutrition environment. While previous research has explored this approach and validated the accuracy of foo...

Full description

Saved in:
Bibliographic Details
Published inNutrients Vol. 15; no. 19; p. 4287
Main Authors Lyu, Weixuan, Seok, Nina, Chen, Xiang, Xu, Ran
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 08.10.2023
MDPI
Subjects
Online AccessGet full text
ISSN2072-6643
2072-6643
DOI10.3390/nu15194287

Cover

More Information
Summary:Crowdsourced online food images, when combined with food image recognition technologies, have the potential to offer a cost-effective and scalable solution for the assessment of the restaurant nutrition environment. While previous research has explored this approach and validated the accuracy of food image recognition technologies, much remains unknown about the validity of crowdsourced food images as the primary data source for large-scale assessments. In this paper, we collect data from multiple sources and comprehensively examine the validity of using crowdsourced food images for assessing the restaurant nutrition environment in the Greater Hartford region. Our results indicate that while crowdsourced food images are useful in terms of the initial assessment of restaurant nutrition quality and the identification of popular food items, they are subject to selection bias on multiple levels and do not fully represent the restaurant nutrition quality or customers’ dietary behaviors. If employed, the food image data must be supplemented with alternative data sources, such as field surveys, store audits, and commercial data, to offer a more representative assessment of the restaurant nutrition environment.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2072-6643
2072-6643
DOI:10.3390/nu15194287