In Frontiers in nutrition
Food recognition and weight estimation based on image methods have always been hotspots in the field of computer vision and medical nutrition, and have good application prospects in digital nutrition therapy and health detection. With the development of deep learning technology, image-based recognition technology has also rapidly extended to various fields, such as agricultural pests, disease identification, tumor marker recognition, wound severity judgment, road wear recognition, and food safety detection. This article proposes a non-wearable food recognition and weight estimation system (nWFWS) to identify the food type and food weight in the target recognition area via smartphones, so to assist clinical patients and physicians in monitoring diet-related health conditions. In addition, the system is mainly designed for mobile terminals; it can be installed on a mobile phone with an Android system or an iOS system. This can lower the cost and burden of additional wearable health monitoring equipment while also greatly simplifying the automatic estimation of food intake via mobile phone photography and image collection. Based on the system's ability to accurately identify 1,455 food pictures with an accuracy rate of 89.60%, we used a deep convolutional neural network and visual-inertial system to collect image pixels, and 612 high-resolution food images with different traits after systematic training, to obtain a preliminary relationship model between the area of food pixels and the measured weight was obtained, and the weight of untested food images was successfully determined. There was a high correlation between the predicted and actual values. In a word, this system is feasible and relatively accurate for one automated dietary monitoring and nutritional assessment.
Zhang Qinqiu, He Chengyuan, Qin Wen, Liu Decai, Yin Jun, Long Zhiwen, He Huimin, Sun Ho Ching, Xu Huilin
2022
elimination hardware, food recognition, machine learning, nutrition monitoring, weight estimation