Household Food Waste Monitoring System Using Convolution Neural Networks

2023 IEEE 15th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM)
(2023), pp. 1-6
Reylwin N. Caña
a
,
Rozanna Dixie D. Berbano
a
,
Prince O. Sarmiento
a
,
Jocelyn M. Amor
a
,
Rex Paolo C. Gamara
a
a Electronics Engineering Department, FEU Institute of Technology, Manila, Philippines
Abstract: Despite awareness of food waste consequences on environmental, societal, and financial aspects, no behavioral change towards food waste reduction is seen in the majority of Filipino Households. This study focuses on the creation of a food waste monitoring system that provides accurate figures to incite constructive behavioral changes. Thru the utilization of Convolutional Neural Networks coupled with image processing techniques, the system identifies and classifies food waste items, measures weight, and collects data for the consumers' viewing in a manner that is both informative and user- friendly. The device developed by the proponents involves two Raspberry Pi 4 Boards and one Arduino Uno board which are the main components of the device that simultaneously communicates to process the given food waste. The other components such as two RPI Cameras that can record in HD, detect food waste, and send the following data to be processed. Then, the weight is taken using height input from the user and EfficientDet algorithm is used for detecting different classes of food waste. Using 2D images that were captured the RPI Cameras, the proponents gathered images to make data sets and to train the device in such a way that it can determine weight and food class. After these steps, the food waste can now be thrown away automatically via the moving platform. After conducting tests, we successfully created a device with an impressive detection accuracy rate of 94.0952%, accompanied by an error rate of only 5.9075% and a precision level of 96.42156%. When it comes to weight detection, our device achieves an overall mean absolute percentage error rate of 9.9898%. Additionally, the tilting platform demonstrates a remarkable relative accuracy of 98.8889% with a success rate of 100%.