
We were inspired to do this project as two team members have relatives with diabetes who have to manually calculate how much insulin to take after eating food. We wanted to simplify this process for them and make their lives easier.
This application takes in the input of the food consumed in the form of a photo or through selection through a UCSC-based dining hall menu. The application then pulls the amount of carbohydrates the food item has from a database sourcing official UCSC nutritional data and calculates a suggested number of insulin units based on the user’s insulin-to-carbohydrate ratio.
We built the website using React JS. We trained the food recognition model using Coco. For the authentication and database, we are using Firebase.
All of us have minimal to no experience using React JS, so it took us many painful hours of debugging and trying things out to get the website up.
Sankritya was able to train the food recognition model in just four hours! I (Sierra) am proud of how much we’ve accomplished towards the website to having essentially 0 React experience.
Firebase, React JS, how to train an image model sourcing images through web scraping.
Expand the AI model to recognize more foods; expand the menu to include foods beyond the UCSC Dining Hall; and update the menu to list options depending on the time of day and dining hall. ***