Skip to content

BFMarks/snapdiet

Repository files navigation


Alt Text

This is Food Recognition App I built with a TensorFlow model in a Swift iOS app. Basic tutorial included but happy to add more if there are enough requested!

Live DEMO

Alt Text

*Notice the image accuracy at the bottom left.

Technology in the project.

Donate Donations will be put back in tutorials (but please don't feel like it's necessary).

Background

This app imports a Tensorflow image recognition model into a Swift app and runs the model every third of a second. The training dataset used to create the model can be found with the Food 101 Keras Dataset or the Food Cam Japanese Dataset.

Since robust training sets are essential in creating accurate models, I also built a script that pulls images from Flickr and adds to the dataset (Please feel free to reach out if you would like it). This project is build in conjunction with Morten-Just Trainer Mac.

Key Files

This view contains the bulk of the code that links the video stream with the caloric intake of the identified food.

            if confidence > 0.10 {
                label = label1

                machineGuess2.text = "\(outPut[0].key): \(String (Int(outPut[0].value)))%"
                machineGuess3.text = "\(outPut[1].key): \(String (Int(outPut[1].value)))%"
                label = outPut[0].key
                secondLabel = outPut[1].key
                }            
            // change the trigger confidence in the Config file
            if confidence > Config.confidence {
              presentSeenObject(label: label)
            }
        }

Allows you to set the confidence variable, which determines when the food is confirmed( static var confidence = 0.9)

This is a really great collection view embedded in a table view to create a very pretty interface.

alt text

###[Last Points] This is a fairly large project with a backend written in Node.js (not included) so it may not work as expected with out some dev time. Please let me know your thoughts!