An application using the CoreML Library and various mlmodels to identify surrounding objects in real time, and will report identified objects through speech. This app is intended as a visual aid for the visually impaired, under stationary conditions ONLY.
The application uses the CoreML modle VGG16. Download the mlmodel here, and add it into the Xcode project. Check 3rd Eye under target Membership after the mlmodel is added.