Skip to content

The AliceSpeaks™ by NeoWare application is meant to serve as a demo displaying vision and machine learning frameworks to identify items. Once done the application will then translate the word(s) to a user selected language. Both words, original and translated are then spoken through the speaker.

License

Notifications You must be signed in to change notification settings

NeowareStudios-Technology/AliceSpeaks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 

Repository files navigation

AliceSpeaks™

The AliceSpeaks™ application is an iOS native build using emerging technologies. These include the Apple Vision and Machine Learning frameworks to identify items. Once done the application will then allow user to tap on the speech button to have the identified item stated out loud in english and translated in a user selected language of choice.

How It Works

- Once the application is installed on the device the user simply opens the application to see the UI.

- In the bottom center of the app you will notice something called the 'Speech' button. Once tapped the device will attempt to identify the item along with a textual and audio representation in english and a selected translation language.

- All the way in the bottom right corner users are able to tap on the 'Language Selection' button to be brought to a selection menu.

Technologies Used

Apple Vision Framework

The Vision framework performs face and face landmark detection, text detection, barcode recognition, image registration, and general feature tracking. Vision also allows the use of custom Core ML models for tasks like classification or object detection.

Apple Machine Learning Model

Core ML 2 lets you integrate a broad variety of machine learning model types into your app. In addition to supporting extensive deep learning with over 30 layer types, it also supports standard models such as tree ensembles, SVMs, and generalized linear models. Because it’s built on top of low level technologies like Metal and Accelerate, Core ML seamlessly takes advantage of the CPU and GPU to provide maximum performance and efficiency. You can run machine learning models on the device so data doesn't need to leave the device to be analyzed.

Google Translate API

The Google Cloud Translation API can dynamically translate text between thousands of language pairs. The Cloud Translation API lets websites and programs integrate with the translation service programmatically. The Google Translation API is part of the larger Cloud Machine Learning API family.

Where Can You Find Us?

Copyright © 2019 NeoWare Inc.

About

The AliceSpeaks™ by NeoWare application is meant to serve as a demo displaying vision and machine learning frameworks to identify items. Once done the application will then translate the word(s) to a user selected language. Both words, original and translated are then spoken through the speaker.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published