An augmented reality experiment in react native
Try beta release: https://play.google.com/apps/testing/com.livetranslator
A simple and small app that translates in real time what you see from your mobile.
Augmented reality is an overlay of content on the real world, it means to add enhancements to the existing reality in order to make it more meaningful.
With the help of advanced Google Cloud technology such as Cloud Vision, the information and the environment surrounding can become interactive.
For a thorough discussion, please read the following blog post about it.
Selecting a language.
I created the language selector using snack expo using Animated and PanResponder from react native.
Rotating the dialog
I am also using the mobile accelerometer to create an animated transition in order to improve the user experience when rotating the mobile.
Clone the repository
git clone https://github.com/agrcrobles/react-native-live-translator.git cd react-native-live-translator
Run the app
react-native run-android # only on android react-native run-ios # only on ios
Prerequisites: Node version 6.0 or higher and npm version 3.0 or higher as well as android or ios running.
Configure Google API yourself
Enable the Cloud Vision API, for doing that, take a look at this quickstart guide
Enable Translate API as explained here
For further information look into getting started docs.
Reminder: After enabling the Google Cloud API, click the Go to Credentials button to set up your Cloud Translation API credentials.
Finally, add the key in the package.json file of the project
- en: English
- es: Spanish
- de: Deutsch
The following improvements and PR could be accepted:
- Improve error handler and encoding
- Move from REST to Google Cloud Translator API. https://cloud.google.com/translate/docs/apis?hl=es-419
- Improved Translate text by using Compute hamming
- An image diff algorithm could be also useful to avoid being overkill when taking and recording a picture.