Skip to content
An app that translates what it's looking at in AR. HackNYU 2018 Winner πŸ†
Swift Ruby
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.


  • πŸ† HackNYU 2018 Winner


As an international student, I wanted to create a tool that makes learning another language easier and more fun. This app combines the immediacy of pointing your phone with the convenience of instantly seeing the word you're looking for.

What it does

This app is capable of recognising, naming and translating in real time the objects it is looking at. The target word is first extracted from the live camera feed and then translated to any language through an external API. Then, both the original (English) and translated words a are displayed in augmented reality, which allows them to "stick" to the object they identify.

How it's made

The app was entirely built in Swift. It is based heavily on the work of Github user hanleywang. Specifically, the image recognition part of this project was accomplished through the InceptionV3 machine learning model, which was integrated inside the iOS app via Apple's CoreML APIs. The translation component was carried out through the Google Cloud Translate APIs. Finally, the augmented reality component was possible thanks to Apple's newest ARKit framework.


1: Install CocoaPods for dependency management (requires Xcode developer tools).

sudo gem install cocoapods

2: While in the repository's root directory, install dependencies.

pod install
You can’t perform that action at this time.