Lost in Translation
NYU ITP 2019 Thesis
An interactive experience to see how machine interpret one thing differently from human.
Presentation Video in ITP Thesis Week 2019
The project has a recursive process for human and machine to interpret each other’s results. Human needs to come up with a sentence to describe an image generated by machine and the machine will do multiple machine learning translations from the description from human to a sketch and then to an image in each round of process.
An example of multiple translations
Drawception - Picture Telephone Drawing Game
A project uses machine learning to do feedback loop on images and texts.
Jake Elwes - Closed Loop
- Python Server with Flask
- Generate a sentence from an image by im2txt
- Find word tags and get nouns by SpaCy
- Word Vector similarity by SpaCy
- Draw doodles by SketchRNN
- Generate new images by AttnGan
Coordinate and process most of the data.
Use http connection to communicate with Runway and Client.
Present the result and collect user input.
A Json file that store all sketch categories
functions to draw sketch
a test function to draw sketch
A machine learning model that can generate doodle in specific categories.
The doodle data is from Quick, Draw! The Data and the model detail is from Magenta - SketchRNN.
It is downloaded from Google Cloud Platform.