Skip to content
forked from leavez529/Imotion

UTokyo IST Research Hackathon. Understands emotions in images, and moves robots to convey it to the viewer.

License

Notifications You must be signed in to change notification settings

marciska/imotion

 
 

Repository files navigation

Imotion

release docker License

Imotion is a project created during UTokyo Research Hackathon 2021.
It understands emotions an external viewer might have while observing an image, and then moves multiple agents in a way that conveys a similar emotion to the viewer.

The original code submitted to the Hackathon can be found at this repo.
This repository extends the original code for docker-support, includes bugfixes, and modified the visual aspects slightly.
The most credit for the original source code goes to:
- adata111
- leavez529
- Tanvisn

header

Inspiration for this idea comes from the following paper:

Santos, M., Egerstedt, M. From Motions to Emotions: Can the Fundamental Emotions be Expressed in a Robot Swarm?. Int J of Soc Robotics 13, 751–764 (2021). https://doi.org/10.1007/s12369-020-00665-6

The name Imotion is a combination of Image and Motion.

Getting started 🚀

There are about three ways to start the Imotion webserver:

After starting the Imotion webserver, you can access the webservice at: http://localhost:5000

Pulling Docker image 🐳

Simply pull & run the docker image

docker run -dp 5000:5000 makokaz/imotion

That's all! 🎉

Building Docker image 🐳

  1. First, clone this project by

    git clone --recurse-submodules https://github.com/makokaz/imotion.git
  2. Create docker image

    docker build -t imotion .
  3. Run docker image

    docker run -dp 5000:5000 imotion

Running by source

  1. First, clone this project by

    git clone --recurse-submodules https://github.com/makokaz/imotion.git
  2. Pull the trained artemis model from Google Drive and put it in the folder ./server/checkpoints/best_model.pt.

    Note: If the model disappeared from the Google Drive folder, it must be first trained as explained in the artemis repo.

  3. In the root folder, run

    pip install -e ./artemis/ && pip install -e . && python -m textblob.download_corpora
  4. To start the webserver, run

    flask run

Structure

├── app/      # the website interface files, FLASK
├── artemis/  # artemis image captioning package
├── server/   # back-end, image to emotion functionality
└── app.py    # main file that serves the imotion webserver

About

UTokyo IST Research Hackathon. Understands emotions in images, and moves robots to convey it to the viewer.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • JavaScript 97.8%
  • Python 1.9%
  • Other 0.3%