Skip to content

Animation of an SMPLX character in an augmented reality application

License

Notifications You must be signed in to change notification settings

laitifranz/AR-SMPLX

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


Logo

Metahuman in AR

Animation of a metahuman in an augmented reality application

Table of Contents
  1. About the project
  2. Getting Started
  3. Animation of the metahuman
  4. AR implementation
  5. Usage
  6. Demo
  7. Further works
  8. License
  9. Contacts
  10. Acknowledgments
  11. Disclaimer

About the project

The project aims to animate a metahuman in an augmented reality application.

This is the repository for the project of the course Computer Vision 2021-2022 @ UniTN made by Laiti Francesco and Lobba Davide.

The game engine used is Unity version 2021.3 and the application was tested on iOS and Android platforms. The 3D model for the humanoid used in this project is the SMPL-X.
For the animation of the metahuman, we used the OptiTrack system available at the Multisensory Interactions Lab to track body movements.

We proposed a scenario where the metahuman is a personal trainer. The exercises are 4: warm-up, first training phase, second training phase and stretching.

Workflow

Workflow Workflow

Getting Started 👨‍💻

  1. Install Unity version 2021.3

    NOTE: a different version is not guaranteed to work properly

  2. Clone this repository

  3. Open the scene cv_2022

  4. Ensure that you have installed ARCore or ARKit packages in Unity

    Window > Package Manager > install AR Foundation, ARCore XR, ARKit XR
  5. Go to

    Edit > Project Settings > XR-Plugin Management

    check one of the boxes of ARKit or ARCore

  6. Go to

    File > Build Settings > choose your platform > switch platform
  7. Build and run the project by using the command

    File > Build And Run
  8. Now you are ready to deploy the application on your device! 🚀

Otherwise, if you would like to test the scene in the Unity simulator, you have to adapt the code and the scene to work with the Unity simulator.

NOTE:

  • For the iOS world, ARKit requires iOS 11.0 or later and an iOS device with an A9 or later processor. You also need the software XCode (only available on macOS) 😢
  • For the Android world, you have to check if your device supports ARCore. Check it on https://developers.google.com/ar/devices
  • The application was tested on an iPhone XS with iOS 15.5

Animation of the metahuman 🕺

For the animation of the metahuman we used a JSON file structured as follows:

.
├── Frame                # Number of the frame
│   ├── Trans            # Root translation
│   ├── Fullpose         # List of coordinates x,y,z for each joint
│         └── Data       # Coordinates of one joint
└── ...

We created the c3d file using the OptiTrack wear and a system of cameras and markers, in particular we used the application Motive to record movements.
Then, in order to convert the c3d file into a JSON file, we used Soma and MoSh++.

Later, we read the JSON file in the SMPL-X script and we animated the metahuman.
Successively, we implemented the animation of the metahuman which starts when we are close to it. We chose 4 meters, but you can change it in the SMPL-X script.

AR Implementation 🥽

We tested the whole project in the Unity simulator to detect any errors or bugs before switching to AR.

For the AR implementation, we used the AR Foundation Kit provided by Unity. In particular, we created an AR session and an AR camera. When you start the application, the main camera of the scene will be the camera of your smartphone.

Later, we implemented the ground plane. When you start the application, the camera will detect planes in the room and you can choose where to instantiate the metahuman. It is important to say that the metahuman is a child of the ground plane, so its position depends on the ground plane.

Usage 📱

  1. Open the application. The device scans your environment and it tries to find a ground plane where the metahuman will be instantiated
  2. Select one of the four modes available
  3. Place the metahuman wherever you want, just tap on the display 👆
  4. The animation is triggered when the camera is near the metahuman, so if you are far from it, go closer to the object. At the top of the screen you can see the distance from the metahuman.
  5. Enjoy the animation! ⚡

Demo

We provide two GIFs to show how the app looks like:

Animation GIF         Stop and go GIF

In the first GIF, we instantiate the SMPLX in a certain position and we animate it when we are close to the object.

In the second GIF we do the same steps as before but when we go far from the object (more than 4 meters), the animation correctly stops.

Further works

  • Implement more animations
  • Add voice to the SMPLX
  • Consider a more high realistic texture
  • Migrate the project to Unreal Engine

License

Distributed under the MIT License. See LICENSE for more information.

Contact

Francesco Laiti - Github - Linkedin - UniTN email | Davide Lobba - Github - Linkedin - UniTN email

Disclaimer

This project is only for educational purposes.

Acknowledgments

We thank the MMLab and the University of Trento for the opportunity to use the OptiTrack system available at the Multisensory Interactions Lab.

We used the 3D model human SMPL-X available at https://smpl-x.is.tue.mpg.de/

(back to top)

About

Animation of an SMPLX character in an augmented reality application

Topics

Resources

License

Stars

Watchers

Forks