Skip to content

M-MOVE-IT is multi-type data labeling and annotation tool with standardized output format. This till simplifies the proces for multimodal annotation and synchronizes sensors. This repo is forked from labelstudio.

License

Notifications You must be signed in to change notification settings

AI-Sensus/M-MOVE-IT

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

M-MOVE-IT: Multimodal Machine Observation and Video-Enhanced Integration Tool for Data Annotation

M-MOVE-IT has been built on the open source data labeling tool Label Studio (LS). The 1.4.1 version of Label Studio was forked and extra features were added to this version. The workflow overview, project management and sensor data parsing were thereafter improved.

The version of LS is important. We use an older version because this version still supports a ‘hack’ that allows time series being synchronized with videos when annotating. The way this feature works will be explained later in this documentation. What is important for now is that by upgrading the fork of LS to a version higher than v1.4.1 one disables this functionality.

Watch this seminar presentation that explains the motivation for this project and a live demo of this tool.

For those interested in contributing, please refer to CONTRIBUTING.md.

Install M-MOVE-IT

# Set-up virtual environment (python v3.10)
pip install virtualenv #only if not yet installed virtualenv
python -m venv <venv-name>
<venv-name>/Scripts/activate
# Clone the repository
git clone https://github.com/AI-Sensus/label-studio
# Go to the directory label-studio
cd label-studio
# Install all package dependencies
pip install -e .
# Run database migrations. This creates the database for Django
python label_studio/manage.py migrate
# Configure the static files from the React project
python label_studio/manage.py collectstatic
# Run the server locally at http://localhost:8080
python label_studio/manage.py runserver

After installing M-MOVE-IT, sign up for label-studio at the first webpage when running the app.

Exiftool

M-MOVE-IT utilizes Exiftool for metadata handling. To ensure proper functionality, verify that Exiftool is installed by running exiftool -ver in the terminal. If Exiftool is not installed, visit Exiftool installation page for instructions. After installation, confirm that Exiftool is added to the system path by running exiftool -ver again.

Landingpage

The ‘landingpage’ app handles the overview of all projects, this is called the dashboard. Also, it handles the project pages of all the projects. Here one can find all functionality explained in the steps one should take.

When one creates a project in the dashboard in the backend four LS subprojects are created per project. An LS project is what one normally would use in LS to upload, annotate and export. LS shows all the data in a project all the time and allows for one annotation setup per project. However since we cut up the files and use annotation for several use cases we chose to use the project structure from LS for our data management and create several LS projects per M-MOVE-IT project. The M-MOVE-IT projects are stored as django models. It creates the following LS projects: data import, subject annotation, activity annotation. It also handles the exportation of a project when finished by exporting all annotations as a JSON file together with the corresponding data files.

The ‘landingpage’ app contains the static files used by the M-MOVE-IT html pages. These are stored in ‘landingpage/static/main.css’.

Sensormodel

The ‘sensormodel’ app handles all the used sensors, subject and the relationships between these per project. It stores them as objects in this model and uses a ForeignKey relationship with the project to make sure all sensors and subjects are related to a certain project. It contains the following models: Sensor, SensorType, Subject and Deployment. The view methods handle adding, adjusting and deleting sensors, subjects and deployments. SensorTypes contain information for specific sensors that is necessary for parsing the data. SensorTypes are different since they are stored in label-studio/sensortypes as .yaml files and are parsed to django models by the method sync_sensor_parser_templates(). This method checks for updates in the files in the label-studio/sensortypes folder and adds or updates the SensorType objects accordingly.

Sensordata

The ‘sensordata’ app handles all imported sensor data and stores them into the ‘dataimport’ subproject. It extracts all data files from an uploaded zip file and parses the sensor information from the data. It then creates a LS object Sensordata for all files. The uploaded zip file should contain only files from the same sensor.

The ‘sensordata’ app also handles the offsets between sensors. When a project is created, a subproject ‘offsetannotation’ is created. This project is used to annotate the offset between overlapping data from different sensors and these offsets are stored in Django objects. For more information on offset annotation see ‘Efficient Synchronization of Video and IMU Data for Activity Recognition’.

Subjectannotation

The ‘subjectannotation’ app handles the ‘subjectannotation’ subproject and creates annotation labels from all project subjects. The mp4 data is converted into annotation tasks inside this subproject. The subject presence annotations are automatically parsed into SubjectPresence objects when generating activity annotation tasks in the ‘taskgeneration’ app. The parsing method is in the ‘subjectannotation’ app.

Taskgeneration

The ‘taskgeneration’ app generates tasks by selecting all data where the input subject is present and slicing it in segments of a chosen length. The mp4 and csv data are synchronized using the SensorOffset objects. Using the offset and the datetimes that were parsed when uploading sensor data it is possible to determine the overlap two SensorData objects really have. These are stored as SensorOverlap objects. Using these objects and a user-chosen segment length, segments can be cut up, uploaded and tasks can be generated.

Time series and video annotation: Synchronization trick

The power of the M-MOVE-IT annotation tool is the ability to annotate several sensor modes at the same time and have them synchronized. LS does not offer this functionality. An old post on the LS site, that has now been deleted (it has been replaced on the site 2024-03-12), shows a way to allow synchronization between a time series and a video. As has been said earlier, this functionality has been removed from LS in versions later than 1.4.1. Therefore sticking to this version is important.

The synchronization works one way. This means that changing the timestamp of the timeseries moves the timestamp of the video, but changing the timestamp of the video does not move the timestamp of the timeseries.

Click here for more explanation

LABEL-STUDIO

If you're interested in exploring the original README for Label Studio, you can click here. It contains detailed information about the original Label Studio software and its features.

About

M-MOVE-IT is multi-type data labeling and annotation tool with standardized output format. This till simplifies the proces for multimodal annotation and synchronizes sensors. This repo is forked from labelstudio.

Topics

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 71.7%
  • JavaScript 16.8%
  • HTML 3.6%
  • Stylus 3.3%
  • CSS 2.1%
  • TypeScript 1.8%
  • Other 0.7%