A set of Max abstractions designed for computing motion descriptors from raw motion capture data in real time.
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
Examples
.gitignore
README.md
SkogstadCoefficients.txt
mo.acceleration.maxpat
mo.boundingBox.maxpat
mo.centerOfMass.maxpat
mo.centroid.maxpat
mo.contractionIndex.maxpat
mo.distance.maxpat
mo.fluidity.maxpat
mo.group.maxpat
mo.jerk.maxpat
mo.newStaticPoint.maxpat
mo.positionLPF.maxpat
mo.qom.maxpat
mo.qtm3D.maxpat
mo.qtm6Deuler.maxpat
mo.qtmSig.maxpat
mo.setWeights.maxpat
mo.tkeo.maxpat
mo.velocity.maxpat
mo.zcr.maxpat
modosc_NIME_2018.pdf

README.md

modosc

modosc is a set of Max abstractions designed for computing motion descriptors from raw motion capture data in real time. The library contains methods for extracting descriptors useful for expressive movement analysis and sonic interaction design.

VIDEO TUTORIALS: https://www.youtube.com/playlist?list=PLMrDazzs9wCQET95Mel3v_Ujmq0uP7XCT

Using modosc requires the o.dot externals for Max. The official o.dot releases can be found here: http://cnmat.berkeley.edu/downloads. However if you are running Windows in 64-bit mode you will need a more recent beta release from the o.dot github page here: https://github.com/CNMAT/CNMAT-odot/releases

More details can be found in the wiki: https://github.com/motiondescriptors/modosc/wiki

For more information see the following two papers on the initial release of Modosc:

F. Visi and L. Dahl, “Real-Time Motion Capture Analysis and Music Interaction with the Modosc Descriptor Library,” in NIME’18 – International Conference on New Interfaces for Musical Expression, 2018. (This is included in the repository as modosc_NIME_2018.pdf.)

L. Dahl and F. Visi, "Modosc: A Library of Real-Time Movement Descriptors for Marker-Based Motion Capture", in MOCO'18, Proceedings of the 5th International Conference on Movement and Computing, 2018. (This can be found here: https://dl.acm.org/citation.cfm?id=3212842)