immersive multimodal meditation, a meditation mudras learning & experience tool
View the lastest version in your own browser
Immersive Multimodal Meditation(IMM) a multimodal application which enables users to utilize voice and gesture to control the meditation process and customize their own experience. It is, also, an educational tool which introduces mudras to the one who wants to experience more origin and a fun meditation with the mudras. The interface is created using p5.js, and voice control driven by p5.speech.js, with which users can control the meditation interface with natural language to switch to different modes of the meditation. The hand detection function with a webcam, which enables users to pose mudras, triggers customized chime sounds which aim for a customized meditation experience.
- supports a sitting-on-the-ground setup with voice commands and camera recognition of customized gestures;
- offers meditation options which correspond to different visuals and sounds;
- includes detailed learning instruction to introduce mudras to the meditation beginners or mudras beginners;
- provides an instructions pop-up on the mudras which can be called and dismissed anytime through gesture;
python -m http.server
MediGesture - Immersive Multimodal Meditation is a project by Ziyuan Zhu and Kat Labrou