Skip to content

Immersive multimodal meditation, a meditation mudras learning & experience tool

License

Notifications You must be signed in to change notification settings

zy-zhu/MediGesture

Repository files navigation

MediGesture - immersive-meditation with multimodel interaction

immersive multimodal meditation, a meditation mudras learning & experience tool

View the lastest version in your own browser

Abstract

Immersive Multimodal Meditation(IMM) a multimodal application which enables users to utilize voice and gesture to control the meditation process and customize their own experience. It is, also, an educational tool which introduces mudras to the one who wants to experience more origin and a fun meditation with the mudras. The interface is created using p5.js, and voice control driven by p5.speech.js, with which users can control the meditation interface with natural language to switch to different modes of the meditation. The hand detection function with a webcam, which enables users to pose mudras, triggers customized chime sounds which aim for a customized meditation experience.

System Architecture

Feature

  • supports a sitting-on-the-ground setup with voice commands and camera recognition of customized gestures;
  • offers meditation options which correspond to different visuals and sounds;
  • includes detailed learning instruction to introduce mudras to the meditation beginners or mudras beginners;
  • provides an instructions pop-up on the mudras which can be called and dismissed anytime through gesture;

IMM uses

App run

python -m http.server

Contributors

MediGesture - Immersive Multimodal Meditation is a project by Ziyuan Zhu and Kat Labrou

About

Immersive multimodal meditation, a meditation mudras learning & experience tool

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published