AI Exercise Assistant
This project is a starting point for a Flutter application.
A few resources to get you started if this is your first Flutter project:
For help getting started with Flutter development, view the online documentation, which offers tutorials, samples, guidance on mobile development, and a full API reference.
AI Yoga Assistant
Explore the docs »
Report Bug
·
Request Feature
- About the Project
- Getting Started
- Features and Functionality
- Concept Process
- Development Process
- Final Outcome
- Roadmap
- Contributing
- License
- Contact
- Acknowledgements
This is an app that uses AI to detect whether you are doing a chosen yoga pose correctly or not.
The following instructions will get you a copy of the project up and running on your local machine for development and testing purposes.
Ensure that you have the latest version of Android Studio installed on your machine.
Here are a couple of ways to clone this repo:
-
Clone Repository
Run the following in the command-line to clone the project:git clone https://github.com/userHP200104/Navi_AI.git
Open
Software
and selectFile | Open...
from the menu. Select cloned directory and pressOpen
button. -
Run the following in the command-line to check if you have everything installed to run flutter:
flutter doctor
-
Run the following in the command-line to get all the required dependancies:
flutter pub get
-
Run the following in the command-line to run the app:
flutter run
On the home scree you will find the number of poses completed weekly, your overall accuracy of all your poses, a them changer and a dark mode toggle.
The dark mode toggle will switch the theme between dark mode and light mode.
The theme change allows you to set the theme of the app by utilising dynamic colour palette.
This screen allows you to add and delete yoga poses you want to do. It is also an access point for the pose checker.
This is a bottom sheet that comes up which allows you to select a pose that you want to add.
This uses Ml and AI to track your bodies position to see if you are doing the pose you chose correctly.
The Development Process
is the technical implementations and functionality done in the frontend and backend of the application.
- Project created and
Frontend
implemented. Firebase
set up and linked to project.- Components made compatible with Material 3.
Body Detetion
added to live camera- Poses fetched from database.
Yoga Pose Model Trained
Yoga Pose Model
added to live camera.- Functionality added to display if the pose is correct or not.
- I really enjoyed how simple Flutter is too use considering this was my first time using it. Even though the structure of the code was different everything works together realy seemlessly.
- The biggest challenge that I faced was implenting the
Machine Learning Model
to the app since I first had to train my own model of over 1000 image but the real problem came when I needed to get the camera to recognise the yoga poses. So after a lot of research I figured out how to get the model to work through TensorFlow Lite.
- Having tutorials to teaach you how to do the poses.
- Training a model of 82 yoga poses.
- Customising the look of the app.
To see a run through of the application, click below:
See the open issues for a list of proposed features (and known issues).
Contributions are what makes the open-source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
- Your Name & Surname - userHP200104(https://github.com/userHP200104)
Distributed under the MIT License. See LICENSE
for more information.
- Hansin Prema - hansinprema@gmail.com
- Project Link - https://github.com/userHP200104/Navi_AI