This project is a simple image classification task using a pre-trained model to predict whether an uploaded image represents a rock, paper, or scissors hand gesture.
- Google Colab account (optional if running locally)
- Python environment with necessary libraries (e.g., NumPy, Keras)
Clone the repository to your local machine:
git clone https://github.com/KevinJonathan30/ML-RockPaperScissors.git
cd ml-rockpaperscissors
If using Google Colab, upload the Jupyter notebook to your Colab environment.
Run the Jupyter notebook ipynb. Execute the cells in order. Upload an image of a hand gesture (rock, paper, or scissors) when prompted (Recommended to use images with green background). The model will predict and display the classification result.