Skip to content

kevinjonathan30/rockpaperscissors

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 

Repository files navigation

ML-RockPaperScissors

This project is a simple image classification task using a pre-trained model to predict whether an uploaded image represents a rock, paper, or scissors hand gesture.

Getting Started

Prerequisites

  • Google Colab account (optional if running locally)
  • Python environment with necessary libraries (e.g., NumPy, Keras)

Installation

Clone the repository to your local machine:

git clone https://github.com/KevinJonathan30/ML-RockPaperScissors.git
cd ml-rockpaperscissors

If using Google Colab, upload the Jupyter notebook to your Colab environment.

Usage

Run the Jupyter notebook ipynb. Execute the cells in order. Upload an image of a hand gesture (rock, paper, or scissors) when prompted (Recommended to use images with green background). The model will predict and display the classification result.

About

RockPaperScissors Classification using Machine Learning

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published