Skip to content

Accompanying code for Paperspace tutorial "Build an AI to play Dino Run"

License

Notifications You must be signed in to change notification settings

tjstavenger-pnnl/DinoRunTutorial

 
 

Repository files navigation

Dino Run Tutorial

A Deep Convolutional Neural Network to play Google Chrome's offline Dino Run game by learning action patterns from visual input using a model-less Reinforcement Learning Algorithm

Accompanying code for Paperspace tutorial "Build an AI to play Dino Run"



Video Sample

Installation

Start by cloning the repository

$ git clone https://github.com/Paperspace/DinoRunTutorial.git
You need to initialize the file system to save progress and resume from last step.
Invoke init_cache() for the first time to do this

Dependencies can be installed using pip install or conda install for Anaconda environment

  • Python 3.6 Environment with ML libraries installed (numpy,pandas,keras,tensorflow etc)
  • Selenium
  • OpenCV

About

Accompanying code for Paperspace tutorial "Build an AI to play Dino Run"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%