Skip to content

vhuni/Data_Project

Repository files navigation

Stock Prediction

Creates a model to predict daily stock prices of Amazon.

Note: Project in early development stage.

Requirements

  • Amazon web service account: EC2 instance and S3 bucket
  • Amazon IAM key: for .pem, access_key and secret_key
  • Docker: pull tensorflow image from docker hub to run modelling in container

Usage

Clone or fork this repository.

  1. Run docker and pull tensorflow image (latest-jupyter gives tensorflow access to jupyter notebook).
docker pull tensorflow/tensorflow:latest-jupyter
  1. (Open any terminal) Create container from tensorflow image.
docker run -it --name tf tensorflow/tensorflow:latest-jupyter bash
  1. Clone or fork repository (leave directory open).
  2. Open jupyter notebook in browser (link to notebook provided by step no.2).
  3. Copy stock_prediction.ipynb from Data_project folder to notebook (leave browser open).
  4. Create Amazon web service EC2 instance and S3 bucket (open in new browser).
    (skip to step no.6c if you have existing ec2 and s3) 6a. download .pem key when creating ec2 instance 6b. download csv file when creating s3 bucket 6c. select instance 6d. under connect tab select SSH and copy the code
    example:
ssh -i "new_ssh.pem" ubuntu@ec2-3-135-190-123.us-east-2.compute.amazonaws.com
  1. Open Windows PS and paste SSH code similar to example (ubuntu or ec2-user will appear as root when connected).
  2. Copy scraper.py to ec2 instance
    example:
scp -i "new_ssh.pem" "scraper.py" ubuntu@ec2-3-135-190-123.us-east-2.compute.amazonaws.com:/home/ubuntu/
  1. Create venv and install requirements as pip.
    example:
create venv: python3 -m venv my_app/env
activate venv : source ~/my_app/env/bin/activate
(env) - pip3 install pandas
      - pip3 install boto3
      - pip3 install requests
      - pip3 install beautifulsoup4

9a. (optional) Test scraper.py. Always check if connected to s3 bucket, otherwise data will not be stored.

python3 scraper.py 
  1. Create a cron task for scraping Amazon stock prices. (windows PS can be closed if this step is finished)
    Open crontab
sudo crontab -e

Add a scheduler code to the last line of crontab
example:

0 9 * * * /home/ubuntu/my_app/env/bin/python /home/ubuntu/scraper.py 2>&1 | logger -t mycmd

Great! You created a scheduled Amazon stock price scraper!

  1. (Return to jupyter notebook) Run all cells in stock_prediction.ipynb

  2. If successful you will get images similar to this in your jupyter notebook:

Actual stock prices

Prediction result

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors