Skip to content

Feedforward and LSTM neural networks forecast our favourite financial markets. ๐ŸŒŠ

License

Notifications You must be signed in to change notification settings

ShrutiAppiah/crypto-forecasting-with-neuralnetworks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

30 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Forecasting with feedforward and LSTM neural networks

Paper & assoicated research

Read the research paper.

Highlights

Noise can sometimes be good!

  • Noise helps optimizers escape saddle points and local maxima/minima

Vanishing gradients are a problem. LSTM units save the day.

  • LSTM (Long Short-term Memory) neural networks are mindful of long-term dependencies. They remember things from the past just like your girlfriend does. Read more about gradient descents.

Adam optimizers can recalculate neuron weights based on both first and second order moments

  • Adam optimizers combine Adaptive Gradient (AdaGrad) and Root Mean Square Propogation (RMS Prop) calculators.
  • In a distribution, the first-order moment is the mean. The second-order moment is the variance.
  • AdaGrad is great at handling sparse gradients. It calculates second-order moments based on multiple past gradients.
  • RMSProp is based solely on first-order moments i.e means.
  • Combined, the Adam Optimizer produces more sensible learning rates in each iteration.

Overview

Research Summary

License

License: MIT

Copyright (c) 2018 Shruti Appiah

About

Feedforward and LSTM neural networks forecast our favourite financial markets. ๐ŸŒŠ

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages