Skip to content

Slight refactor of Tensorflow's data-conversion-attention example

License

Notifications You must be signed in to change notification settings

AndreMaz/deep-attention

Repository files navigation

This is a slight refactor of Tensorflow's data-conversion-attention example. All the credit goes to the TF team and the people that built the model.

I've just refactored things (in a way that make more sense to me) while learning the attention model.

Changes

  • I've reorganized the file structure
  • I've dropped the frontend part as I'm only interested in the model
  • I'm using Jest instead of Jasmine

Running

  • npm run train - Train the model
  • npm run test - Run unit tests
  • npm run flow - One execution of the model (with apply()) over an actual input.

Useful Links

About

Slight refactor of Tensorflow's data-conversion-attention example

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published