Skip to content

IonZhao/Transformer_Implement

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Package required

pip install torch

pip install numpy

Implementation

Implement based on NumPy and PyTorch, with

  • Multihead Self-attention
  • Basic Positional Embedding(Increasing linear numbers)
  • Feed Forward layer
  • Adding and normalization(Residual Network)

Attention and Transformer Models. “Attention Is All You Need” was a… | by  Helene_k | Towards Data Science

Analyzation

Model been used to content generate. Since it's been trained at financial discourse, the generated text had stay on that domain.

And we analyzing the model performance based on the intrinsic property: Perplexity.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors