Skip to content
/ Neura Public

Transformer-based neural network using the PyTorch deep learning library:

License

Notifications You must be signed in to change notification settings

Kubenew/Neura

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Neura

A convolutional neural network using the TensorFlow deep learning library or Transformer-based neural network using the PyTorch deep learning library: This code defines a transformer-based neural network called TransformerModel that takes in a sequence of integers and predicts the next integer in the sequence. The neural network is composed of an embedding layer, a positional encoding layer, a transformer encoder layer, and a fully connected layer.

The PositionalEncoding layer calculates the positional encoding vectors, which are added to the input embeddings to give the neural network information about the position of each token in the sequence.

The TransformerModel layer applies the embedding layer, positional encoding layer, and transformer encoder layer to the input sequence to generate a set of hidden states. The fully connected layer is then applied to the last hidden state to generate the output predictions.

About

Transformer-based neural network using the PyTorch deep learning library:

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages