Skip to content

parastooAflaki/Transformer-Conversational-Bot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 

Repository files navigation

Transformer-Conversational-Bot

Conversational Bot using Transformer Model

The core idea behind the Transformer model is self-attention—the ability to attend to different positions of the input sequence to compute a representation of that sequence.

Transformer creates stacks of self-attention layers and is explained in the sections Scaled dot product attention and Multi-head attention.

About

Conversational Bot using Transformer Model

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published