Skip to content

Suggesting a neural network architecture for analyzing and recognizing texts, where transformers were used through a pre-trained BERT model, in addition to its integration with the LSTM layer with the Global Pooling layers, in order to reach a model capable of analyzing texts.

Notifications You must be signed in to change notification settings

kaledhoshme123/Transformers-Text-Classification

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 

Repository files navigation

Transformers-Text-Classification

Suggesting a neural network architecture for analyzing and recognizing texts, where transformers were used through a pre-trained BERT model, in addition to its integration with the LSTM layer with the Global Pooling layers, in order to reach a model capable of analyzing texts.

Result:

Neural Network Architecture:

image

Metrics:

Accuracy, Recall, Precision:

image

Loss While Training:

image

Evaluate with Validation Data:

image

About

Suggesting a neural network architecture for analyzing and recognizing texts, where transformers were used through a pre-trained BERT model, in addition to its integration with the LSTM layer with the Global Pooling layers, in order to reach a model capable of analyzing texts.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published