Skip to content

ameyark28/Question-Answering-System

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 

Repository files navigation

Question-Answering-System

QA system employing DistilBERT

What is DisttilBERT?

It is a small, fast, cheap and light Transformer model trained by distilling BERT base. It has 40% less parameters than bert-base-uncased , runs 60% faster while preserving over 95% of BERT's performances.

About

QA system employing DistilBERT

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published