A project to predict the type of webpage from its text.
-
Updated
Apr 25, 2022 - HTML
A project to predict the type of webpage from its text.
The main idea of this project is of Transfer Learning. Bert large 24 encoder Questions Answering model was fine tuned on specific task of Questions Answering. Currently for Demo purposes it is Hosted on free Heroku platform. Please take a moment to catch up the twisting idea.
Pro/Anti-vaxxers in Brazil: a temporal analysis of COVID vaccination stance in Twitter
Analyzing French census data (1836-1936) for demographic insights : application on household head prediction.
Space Model framework that allows for maintaining generalizability, and enhances the performance on the downstream task by utilizing task-specific context attribution. It is an external LLM layer, that improves accuracy in classification task for multiple datasets, such as HateXplain, IMDB movies reviews and more.
Add a description, image, and links to the bert-fine-tuning topic page so that developers can more easily learn about it.
To associate your repository with the bert-fine-tuning topic, visit your repo's landing page and select "manage topics."