A bot to send tech related polls in your channel/group
-
Updated
Feb 23, 2024 - Python
A bot to send tech related polls in your channel/group
Fine-tuning a Transformer albert model for Question Answering on Squad 2.0
BERT (Bidirectional Encoder Representations from Transformers) is a transformer-based method of learning language representations. It is a bidirectional transformer pre-trained model developed using a combination of two tasks namely: masked language modeling objective and next sentence prediction on a large corpus.
Assessing Humor in Edited News Headlines with ALBERT
A very basic TODO list for the Albert launcher
The project trains google T5 on ANLG and COS-E datasets and use these pretrained models to generates explanations for ReColr contexts which along with the context and question are passed to ALBERT for prediction.
基于tensorflow2.x实现bert及其变体的预训练模型加载架构
Sentiment Analysis On Stanford Dataset using State-of-the-Art models (Contextualized Embedding)
Snip screenshot extension for Albert Launcher
Albert extension for quickly and easily searching the Laravel documentation 🕵️
All-in-One PyTorch-based BERT Module
An exploration of Deep Learning models on the Amazon Reviews dataset.
🈂️ Convierte las vocales en Ingles a vocales con tilde.
Some experiments to compare the performances of some pre-trained transformer models on a basic sentiment regression task
Albert launcher extension for converting between timezones
Add a description, image, and links to the albert topic page so that developers can more easily learn about it.
To associate your repository with the albert topic, visit your repo's landing page and select "manage topics."