Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
-
Updated
Jun 11, 2024 - Rust
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
MSc thesis project: classification of italian certified electronic mails using SOTA machine learning, fine-tuning pre-trained deep learning models and data augmentation techniques
Official implementation of "Using Pre-Trained Language Models in an End-to-End Pipeline for Antithesis Detection" accepted in LREC-2024
Official implementation of "Using Pre-Trained Language Models in an End-to-End Pipeline for Antithesis Detection" accepted in LREC-2024
Turkish BERT/DistilBERT, ELECTRA and ConvBERT models
AI and Memory Wall
Pretrained ELECTRA Model for Korean
ML and Natual Language Processing
Factuality check of the SemRep Predications
Simple NER Pipeline Using KoCharELECTRA
Pretrain and finetune ELECTRA with fastai and huggingface. (Results of the paper replicated !)
randomnerd_esp32_wifi_manager
上海交通大学自然语言理解2020春:CoLA Task
Baseline code for Korean open domain question answering(ODQA)
2023년 인공지능 온라인 경진대회 - 전북대학교 SW중심대학사업단
Pre-training Language Models for Japanese
Solving Math Word Problems Using Language Models and Contrastive Loss
Transformers Pipeline with KoELECTRA
Add a description, image, and links to the electra topic page so that developers can more easily learn about it.
To associate your repository with the electra topic, visit your repo's landing page and select "manage topics."