A framework for large scale recommendation algorithms.
-
Updated
Jun 7, 2024 - Python
A framework for large scale recommendation algorithms.
Repository for Project Insight: NLP as a Service
[NeurIPS 2023] Michelangelo: Conditional 3D Shape Generation based on Shape-Image-Text Aligned Latent Representation
Federated Learning Utilities and Tools for Experimentation
Retrieval-based Voice Conversion (RVC) implemented with Hugging Face Transformers.
[TMI 2023] XBound-Former: Toward Cross-scale Boundary Modeling in Transformers
Symbolic music generation taking inspiration from NLP and human composition process
Public repo for the paper: "Modeling Intensification for Sign Language Generation: A Computational Approach" by Mert Inan*, Yang Zhong*, Sabit Hassan*, Lorna Quandt, Malihe Alikhani
CHARacter-awaRE Diffusion: Multilingual Character-Aware Encoders for Font-Aware Diffusers That Can Actually Spell
✨ Solve multi_dimensional multiple knapsack problem using state_of_the_art Reinforcement Learning Algorithms and transformers
A Transformer Implementation that is easy to understand and customizable.
Sentence-Level Text Simplification for Dutch
Zeta implemantion of "Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers"
The official fork of THoR Chain-of-Thought framework, enhanced and adapted for Emotion Cause Analysis (ECAC-2024)
Public repo for the paper: "COSMic: A Coherence-Aware Generation Metric for Image Descriptions" by Mert İnan, Piyush Sharma, Baber Khalid, Radu Soricut, Matthew Stone, Malihe Alikhani
Contains work done on the fintech patents classification project. The goal of this project is to build a model that can detect if a patent is fintech or not based on it's text content. If a patent is fintech then we want to know which kind of fintech patent it is form our defined fintech categories.
Python package for Romanian diacritics restoration
This repository contains a number of experiments with Multi Lingual Transformer models (Multi-Lingual BERT, DistilBERT, XLM-RoBERTa, mT5 and ByT5) focussed on the Dutch language.
Comparing between residual stream and highway stream in transformers(BERT) .
Transformers-based Neural Network harbor logistic prediction model
Add a description, image, and links to the transformers-models topic page so that developers can more easily learn about it.
To associate your repository with the transformers-models topic, visit your repo's landing page and select "manage topics."