Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
-
Updated
May 21, 2024 - Rust
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
AICI: Prompts as (Wasm) Programs
Rust-tokenizer offers high-performance tokenizers for modern language models, including WordPiece, Byte-Pair Encoding (BPE) and Unigram (SentencePiece) models
Implementation of the GPT architecture in Rust 🦀 + Burn 🔥
Succeeded by SyntaxDot: https://github.com/tensordot/syntaxdot
Succeeded by syntaxdot-transformers: https://github.com/tensordot/syntaxdot/tree/main/syntaxdot-transformers
Rust Implemention of paper: Attention Is All You Need(https://arxiv.org/abs/1706.03762), Code Port from http://nlp.seas.harvard.edu/2018/04/03/attention.html
AthenaOS is a next generation AI-native operating system managed by Swarms of AI Agents
A transformer built from scratch configured for deanonymisation of textual data.
Extremely fast html to react transformer
Transform DSL files to JSON in Rust. Formatting options available for custom output.
A project written in the Rust language with the goal of offline load of small LLM Model, specifically RAG (Retrieval Augmented Generation) on mobile devices.
Efficiently translating Latin to English using a sequence-to-sequence transformer augmented with learnable morphologically-derived grammatical embeddings. 🚀
Morphologically biased byte-pair encoding
Python bindings to the openfga-dsl-parser library
Further developed as SyntaxDot: https://github.com/tensordot/syntaxdot
Transformer translator website with multithreaded web server in Rust
Parsing and JSON transformer library for the OpenFGA authorization DSL
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."