A transformer built from scratch configured for deanonymisation of textual data.
-
Updated
Nov 24, 2023 - Rust
A transformer built from scratch configured for deanonymisation of textual data.
Morphologically biased byte-pair encoding
Transformer translator website with multithreaded web server in Rust
Transform DSL files to JSON in Rust. Formatting options available for custom output.
A project written in the Rust language with the goal of offline load of small LLM Model, specifically RAG (Retrieval Augmented Generation) on mobile devices.
Python bindings to the openfga-dsl-parser library
Efficiently translating Latin to English using a sequence-to-sequence transformer augmented with learnable morphologically-derived grammatical embeddings. 🚀
Extremely fast html to react transformer
Rust Implemention of paper: Attention Is All You Need(https://arxiv.org/abs/1706.03762), Code Port from http://nlp.seas.harvard.edu/2018/04/03/attention.html
Parsing and JSON transformer library for the OpenFGA authorization DSL
Further developed as SyntaxDot: https://github.com/tensordot/syntaxdot
AthenaOS is a next generation AI-native operating system managed by Swarms of AI Agents
Succeeded by syntaxdot-transformers: https://github.com/tensordot/syntaxdot/tree/main/syntaxdot-transformers
Succeeded by SyntaxDot: https://github.com/tensordot/syntaxdot
Implementation of the GPT architecture in Rust 🦀 + Burn 🔥
Rust-tokenizer offers high-performance tokenizers for modern language models, including WordPiece, Byte-Pair Encoding (BPE) and Unigram (SentencePiece) models
AICI: Prompts as (Wasm) Programs
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."