🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
May 14, 2024 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.
Large language models (LLMs) made easy, EasyLM is a one stop solution for pre-training, finetuning, evaluating and serving LLMs in JAX/Flax.
A Library for Uncertainty Quantification.
Long Range Arena for Benchmarking Efficient Transformers
Original Implementation of Prompt Tuning from Lester, et al, 2021
Tevatron - A flexible toolkit for neural retrieval research and development.
Run Effective Large Batch Contrastive Learning Beyond GPU/TPU Memory Constraint
A Jax-based library for designing and training transformer models from scratch.
Orbax provides common utility libraries for JAX users.
Pretrained deep learning models for Jax/Flax: StyleGAN2, GPT2, VGG, ResNet, etc.
Train very large language models in Jax.
EasyDeL is an OpenSource Library to make your training faster and more Optimized With cool Options for training and serving Both in Python And Mojo🔥
Unofficial JAX implementations of deep learning research papers
KoCLIP: Korean port of OpenAI CLIP, in Flax
EfficientNet, MobileNetV3, MobileNetV2, MixNet, etc in JAX w/ Flax Linen and Objax
Official code for "Maximum Likelihood Training of Score-Based Diffusion Models", NeurIPS 2021 (spotlight)
Implementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).
JAX implementation of deep RL agents with resets from the paper "The Primacy Bias in Deep Reinforcement Learning"
Multimodal Masked Autoencoders (M3AE): A JAX/Flax Implementation
Add a description, image, and links to the flax topic page so that developers can more easily learn about it.
To associate your repository with the flax topic, visit your repo's landing page and select "manage topics."