Zero-copy MPI communication of JAX arrays, for turbo-charged HPC applications in Python ⚡
-
Updated
Jun 20, 2024 - Python
Zero-copy MPI communication of JAX arrays, for turbo-charged HPC applications in Python ⚡
ALBERT model Pretraining and Fine Tuning using TF2.0
Simple and efficient RevNet-Library for PyTorch with XLA and DeepSpeed support and parameter offload
基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。
Tensorflow2 training code with jit compiling on multi-GPU.
Provides code to serialize the different models involved in Stable Diffusion as SavedModels and to compile them with XLA.
Fast and easy distributed model training examples.
Versatile Data Ingestion Pipelines for Jax
Classification of multilingual dataset trained only on English training data using pre-trained models. Model is trained on TPUs using PyTorch and torch_xla library.
Optimal choice for 🛰 classification problem.
deep learning inference perf analysis
As the quality of large language models increases, so do our expectations of what they can do. Since the release of OpenAI's GPT-2, text generation capabilities have received attention. And for good reason - these models can be used for summarization, translation, and even real-time learning in some language tasks.
A practical method for training energy-based language models.
Access the Xspec models and corresponding JAX/XLA ops.
Modern Graph TensorFlow implementation of Super-Resolution GAN
A Multivariate Gaussian Bayes classifier written using JAX
Add a description, image, and links to the xla topic page so that developers can more easily learn about it.
To associate your repository with the xla topic, visit your repo's landing page and select "manage topics."