-
Preferred Networks
- Tokyo, Japan
- http://www.hiroyukivincentyamazaki.com
Highlights
- Pro
Starred repositories
Collective communications library with various primitives for multi-machine training.
The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
The registry of the OptunaHub packages
DiscoGrad - automatically differentiate across conditional branches in C++ programs
A JAX research toolkit for building, editing, and visualizing neural networks.
Development repository for the Triton language and compiler
A curated list for Efficient Large Language Models
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
TensorRT-LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and build TensorRT engines that contain state-of-the-art optimizations to perform inference efficie…
Google Research
Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimentation and parallelization, and has demonstrated industry lead…
Extended functionalities for Optuna in combination with third-party libraries.
A curated list of awesome neural radiance fields papers
Hackable and optimized Transformers building blocks, supporting a composable construction.
1st place solution for Kaggle "Happywhale - Whale and Dolphin Identification"
PyTorch repository for ICLR 2022 paper (GSAM) which improves generalization (e.g. +3.8% top-1 accuracy on ImageNet with ViT-B/32)
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (V…