Benchmarks for Multi-GPU Communication with MVAPICH2
-
Updated
Jan 4, 2017 - C
Benchmarks for Multi-GPU Communication with MVAPICH2
PyTorch original implementation of Cross-lingual Language Model Pretraining.
Leveraging Structural Indexes for High-Performance JSON Data Processing on GPUs
Keras light-weight model for sketch images classification using Quick!Draw dataset
This helps you to submit job with multinode & multgpu in Slurm in Torchrun
Custom Iterable Dataset Class for Large-Scale Data Loading
Very minimal pytorch boilerplate with wandb logging and multi gpu support
Recommendation Engine powered by Matrix Factorization.
Engineering thesis project tackling raytracing in remote multi-gpu environment
asynchoronous learning example working inside localhost
Distributed_compy is a distributed computing library that offers multi-threading, heterogeneous (CPU + mult-GPU), and multi-node support
multi_gpu_infer 多gpu预测 multiprocessing or subprocessing
Training Using Multiple GPUs
AI核心库
⚡ LLaMA-2 model experiment
CRNN(Convolutional Recurrent Neural Network), with optional STN(Spatial Transformer Network), in Tensorflow, multi-gpu supported.
TOmographic MOdel-BAsed Reconstruction (ToMoBAR) software
Add a description, image, and links to the multigpu topic page so that developers can more easily learn about it.
To associate your repository with the multigpu topic, visit your repo's landing page and select "manage topics."