Benchmarks for Multi-GPU Communication with MVAPICH2
-
Updated
Jan 4, 2017 - C
Benchmarks for Multi-GPU Communication with MVAPICH2
PyTorch original implementation of Cross-lingual Language Model Pretraining.
Training Using Multiple GPUs
AI核心库
Very minimal pytorch boilerplate with wandb logging and multi gpu support
Engineering thesis project tackling raytracing in remote multi-gpu environment
Leveraging Structural Indexes for High-Performance JSON Data Processing on GPUs
This project aims to develop a deep learning model for the detection of skin cancer from dermoscopic images. The model utilizes convolutional neural networks (CNNs), specifically the ResNet50 architecture, to classify images into two classes: benign and malignant.
Distributed_compy is a distributed computing library that offers multi-threading, heterogeneous (CPU + mult-GPU), and multi-node support
Keras light-weight model for sketch images classification using Quick!Draw dataset
asynchoronous learning example working inside localhost
Recommendation Engine powered by Matrix Factorization.
This helps you to submit job with multinode & multgpu in Slurm in Torchrun
Custom Iterable Dataset Class for Large-Scale Data Loading
multi_gpu_infer 多gpu预测 multiprocessing or subprocessing
⚡ LLaMA-2 model experiment
CRNN(Convolutional Recurrent Neural Network), with optional STN(Spatial Transformer Network), in Tensorflow, multi-gpu supported.
Add a description, image, and links to the multigpu topic page so that developers can more easily learn about it.
To associate your repository with the multigpu topic, visit your repo's landing page and select "manage topics."