Skip to content

Dockfile containing LightGBM, Pytorch and TorchText with GPU Cuda acceleration

Notifications You must be signed in to change notification settings

WinchellWang/datawhale_docker_image

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Dockfile containing LightGBM, Pytorch and TorchText with GPU Cuda acceleration

How to Use

Docker

Download all files in one folder.

Add your SSH key information to authorized_keys.

Build the image in the folder with the following command.

docker build -t datawhale/lightgbm_pytorch_torchtext:12.3 .

After build the image, creat the container.

docker run -itd \
    -e PUID=1000 -e PGID=1000 \
    --gpus all \
    --name=datawhale \
    --restart=on-failure \
    -v /your/work/folder/:/home \
    -p 1234:22 \
    datawhale/lightgbm_pytorch_torchtext:12.3

Local Computer

Set the SSH config file with the following

Host DataWhale
    HostName 0.0.0.0 # your docker server IP
    User root # password is winchellwang
    Port 1234 # same with the forward port in container deployment
    IdentityFile ~\.ssh\id_rsa # id_rsa should match your key in authorized_keys.

Open remote connection in VS Code.

What it has inside

Built in CONDA environment

conda 24.3.0
# conda environments:
#
base                  *  /opt/miniforge
LightGBM                 /opt/miniforge/envs/LightGBM
Pytorch                  /opt/miniforge/envs/Pytorch
TorchText                /opt/miniforge/envs/TorchText

LightGBM, Pytorch, and TorchText are already CUDA compatible.

Requirement

Ubuntu 22.04 has Nvidia GPU with CUDA > 12.3

About

Dockfile containing LightGBM, Pytorch and TorchText with GPU Cuda acceleration

Topics

Resources

Stars

Watchers

Forks