Dockfile containing LightGBM, Pytorch and TorchText with GPU Cuda acceleration
Download all files in one folder.
Add your SSH key information to authorized_keys.
Build the image in the folder with the following command.
docker build -t datawhale/lightgbm_pytorch_torchtext:12.3 .
After build the image, creat the container.
docker run -itd \
-e PUID=1000 -e PGID=1000 \
--gpus all \
--name=datawhale \
--restart=on-failure \
-v /your/work/folder/:/home \
-p 1234:22 \
datawhale/lightgbm_pytorch_torchtext:12.3
Set the SSH config file with the following
Host DataWhale
HostName 0.0.0.0 # your docker server IP
User root # password is winchellwang
Port 1234 # same with the forward port in container deployment
IdentityFile ~\.ssh\id_rsa # id_rsa should match your key in authorized_keys.
Open remote connection in VS Code.
Built in CONDA environment
conda 24.3.0
# conda environments:
#
base * /opt/miniforge
LightGBM /opt/miniforge/envs/LightGBM
Pytorch /opt/miniforge/envs/Pytorch
TorchText /opt/miniforge/envs/TorchText
LightGBM, Pytorch, and TorchText are already CUDA compatible.
Ubuntu 22.04 has Nvidia GPU with CUDA > 12.3