Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi-gpu training is not effective on specific cases #272

Open
dnjsdos opened this issue Feb 24, 2023 · 0 comments
Open

Multi-gpu training is not effective on specific cases #272

dnjsdos opened this issue Feb 24, 2023 · 0 comments

Comments

@dnjsdos
Copy link

dnjsdos commented Feb 24, 2023

In case of RESCAL model with wn18 dataset, There is no training speed improvement when I trained using multi-gpu compared to 1-gpu training. Even, It is much slower than 1-gpu training.

Below this description, I write the command line I used. And benchmark table in document(https://dglke.dgl.ai/doc/benchmarks.html), RESCAL dosen't exist on 8-gpu training with wn18 dataset. Is there any special reason? Or, did I miss something? Please let me know.

dglke_train --model_name RESCAL --dataset wn18 --batch_size 1024 --log_interval 1000 \
--neg_sample_size 256 --hidden_dim 250 --gamma 24.0 --lr 0.03 --batch_size_eval 16 \
--test -adv --gpu 0 1 --max_step 10000 --mix_cpu_gpu --num_proc 2 --async_update --force_sync_interval 1000
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant