Difference between micro_train_batch_size & micro_rollout_batch_size #210
pratikkumar018
started this conversation in
General
Replies: 1 comment 3 replies
-
micro_train_batch_size is the training batch size per gpu |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Can you please explain the difference between micro_train_batch_size & micro_rollout_batch_size. I tried going through the code and found micro_train_batch_size to be used here.
https://github.com/OpenLLMAI/OpenRLHF/blob/main/openrlhf/trainer/ppo_utils/replay_buffer.py#L152
I am not sure if these two values should be same everytime?
Beta Was this translation helpful? Give feedback.
All reactions