You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ValueError: You can't train a model that has been loaded with `device_map='auto'` in any distributed mode. Please rerun your script specifying `--num_processes=1` or by launching with `python {{myscript.py}}`.
Also in here (huggingface/accelerate#1840) it states that it is not compatible with using DistributedDataParallel.
The text was updated successfully, but these errors were encountered:
in https://github.com/pacman100/DHS-LLM-Workshop/blob/main/chat_assistant/training/utils.py#L182C9-L182C19, what is the reason to set device_map = 'auto'
When I run it with accelerator (with fsdp) I got the error
Also in here (huggingface/accelerate#1840) it states that it is not compatible with using DistributedDataParallel.
The text was updated successfully, but these errors were encountered: