Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why use FSDP instead of Deepspeed? #76

Closed
nrailg opened this issue Sep 28, 2023 · 2 comments
Closed

Why use FSDP instead of Deepspeed? #76

nrailg opened this issue Sep 28, 2023 · 2 comments

Comments

@nrailg
Copy link

nrailg commented Sep 28, 2023

We found that the overhead of FSDP is very high compared to deepspeed. Especially if you upgrade transformer to >= 4.29 (huggingface/transformers#24724).

In Phase 3 PPO, seems I can only use FSDP?

Can anyone pls explain why FSDP is adopted? Thank U~

@lxuechen
Copy link
Collaborator

This can be transformers-version-dependent, but we generally find the same. See this note I wrote a long time ago. https://github.com/tatsu-lab/stanford_alpaca#addressing-oom

DeepSpeed model saving can be a little annoying at times.

@nrailg
Copy link
Author

nrailg commented Nov 28, 2023

This can be transformers-version-dependent, but we generally find the same. See this note I wrote a long time ago. https://github.com/tatsu-lab/stanford_alpaca#addressing-oom

DeepSpeed model saving can be a little annoying at times.

Thanks for Ur reply.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants