Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add LIMA dropout #21

Merged
merged 3 commits into from
Aug 12, 2023
Merged

Add LIMA dropout #21

merged 3 commits into from
Aug 12, 2023

Conversation

andreaskoepf
Copy link
Contributor

When --lima_dropout is specified use a layer dependent dropout probability, starting at p_d=0.0 at the bottom layer and linearly raising the rate to the value specified by --hidden_dropout at the last layer.

See: "LIMA: Less Is More for Alignment", Zhou et al 2023, https://arxiv.org/abs/2305.11206

@andreaskoepf andreaskoepf marked this pull request as ready for review August 10, 2023 19:34
@AleHD AleHD merged commit 5ac93aa into epfLLM:main Aug 12, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants