Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix baichuan2 lora #1042

Merged
merged 7 commits into from
Jan 26, 2024
Merged

Fix baichuan2 lora #1042

merged 7 commits into from
Jan 26, 2024

Conversation

grimoire
Copy link
Collaborator

  • alibi attention with small blockN
  • attention loralinear tp

@grimoire grimoire changed the title Fix baichuan2 lora [WIP] Fix baichuan2 lora Jan 25, 2024
@grimoire grimoire changed the title [WIP] Fix baichuan2 lora Fix baichuan2 lora Jan 25, 2024
@lvhan028
Copy link
Collaborator

@zhulinJulia24 Please help testing baichuan2 inference with pytorch engine

Copy link
Collaborator

@RunningLeon RunningLeon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@lvhan028 lvhan028 merged commit 5c47148 into InternLM:main Jan 26, 2024
4 of 6 checks passed
@zhulinJulia24
Copy link
Collaborator

fixed

need added adaptor usage and dummy adaptor for tutorial.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug] 部署s-lora报错ValueError: not enough values to unpack (expected 2, got 1)
4 participants