Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Docs] Add Sequence Parallel documents #505

Merged
merged 24 commits into from
Apr 9, 2024
Merged

Conversation

HIT-cwh
Copy link
Collaborator

@HIT-cwh HIT-cwh commented Mar 21, 2024

No description provided.

docs/zh_cn/training/training_extreme_long_sequence.md Outdated Show resolved Hide resolved
另外,若需要进一步拓展模型的长文本处理能力,需要进一步修改 config 中的 `max_position_embeddings` 字段。例如需要将模型的上下文长度拓展为 64K 时,可进行如下修改:

```diff
+ max_position_embeddings = 65536
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这些参数都是独立解释的,其实用户不清楚的,当我开了某个设置后 ,这个值到底要不要设置。

  • pack_to_max_length
  • var attention
  • sp

这三者独立使用和组合使用时候,对其他参数的设置要求,我感觉比较模糊

docs/zh_cn/training/training_extreme_long_sequence.md Outdated Show resolved Hide resolved
docs/zh_cn/training/training_extreme_long_sequence.md Outdated Show resolved Hide resolved
docs/zh_cn/training/training_extreme_long_sequence.md Outdated Show resolved Hide resolved
docs/zh_cn/training/training_extreme_long_sequence.md Outdated Show resolved Hide resolved
docs/zh_cn/training/training_extreme_long_sequence.md Outdated Show resolved Hide resolved
docs/zh_cn/training/training_extreme_long_sequence.md Outdated Show resolved Hide resolved
docs/zh_cn/training/training_extreme_long_sequence.md Outdated Show resolved Hide resolved
docs/zh_cn/training/training_extreme_long_sequence.md Outdated Show resolved Hide resolved
@pppppM pppppM merged commit f711a19 into InternLM:docs Apr 9, 2024
0 of 3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants