Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix passing options to TransformerEngine layers and MCore #9231

Open
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

janEbert
Copy link
Contributor

@janEbert janEbert commented May 17, 2024

This affects the NeMo TransformerEngine layers (AutocastTransformerLayer) and MCore TransformerConfig, which previously did not correctly receive the settings for

  • activation: str: custom activation, including the GLU property,
  • normalization: str: custom normalization layers,
  • bias: bool: optional bias,
  • (MCore only) bias_dropout_add_fusion: bool: whether to fuse bias add+dropout+residual add via JIT-compilation (we could even enable this for TransformerEngine by setting os.environ["NVTE_BIAS_DROPOUT_FUSION"] = "1". Note that Megatron-LM also does not do this), and
  • (MCore only) bias_activation_fusion: bool: whether to fuse bias add+activation via JIT-compilation.

What does this PR do ?

Fix passing normalization and bias options to TransformerEngine layers. Previously, with mcore=False, transformer_engine=True, the mentioned model settings would not be reproduced correctly.

Collection: NLP

Changelog

  • We refactor the functions to convert the NeMo normalization and activation config option to its TransformerEngine/Megatron-LM core equivalent, saving on duplicate code and improving maintainability.
  • We refactor the query for whether an activation function uses a GLU layer.
  • These refactored functions are now used for creating the TransformerEngine and MCore configs.
  • We pass two additional values to the AutocastTransformerLayer and its superclass constructor, namely activation, normalization, and bias.
  • We pass five additional values to the MCore TransformerConfig, namely activation_func, gated_linear_unit, add_bias_linear, bias_activation_fusion, and bias_dropout_fusion.

GitHub Actions CI

The Jenkins CI system has been replaced by GitHub Actions self-hosted runners.

The GitHub Actions CI will run automatically when the "Run CICD" label is added to the PR.
To re-run CI remove and add the label again.
To run CI on an untrusted fork, a NeMo user with write access must first click "Approve and run".

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests? – There were no existing tests for AutocastTransformerLayer, so I did not bother.
  • Did you add or update any necessary documentation? – I do not think this is necessary, we just fix the incorrect behavior.
  • Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc) – Yes, it implicitly affects behavior in TransformerEngine and Megatron-LM.
    • Reviewer: Does the PR have correct import guards for all optional libraries?

PR Type:

  • New Feature
  • Bugfix
  • Documentation

Who can review?

Anyone in the community is free to review the PR once the checks have passed.

@github-actions github-actions bot added the NLP label May 17, 2024
@janEbert janEbert changed the title Fix passing normalization and bias options to TransformerEngine layers Fix passing options to TransformerEngine layers and MCore May 17, 2024
This affects the NeMo TransformerEngine layers, which previously did not
correctly receive the settings for
- `normalization: str`: custom normalization layers and
- `bias: bool`: optional bias.

We additionally refactor the function to convert the NeMo normalization
config option to its TransformerEngine/Megatron-LM core equivalent,
saving on duplicate code and improving maintainability.

Signed-off-by: janEbert <janpublicebert@posteo.net>
This affects both TransformerEngine and Megatron-LM core, neither of
which correctly receive the specified activation function. This also
handles GLU activation functions.

Signed-off-by: janEbert <janpublicebert@posteo.net>
Signed-off-by: janEbert <janpublicebert@posteo.net>
Activation functions are passed as callable to Megatron-LM core, so
mentioning it here is incorrect.

Signed-off-by: janEbert <janpublicebert@posteo.net>
Signed-off-by: janEbert <janpublicebert@posteo.net>
Signed-off-by: janEbert <janEbert@users.noreply.github.com>
Copy link
Contributor

This PR is stale because it has been open for 14 days with no activity. Remove stale label or comment or update or this will be closed in 7 days.

@github-actions github-actions bot added the stale label Jun 13, 2024
@janEbert
Copy link
Contributor Author

PTAL.

@github-actions github-actions bot removed the stale label Jul 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant