Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tensor-parallel communication overlap with userbuffer backend #6362

Merged
merged 6 commits into from
Apr 18, 2023

Conversation

erhoo82
Copy link
Collaborator

@erhoo82 erhoo82 commented Apr 4, 2023

What does this PR do ?

Add (1) interfaces to TE and initialized (2) process group setting to support tensor-parallel communication overlap with userbuffer backend.

Changelog

  • Add specific line by line info of high level changes in this PR.

Usage

Set ub_tp_comm_overlap to True

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests?
  • Did you add or update any necessary documentation?
  • Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
    • Reviewer: Does the PR have correct import guards for all optional libraries?

PR Type:

  • New Feature
  • Bugfix
  • Documentation

If you haven't finished some of the above items you can still open "Draft" PR.

Who can review?

Anyone in the community is free to review the PR once the checks have passed.
Contributor guidelines contains specific people who can review PRs to various areas.

Additional Information

  • Related to # (issue)

@github-actions github-actions bot added the NLP label Apr 4, 2023
@erhoo82 erhoo82 requested a review from ericharper April 6, 2023 17:48
@erhoo82 erhoo82 force-pushed the slym/tp_overlap branch 2 times, most recently from b81c170 to 609d0b1 Compare April 6, 2023 17:55
@erhoo82 erhoo82 changed the base branch from main to r1.17.0_pt_23.04 April 6, 2023 17:58
@okuchaiev okuchaiev requested a review from ksivaman April 10, 2023 23:05
Comment on lines +171 to +174
ub_tp_comm_overlap_cfg: null
# A yaml file with userbuffer communicator configurations. This file should provide `method`, `dtype`, `num_sm`, `num_splits`,
# `cga_size`, `num_splits`, `set_sm_margin`, and `aggregate` for the communicators to use custom settings.
# If the configuration file is not provided a default setting is used for all communicators.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this a yaml file? Should it just be

ub_tp_comm_overlap_cfg:
  method: blah
  dtype: blah
  ...

?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The two choices were either a config object or config file since there are two many UB related args

Comment on lines +518 to +520
logging.info(
"Userbuffer tensor-parallel communication overlap is available with both Transformer Engine and sequence-parallelism."
)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should be .warning and maybe add ".. only available with ... Setting ub_tp_comm_overlap to True"

Comment on lines +333 to +357
# Initialize userbuffer communicators. Initialization is done only once at the
# beginning of the first training step.
if self.initialize_ub:
input_shape = [
self.cfg.get('encoder_seq_length') * self.cfg.get('micro_batch_size'),
self.cfg.get('hidden_size'),
]
ub_cfg_file_name = self.cfg.get('ub_tp_comm_overlap_cfg', None)
if ub_cfg_file_name is not None:
try:
import yaml

with open(ub_cfg_file_name, 'r') as ub_cfg_file:
ub_cfgs = yaml.safe_load(ub_cfg_file)
except (ImportError, TypeError):
print("Fail to read ub_tp_comm_overlap config file.")
else:
ub_cfgs = None
te_module.initialize_ub(
shape=input_shape,
tp_size=self.cfg.get('tensor_model_parallel_size'),
use_fp8=self.cfg.get('fp8'),
ub_cfgs=ub_cfgs,
)
self.initialize_ub = False
Copy link
Collaborator

@ericharper ericharper Apr 11, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can this go in the .setup method then? (since it is only called once at beginning of training)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, can we make it a private method and then call the it?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To add; I'm not a fan of how we're importing the whole file from TE (which isn't a part of the API)

Copy link
Collaborator

@ericharper ericharper left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall looks good, just a few minor comments.

Also, do we need new apex and te commits for this PR?

Signed-off-by: Sangkug Lym <slym@nvidia.com>
@github-actions github-actions bot added the CI label Apr 15, 2023
@erhoo82 erhoo82 changed the title Draft: Tensor-parallel communication overlap with userbuffer backend Tensor-parallel communication overlap with userbuffer backend Apr 15, 2023
Copy link
Collaborator

@ericharper ericharper left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thanks!

@ericharper ericharper merged commit 68dadb9 into NVIDIA:r1.17.0_pt_23.04 Apr 18, 2023
github-actions bot pushed a commit that referenced this pull request Apr 18, 2023
* add interfaces for tp_communication overlap

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Interface to provide custom userbuffer communicator settings by yaml file

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Jenkinsfile

Signed-off-by: Sangkug Lym <slym@nvidia.com>

---------

Signed-off-by: Sangkug Lym <slym@nvidia.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Eric Harper <complex451@gmail.com>
@erhoo82 erhoo82 deleted the slym/tp_overlap branch December 9, 2023 05:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants