Skip to content

[Op] Fuse bias+dropout in FusedMLP#73

Merged
comaniac merged 11 commits intoawslabs:mainfrom
comaniac:fused_bias_dropout
Mar 3, 2023
Merged

[Op] Fuse bias+dropout in FusedMLP#73
comaniac merged 11 commits intoawslabs:mainfrom
comaniac:fused_bias_dropout

Conversation

@comaniac
Copy link
Contributor

Description

  1. Fuse bias and dropout in op.FusedMLP.
  2. Add unit tests for op.FusedMLP.
  3. Correct random states in GPT schedules when sequence parallelism is enabled.

Checklist

  • PR's title starts with a category (e.g. [Bugfix], [Model], [Tutorial], etc)
  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage
  • Code is well-documented

cc @chhzh123 @szhengac

@comaniac
Copy link
Contributor Author

comaniac commented Mar 1, 2023

Based on the comments, I removed the fused bias modules. Instead, I created LinearWithAct and LinearWithDropout. Both modules derive nn.Linear. In addition, I enable memory_efficient_fusion for bias+dropout if available, and add a flag to switch to torchscript in unit tests.

@comaniac comaniac merged commit 87be1d7 into awslabs:main Mar 3, 2023
@comaniac
Copy link
Contributor Author

comaniac commented Mar 3, 2023

Thanks @chhzh123 @szhengac

@comaniac comaniac deleted the fused_bias_dropout branch March 3, 2023 07:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants