Skip to content

Add tensor methods in flops counting and separate macs and flops#1591

Merged
jeffra merged 12 commits intomasterfrom
profiler-add-tensor-ops
Dec 14, 2021
Merged

Add tensor methods in flops counting and separate macs and flops#1591
jeffra merged 12 commits intomasterfrom
profiler-add-tensor-ops

Conversation

@cli99
Copy link
Contributor

@cli99 cli99 commented Nov 24, 2021

This PR adds support to count flops for the following operations:

  1. PyTorch tensor methods such as torch.matmul, torch.mm, torch.addmm, torch.bmm, etc, and the corresponding torch.Tensor.xxx methods
  2. other activations, e.g. sigmoid, silu, and gelu
  3. other norms

This PR also separates the MACs and flops counting. multiply-add operations are counted as 2 flops while others are counted as 1. FLOPS is computed using flops.

@jeffra jeffra enabled auto-merge (squash) December 14, 2021 16:16
@jeffra jeffra merged commit 082f392 into master Dec 14, 2021
B06901052 pushed a commit to B06901052/DeepSpeed that referenced this pull request Apr 14, 2022
B06901052 pushed a commit to B06901052/DeepSpeed that referenced this pull request Apr 14, 2022
@mrwyattii mrwyattii deleted the profiler-add-tensor-ops branch July 7, 2023 02:40
@BitCalSaul
Copy link

x @ y is still not counted as flops. I added each line in the code of Transformer, and found the attention operation was not counted as flops.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants