Skip to content

Commit

Permalink
Fix a bug in DeepSpeedMLP (microsoft#4389)
Browse files Browse the repository at this point in the history
Co-authored-by: Logan Adams <114770087+loadams@users.noreply.github.com>
  • Loading branch information
2 people authored and amaurya committed Oct 9, 2023
1 parent 471dc0c commit 4e9a045
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions deepspeed/ops/transformer/inference/ds_mlp.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@ def __init__(self, config, mp_group=None, q_scales=None, q_groups=1, merge_count

self.config = config

data_type = torch.half if self.config.dtype == torch.int8 else self.config.dtype
data_type_fp = data_type
data_type = torch.int8 if self.config.dtype == torch.int8 else self.config.dtype
data_type_fp = torch.half if self.config.dtype == torch.int8 else self.config.dtype
device = get_accelerator().current_device_name()

proj_factor = 2 if self.config.mlp_act_func_type in GATED_ACTIVATION_TYPES else 1
Expand Down

0 comments on commit 4e9a045

Please sign in to comment.