Skip to content

Commit

Permalink
fix incorrect sharding without zero
Browse files Browse the repository at this point in the history
  • Loading branch information
Edenzzzz committed Apr 2, 2024
1 parent e614aa3 commit a66ae1e
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion colossalai/shardformer/shard/shard_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,9 @@ def _turn_on_all_optimization(self):
self.enable_fused_normalization = True
self.enable_flash_attention = True
self.enable_jit_fused = True
self.enable_sequence_parallelism = True
# This can cause non-in-place param sharding when used without ZeRO.
# It may also slow down training when seq len is small. Plz enable manually.
# self.enable_sequence_parallelism = True
self.enable_sequence_overlap = True

def _infer(self):
Expand Down

0 comments on commit a66ae1e

Please sign in to comment.