Skip to content

Commit

Permalink
add skip if no cuda decorator to pg wrapper test (#649)
Browse files Browse the repository at this point in the history
Summary:
Pull Request resolved: #649

This test needs a GPU

Reviewed By: JKSenthil

Differential Revision: D51966203

fbshipit-source-id: 9f99711667f9265e72b85cba81159a34e220edd1
  • Loading branch information
galrotem authored and facebook-github-bot committed Dec 8, 2023
1 parent c5d913c commit 6fd0945
Showing 1 changed file with 5 additions and 0 deletions.
5 changes: 5 additions & 0 deletions tests/utils/test_distributed_gpu.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,11 @@ def _test_ddp_gather_uneven_tensors_multidim_nccl() -> None:
assert val.shape == (idx + 1, 4 - idx)
assert (val == 1).all()

@unittest.skipUnless(
condition=cuda_available,
reason="This test should only run on a GPU host.",
)
@unittest.skipUnless(dist_available, reason="Torch distributed is needed to run")
def test_pg_wrapper_scatter_object_list_nccl(self) -> None:
spawn_multi_process(
2,
Expand Down

0 comments on commit 6fd0945

Please sign in to comment.