Skip to content

Commit

Permalink
Make numpy optional dependency for torch.cuda.amp (#48154)
Browse files Browse the repository at this point in the history
Summary: Pull Request resolved: #48154

Test Plan:
Uninstall `numpy` and try to importing `torch`

Discovered while working on #48145

Reviewed By: walterddr

Differential Revision: D25046307

Pulled By: malfet

fbshipit-source-id: c1171a49e03bdc40e8dc1d65928c6c12626e33db
  • Loading branch information
malfet authored and facebook-github-bot committed Nov 18, 2020
1 parent e2b4c63 commit 1454cbf
Showing 1 changed file with 5 additions and 2 deletions.
7 changes: 5 additions & 2 deletions torch/cuda/amp/autocast_mode.py
@@ -1,7 +1,10 @@
import torch
import functools
import warnings
import numpy as np
try:
import numpy as np
except ModuleNotFoundError:
np = None
from torch._six import container_abcs, string_classes


Expand Down Expand Up @@ -144,7 +147,7 @@ def _cast(value, dtype):
return value.to(dtype) if is_eligible else value
elif isinstance(value, string_classes):
return value
elif isinstance(value, np.ndarray):
elif np is not None and isinstance(value, np.ndarray):
return value
elif isinstance(value, container_abcs.Mapping):
return {_cast(k, dtype): _cast(v, dtype) for k, v in value.items()}
Expand Down

0 comments on commit 1454cbf

Please sign in to comment.