You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have an older NVIDIA GPU that is unfortunately not supported by PyTorch. As such, I was unable to use tweetNLP as it tries to use GPU since torch.cuda.is_available() returns True.
One way I was able to get around was modifying the model.py as follows:
# GPU setup (https://github.com/cardiffnlp/tweetnlp/issues/15)
if use_gpu:
if torch.cuda.is_available() and torch.cuda.device_count() > 0:
self.device = torch.device('cuda')
print('Note: Using GPU')
elif hasattr(torch.backends, "mps") and torch.backends.mps.is_available() and torch.backends.mps.is_built():
self.device = torch.device("mps")
print('Note: Using MPS')
else:
self.device = torch.device('cpu')
print('Note: Using CPU')
else:
self.device = torch.device('cpu')
print('Note: Using CPU')
Sharing it, just in case it helps someone else encountering a similar situation.
sbs
The text was updated successfully, but these errors were encountered:
I have an older NVIDIA GPU that is unfortunately not supported by
PyTorch
. As such, I was unable to usetweetNLP
as it tries to use GPU sincetorch.cuda.is_available()
returns True.One way I was able to get around was modifying the model.py as follows:
Sharing it, just in case it helps someone else encountering a similar situation.
sbs
The text was updated successfully, but these errors were encountered: