Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: module 'torch.distributed' has no attribute 'is_initialized' #37

Closed
Mohd-Misran opened this issue Jul 30, 2020 · 6 comments

Comments

@Mohd-Misran
Copy link

Mohd-Misran commented Jul 30, 2020

I loaded the crf-con-en model to the Parser. Then I tried to use the predict() function on a list of English tokens.
But I get the above error. Need help on how to debug this!

Here's my code:

from supar import Parser
import nltk

parser = Parser.load("crf-con-en")

text = nltk.word_tokenize("John, who is the CEO of a company, played golf.")

output = parser.predict(data=[text], verbose=False)

Complete error log:

AttributeError Traceback (most recent call last)
in
----> 1 output = parser.predict(data=[text], verbose=False)

~\anaconda3\lib\site-packages\supar\parsers\crf_constituency.py in predict(self, data, pred, buckets, batch_size, prob, mbr, verbose, **kwargs)
121 """
122
--> 123 return super().predict(**Config().update(locals()))
124
125 def _train(self, loader):

~\anaconda3\lib\site-packages\supar\parsers\parser.py in predict(self, data, pred, buckets, batch_size, prob, **kwargs)
120 def predict(self, data, pred=None, buckets=8, batch_size=5000, prob=False, **kwargs):
121 args = self.args.update(locals())
--> 122 init_logger(logger, verbose=args.verbose)
123
124 self.transform.eval()

~\anaconda3\lib\site-packages\supar\utils\logging.py in init_logger(logger, path, mode, level, handlers, verbose)
28 level=level,
29 handlers=handlers)
---> 30 logger.setLevel(logging.INFO if is_master() and verbose else logging.WARNING)
31
32

~\anaconda3\lib\site-packages\supar\utils\parallel.py in is_master()
33
34 def is_master():
---> 35 return not dist.is_initialized() or dist.get_rank() == 0

AttributeError: module 'torch.distributed' has no attribute 'is_initialized'

@yzhangcs
Copy link
Owner

yzhangcs commented Jul 30, 2020 via email

@Mohd-Misran
Copy link
Author

@yzhangcs

Torch version 1.5.1
Windows 10 Pro

@yzhangcs
Copy link
Owner

yzhangcs commented Jul 31, 2020

@Mohd-Misran You could check if torch.distributed.is_available() is True (facebookresearch/maskrcnn-benchmark#280).

@Mohd-Misran
Copy link
Author

@yzhangcs
torch.distributed.is_available() is False

@yzhangcs
Copy link
Owner

@Mohd-Misran It seems that distributed training is not supported on Windows.

@yzhangcs yzhangcs closed this as completed Aug 6, 2020
@yzhangcs
Copy link
Owner

yzhangcs commented Oct 27, 2020

@Mohd-Misran DDP is supported on Windows since torch 1.7.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants