-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RuntimeError: inconsistent tensor size #1
Comments
A couple of follow-up questions:
|
I encountered the same problem as maydaygmail.
I debuged the code, and the length returned from this line "start_idx, end_idx, length = self.sentence_id[seq_id]" is 0. |
I pushed a fix for the issue. The size of the SID tensor should be the number of sentences in the corpus. It was mistakenly set to the number of words in the corpus. |
This issue is fixed after I pulled the newest code.
|
The settings for the projection matrix didn't work well. I ran a quick test to check everything, and the model has 63.44 perplexity after the first epoch. |
The size of the projection matrix seems not help the PPL drop so quickly. Can you give more details on the mistake of process_gbw script? |
I forgot to fill the sid tensor, so the start_idx and length values are random. |
I have problem:
load word frequency mapping - complete
loaded tensor torch.Size([798949912])
loaded tensor torch.Size([798949912, 3])
#sentences 798949912
load train data - complete
#sentences 6073
load test data - complete
Traceback (most recent call last):
File "main.py", line 195, in
train()
File "main.py", line 157, in train
for batch, item in enumerate(train_loader):
File "/home/xxxx/PyTorch_LM/lm/fast_gbw.py", line 89, in batch_generator
tracker_list[idx] = self.add(seq_length, source, target, idx, tracker)
File "/home/xxxx/lm/PyTorch_LM/lm/fast_gbw.py", line 124, in add
source[curr:batch_end, batch_idx] = self.corpus[seq_start:seq_end]
RuntimeError: inconsistent tensor size, expected tensor [19] and src [798949911] to have the same number of elements, but got 19 and 798949911 elements respectively at /pytorch/torch/lib/TH/generic/THTensorCopy.c:86
The text was updated successfully, but these errors were encountered: