Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Quick start - basic CMD fails #2

Closed
yonatanbitton opened this issue Nov 24, 2020 · 1 comment
Closed

Quick start - basic CMD fails #2

yonatanbitton opened this issue Nov 24, 2020 · 1 comment

Comments

@yonatanbitton
Copy link

yonatanbitton commented Nov 24, 2020

Hey.
I created a sample.txt file with a short text "I burst through the cabin doors" and tried to run the basic cmd:

(venv) (base) [ssmba]$ cat sample.txt 
I burst through the cabin doors
(venv) (base) [ssmba]$ python ssmba.py --model bert-base-uncased --in-file sample.txt --output-prefix ssmba_out --noise-prob 0.25 --num-samples 8 --topk 10
/../ssmba/venv/lib/python3.6/site-packages/transformers/modeling_auto.py:837: FutureWarning: The class `AutoModelWithLMHead` is deprecated and will be removed in a future version. Please use `AutoModelForCausalLM` for causal language models, `AutoModelForMaskedLM` for masked language models and `AutoModelForSeq2SeqLM` for encoder-decoder models.
  FutureWarning,
Some weights of the model checkpoint at bert-base-uncased were not used when initializing BertForMaskedLM: ['cls.seq_relationship.weight', 'cls.seq_relationship.bias']
- This IS expected if you are initializing BertForMaskedLM from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing BertForMaskedLM from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
/data/users/yonatab/ssmba/venv/lib/python3.6/site-packages/transformers/modeling_bert.py:1152: FutureWarning: The `masked_lm_labels` argument is deprecated and will be removed in a future version, use `labels` instead.
  FutureWarning,
Traceback (most recent call last):
  File "ssmba.py", line 259, in <module>
    gen_neighborhood(args)
  File "ssmba.py", line 155, in gen_neighborhood
    rec, rec_masks = hf_reconstruction_prob_tok(toks, masks, tokenizer, r_model, softmax_mask, reconstruct=True, topk=args.topk)
  File "/../ssmba/utils.py", line 105, in hf_reconstruction_prob_tok
    l[softmax_mask] = float('-inf')
RuntimeError: Output 0 of UnbindBackward is a view and is being modified inplace. This view is the output of a function that returns multiple views. Such functions do not allow the output views to be modified inplace. You should replace the inplace operation by an out-of-place one.

I've installed the latest torch and transformers for cuda 10.1:

>>> torch.__version__
'1.7.0+cu101'
>>> transformers.__version__
'3.5.1'
>>> 

What am I missing? Thanks

@nng555
Copy link
Owner

nng555 commented Nov 24, 2020

Looks like I hadn't updated the code for compatibility with the latest torch and transformers versions in a while. I've just pushed a fix to this in-place modification issue and will fix up the deprecated warnings later. For now the sample command you ran should work now.

@nng555 nng555 closed this as completed Nov 24, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants