Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue when running interact script #109

Closed
harrystuart opened this issue Jun 27, 2021 · 1 comment
Closed

Issue when running interact script #109

harrystuart opened this issue Jun 27, 2021 · 1 comment

Comments

@harrystuart
Copy link

Hi there,

I am trying to run the interact script and am getting the following behaviour:

(logenv) C:\Users\Harry\source\repos\HuggingFace>python interact.py
2021-06-27 23:37:29.687306: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library cudart64_101.dll
INFO:interact.py:Namespace(dataset_cache='./dataset_cache', dataset_path='', device='cpu', max_history=2, max_length=20, min_length=1, model='openai-gpt', model_checkpoint='', no_sample=False, seed=0, temperature=0.7, top_k=0, top_p=0.9)
INFO:C:\Users\Harry\source\repos\HuggingFace\utils.py:extracting archive file C:\Users\Harry/.cache\huggingface\transformers\2f5114b5eb72f9515802779c42c1b289bebdb1cfc8ce94c653237518eb530b34.75f2a4fe69178ff43138117a977e107a5fc7d402603a0825a296b531f246b3f2 to temp dir C:\Users\Harry\AppData\Local\Temp\tmpd181v50y
INFO:interact.py:Get pretrained model and tokenizer
ftfy or spacy is not installed using BERT BasicTokenizer instead of SpaCy & ftfy.
Some weights of the model checkpoint at C:\Users\Harry\AppData\Local\Temp\tmpd181v50y were not used when initializing OpenAIGPTLMHeadModel: ['multiple_choice_head.summary.bias', 'multiple_choice_head.summary.weight']
- This IS expected if you are initializing OpenAIGPTLMHeadModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing OpenAIGPTLMHeadModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
INFO:interact.py:Sample a personality
INFO:C:\Users\Harry\source\repos\HuggingFace\utils.py:Load tokenized dataset from cache at ./dataset_cache_OpenAIGPTTokenizer
INFO:interact.py:Selected personality: i never have had alcohol in my life. i have a girlfriend of 7 years. i watch every football game at alabama. i currently suffer from social anxiety. i'm a geology major at alabama university.
>>> hey there
Traceback (most recent call last):
  File "interact.py", line 154, in <module>
    run()
  File "interact.py", line 146, in run
    out_ids = sample_sequence(personality, history, tokenizer, model, args)
  File "interact.py", line 72, in sample_sequence
    logits = logits[0, -1, :] / args.temperature
  File "C:\Users\Harry\Anaconda3\envs\logenv\lib\site-packages\transformers\file_utils.py", line 1808, in __getitem__
    return self.to_tuple()[k]
TypeError: tuple indices must be integers or slices, not tuple

(logenv) C:\Users\Harry\source\repos\HuggingFace>

Any ideas on a fix?
Thanks

@harrystuart
Copy link
Author

Fix was to downgrade transformers version to that indicated in readme

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant