Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Describe a bug #3

Closed
telxt opened this issue Feb 25, 2022 · 2 comments
Closed

Describe a bug #3

telxt opened this issue Feb 25, 2022 · 2 comments

Comments

@telxt
Copy link

telxt commented Feb 25, 2022

#describe a bug
Whe I try to run example_scripts/finetune_boolq.sh , an error is raised

Traceback (most recent call last):
  File "/mnt/sfs_turbo/lxt/CrossFit/cli_singletask.py", line 147, in <module>
    main()
  File "/mnt/sfs_turbo/lxt/CrossFit/cli_singletask.py", line 144, in main
    run(args, logger)
  File "/mnt/sfs_turbo/lxt/CrossFit/run_singletask.py", line 84, in run
    best_dev_performance, best_model_state_dict = train(args, logger, model, train_data, dev_data, optimizer, scheduler)
  File "/mnt/sfs_turbo/lxt/CrossFit/run_singletask.py", line 175, in train
    curr_performance = inference(model if args.n_gpu==1 else model.module, dev_data)
  File "/mnt/sfs_turbo/lxt/CrossFit/run_singletask.py", line 219, in inference
    outputs = model.generate(input_ids=batch[0],
  File "/mnt/sfs_turbo/zhangshudan/anaconda3/envs/opendelta_dev/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 28, in decorate_context
    return func(*args, **kwargs)
  File "/mnt/sfs_turbo/zhangshudan/anaconda3/envs/opendelta_dev/lib/python3.8/site-packages/transformers/generation_utils.py", line 1053, in generate
    return self.beam_search(
  File "/mnt/sfs_turbo/zhangshudan/anaconda3/envs/opendelta_dev/lib/python3.8/site-packages/transformers/generation_utils.py", line 1787, in beam_search
    outputs = self(
  File "/mnt/sfs_turbo/zhangshudan/anaconda3/envs/opendelta_dev/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1051, in _call_impl
    return forward_call(*input, **kwargs)
TypeError: forward() got an unexpected keyword argument 'past_key_values'

#System information

  • Python version: 3.8.2
  • Transformers version:4.10.0
@cherry979988
Copy link
Member

Thank you for raising this issue. I am looking into this problem!

@cherry979988
Copy link
Member

This is probably due to the transformers library upgrade. Looks like new input variables are defined in the new version.

Solution 1: Use the transformer version stated in the README file. (But it's a pretty outdated version 😢)
Solution 2: Replace the bart.py file with this new one (https://github.com/INK-USC/CrossFit/blob/676f801d7cc2c431ddd0e21b9593183d8e95f580/bart.py). **model_kwargs will automatically handle new input variables due to transformer versioning. I have tested this with Python 3.6.9, transformers 4.10.0, torch 1.7.1. I hope it solves your issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants