You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
#describe a bug
Whe I try to run example_scripts/finetune_boolq.sh , an error is raised
Traceback (most recent call last):
File "/mnt/sfs_turbo/lxt/CrossFit/cli_singletask.py", line 147, in <module>
main()
File "/mnt/sfs_turbo/lxt/CrossFit/cli_singletask.py", line 144, in main
run(args, logger)
File "/mnt/sfs_turbo/lxt/CrossFit/run_singletask.py", line 84, in run
best_dev_performance, best_model_state_dict = train(args, logger, model, train_data, dev_data, optimizer, scheduler)
File "/mnt/sfs_turbo/lxt/CrossFit/run_singletask.py", line 175, in train
curr_performance = inference(model if args.n_gpu==1 else model.module, dev_data)
File "/mnt/sfs_turbo/lxt/CrossFit/run_singletask.py", line 219, in inference
outputs = model.generate(input_ids=batch[0],
File "/mnt/sfs_turbo/zhangshudan/anaconda3/envs/opendelta_dev/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 28, in decorate_context
return func(*args, **kwargs)
File "/mnt/sfs_turbo/zhangshudan/anaconda3/envs/opendelta_dev/lib/python3.8/site-packages/transformers/generation_utils.py", line 1053, in generate
return self.beam_search(
File "/mnt/sfs_turbo/zhangshudan/anaconda3/envs/opendelta_dev/lib/python3.8/site-packages/transformers/generation_utils.py", line 1787, in beam_search
outputs = self(
File "/mnt/sfs_turbo/zhangshudan/anaconda3/envs/opendelta_dev/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1051, in _call_impl
return forward_call(*input, **kwargs)
TypeError: forward() got an unexpected keyword argument 'past_key_values'
#System information
Python version: 3.8.2
Transformers version:4.10.0
The text was updated successfully, but these errors were encountered:
This is probably due to the transformers library upgrade. Looks like new input variables are defined in the new version.
Solution 1: Use the transformer version stated in the README file. (But it's a pretty outdated version 😢) Solution 2: Replace the bart.py file with this new one (https://github.com/INK-USC/CrossFit/blob/676f801d7cc2c431ddd0e21b9593183d8e95f580/bart.py). **model_kwargs will automatically handle new input variables due to transformer versioning. I have tested this with Python 3.6.9, transformers 4.10.0, torch 1.7.1. I hope it solves your issue.
#describe a bug
Whe I try to run example_scripts/finetune_boolq.sh , an error is raised
#System information
The text was updated successfully, but these errors were encountered: