Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: 'BartEncoder' object has no attribute 'main_input_name' #4

Open
zeke-john opened this issue Aug 16, 2022 · 1 comment

Comments

@zeke-john
Copy link

zeke-john commented Aug 16, 2022

Whenever i run the model (facebook/bart-large-cnn), or even the bart-base model to try to summarize text, using the code below, i keep running into this error, and i'm honestly stuck, so any help would be greatly appreciated.

Code:

from fastBart import export_and_get_onnx_model, OnnxBart, quantize
from transformers import AutoTokenizer
from pathlib import  Path
from fastBart.ort_settings import get_onnx_runtime_sessions
import os
from fastBart.onnx_exporter import (
    generate_onnx_representation)

model_name = 'facebook/bart-large-cnn'

# generate onnx model
model_paths = generate_onnx_representation(model_name)
model_sessions = get_onnx_runtime_sessions(tuple(model_paths))
model_paths = tuple(model_paths)

# after model gotten
# name = 'bart-large-cnn'
# custom_output_path = './models-bart/'
# model_paths = [os.path.join(custom_output_path,name+"-"+x) for x in ['encoder.onnx','decoder.onnx','init-decoder.onnx']]
# model_paths = tuple(model_paths)


model_sessions = get_onnx_runtime_sessions(model_paths)
model = OnnxBart(model_name, model_sessions)



tokenizer = AutoTokenizer.from_pretrained(model_name)

input = """Since being shared on Reddit, the 21-second video has garnered more than 35,000 upvotes. People also filled the comment section with adorable comments.
A Reddit user said that he liked the little cat staring at the mama dog for a moment. Another liked the calmness and patience of the mama dog.
“This is soooo cute,” commented a third Reddit user. Many users also added that humans should learn kindness from these animals.
Well, this is not the first time that a kitten was feeding on a nursing dog.Last year, a similar heartwarming video had surfaced 
from a remote village in Nigeria that showed a little feline feeding on milk from a mama dog. The clip, which had gone viral, has 
so far accumulated over 1 million views. Twitter users showered love for the dog and the little kitten. The 32-second clip was shared 
by Reuters on its official Twitter handle with the caption, “It's a most unusual sight: a kitten was spotted feeding on milk from a nursing dog in a remote village in Nigeria.”
"""

token = tokenizer(input, return_tensors='pt',  truncation=True)
tokens = model.generate(input_ids=token['input_ids'],attention_mask=token['attention_mask'],num_beams=4)

output = tokenizer.decode(tokens.squeeze(), skip_special_tokens=True)
print(output)

Error:

Traceback (most recent call last):
File "/Users/zeke/Documents/Github/SumanyWeb/sumany/summarize/fast-Bart/bartoptimize.py", line 34, in <module>
    tokens = model.generate(input_ids=token['input_ids'],attention_mask=token['attention_mask'],num_beams=4)
File "/usr/local/lib/python3.9/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
    return func(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/transformers/generation_utils.py", line 1162, in generate
    inputs_tensor, model_input_name, model_kwargs = self._prepare_model_inputs(inputs, bos_token_id, model_kwargs)
File "/usr/local/lib/python3.9/site-packages/transformers/generation_utils.py", line 412, in _prepare_model_inputs
    and self.encoder.main_input_name != self.main_input_name
File "/usr/local/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1207, in __getattr__
    raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'BartEncoder' object has no attribute 'main_input_name'
@lang101
Copy link

lang101 commented Sep 28, 2022

Reducing the version for transformers to 4.10.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants