Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not able to trace GPT2DoubleHeadsModel #36812

Open
4 tasks
levindabhi opened this issue Mar 19, 2025 · 1 comment
Open
4 tasks

Not able to trace GPT2DoubleHeadsModel #36812

levindabhi opened this issue Mar 19, 2025 · 1 comment
Labels

Comments

@levindabhi
Copy link

levindabhi commented Mar 19, 2025

System Info

Hi, I'm trying to create trace of GPT2DoubleHeadsModel model but I'm facing issue. Here is my code

from transformers.utils import fx
from transformers import *

gpt2_config = GPT2Config()
model = GPT2DoubleHeadsModel(gpt2_config)
input_names = model.dummy_inputs.keys()
trace = fx.symbolic_trace(model, input_names)

I'm getting below error

File "~/venv/lib/python3.12/site-packages/torch/fx/proxy.py", line 327, in iter
raise TraceError('Proxy object cannot be iterated. This can be '
torch.fx.proxy.TraceError: Proxy object cannot be iterated. This can be attempted when the Proxy is used in a loop or as a *args or **kwargs function argument. See the torch.fx docs on pytorch.org for a more detailed explanation of what types of control flow can be traced, and check out the Proxy docstring for help troubleshooting Proxy iteration errors

Any help, Thanks!

Who can help?

@ArthurZucker @michaelbenayoun

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

from transformers.utils import fx
from transformers import *

gpt2_config = GPT2Config()
model = GPT2DoubleHeadsModel(gpt2_config)
input_names = model.dummy_inputs.keys()
trace = fx.symbolic_trace(model, input_names)

Expected behavior

Expecting it to work without error

@levindabhi levindabhi added the bug label Mar 19, 2025
@Rocketknight1
Copy link
Member

Hi @levindabhi, I don't know if we're strongly supporting torch.fx right now - we're refocusing on making sure everything works with torch.compile instead!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants