Skip to content

CI test-phi-3-mini-runner-linux failing as a result of transformers v4.54.0 #12867

@Conarnar

Description

@Conarnar

CI test-phi-3-mini-runner-linux fails with the following error:

Traceback (most recent call last):
  File "/opt/conda/envs/py_3.10/lib/python3.10/runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/opt/conda/envs/py_3.10/lib/python3.10/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/executorch/examples/models/phi-3-mini/export_phi-3-mini.py", line 168, in <module>
    main()
  File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/executorch/examples/models/phi-3-mini/export_phi-3-mini.py", line 164, in main
    export(parser.parse_args())
  File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/executorch/examples/models/phi-3-mini/export_phi-3-mini.py", line 79, in export
    exportable_module = TorchExportableModuleForDecoderOnlyLM(
  File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/integrations/executorch.py", line 67, in __init__
    self.model = TorchExportableModuleWithStaticCache(model)
  File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/integrations/executorch.py", line 293, in __init__
    max_batch_size=self.model.generation_config.cache_config.get("batch_size"),
AttributeError: 'StaticCacheConfig' object has no attribute 'get'

This appears to be caused by the release of transformers v4.54.0, specifically as a result of the lines changed in src/transformers/integrations/executorch.py.

Metadata

Metadata

Labels

module: ciIssues related to continuous integrationtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

Type

No type

Projects

Status

To triage

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions