Skip to content

Conversation

@xadupre
Copy link
Member

@xadupre xadupre commented Apr 10, 2025

Description

Description

Replaces #24291.

transformers>=4.51 makes DynamicCache exportable.
The modification were tested with a tiny LLM:

python -m onnxruntime.transformers.models.llama.convert_to_onnx -m arnir0/Tiny-LLM --output Tiny-LLM --precision fp16 --execution_provider cuda --small_gp --use_dynamo_export


import numpy as np
import torch
import transformers

Check notice

Code scanning / CodeQL

Module is imported with 'import' and 'import from' Note

Module 'transformers' is imported with both 'import' and 'import from'.
Module 'onnxruntime.test.python.transformers' is imported with both 'import' and 'import from'.
return {torch_deepcopy(v) for v in value}
if isinstance(value, dict):
return {k: torch_deepcopy(v) for k, v in value.items()}
if isinstance(value, np.ndarray):

Check failure

Code scanning / lintrunner

RUFF/F821 Error

@xadupre xadupre closed this Apr 16, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant