-
Notifications
You must be signed in to change notification settings - Fork 399
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gemma Onnx suuport #1728
Comments
Answered on the other thread |
@fxmarty can you give me your version of optimum and transformers? I am running in the latest version and still have the error: ValueError: Trying to export a gemma model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as I am using the same command that you are using |
@fxmarty I encountered the same issue with this project as well while adding 'google/gemma-2b-it' as a LLM option. cc @tchen48 |
you'll need to install from main |
Closing as supported on main. Please install from source till the next release. |
System Info
My system currently is python = 3.8 optimum-intel : optimum-1.18.0.dev0
Who can help?
@JingyaHuang @echarlaix
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction (minimal, reproducible, runnable)
I am following the provided examples on #1714 but am running into some issues.
when I run
optimum-cli export onnx -m google/gemma-2b gemma_onnx
I get the following error
![image](https://private-user-images.githubusercontent.com/61036085/308269044-054916e1-5087-4d7b-a5ee-c90a9865d288.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTk2ODc3MzUsIm5iZiI6MTcxOTY4NzQzNSwicGF0aCI6Ii82MTAzNjA4NS8zMDgyNjkwNDQtMDU0OTE2ZTEtNTA4Ny00ZDdiLWE1ZWUtYzkwYTk4NjVkMjg4LnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNDA2MjklMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQwNjI5VDE4NTcxNVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTNjY2EyMDRiMTYwYzE3MDEzMDE4MTcwY2E0YzJiODJmZWJkY2RkMDRkMDNlZGRlMTg2ZjJjZDM2MjBmNjAxYmQmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0JmFjdG9yX2lkPTAma2V5X2lkPTAmcmVwb19pZD0wIn0.FZAppR7HB8TFu7e5lukUrT1G4Zghml4k63rn0TJ_b5w)
When I execute the python script provided I get the following error
Expected behavior
The expected behavior is for the model to compile and be stored in onnx form
The text was updated successfully, but these errors were encountered: