Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to generate question-answering model for Llama and there is also no list of what are the supported models for question-answering #1876

Open
customautosys opened this issue May 26, 2024 · 0 comments

Comments

@customautosys
Copy link

Feature request

Hi, I received this error:

ValueError: Asked to export a llama model for the task question-answering, but the Optimum ONNX exporter only supports the tasks feature-extraction, feature-extraction-with-past, text-generation, text-generation-with-past, text-classification for llama. Please use a supported task. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the task question-answering to be supported in the ONNX export for llama.

I was trying to generate an ONNX model for QuanAI/llama-2-7b-question-answering.

I also tried to search for the supported question-answering models on https://huggingface.co/docs/optimum/exporters/onnx/usage_guides/export_a_model which had a broken link pointing to https://huggingface.co/exporters/task_manager (returns a 404). I am happy to consider other question-answering models instead of Llama if there is a list of what is available.

Motivation

Unable to export Llama question-answering model

Your contribution

Not sure how to contribute, I am a new user

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant