We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chat_sample.exe
Llama-2-7b-chat-hf
When running chat_sample.exe using Llama-2-7b-chat-hf, seems that it always have a built-in question:
optimum-cli export openvino --model "meta-llama/Llama-2-7b-chat-hf" --trust-remote-code "meta-llama/Llama-2-7b-chat-hf"
chat_sample.exe C:\models\Llama-2-7b-chat-hf
By the way, I also tested chat_sample.exe with TinyLlama-1.1B-Chat-v1.0, and it works well:
TinyLlama-1.1B-Chat-v1.0
The text was updated successfully, but these errors were encountered:
This issue is fixed in #549
Sorry, something went wrong.
olpipi
Successfully merging a pull request may close this issue.
Error detail
When running
![image](https://private-user-images.githubusercontent.com/103162767/343627734-93e8d38f-e209-46bc-b4f9-08b143780ba6.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MjEyMzA3NDAsIm5iZiI6MTcyMTIzMDQ0MCwicGF0aCI6Ii8xMDMxNjI3NjcvMzQzNjI3NzM0LTkzZThkMzhmLWUyMDktNDZiYy1iNGY5LTA4YjE0Mzc4MGJhNi5wbmc_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwNzE3JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDcxN1QxNTM0MDBaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1mNGJjNmFiMDU4YTRkOTc5NmMwYjk5NjFlOWI0NmU2ZTM5MWM3NGNkM2RjOTczNGIwMzE1ZTI4MzFmZTdjNGJiJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.dgBNT7lVPi7Dbu3AyFosgfnZF4faUHcGQZiAarxsfrc)
chat_sample.exe
usingLlama-2-7b-chat-hf
, seems that it always have a built-in question:Steps to reproduce
Llama-2-7b-chat-hf
models by this command:By the way, I also tested
![image](https://private-user-images.githubusercontent.com/103162767/343628382-74cb3e71-7bf4-43bd-9154-1881ade13af2.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MjEyMzA3NDAsIm5iZiI6MTcyMTIzMDQ0MCwicGF0aCI6Ii8xMDMxNjI3NjcvMzQzNjI4MzgyLTc0Y2IzZTcxLTdiZjQtNDNiZC05MTU0LTE4ODFhZGUxM2FmMi5wbmc_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwNzE3JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDcxN1QxNTM0MDBaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jZWNmZmFhNjQ4YzMzYWI2MDAzM2M3NTFmZmNjYWNmMjcwZjJmMjFjN2I2MTNjYWJlMzAyODY3YWY4ZmMyMGM1JlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.jrg0wwsMCvRvvsUMZieu_E3PxYwLCAyqhmE2BEJlkzs)
chat_sample.exe
withTinyLlama-1.1B-Chat-v1.0
, and it works well:The text was updated successfully, but these errors were encountered: