We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I want to use mlc-llm on windows with cuda. I have compiled mlc_chat_cli.exe with cuda enabled, but I still need this dll to run llama.
The text was updated successfully, but these errors were encountered:
mlc_chat_cli应该是没有后续支持了,看一下#89号issue的描述。
Sorry, something went wrong.
Thank you
No branches or pull requests
I want to use mlc-llm on windows with cuda. I have compiled mlc_chat_cli.exe with cuda enabled, but I still need this dll to run llama.
The text was updated successfully, but these errors were encountered: