We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
你好!我在langchain-ChatGLM看到也是基于本地模型知识库的部署。我之前也是在想将这个vicuna-13b模型接入到那个项目上,但是奈何太菜无从下手,请问这个vicuna能接到那个项目上么,我认为这样的回答可能效果更好
The text was updated successfully, but these errors were encountered:
嗯,可以的,重新写一个loader方法就可以。
Sorry, something went wrong.
@zp2459 langchain-ChatGLM获取dev_llm分支已经支持vicuna了
No branches or pull requests
你好!我在langchain-ChatGLM看到也是基于本地模型知识库的部署。我之前也是在想将这个vicuna-13b模型接入到那个项目上,但是奈何太菜无从下手,请问这个vicuna能接到那个项目上么,我认为这样的回答可能效果更好
The text was updated successfully, but these errors were encountered: