-
Notifications
You must be signed in to change notification settings - Fork 363
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Trying to create an llm via HuggingFaceHub() but get "coroutine 'AsyncRunManager.on_text' was never awaited' #113
Comments
Try setting |
Thanks: tried it but got same error. Is this (specifying a |
The "error" you're showing is just a warning. Are you sure there is not another exception? |
Ah, good point: I may have jumped the gun! I had assumed something had gone fatally wrong because the process hangs after emitting the warning. I've started it again and left it running; will see if it's completed or errored out or still hanging in the morning... |
OK here's the final exception (actually two exceptions?):
|
Looks like the model timed out @maspotts - the open source models are a lot slower and paper-qa requires a lot of calls. I would recommend Claude or HuggingFace with a GPU instance if you're look to replace the OpenAI models. Llama can work too, but it's extremely slow |
Thanks: (off-topic but) do I need to upgrade to a paid huggingface tier to get sufficient performance? |
Not sure, I have paid huggingface and I'm not sure what is available on free tier |
Hi: I'm trying to swap out:
llm = ChatOpenAI(temperature = temperature, model_name = 'gpt-3.5-turbo', max_tokens = max_output_size)
with:
llm = HuggingFaceHub(repo_id = 'mosaicml/mpt-7b-chat', model_kwargs = { "temperature": temperature, "max_length": max_output_size })
and then creating my index as usual via
index = Docs(llm = llm)
. When I callindex.query()
with the index built with theChatOpenAI()
model it works, but when I callindex.query()
with theHuggingFaceHub()
model I get this error:(although I'm not calling
index.aquery()
). Am I making an obvious mistake? I'm hoping there's a nice simple way to make this work, so I can continue my successful implementation using paperqa! Thanks in advance...Mike
The text was updated successfully, but these errors were encountered: