Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running asyncio.get_event_loop().run_until_complete(query(args)) more than once deadlocks program #36

Closed
gamendez98 opened this issue Apr 20, 2023 · 5 comments
Labels
bug Something isn't working

Comments

@gamendez98
Copy link

for

@lmql.query
async def s(output, material_text: str):
    '''
sample(temperature=0.5)
   """Write 2 one-sentence true summaries and 4 one-sentence false summaries in the style of a first-grade reading comprehension test.  Replace "I" with a third-person word. Only use information from the paragraph.
   Paragraph:
   {material_text}
   True summaries:{nl}"""
   for i in range(2):
      "- [correct_summaries]"
      output.append(correct_summaries)
   """False summaries:{nl}"""
   for i in range(4):
      "- [incorrect_summaries]"
      output.append(incorrect_summaries)
from
   "openai/text-davinci-003"
where
   STOPS_AT(correct_summaries, "{nl}") and STOPS_AT(incorrect_summaries, "{nl}")'''

if I run

l=[]
asyncio.get_event_loop().run_until_complete(s(l, "some text"))

it executes just fine, but the second time it seems to produce a deadlock

@lbeurerkellner
Copy link
Collaborator

lbeurerkellner commented Apr 20, 2023

I think there may be an issue with worker threads being shut down after the first completion. I will investigate a bit and report here.

For now, usually when running multiple queries in sequence, I would suggest a pattern like this:

async def main():
   l = []
   await s(l, "some text")
   await s(l, "some text")

asyncio.run(main)

@lbeurerkellner lbeurerkellner added the bug Something isn't working label Apr 20, 2023
@gamendez98
Copy link
Author

Thank you for the quick answer. sadly I am calling the functions from within a RabbitMq callback that I cannot make async and since I the function gets called for each message I need a way to transition more than once from non-async to async

@lbeurerkellner
Copy link
Collaborator

lbeurerkellner commented Apr 20, 2023

I see. For this you may want to have a look at how we bridge async/non-async for langchain integration, see https://github.com/eth-sri/lmql/blob/main/src/lmql/runtime/langchain.py

Bascially, if possible try re-using the event loop. Still, I will see what I can do to enable your initial use.

@gamendez98
Copy link
Author

gamendez98 commented Apr 20, 2023

Thanks a lot!! I am triying to see exactly where it gets locked to have a better idea on whats happening
so far it gets locked on
lmql/runtime/bopenai/batched_openai.py :675
PS I am using version 0.0.5.1

@gamendez98
Copy link
Author

It worked thanks!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants