Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Python: Resolve hanging issue in Jupyter notebooks when re-running async functions #4237

Merged
merged 3 commits into from
Dec 13, 2023

Conversation

moonbox3
Copy link
Contributor

Motivation and Context

Resolves #4137. The issue was that if you try to re-run the sample joke in the 00-getting-started notebook, the subsequent call to the joke hangs.

Description

There were issues when re-running asynchronous functions (_invoke_semantic_async and _invoke_native_async) due to improper handling of asyncio event loops in a multi-threaded env. Specifically, the code had an issue when an event loop was already running, leading to hanging or blocking behavior on subsequent runs.

The change replaced the old code Python's ThreadPoolExecutor, which simplifies thread management. It handles the creation, execution, and cleanup of threads more efficiently and safely.

The new run_async_in_executor method ensures that each async function runs in its own new event loop, avoiding conflicts with the main thread's event loop or other threads.

Contribution Checklist

@moonbox3 moonbox3 requested a review from a team as a code owner December 13, 2023 17:46
@shawncal shawncal added the python Pull requests for the Python Semantic Kernel label Dec 13, 2023
Copy link
Member

@eavanvalkenburg eavanvalkenburg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, thanks for fixing this!

@moonbox3 moonbox3 added this pull request to the merge queue Dec 13, 2023
Merged via the queue into microsoft:main with commit be89d6b Dec 13, 2023
26 checks passed
@moonbox3 moonbox3 deleted the evmattso/fix-sk-function-hang branch December 13, 2023 18:39
@BeeShall
Copy link

BeeShall commented Dec 24, 2023

I think this fix has caused a new problem. When trying to run the semantic functions with input parameters from this notebook [https://github.com/microsoft/semantic-kernel/blob/main/python/README.md], my event loop hangs now:

In addition, when trying to run this code as a standalone python file, I get RuntimeError: There is no current event loop in thread 'MainThread'.

I'm trying to find the exact problem and fix associated with this but reporting here so the community can also look into it.

@moonbox3
Copy link
Contributor Author

moonbox3 commented Jan 3, 2024

Hi, @BeeShall. Are you referring to this code? Can you send us the exact code you're trying to run as a standalone python file? Thanks.

@NatanMish
Copy link

I think this fix has caused a new problem. When trying to run the semantic functions with input parameters from this notebook [https://github.com/microsoft/semantic-kernel/blob/main/python/README.md], my event loop hangs now:

In addition, when trying to run this code as a standalone python file, I get RuntimeError: There is no current event loop in thread 'MainThread'.

I'm trying to find the exact problem and fix associated with this but reporting here so the community can also look into it.

I am seeing the same issue as @BeeShall .
And yes, @moonbox3 , the code is very similar to that example. Let me write the code and I'll send it over

@NatanMish
Copy link

NatanMish commented Jan 4, 2024

import semantic_kernel as sk
from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion


kernel = sk.Kernel()

deployment = "gpt35turbo"
api_key = ""
endpoint = ""

azure_text_service = AzureChatCompletion(deployment_name=deployment, endpoint=endpoint, api_key=api_key)    # set the deployment name to the value of your text model
kernel.add_chat_service("dv", azure_text_service)

prompt = """{{$input}}\n\nEXPLAIN THESE SYSTEM LOGS IN A HUMAN READABLE FORMAT."""
# prompt = "{{$input}}/n/nSummarize this"
log_explainer = kernel.create_semantic_function(prompt_template=prompt, max_tokens=2000, temperature=0.2, top_p=0.5)

for i in range(10):
    print(log_explainer(f"ERROR: {i}"))

Error I get is:

I'm sorry, but without any specific system logs provided, I am unable to explain them in a human-readable format. Could you please provide the system logs you would like me to explain?
Traceback (most recent call last):
  File "/Users/natanmish/Projects/zbai-system/scratch.py", line 19, in <module>
    print(log_explainer(f"ERROR: {i}"))
  File "/Users/natanmish/Library/Caches/pypoetry/virtualenvs/zbai-system-8akujXPy-py3.10/lib/python3.10/site-packages/semantic_kernel/orchestration/sk_function.py", line 400, in __call__
    return self.invoke(
  File "/Users/natanmish/Library/Caches/pypoetry/virtualenvs/zbai-system-8akujXPy-py3.10/lib/python3.10/site-packages/semantic_kernel/orchestration/sk_function.py", line 441, in invoke
    if asyncio.get_event_loop().is_running()
  File "/Users/natanmish/.pyenv/versions/3.10.13/lib/python3.10/asyncio/events.py", line 656, in get_event_loop
    raise RuntimeError('There is no current event loop in thread %r.'
RuntimeError: There is no current event loop in thread 'MainThread'.

It succeeds on the first call, but then fails on the second one after

@BeeShall
Copy link

BeeShall commented Jan 4, 2024

@moonbox3 Yes that's the one. keeps hanging in Jupyter notebook in a thread lock situation and I get that error when I try to run standalone.

@NatanMish yep that's exactly how I see it too.

@moonbox3
Copy link
Contributor Author

moonbox3 commented Jan 4, 2024

@BeeShall, thanks for confirming. And @NatanMish, thank you for the code. Appreciate you pointing this out. This is fixed in #4485.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
python Pull requests for the Python Semantic Kernel
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Python: semantic function hangs if I call more than once.
6 participants