-
Notifications
You must be signed in to change notification settings - Fork 217
Description
Hi, thanks for the great project — I'm trying to run it locally and ran into a couple of issues.
First, in the README it would be helpful to mention that you need to activate the virtual environment before running the server. Something like this before the “Run the MCP server” step:
..venv\Scripts\activate
That way, when we get to:
uv run src/notebookllama/server.py
streamlit run src/notebookllama/Home.py
everything works smoothly.
Second issue: I'm having trouble with the OTLPSpanExporter and tracing setup.
The default configuration gives me this error:
requests.exceptions.ConnectionError: HTTPConnectionPool(host='0.0.0.0', port=4318): Max retries exceeded with url: /v1/traces (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x0000014DF9A107D0>: Failed to establish a new connection: [WinError 10049] The requested address is not valid in its context'))
So I tried editing Home.py line 23 to:
span_exporter = OTLPSpanExporter("http://127.0.0.1:4318/v1/traces")
But unfortunately that still doesn’t solve it. The app starts up and I can upload a file, but as soon as it tries to process it, I get a timeout and this stack trace:
mcp.shared.exceptions.McpError: Timed out while waiting for response to ClientRequest. Waited 30.0 seconds.
This happens after the file is uploaded and parsing starts. From the logs:
Error processing document: Error in step 'extract_file_data': unhandled errors in a TaskGroup (1 sub-exception)
mcp.shared.exceptions.McpError: Timed out while waiting for response to ClientRequest. Waited 30.0 seconds.
INFO: Started server process [33452]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8000/ (Press CTRL+C to quit)
INFO: 127.0.0.1:57413 - "POST /mcp HTTP/1.1" 307 Temporary Redirect
INFO: 127.0.0.1:57413 - "POST /mcp/ HTTP/1.1" 200 OK
INFO: 127.0.0.1:57417 - "GET /mcp HTTP/1.1" 307 Temporary Redirect
INFO: 127.0.0.1:57416 - "POST /mcp HTTP/1.1" 307 Temporary Redirect
INFO: 127.0.0.1:57417 - "GET /mcp/ HTTP/1.1" 200 OK
INFO: 127.0.0.1:57416 - "POST /mcp/ HTTP/1.1" 202 Accepted
INFO: 127.0.0.1:57422 - "POST /mcp HTTP/1.1" 307 Temporary Redirect
INFO: 127.0.0.1:57422 - "POST /mcp/ HTTP/1.1" 200 OK
Uploading file...
Parsing file...
Started parsing the file under job_id e0c4a8ea-d856-49e2-9b93-0238a5e47f1a
Getting markdown...
Extracting data...
Uploading files: 100%|███████████████████████████████████████████████████████████████████| 1/1 [00:01<00:00, 1.08s/it]
Creating extraction jobs: 100%|██████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 2.34it/s]
Extracting files: 0%| | 0/1 [00:00<?, ?it/s]INFO: 127.0.0.1:57440 - "DELETE /mcp HTTP/1.1" 307 Temporary Redirect
INFO: 127.0.0.1:57440 - "DELETE /mcp/ HTTP/1.1" 200 OK
Extracting files: 100%|██████████████████████████████████████████████████████████████████| 1/1 [00:47<00:00, 47.39s/it]
Done.
ERROR: + Exception Group Traceback (most recent call last):
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\starlette\routing.py", line 694, in lifespan
| async with self.lifespan_context(app) as maybe_state:
| ~~~~~~~~~~~~~~~~~~~~~^^^^^
| File "C:\Users..\AppData\Roaming\uv\python\cpython-3.13.5-windows-x86_64-none\Lib\contextlib.py", line 235, in aexit
| await self.gen.athrow(value)
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\fastmcp\server\http.py", line 353, in lifespan
| async with session_manager.run():
| ~~~~~~~~~~~~~~~~~~~^^
| File "C:\Users..\AppData\Roaming\uv\python\cpython-3.13.5-windows-x86_64-none\Lib\contextlib.py", line 235, in aexit
| await self.gen.athrow(value)
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\mcp\server\streamable_http_manager.py", line 106, in run
| async with anyio.create_task_group() as tg:
| ~~~~~~~~~~~~~~~~~~~~~~~^^
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\anyio_backends_asyncio.py", line 772, in aexit
| raise BaseExceptionGroup(
| "unhandled errors in a TaskGroup", self._exceptions
| ) from None
| ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
+-+---------------- 1 ----------------
| Exception Group Traceback (most recent call last):
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\mcp\server\streamable_http_manager.py", line 228, in run_server
| async with http_transport.connect() as streams:
| ~~~~~~~~~~~~~~~~~~~~~~^^
| File "C:\Users..\AppData\Roaming\uv\python\cpython-3.13.5-windows-x86_64-none\Lib\contextlib.py", line 235, in aexit
| await self.gen.athrow(value)
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\mcp\server\streamable_http.py", line 784, in connect
| async with anyio.create_task_group() as tg:
| ~~~~~~~~~~~~~~~~~~~~~~~^^
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\anyio_backends_asyncio.py", line 772, in aexit
| raise BaseExceptionGroup(
| "unhandled errors in a TaskGroup", self._exceptions
| ) from None
| ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
+-+---------------- 1 ----------------
| Exception Group Traceback (most recent call last):
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\mcp\server\streamable_http.py", line 846, in connect
| yield read_stream, write_stream
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\mcp\server\streamable_http_manager.py", line 231, in run_server
| await self.app.run(
| ...<4 lines>...
| )
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\mcp\server\lowlevel\server.py", line 473, in run
| async with AsyncExitStack() as stack:
| ~~~~~~~~~~~~~~^^
| File "C:\Users..\AppData\Roaming\uv\python\cpython-3.13.5-windows-x86_64-none\Lib\contextlib.py", line 768, in aexit
| raise exc
| File "C:\Users..\AppData\Roaming\uv\python\cpython-3.13.5-windows-x86_64-none\Lib\contextlib.py", line 751, in aexit
| cb_suppress = await cb(*exc_details)
| ^^^^^^^^^^^^^^^^^^^^^^
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\mcp\shared\session.py", line 218, in aexit
| return await self._task_group.aexit(exc_type, exc_val, exc_tb)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\anyio_backends_asyncio.py", line 772, in aexit
| raise BaseExceptionGroup(
| "unhandled errors in a TaskGroup", self._exceptions
| ) from None
| ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
+-+---------------- 1 ----------------
| Exception Group Traceback (most recent call last):
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\mcp\server\lowlevel\server.py", line 484, in run
| async with anyio.create_task_group() as tg:
| ~~~~~~~~~~~~~~~~~~~~~~~^^
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\anyio_backends_asyncio.py", line 772, in aexit
| raise BaseExceptionGroup(
| "unhandled errors in a TaskGroup", self._exceptions
| ) from None
| ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
+-+---------------- 1 ----------------
| Traceback (most recent call last):
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\mcp\server\lowlevel\server.py", line 508, in _handle_message
| await self._handle_request(message, req, session, lifespan_context, raise_exceptions)
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\mcp\server\lowlevel\server.py", line 557, in _handle_request
| await message.respond(response)
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\mcp\shared\session.py", line 131, in respond
| await self._session._send_response( # type: ignore[reportPrivateUsage]
| request_id=self.request_id, response=response
| )
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\mcp\shared\session.py", line 329, in _send_response
| await self._write_stream.send(session_message)
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\anyio\streams\memory.py", line 242, in send
| self.send_nowait(item)
| ~~~~~~~~~~~~~~~~^^^^^^
| File "C:\Users..\src\notebookllama.venv\Lib\site-packages\anyio\streams\memory.py", line 211, in send_nowait
| raise ClosedResourceError
| anyio.ClosedResourceError
+------------------------------------