Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Anthropic RAG (Sonnet & Opus) not working. ValueError: System message must be at beginning of message list. #21

Open
ryaneggz opened this issue May 16, 2024 · 1 comment

Comments

@ryaneggz
Copy link
Contributor

ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 265, in __call__
    await wrap(partial(self.listen_for_disconnect, receive))
  File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 261, in wrap
    await func()
  File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 238, in listen_for_disconnect
    message = await receive()
  File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 568, in receive
    await self.message_event.wait()
  File "/usr/lib/python3.10/asyncio/locks.py", line 214, in wait
    await fut
asyncio.exceptions.CancelledError: Cancelled by cancel scope 7f0dc4518940

During handling of the above exception, another exception occurred:

  + Exception Group Traceback (most recent call last):
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 411, in run_asgi
  |     result = await app(  # type: ignore[func-returns-value]
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
  |     return await self.app(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in __call__
  |     await super().__call__(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/applications.py", line 123, in __call__
  |     await self.middleware_stack(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in __call__
  |     raise exc
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in __call__
  |     await self.app(scope, receive, _send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/middleware/cors.py", line 93, in __call__
  |     await self.simple_response(scope, receive, send, request_headers=headers)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/middleware/cors.py", line 148, in simple_response
  |     await self.app(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
  |     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
  |     raise exc
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
  |     await app(scope, receive, sender)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/routing.py", line 756, in __call__
  |     await self.middleware_stack(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
  |     await route.handle(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
  |     await self.app(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
  |     await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
  |     raise exc
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
  |     await app(scope, receive, sender)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/routing.py", line 75, in app
  |     await response(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 258, in __call__
  |     async with anyio.create_task_group() as task_group:
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 678, in __aexit__
  |     raise BaseExceptionGroup(
  | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
  +-+---------------- 1 ----------------
    | Traceback (most recent call last):
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 261, in wrap
    |     await func()
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 250, in stream_response
    |     async for chunk in self.body_iterator:
    |   File "/home/ryaneggz/promptengineers/llm-server/backend/src/utils/__init__.py", line 9, in chain_stream
    |     async for event in runnable:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 4712, in astream
    |     async for item in self.bound.astream(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 4712, in astream
    |     async for item in self.bound.astream(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2900, in astream
    |     async for chunk in self.atransform(input_aiter(), config, **kwargs):
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2883, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2853, in _atransform
    |     async for output in final_pipeline:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 4748, in atransform
    |     async for item in self.bound.atransform(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2883, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2853, in _atransform
    |     async for output in final_pipeline:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/passthrough.py", line 601, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/passthrough.py", line 580, in _atransform
    |     async for chunk in for_passthrough:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 97, in tee_peer
    |     item = await iterator.__anext__()
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 97, in tee_peer
    |     item = await iterator.__anext__()
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/passthrough.py", line 601, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/passthrough.py", line 591, in _atransform
    |     yield await first_map_chunk_task
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 62, in anext_impl
    |     return await __anext__(iterator)
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3315, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3302, in _atransform
    |     chunk = AddableDict({step_name: task.result()})
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3285, in get_next_chunk
    |     return await py_anext(generator)
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 4748, in atransform
    |     async for item in self.bound.atransform(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1334, in atransform
    |     async for output in self.astream(final, config, **kwargs):
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/branch.py", line 400, in astream
    |     async for chunk in self.default.astream(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2900, in astream
    |     async for chunk in self.atransform(input_aiter(), config, **kwargs):
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2883, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2853, in _atransform
    |     async for output in final_pipeline:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1316, in atransform
    |     async for ichunk in input:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/output_parsers/transform.py", line 60, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1944, in _atransform_stream_with_config
    |     final_input: Optional[Input] = await py_anext(input_for_tracing, None)
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 62, in anext_impl
    |     return await __anext__(iterator)
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 97, in tee_peer
    |     item = await iterator.__anext__()
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1334, in atransform
    |     async for output in self.astream(final, config, **kwargs):
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 319, in astream
    |     raise e
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 297, in astream
    |     async for chunk in self._astream(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_anthropic/chat_models.py", line 440, in _astream
    |     params = self._format_params(messages=messages, stop=stop, **kwargs)
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_anthropic/chat_models.py", line 378, in _format_params
    |     system, formatted_messages = _format_messages(messages)
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_anthropic/chat_models.py", line 152, in _format_messages
    |     raise ValueError("System message must be at beginning of message list.")
    | ValueError: System message must be at beginning of message list.
    +------------------------------------
@ryaneggz ryaneggz changed the title Anthropic RAG (Sonnet) not working. ValueError: System message must be at beginning of message list. Anthropic RAG (Sonnet & Opus) not working. ValueError: System message must be at beginning of message list. May 16, 2024
@ryaneggz
Copy link
Contributor Author

ryaneggz commented May 18, 2024

Getting this as well. When I took look it appears that RAG is doubling the system message to Anthropic instead of rewriting quetsion. This is the only provider currently having this issue with so seems specific to Langchain Anthropic.

langchain-ai/langchain#18909

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant