Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IndexError: list index out of range #177

Closed
duob-ai opened this issue Apr 22, 2024 · 3 comments
Closed

IndexError: list index out of range #177

duob-ai opened this issue Apr 22, 2024 · 3 comments

Comments

@duob-ai
Copy link

duob-ai commented Apr 22, 2024

I'm using a simple RAG chain (sample code below). When using ChatVertexAI as my LLM I get the following error only for certain question prompts. It works fine for some prompts but throws the IndexError for others. Using the same prompt always results in the error below. Changing the LLM for AzureAI solves the problem.

Error Log:
IndexError('list index out of range')Traceback (most recent call last):

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1979, in _atransform_stream_with_config
chunk: Output = await asyncio.create_task( # type: ignore[call-arg]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/tracers/log_stream.py", line 237, in tap_output_aiter
async for chunk in output:

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2853, in _atransform
async for output in final_pipeline:

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 4734, in atransform
async for item in self.bound.atransform(

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2883, in atransform
async for chunk in self._atransform_stream_with_config(

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1979, in _atransform_stream_with_config
chunk: Output = await asyncio.create_task( # type: ignore[call-arg]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/tracers/log_stream.py", line 237, in tap_output_aiter
async for chunk in output:

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2853, in _atransform
async for output in final_pipeline:

File "/usr/local/lib/python3.11/site-packages/langchain_core/output_parsers/transform.py", line 60, in atransform
async for chunk in self._atransform_stream_with_config(

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1979, in _atransform_stream_with_config
chunk: Output = await asyncio.create_task( # type: ignore[call-arg]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/tracers/log_stream.py", line 237, in tap_output_aiter
async for chunk in output:

File "/usr/local/lib/python3.11/site-packages/langchain_core/output_parsers/transform.py", line 38, in _atransform
async for chunk in input:

File "/usr/local/lib/python3.11/site-packages/langchain_core/utils/aiter.py", line 97, in tee_peer
item = await iterator.anext()
^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1333, in atransform
async for output in self.astream(final, config, **kwargs):

File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 315, in astream
raise e

File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 293, in astream
async for chunk in self._astream(

File "/usr/local/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py", line 768, in _astream
message = _parse_response_candidate(chunk.candidates[0], streaming=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py", line 325, in _parse_response_candidate
first_part = response_candidate.content.parts[0]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^
IndexError: list index out of range

Sample Code

#  Vertex AI
llm = ChatVertexAI(model_name="gemini-1.5-pro-preview-0409")
retriever = get_retriever()
answer_chain = create_chain(
    llm,
    retriever,
)
return answer_chain

Library Versions
langchain 0.1.16
langchain-google-vertexai 1.0.1

Maybe related
langchain-ai/langchain#17800

@lkuligin
Copy link
Collaborator

I added retries that checks for empty generations too, please, take a look whether it solves the problem.

In case of streaming, there's also a new flag to check for empty generations but it essentially breaks streaming and waits until the full response is generated.

@lkuligin
Copy link
Collaborator

It should be fixed with the recent release. Closing but feel free to re-open please if you observe the issues.

@duob-ai
Copy link
Author

duob-ai commented Apr 27, 2024

@lkuligin I updated my project to the latest release. It is now throwing a new error for the same prompts I had trouble with before:

Tracing with LangSmith:

TypeError("Additional kwargs key is_blocked already exists in left dict and value has unsupported type <class 'bool'>.")Traceback (most recent call last):

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1980, in _atransform_stream_with_config
chunk: Output = await asyncio.create_task( # type: ignore[call-arg]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/tracers/log_stream.py", line 237, in tap_output_aiter
async for chunk in output:

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2853, in _atransform
async for output in final_pipeline:

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 4748, in atransform
async for item in self.bound.atransform(

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2883, in atransform
async for chunk in self._atransform_stream_with_config(

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1980, in _atransform_stream_with_config
chunk: Output = await asyncio.create_task( # type: ignore[call-arg]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/tracers/log_stream.py", line 237, in tap_output_aiter
async for chunk in output:

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2853, in _atransform
async for output in final_pipeline:

File "/usr/local/lib/python3.11/site-packages/langchain_core/output_parsers/transform.py", line 60, in atransform
async for chunk in self._atransform_stream_with_config(

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1980, in _atransform_stream_with_config
chunk: Output = await asyncio.create_task( # type: ignore[call-arg]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/tracers/log_stream.py", line 237, in tap_output_aiter
async for chunk in output:

File "/usr/local/lib/python3.11/site-packages/langchain_core/output_parsers/transform.py", line 38, in _atransform
async for chunk in input:

File "/usr/local/lib/python3.11/site-packages/langchain_core/utils/aiter.py", line 97, in tee_peer
item = await iterator.anext()
^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1334, in atransform
async for output in self.astream(final, config, **kwargs):

File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 319, in astream
raise e

File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 312, in astream
generation += chunk

File "/usr/local/lib/python3.11/site-packages/langchain_core/outputs/chat_generation.py", line 74, in add
generation_info = merge_dicts(
^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain_core/utils/_merge.py", line 40, in merge_dicts
raise TypeError(

TypeError: Additional kwargs key is_blocked already exists in left dict and value has unsupported type <class 'bool'>.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants