Query engine empty output #8756
Replies: 3 comments 1 reply
-
🤖 Based on the information you've provided and the similar issues I found in the LlamaIndex repository, there are a few things you could try to resolve the issue of receiving empty outputs from your query engine.
I hope these suggestions help you resolve the issue. If you have any further questions or if the issue persists, please provide more details about your setup and the steps you've taken so far. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
@trigun117 maybe try lowering the context window in the service context a bit? Usually empty responses are because the prompt went over the token limit 🤔
|
Beta Was this translation helpful? Give feedback.
-
It seems like there was a model issue. I replaced |
Beta Was this translation helpful? Give feedback.
-
I followed a tutorial at https://docs.llamaindex.ai/en/stable/examples/llm/llama_2_llama_cpp.html to create a solution for querying my documents. The initial prompts were successful, and I received the expected responses. However, all subsequent requests using the same configuration and data resulted in empty outputs. I don't see any errors or anything that would give me context about what's wrong. I'm using
llama-cpp-python
and I faced the same problem when trying to uselangchain
.Code itself:
Empty response:
Non-empty response:
Beta Was this translation helpful? Give feedback.
All reactions