-
Notifications
You must be signed in to change notification settings - Fork 5.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"ModuleNotFoundError: No module named 'llama_cpp'" when using Palm API. #7898
Comments
🤖 Hello, The error you're encountering is due to the LlamaIndex framework trying to import the 'llama_cpp' module, which is not found in your environment. This module is a Python interface to the Llama C++ library and is used in the LlamaIndex framework for generating responses in a chat or completion context. However, as you're using the Palm API, you shouldn't need the 'llama_cpp' module. The issue you're facing might be related to a known bug in the LlamaIndex framework where the embedding model defaults to OpenAI, even when the Palm LLM is set in the ServiceContext. This is because the PaLM embedding model is not yet supported in LlamaIndex. You can find more details about this issue here. A possible workaround for this issue is to extend the base class to include the PaLMEmbeddings class for the query_engine. However, please note that there seems to be an issue with NaN values being generated when reading empty items from a CSV document. This issue needs to be addressed, possibly by checking the data handling in the pandas query engine. If you're not using the 'llama_cpp' module in your code, you can try to remove or comment out the import statement for 'llama_cpp' in the LlamaIndex framework. However, please note that this might lead to other issues if the 'llama_cpp' module is used elsewhere in the framework. I hope this helps. If you have any other questions or need further clarification, please let me know. Best, Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
@hsmsci you did not pass in the service context
Or, you could set it globally
|
Thank you @logan-markewich this helps. |
Question Validation
Question
I am following the code from: https://gpt-index.readthedocs.io/en/stable/examples/llm/palm.html
to use Palm api for inference and it is working fine. After that I am trying to extend the code to do 10Q analysis, using this code:
`#10 Q Analysis
import nest_asyncio
nest_asyncio.apply()
from llama_index import SimpleDirectoryReader, ServiceContext, VectorStoreIndex
from llama_index.response.pprint_utils import pprint_response
from llama_index.llms import OpenAI
from llama_index.tools import QueryEngineTool, ToolMetadata
from llama_index.query_engine import SubQuestionQueryEngine
service_context = ServiceContext.from_defaults(llm=llm)
march_2022 = SimpleDirectoryReader(
input_files=["10q1.pdf"]
).load_data()
march_index = VectorStoreIndex.from_documents(march_2022)
`
I am getting this error when initializing the vector store index:
ModuleNotFoundError Traceback (most recent call last)
File /opt/conda/lib/python3.10/site-packages/llama_index/llms/llama_cpp.py:82, in LlamaCPP.init(self, model_url, model_path, temperature, max_new_tokens, context_window, messages_to_prompt, completion_to_prompt, callback_manager, generate_kwargs, model_kwargs, verbose)
81 try:
---> 82 from llama_cpp import Llama
83 except ImportError:
ModuleNotFoundError: No module named 'llama_cpp'
During handling of the above exception, another exception occurred:
ImportError Traceback (most recent call last)
Cell In[11], line 2
1 llm
----> 2 march_index = VectorStoreIndex.from_documents(march_2022)
Why is it looking for llama_cpp when I am using Palm api. How do I fix this?
The text was updated successfully, but these errors were encountered: