Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"ModuleNotFoundError: No module named 'llama_cpp'" when using Palm API. #7898

Closed
1 task done
hpsc3 opened this issue Sep 29, 2023 · 3 comments
Closed
1 task done

"ModuleNotFoundError: No module named 'llama_cpp'" when using Palm API. #7898

hpsc3 opened this issue Sep 29, 2023 · 3 comments
Labels
question Further information is requested

Comments

@hpsc3
Copy link

hpsc3 commented Sep 29, 2023

Question Validation

  • I have searched both the documentation and discord for an answer.

Question

I am following the code from: https://gpt-index.readthedocs.io/en/stable/examples/llm/palm.html
to use Palm api for inference and it is working fine. After that I am trying to extend the code to do 10Q analysis, using this code:

`#10 Q Analysis
import nest_asyncio

nest_asyncio.apply()
from llama_index import SimpleDirectoryReader, ServiceContext, VectorStoreIndex
from llama_index.response.pprint_utils import pprint_response
from llama_index.llms import OpenAI

from llama_index.tools import QueryEngineTool, ToolMetadata
from llama_index.query_engine import SubQuestionQueryEngine

service_context = ServiceContext.from_defaults(llm=llm)

march_2022 = SimpleDirectoryReader(
input_files=["10q1.pdf"]
).load_data()

march_index = VectorStoreIndex.from_documents(march_2022)

`

I am getting this error when initializing the vector store index:


ModuleNotFoundError Traceback (most recent call last)
File /opt/conda/lib/python3.10/site-packages/llama_index/llms/llama_cpp.py:82, in LlamaCPP.init(self, model_url, model_path, temperature, max_new_tokens, context_window, messages_to_prompt, completion_to_prompt, callback_manager, generate_kwargs, model_kwargs, verbose)
81 try:
---> 82 from llama_cpp import Llama
83 except ImportError:

ModuleNotFoundError: No module named 'llama_cpp'

During handling of the above exception, another exception occurred:

ImportError Traceback (most recent call last)
Cell In[11], line 2
1 llm
----> 2 march_index = VectorStoreIndex.from_documents(march_2022)

Why is it looking for llama_cpp when I am using Palm api. How do I fix this?

@hpsc3 hpsc3 added the question Further information is requested label Sep 29, 2023
@dosubot
Copy link

dosubot bot commented Sep 29, 2023

🤖

Hello,

The error you're encountering is due to the LlamaIndex framework trying to import the 'llama_cpp' module, which is not found in your environment. This module is a Python interface to the Llama C++ library and is used in the LlamaIndex framework for generating responses in a chat or completion context.

However, as you're using the Palm API, you shouldn't need the 'llama_cpp' module. The issue you're facing might be related to a known bug in the LlamaIndex framework where the embedding model defaults to OpenAI, even when the Palm LLM is set in the ServiceContext. This is because the PaLM embedding model is not yet supported in LlamaIndex. You can find more details about this issue here.

A possible workaround for this issue is to extend the base class to include the PaLMEmbeddings class for the query_engine. However, please note that there seems to be an issue with NaN values being generated when reading empty items from a CSV document. This issue needs to be addressed, possibly by checking the data handling in the pandas query engine.

If you're not using the 'llama_cpp' module in your code, you can try to remove or comment out the import statement for 'llama_cpp' in the LlamaIndex framework. However, please note that this might lead to other issues if the 'llama_cpp' module is used elsewhere in the framework.

I hope this helps. If you have any other questions or need further clarification, please let me know.

Best,
Dosu

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@logan-markewich
Copy link
Collaborator

@hsmsci you did not pass in the service context

march_index = VectorStoreIndex.from_documents(march_2022, service_context=service_context)

Or, you could set it globally

from llama_index import set_global_service_context 

set_global_service_context(service_context)

@hpsc3
Copy link
Author

hpsc3 commented Sep 29, 2023

Thank you @logan-markewich this helps.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants