Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: ModuleNotFoundError: No module named 'llama_index.llms.anthropic_utils' #91

Closed
1 task done
saadenr opened this issue Mar 15, 2024 · 5 comments
Closed
1 task done
Labels
auto:bug bug Something isn't working triage

Comments

@saadenr
Copy link

saadenr commented Mar 15, 2024

Contact Details

saadennigro@gmail.com

What happened?

I had this Module Not Found Error, while trying to work with Port-key-AI integrated with llamaindex.
If someone had the same issue, please make a sign.

I have the latest version of llamaindex (Version: 0.10.0) as well as Portkey-ai(Version: 1.1.7).

Here's the code

# Setup a custom service context by passing in the Portkey LLM
from llama_index.core import ServiceContext
from portkey_ai.llms.llama_index import PortkeyLLM

portkey = PortkeyLLM(api_key="PORTKEY_API_KEY", virtual_key="VIRTUAL_KEY")
service_context = ServiceContext.from_defaults(llm=portkey)

it's from the official documentation of Portkey

Version

0.1.xx (Default)

Relevant log output

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
File [c:\Users\saad\OneDrive\Bureau\LlamaChat-RAG-S2B\.conda\lib\site-packages\portkey_ai\llms\llama_index\utils.py:20](file:///C:/Users/saad/OneDrive/Bureau/LlamaChat-RAG-S2B/.conda/lib/site-packages/portkey_ai/llms/llama_index/utils.py:20), in all_available_models()
     [19](file:///C:/Users/saad/OneDrive/Bureau/LlamaChat-RAG-S2B/.conda/lib/site-packages/portkey_ai/llms/llama_index/utils.py:19) try:
---> [20](file:///C:/Users/saad/OneDrive/Bureau/LlamaChat-RAG-S2B/.conda/lib/site-packages/portkey_ai/llms/llama_index/utils.py:20)     from llama_index.llms.anthropic_utils import CLAUDE_MODELS
     [21](file:///C:/Users/saad/OneDrive/Bureau/LlamaChat-RAG-S2B/.conda/lib/site-packages/portkey_ai/llms/llama_index/utils.py:21)     from llama_index.llms.openai_utils import (
     [22](file:///C:/Users/saad/OneDrive/Bureau/LlamaChat-RAG-S2B/.conda/lib/site-packages/portkey_ai/llms/llama_index/utils.py:22)         AZURE_TURBO_MODELS,
     [23](file:///C:/Users/saad/OneDrive/Bureau/LlamaChat-RAG-S2B/.conda/lib/site-packages/portkey_ai/llms/llama_index/utils.py:23)         GPT3_5_MODELS,
   (...)
     [26](file:///C:/Users/saad/OneDrive/Bureau/LlamaChat-RAG-S2B/.conda/lib/site-packages/portkey_ai/llms/llama_index/utils.py:26)         TURBO_MODELS,
     [27](file:///C:/Users/saad/OneDrive/Bureau/LlamaChat-RAG-S2B/.conda/lib/site-packages/portkey_ai/llms/llama_index/utils.py:27)     )

ModuleNotFoundError: No module named 'llama_index.llms.anthropic_utils'

The above exception was the direct cause of the following exception:

Exception                                 Traceback (most recent call last)
Cell In[18], [line 3](vscode-notebook-cell:?execution_count=18&line=3)
      [1](vscode-notebook-cell:?execution_count=18&line=1) # Setup a custom service context by passing in the Portkey LLM
      [2](vscode-notebook-cell:?execution_count=18&line=2) from llama_index.core import ServiceContext
----> [3](vscode-notebook-cell:?execution_count=18&line=3) from portkey_ai.llms.llama_index import PortkeyLLM
      [5](vscode-notebook-cell:?execution_count=18&line=5) portkey = PortkeyLLM(api_key="PORTKEY_API_KEY", virtual_key="VIRTUAL_KEY")
      [6](vscode-notebook-cell:?execution_count=18&line=6) service_context = ServiceContext.from_defaults(llm=portkey)

File [c:\Users\saad\OneDrive\Bureau\LlamaChat-RAG-S2B\.conda\lib\site-packages\portkey_ai\llms\llama_index\__init__.py:1](file:///C:/Users/saad/OneDrive/Bureau/LlamaChat-RAG-S2B/.conda/lib/site-packages/portkey_ai/llms/llama_index/__init__.py:1)
...
     [36](file:///C:/Users/saad/OneDrive/Bureau/LlamaChat-RAG-S2B/.conda/lib/site-packages/portkey_ai/llms/llama_index/utils.py:36)     }
     [37](file:///C:/Users/saad/OneDrive/Bureau/LlamaChat-RAG-S2B/.conda/lib/site-packages/portkey_ai/llms/llama_index/utils.py:37) except ImportError as exc:
---> [38](file:///C:/Users/saad/OneDrive/Bureau/LlamaChat-RAG-S2B/.conda/lib/site-packages/portkey_ai/llms/llama_index/utils.py:38)     raise Exception(IMPORT_ERROR_MESSAGE) from exc

Exception: Llama-Index is not installed.Please install it with `pip install llama-index`.

Code of Conduct

  • I agree to follow this project's Code of Conduct
@saadenr saadenr added bug Something isn't working triage labels Mar 15, 2024
Copy link

dosubot bot commented Mar 15, 2024

Hello @saadenr! Great to see you reaching out again. Hope you're doing well!

The error you're encountering, "ModuleNotFoundError: No module named 'llama_index.llms.anthropic_utils'", suggests that the Python interpreter is unable to locate the module llama_index.llms.anthropic_utils. This issue can occur due to several reasons:

  1. Llama-Index Package Not Installed: The most common cause for this error is that the Llama-Index package, which should contain the anthropic_utils module, is not installed in your environment. The error handling in your code suggests installing Llama-Index using pip install llama-index. Please ensure that this package is installed in the same Python environment where you're running your script.

  2. Incorrect Package Version: If the Llama-Index package is installed but the error persists, it's possible that the version of Llama-Index you have does not include the anthropic_utils module. This could happen if the module was added in a later version than the one you have installed. Check the documentation or release notes of Llama-Index to verify if a specific version is required and update the package accordingly.

  3. Incorrect Module Path: The error could also result from an incorrect import statement if the anthropic_utils module has been moved to a different location within the Llama-Index package or if there's a typo in the module path. Verify the correct path by consulting the latest documentation of Llama-Index.

  4. Python Environment Issues: If you have multiple Python environments (e.g., created with virtualenv, conda), ensure that you're installing the Llama-Index package in the correct environment. Sometimes, packages are installed in one environment while the script is executed in another, leading to such errors.

To resolve the issue, you should:

  • Ensure that Llama-Index is installed in the correct environment with pip install llama-index.
  • Verify that you have the correct version of Llama-Index that includes the anthropic_utils module.
  • Check the import path for typos or changes in the module structure.
  • Confirm that you're working in the correct Python environment.

Regarding your question about the role of the 'llama_index.llms.anthropic_utils' module in the Portkey Python SDK, it is used to import the CLAUDE_MODELS dictionary. This dictionary presumably contains information or configurations related to models developed by Anthropic, specifically Claude models, as indicated by the naming convention. These models are then included in the aggregation of all available models within the SDK, as seen in the all_available_models function. This function compiles a comprehensive dictionary of models from various sources, including those from OpenAI (e.g., GPT-3, GPT-4, Azure Turbo models) and the Claude models from Anthropic, making them accessible for further operations within the SDK.

You can find more details in the portkey_ai/llms/llama_index/utils.py file in the Portkey Python SDK repository.

Please let me know if you need further assistance.

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@dosubot dosubot bot added the auto:bug label Mar 15, 2024
@vrushankportkey
Copy link
Collaborator

@saadenr thank you for reporting this bug. Tagging @VisargD @csgulati09 who can look into this

@vrushankportkey
Copy link
Collaborator

@saadenr quick update - we are working on updating our Llamaindex integration and this will be fixed upstream.

@paul-tharun
Copy link

@vrushankportkey is this fixed? We are facing the same issue on the latest versions of both packages

@vrushankportkey
Copy link
Collaborator

@paul-tharun @saadenr - folks, sorry for the delay! We have updated our Llamaindex documentation that fixes all Portkey integration issues.

Please check it out here - https://portkey.ai/docs/welcome/integration-guides/llama-index-python

I'll close this thread now, please feel free to reopen it if you have any more questions!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug bug Something isn't working triage
Projects
None yet
Development

No branches or pull requests

3 participants