Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

openai.error.AuthenticationError: No API key provided. You can set your API key in code using 'openai.api_key = <API-KEY>', or you can set the environment variable OPENAI_API_KEY=<API-KEY>). If your API key is stored in a file, you can point the openai module at it with 'openai.api_key_path = <PATH>'. You can generate API keys in the OpenAI web interface. #6756

Closed
pradeepdev-1995 opened this issue Jul 6, 2023 · 3 comments
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized

Comments

@pradeepdev-1995
Copy link

pradeepdev-1995 commented Jul 6, 2023

Bug Description

I am using the AzureOpenAI llm. Not openai LLM directly.
I given the all azure open ai credentails in the code including azure open ai key . But it still shows the error

openai.error.AuthenticationError: No API key provided. You can set your API key in code using 'openai.api_key = <API-KEY>', or you can set the environment variable OPENAI_API_KEY=<API-KEY>). If your API key is stored in a file, you can point the openai module at it with 'openai.api_key_path = <PATH>'. You can generate API keys in the OpenAI web interface. See https://platform.openai.com/account/api-keys for details.

and

tenacity.RetryError: RetryError[<Future at 0x7f99f9ea6ca0 state=finished raised AuthenticationError>] 

Version

llama-index-0.7.1

Steps to Reproduce

from llama_index import VectorStoreIndex, SimpleDirectoryReader,LLMPredictor, ServiceContext
from langchain.llms import AzureOpenAI
import os
import openai

os.environ["OPENAI_API_TYPE"] = "type"
os.environ["OPENAI_API_VERSION"] = "version"
os.environ["OPENAI_API_BASE"] = "api_base"
os.environ["OPENAI_API_KEY"] = "azure_open_ai_key"


llm_predictor = LLMPredictor(llm=AzureOpenAI(aptemperature=0, model_name="model_name"))
service_context = ServiceContext.from_defaults(llm_predictor=llm_predictor)


documents = SimpleDirectoryReader('Data/').load_data()
custom_llm_index = VectorStoreIndex.from_documents(documents,service_context=service_context)
custom_llm_query_engine = custom_llm_index.as_query_engine()
response = custom_llm_query_engine.query("who is this text about?")
print(response)

Relevant Logs/Tracbacks

No response

@pradeepdev-1995 pradeepdev-1995 added bug Something isn't working triage Issue needs to be triaged/prioritized labels Jul 6, 2023
@avi0gaur
Copy link

avi0gaur commented Jul 6, 2023

Run the Below code It should work.

I faced the same issue and after digging into the llama index code, I realized key access is happing in file level scope rather than function level, which creates a problem if you import first and set the env variable.

This should get corrected anyways you can look at the workaround below.

os.environ["OPENAI_API_TYPE"] = "type"
os.environ["OPENAI_API_VERSION"] = "version"
os.environ["OPENAI_API_BASE"] = "api_base"
os.environ["OPENAI_API_KEY"] = "azure_open_ai_key"

from llama_index import VectorStoreIndex, SimpleDirectoryReader,LLMPredictor, ServiceContext
from langchain.llms import AzureOpenAI
import os
import openai

llm_predictor = LLMPredictor(llm=AzureOpenAI(aptemperature=0, model_name="model_name"))
service_context = ServiceContext.from_defaults(llm_predictor=llm_predictor)


documents = SimpleDirectoryReader('Data/').load_data()
custom_llm_index = VectorStoreIndex.from_documents(documents,service_context=service_context)
custom_llm_query_engine = custom_llm_index.as_query_engine()
response = custom_llm_query_engine.query("who is this text about?")
print(response)

@pradeepdev-1995
Copy link
Author

@avi0gaur
Thanks for the reply.
I tried another method and it also worked. Instead of store the credentials in environment variables, store those in openai arguments such as

import openai

openai.api_type = "type"
openai.api_base = "api_base"
openai.api_version = "version"
os.environ["OPENAI_API_KEY"] = "azure_open_ai_key"
openai.api_key = os.getenv("OPENAI_API_KEY")

it worked for me

@logan-markewich
Copy link
Collaborator

I will also note that we just released our own Azure LLM implementation (in v0.7.2), which has nicer errors to help you with the setup.

See details here https://gpt-index.readthedocs.io/en/latest/examples/customization/llms/AzureOpenAI.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized
Projects
None yet
Development

No branches or pull requests

3 participants