You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
openai.error.AuthenticationError: No API key provided. You can set your API key in code using 'openai.api_key = <API-KEY>', or you can set the environment variable OPENAI_API_KEY=<API-KEY>). If your API key is stored in a file, you can point the openai module at it with 'openai.api_key_path = <PATH>'. You can generate API keys in the OpenAI web interface.
#6756
I am using the AzureOpenAI llm. Not openai LLM directly.
I given the all azure open ai credentails in the code including azure open ai key . But it still shows the error
openai.error.AuthenticationError: No API key provided. You can set your API key in code using 'openai.api_key = <API-KEY>', or you can set the environment variable OPENAI_API_KEY=<API-KEY>). If your API key is stored in a file, you can point the openai module at it with 'openai.api_key_path = <PATH>'. You can generate API keys in the OpenAI web interface. See https://platform.openai.com/account/api-keys for details.
and
tenacity.RetryError: RetryError[<Future at 0x7f99f9ea6ca0 state=finished raised AuthenticationError>]
Version
llama-index-0.7.1
Steps to Reproduce
from llama_index import VectorStoreIndex, SimpleDirectoryReader,LLMPredictor, ServiceContext
from langchain.llms import AzureOpenAI
import os
import openai
os.environ["OPENAI_API_TYPE"] = "type"
os.environ["OPENAI_API_VERSION"] = "version"
os.environ["OPENAI_API_BASE"] = "api_base"
os.environ["OPENAI_API_KEY"] = "azure_open_ai_key"
llm_predictor = LLMPredictor(llm=AzureOpenAI(aptemperature=0, model_name="model_name"))
service_context = ServiceContext.from_defaults(llm_predictor=llm_predictor)
documents = SimpleDirectoryReader('Data/').load_data()
custom_llm_index = VectorStoreIndex.from_documents(documents,service_context=service_context)
custom_llm_query_engine = custom_llm_index.as_query_engine()
response = custom_llm_query_engine.query("who is this text about?")
print(response)
Relevant Logs/Tracbacks
No response
The text was updated successfully, but these errors were encountered:
I faced the same issue and after digging into the llama index code, I realized key access is happing in file level scope rather than function level, which creates a problem if you import first and set the env variable.
This should get corrected anyways you can look at the workaround below.
os.environ["OPENAI_API_TYPE"] = "type"
os.environ["OPENAI_API_VERSION"] = "version"
os.environ["OPENAI_API_BASE"] = "api_base"
os.environ["OPENAI_API_KEY"] = "azure_open_ai_key"
from llama_index import VectorStoreIndex, SimpleDirectoryReader,LLMPredictor, ServiceContext
from langchain.llms import AzureOpenAI
import os
import openai
llm_predictor = LLMPredictor(llm=AzureOpenAI(aptemperature=0, model_name="model_name"))
service_context = ServiceContext.from_defaults(llm_predictor=llm_predictor)
documents = SimpleDirectoryReader('Data/').load_data()
custom_llm_index = VectorStoreIndex.from_documents(documents,service_context=service_context)
custom_llm_query_engine = custom_llm_index.as_query_engine()
response = custom_llm_query_engine.query("who is this text about?")
print(response)
@avi0gaur
Thanks for the reply.
I tried another method and it also worked. Instead of store the credentials in environment variables, store those in openai arguments such as
Bug Description
I am using the AzureOpenAI llm. Not openai LLM directly.
I given the all azure open ai credentails in the code including azure open ai key . But it still shows the error
and
Version
llama-index-0.7.1
Steps to Reproduce
Relevant Logs/Tracbacks
No response
The text was updated successfully, but these errors were encountered: