Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

open-api key needed when Ollama specified. #2

Closed
iplayfast opened this issue Dec 28, 2023 · 6 comments
Closed

open-api key needed when Ollama specified. #2

iplayfast opened this issue Dec 28, 2023 · 6 comments

Comments

@iplayfast
Copy link

Doing the stock example, and specified ollama for both agents.
Looked like ti was working until it started telling me the results. and then....

 Entering new AgentExecutor chain...
Traceback (most recent call last):
  File "/home/chris/ai/aiprojects/crewAIPlayground/crewAI-examples/stock_analysis/main.py", line 54, in <module>
    result = financial_crew.run()
             ^^^^^^^^^^^^^^^^^^^^
  File "/home/chris/ai/aiprojects/crewAIPlayground/crewAI-examples/stock_analysis/main.py", line 42, in run
    result = crew.kickoff()
             ^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/crewai/crew.py", line 63, in kickoff
    return self.__sequential_loop()
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/crewai/crew.py", line 81, in __sequential_loop
    task_outcome = task.execute(task_outcome)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/crewai/task.py", line 36, in execute
    return self.agent.execute_task(
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/crewai/agent.py", line 102, in execute_task
    return self.agent_executor.invoke({
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain/chains/base.py", line 89, in invoke
    return self(
           ^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain/chains/base.py", line 312, in __call__
    raise e
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain/chains/base.py", line 306, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain/agents/agent.py", line 1312, in _call
    next_step_output = self._take_next_step(
                       ^^^^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain/agents/agent.py", line 1038, in _take_next_step
    [
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain/agents/agent.py", line 1038, in <listcomp>
    [
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain/agents/agent.py", line 1066, in _iter_next_step
    output = self.agent.plan(
             ^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain/agents/agent.py", line 385, in plan
    output = self.runnable.invoke(inputs, config={"callbacks": callbacks})
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1514, in invoke
    input = step.invoke(
            ^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2937, in invoke
    return self.bound.invoke(
           ^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 164, in invoke
    self.generate_prompt(
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 495, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 382, in generate
    raise e
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 372, in generate
    self._generate_with_cache(
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 528, in _generate_with_cache
    return self._generate(
           ^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain_community/chat_models/openai.py", line 435, in _generate
    response = self.completion_with_retry(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain_community/chat_models/openai.py", line 360, in completion_with_retry
    return _completion_with_retry(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/tenacity/__init__.py", line 289, in wrapped_f
    return self(f, *args, **kw)
           ^^^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/tenacity/__init__.py", line 379, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/tenacity/__init__.py", line 314, in iter
    return fut.result()
           ^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/tenacity/__init__.py", line 382, in __call__
    result = fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/langchain_community/chat_models/openai.py", line 358, in _completion_with_retry
    return self.client.create(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/openai/api_resources/chat_completion.py", line 25, in create
    return super().create(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 155, in create
    response, _, api_key = requestor.request(
                           ^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/openai/api_requestor.py", line 299, in request
    resp, got_stream = self._interpret_response(result, stream)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/openai/api_requestor.py", line 710, in _interpret_response
    self._interpret_response_line(
  File "/home/chris/anaconda3/envs/crewAI/lib/python3.11/site-packages/openai/api_requestor.py", line 775, in _interpret_response_line
    raise self.handle_error_response(
openai.error.AuthenticationError: Incorrect API key provided: shit. You can find your API key at https://platform.openai.com/account/api-keys.
(crewAI) chris@FORGE:~/ai/aiprojects/crewAIPlayground/crewAI-examples/stock_analysis$ 
@fxtoofaan
Copy link

fxtoofaan commented Dec 29, 2023

i got same error as well now. on every run. I am using ollama locally. testing trip_planner scrupt. when I run the main.py after asking me initial questions it spits out a long error as seen in the above message from @iplayfast .

I noticed that the file crewai/agent.py has this in the code...

from langchain.chat_models import ChatOpenAI as OpenAI
.
.
.
@root_validator(pre=True)
def check_llm(_cls, values):
if not values.get('llm'):
values['llm'] = OpenAI(
temperature=0.7,
model_name="gpt-4"
)
return values

it seems like the main.py is invoking the generic agent and its hard coded to gpt4. since I removed the openai api key from the .env file its now giving me this error even though i am using ollama locally ...

openai.error.AuthenticationError: Incorrect API key provided: KEY. You can find your API key at https://platform.openai.com/account/api-keys.

any way to fix this? so if we using locall llm like ollama then it should not bother with openai at all.

@dukex
Copy link

dukex commented Jan 2, 2024

@iplayfast and @fxtoofaan did you check all Agents in example have the llm= config? Beside the agents at /trip_agents.py we have agents at /trip_planner/tools/browser_tools.py can you check these files are correctly set up?

EDIT: Same to /stock_analysis/tools/browser_tools.py

@fxtoofaan
Copy link

is there an example of using a local LLM in browser_tools.py file?

@scenaristeur
Copy link

@fxtoofaan
add

from langchain.llms import Ollama
ollama_openhermes = Ollama(model="openhermes")

and in agent llm=ollama_openhermes, , same as trip_agent.py
after backstory, for example

image

@Biancamazzi
Copy link
Contributor

If you haven't solved it yet, could you please create an issue at https://github.com/joaomdmoura/crewAI/issues? Let's focus on addressing issues there.

@Biancamazzi
Copy link
Contributor

crewAIInc/crewAI#21

theCyberTech pushed a commit that referenced this issue Aug 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants
@dukex @iplayfast @scenaristeur @Biancamazzi @fxtoofaan and others