-
Notifications
You must be signed in to change notification settings - Fork 999
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
open-api key needed when Ollama specified. #2
Comments
i got same error as well now. on every run. I am using ollama locally. testing trip_planner scrupt. when I run the main.py after asking me initial questions it spits out a long error as seen in the above message from @iplayfast . I noticed that the file crewai/agent.py has this in the code... from langchain.chat_models import ChatOpenAI as OpenAI it seems like the main.py is invoking the generic agent and its hard coded to gpt4. since I removed the openai api key from the .env file its now giving me this error even though i am using ollama locally ... openai.error.AuthenticationError: Incorrect API key provided: KEY. You can find your API key at https://platform.openai.com/account/api-keys. any way to fix this? so if we using locall llm like ollama then it should not bother with openai at all. |
@iplayfast and @fxtoofaan did you check all Agents in example have the EDIT: Same to |
is there an example of using a local LLM in browser_tools.py file? |
@fxtoofaan from langchain.llms import Ollama and in agent |
If you haven't solved it yet, could you please create an issue at https://github.com/joaomdmoura/crewAI/issues? Let's focus on addressing issues there. |
Doing the stock example, and specified ollama for both agents.
Looked like ti was working until it started telling me the results. and then....
The text was updated successfully, but these errors were encountered: