-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Works fine with OpenAI, but local Ollama and LM-Studio fail with co-worker errors #602
Comments
I can confirm this. Running with llama3 locally gives me the same error "Co-worker mentioned not found...". |
For the error "Co-worker mentioned not found" I think it's a regression. I could reproduce the error with: from crewai import Agent, Task, Crew, Process
from crewai_tools import tool
from langchain_openai import ChatOpenAI
import os
os.environ["OPENAI_API_KEY"] = "NA"
llm = ChatOpenAI(model="llama3:8b", base_url="http://localhost:11434/v1")
# Agents
luke = Agent(
role="piloto",
goal="Destruar a Estrela da Morte",
backstory="O jovem piloto destinado a ser um jedi, convocado para atacar a a estrela da morte",
llm=llm,
)
leia = Agent(
role="estrategista",
goal="Coordenar o ataque a Estrela da morte",
backstory="A princesa lider da Rebeliao, essencial para a estratégia e comunicacao.",
llm=llm,
)
# Tasks
coordenar_ataque = Task(
description="""Leia deve coordenar a missao,
mantendo comunicacao e fornecendo suporte
estratégico.
Leia deve assegurar que tudo está em ordem, possibilitando um caminho seguro para Luke""",
expected_output="""Ataque coordenado com sucesso, estrela da morte destruida. Todas as unidades informadas e alinhadas.""",
agent=leia,
allow_delegation=True,
)
destruir_estrela_morte = Task(
description="""Luke deve pilotar sua X-Wing e atirar no ponto fraco da Estrela da Morte para destrui-la.""",
expected_output="""Estrela da Morte destruida, missdo bem-sucedida. """,
agent=luke,
)
# Crews
alianca_rebelde = Crew(
agents=[leia, luke],
tasks=[coordenar_ataque, destruir_estrela_morte],
verbose=2,
manager_llm=llm,
)
alianca_rebelde.kickoff() I did a bisect and the bad commit is 0b78106. Before that, it works without error. |
@italovieira is correct... reverting the one line changed DOES help. It's not perfect, but it helps greatly. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
When running local, i can't get past coworker errors. When I run the exact same workflow on OpenAI (using the last Exa example, below), it works fine.
On local Ollama or LM-Studio, it seems to be sensitive to the search tool.
With
search_tool = DuckDuckGoSearchRun()
I get errors like this:With
search_tool = SerperDevTool()
With
search_tool = ExaSearchTool.search
With
search_tool = ExaSearchTool.search
ANDfind_similar_tool = ExaSearchTool.find_similar
ANDsearch_and_contents_tool = ExaSearchTool.search_and_contents
'internet research' definitely exists...
CrewAI does not seem to play nice with anything local, and I have tried dozens of different configs :/
Presumably, the tools are working the same, so perhaps the OpenAI API is not as compatible as claimed for Ollama and LMS? Or, it is something related to CrewAI?
The very simple code I am using to test is at https://github.com/tholonia/crewai-blogger/blob/main/blogger_v0.py
The log outputs are
https://github.com/tholonia/crewai-blogger/blob/main/output_ddgs.log
https://github.com/tholonia/crewai-blogger/blob/main/output_exa.log
https://github.com/tholonia/crewai-blogger/blob/main/output_openai.md
https://github.com/tholonia/crewai-blogger/blob/main/output_serper.log
The text was updated successfully, but these errors were encountered: