-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error "Co-worker mentioned not found..." when using with local llama3 #620
Comments
There are multiple issues with your code.
If i havn't missed anything, the coworker not being found is either because points 1-3 or because llama3 is "too dumb" |
Actually this might be related to #602 |
I've updated the code in the description with what you indicated. For the point 3, I used Either way, the error still occurs. |
This might be a problem in how the ollama or langchain outputs the steps for the agents. But I did a bisect and found out crewAI was able to cope with that before 0b78106. |
Hey @italovieira have you been able to fix it? Getting same issue :( |
I've opened a MR to fix this issue, but it's not merged yet. |
Without logs it's hard to figure out the reason. I've had the same problem when LLM (Mistral 0.3 in my case) return action input key as |
Sorry to ask, maybe I missed something. Did you solved this issue with an workaround? If yes can please let me know how ? |
@psyq0 Hope it helps. |
For me the fix from @madmag77 didn't work. Comparing available_agents to agent in agent_tools.py :
So as a dirty fix i replaced this line with the following one :
And now it works with llama3:8b without issue. A bit dirty but i hope it helps waiting for a fix. For reference this is how i initialize the agent :
|
With the example below I get the following error:
The text was updated successfully, but these errors were encountered: