Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Removed hyphen in co-workers #712

Closed
wants to merge 1 commit into from

Conversation

madmag77
Copy link
Contributor

This PR intended to fix the problem with open sourced LLM that tries to use name co-worker instead of coworker in Actions because all over the prompts co-workers spelling contain hyphen. It prevents using delegation with open sourced models - I tried Mistral 0.3 and Phi Medium and they both try to use coworker with hyphen. Same problem mentioned here.

There was a attempt to fix it using kwargs parsing here, however Langchain StructuredTool, which is being used under the hood, uses this function where only fields from schema are being passed further and since co-worker is not a part of the schema it's being ignored. And we can't include it in schema as we can't have fields with hyphens inside.

Attaching the log with error got on the default example about writing article with two agents (error is a bit misleading as it complains about agent name however the problem is with key):

 Entering new CrewAgentExecutor chain...
 Action: Delegate work to co-worker
Action Input: {
    "co-worker": "Senior Research Analyst",
    "task": "Research the most interesting AI applications in Interior Design.",
    "context": "I need detailed information about the AI applications in Interior Design, their functions, benefits, and how they are changing the industry. The information should be presented in a way that is easy to understand for a tech-savvy audience."
}


Error executing tool. Co-worker mentioned not found, it must to be one of the following options:
- senior research analyst

After removing hyphen open sourced LLMs started using proper input/key name coworker and everything works fine.

@pranitl
Copy link

pranitl commented May 31, 2024

Thank you for fixing it @madmag77 ! I ran your fork and it worked. Hope this gets merged because the entire crewai collaboration framework for Ollama local models don't work at all.

@madmag77
Copy link
Contributor Author

madmag77 commented Jun 1, 2024

Thank you for fixing it @madmag77 ! I ran your fork and it worked. Hope this gets merged because the entire crewai collaboration framework for Ollama local models don't work at all.

I made another PR in my fork that made working with LM Studio and Ollama with Llama models same stable as with Mistral 0.3 at least on my examples. PR is super simple - I just noticed that Observation stop word didn't work, so I removed /n and it resolved the problem.

@joaomdmoura
Copy link
Collaborator

Nice! great PR looking into it!

@greg80303
Copy link

greg80303 commented Jun 4, 2024

Love this PR -- I think it will fix #668

I wonder @madmag77 , do we still need the co_worker form as well? If we consistently use the form coworker throughout all prompting, can we simplify the code even more in agent_tools.py?

@madmag77
Copy link
Contributor Author

madmag77 commented Jun 4, 2024

Nice! great PR looking into it!

Thanks @joaomdmoura! looking forward to having it merged. Have another PR ready afterwards.

Really like the framework! thanks for working on it...

greg80303 added a commit to greg80303/crewAI that referenced this pull request Jun 7, 2024
@JavierCCC
Copy link

im working with llama3 running locally and im really waiting for this...

@pranitl
Copy link

pranitl commented Jun 14, 2024

Agreed, crewai is non functional without this fix.

@incidentallyalthoughcoolly

@madmag77 , how do i test and use your fork? i am having these issues too and need crews to work via ollama and until this gets merged to main, i need to have it work.

@greg80303
Copy link

It has been merged

@pranitl
Copy link

pranitl commented Jun 15, 2024

It has @greg80303 ? I see this release as still open and last release was 0.30.11 and nothing else since. Have you got it working not through an official release?

@greg80303
Copy link

@rasterize-art Yes, the PR was merged to main, but there hasn't been a release yet.

I have a fork of the CrewAI repo that I just update to the latest main branch commits. I use pip install -e to install CrewAI in my project's venv from my forked repo. So, I'm always running with the latest commits.

@madmag77
Copy link
Contributor Author

madmag77 commented Jun 16, 2024

@madmag77 , how do i test and use your fork? i am having these issues too and need crews to work via ollama and until this gets merged to main, i need to have it work.

@incidentallyalthoughcoolly :
This is the example I've done that uses my fork: https://github.com/madmag77/crewai-article-example - you can check pyproject.toml file to see how to add CrewAI from my fork as a dependency for your project.

@incidentallyalthoughcoolly

Sorry i am learning to code now. so i have a venv set up, I do git clone to yours or how do i do what @greg80303 had done? I'm just really really confused because thought if i do pip install -r requirements or do i use your poetry thing on your repository @madmag77 . im really overwhelemd and i would really appreciate some instructions because its all so new to me.

@greg80303
Copy link

The fix by @madmag77 has been merged to main in the CrewAI repo. You can install a python libaray from a local clone of its repo using pip install -e. These are the steps:

  1. Clone the CrewAI repo
  2. CD to the repo root directory
  3. Activate the venv for your project
  4. Run pip install -e .

This will uninstall any releases version of CrewAI from your virtual environment and "link" your local CrewAI repository code to your virtual environment. You can make changes to CrewAI source files if you want and the change will be immediately reflected in the execution of your application. But for your purposes, a fresh checkout of the CrewAI repo will contain the fixes you're looking for.

@incidentallyalthoughcoolly
Copy link

incidentallyalthoughcoolly commented Jun 17, 2024

thank you very much @greg80303 !!

So in the second bullet above, i cd into the clone repo or my project's repo?

@greg80303
Copy link

You cd into the crewAI repo.

joaomdmoura pushed a commit that referenced this pull request Jun 18, 2024
* removed hyphen in co-workers

* Fix issue with AgentTool agent selection. The LLM included double quotes in the agent name which messed up the string comparison. Added additional types. Cleaned up error messaging.

* Remove duplicate import

* Improve explanation

* Revert poetry.lock changes

* Fix missing line in poetry.lock

---------

Co-authored-by: madmag77 <goncharov.artemv@gmail.com>
@joaomdmoura
Copy link
Collaborator

yay! thanks folks @bhancockio fixed the conflicts on #786 and I merged it, so this is fixed in master and will be in the new version we cut :D
🔥 you rock @madmag77 and @bhancockio

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants