Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] How do I use Devika? #532

Open
KorDum opened this issue May 4, 2024 · 11 comments
Open

[Question] How do I use Devika? #532

KorDum opened this issue May 4, 2024 · 11 comments

Comments

@KorDum
Copy link

KorDum commented May 4, 2024

I don't understand how to use it, now I'll try to describe what exactly doesn't work.

I downloaded the ollama model (tried several different ones, including llama3), installed everything, everything works. Then I try to create a project from scratch by asking to create docker-compose.yml with one container. Devika did fine, I got the docker-compose.yml file and the Dockerfile as expected.

Then I noticed that the docker-compose.yml specifies a version that is no longer needed. I understand where it came from, so I asked Devika to remove the version from the file. Devika said she understood me and removed the version, but there was no change to the file. I tried to reword and ask to delete the first line from the file. Devika destroyed the file and hung in an endless loop while checking the actions taken until it ran out of attempts. This was the first situation I encountered.

The second situation is even more interesting. I asked Devika to move the Dockerfile to a subdirectory. Devika ended up moving the file to the correct directory, but created "mkdir", "mv", "&&" folders next to it and hung in an endless loop until it ran out of attempts.

How is this supposed to work and should it? What tasks should Devika cover in general? I assume I'm supposed to be the AI operator and should guide Devika's actions like a mentor guides a junior. But all my attempts to use it in some way in the present scenario at work end up with me realizing that it's easier to do everything by hand.

@hqnicolas
Copy link

hqnicolas commented May 11, 2024

@KorDum did you try Devika for Ollama?

@KorDum
Copy link
Author

KorDum commented May 12, 2024

@hqnicolas Thanks, I'll give it a try and let you know my experience using it!

@hqnicolas
Copy link

Test the llama3 instruct
https://ollama.com/library/llama3:instruct

@KorDum
Copy link
Author

KorDum commented May 12, 2024

Absolutely the same behavior. Which model should I use?
I tried the models:

  • openchat:7b-v3.5-1210-q5_K_M
  • mistral-openorca:7b-q5_K_M
  • qwen:14b-chat-v1.5-q4_K_M
  • codellama
  • deepseek-coder
  • llama3

In all cases or infinite loop because you can't parse the response as JSON. Or it says "I understood you and did everything", but did nothing.

@KorDum
Copy link
Author

KorDum commented May 12, 2024

Maybe I'm giving the instructions wrong somehow?

  • "Create docker-compose file with one container with php-8.3.6-cli-alpine image"
  • devica creates the file successfully
  • "Move Dockerfile into docker/dev/php-cli directory"
  • devica says "I got it and did it", but doesn't move anything
  • "Remove first line from docker-compose.yml file with version"
  • devica says "I've understood and done" but nothing is removed

Yeah, this behavior with llama3:instruct, too.

@hqnicolas
Copy link

hqnicolas commented May 12, 2024

Maybe I'm giving the instructions wrong somehow?

  • devica says "I've understood and done" but nothing is removed
    Yeah, this behavior with llama3:instruct, too.

@KorDum I believe in the need to create a custom Ollama Model to use with DEVIKA
This video shows your answer

@dagelf
Copy link

dagelf commented May 26, 2024

That would be a good solution, but should absolutely not be necessary if the all interactions are visible and tweakable from the UI.

@SyedMuqtasidAli
Copy link

@dagelf @KorDum @hqnicolas @jonatas @skaramicke which one is better Devika or ChatDev, performance-wise, compatibility with hugging face model wise or Ui wise which one is better ?

@jonatas
Copy link
Contributor

jonatas commented Jun 6, 2024

I don't have a strong opinion as I gave up after a few hours trying. I saw it doing some work but it was a bit lost.

@KorDum
Copy link
Author

KorDum commented Jun 7, 2024

@SyedMuqtasidAli Does ChatDev support local ollama models?

@SyedMuqtasidAli
Copy link

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants