Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add option to preprocess the prompt payload #74

Closed
wants to merge 2 commits into from

Conversation

albohlabs
Copy link

Hi, I could not get my Neovim to talk to my codellama model because the expected format is not the same as the one the plugin produces.

curl -X POST http://localhost:11434/api/generate -d '{
  "model": "codellama",
  "prompt": "Write me a function that outputs the fibonacci sequence"
}'

For this reason I added the preprocess_body option, so that the user of the plugin can easily manipulate the format of the curl post body.

Aaaaaand there is a second commit with some code style changes produced by the lua lsp. Sorry about that. 👼

@Mte90
Copy link
Contributor

Mte90 commented Mar 7, 2024

For codellama it is requested something with this plugin at the end (apart that PR)?

@albohlabs
Copy link
Author

Sorry, I don't think I understand what you mean.

But I saw in your dots that you're also using codellama. What is the curl post body that gen.nvim produces? I'm confused. 🤔

@Mte90
Copy link
Contributor

Mte90 commented Mar 8, 2024

I was just trying to understand if with codellama once that this PR is approved it is required something else to add in the plugin settings.

@albohlabs
Copy link
Author

albohlabs commented Mar 11, 2024

The preprocessor function has a default implementation that is used and contains the current prompt creation logic. You can find it at https://github.com/albohlabs/gen.nvim/blob/feat/preprocess-body/lua/gen/init.lua#L35

local default_options = {
    ...
    --- Create the curl prompt that will be included in the substutute of $body
    ---
    ---@param prompt string user input and optionally the highlighted code
    ---@return table
    preprocess_body = function(prompt)
        local messages = {}
        table.insert(messages, { role = "user", content = prompt })
        return { messages = messages }
    end,
   ...

This means that when this PR is merged, there is nothing for you to do. Everything should work out of the box.

@David-Kunz
Copy link
Owner

Hi @albohlabs ,

Sorry for the late reply, is this still needed? I'm a bit confused why codellama has a different shape, I thought Ollama has an OpenAPI compatible API?

@David-Kunz David-Kunz closed this Oct 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants