Skip to content
This repository has been archived by the owner on Apr 3, 2024. It is now read-only.

version3 bug #31

Closed
iplayfast opened this issue Jan 31, 2024 · 0 comments
Closed

version3 bug #31

iplayfast opened this issue Jan 31, 2024 · 0 comments

Comments

@iplayfast
Copy link

It looks like (at least on my ollama) the response coming back from ollama contains \n\n between each line which messes up csv parsing later on.
simple fix in core.py line 179:

        try:
             if self.llm_endpoint == 'ollama' and self.ollama:
                 response = self.ollama.invoke(instruction)
+                response = response.replace("\n\n", "\n")
                 # Log the raw LLM output
                 logging.debug(f"Raw LLM output (Ollama):\n{response}")
                 logging.debug(f"Number of tokens in the response: {count_tokens(response)}")

Also I noticed that your editor saves lines with 0x0d,0x0a (CR and NL).
I've found things go easier if you can save things with only the newline.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant