You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Apr 3, 2024. It is now read-only.
It looks like (at least on my ollama) the response coming back from ollama contains \n\n between each line which messes up csv parsing later on.
simple fix in core.py line 179:
try:
if self.llm_endpoint == 'ollama' and self.ollama:
response = self.ollama.invoke(instruction)
+ response = response.replace("\n\n", "\n")
# Log the raw LLM output
logging.debug(f"Raw LLM output (Ollama):\n{response}")
logging.debug(f"Number of tokens in the response: {count_tokens(response)}")
Also I noticed that your editor saves lines with 0x0d,0x0a (CR and NL).
I've found things go easier if you can save things with only the newline.
The text was updated successfully, but these errors were encountered:
It looks like (at least on my ollama) the response coming back from ollama contains \n\n between each line which messes up csv parsing later on.
simple fix in core.py line 179:
Also I noticed that your editor saves lines with 0x0d,0x0a (CR and NL).
I've found things go easier if you can save things with only the newline.
The text was updated successfully, but these errors were encountered: