-
-
Notifications
You must be signed in to change notification settings - Fork 23.2k
Closed as not planned
Description
First thanks for the excellent framework! It is very easy to use and promising. I have built several workflows with it using ChatGPT and am currently switching to LocalAI and open source models.
Describe the bug
Currently with a LocalAI backend Flowise can support single shot completion but not chat related workflows. I have observed:
What works:
Given template, using normal prompt node and connect to LLM chain.
What does not work:
- Use ChatPromptTemplate and connect to LLM chain.
Chat and the AI will respond with "}"
- The same settings, but with the OpenAI node and the endpoint of LocalAI
Chat and the AI will report undefined property "text"
To Reproduce
Steps to reproduce the behavior:
See the picture above.
Expected behavior
AI can respond with actual answers
Screenshots
See the above.
Flow
See the above.
Setup
- Installation: docker compose
- Flowise Version: latest
- OS: Linux
- Browser: Firefox
Additional context
Metadata
Metadata
Assignees
Labels
No labels


