You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Step one is #16. Then to add support for a local postgres/supabase. Ideally we can have a docker compose setup that runs in a completely isolated manner with only OpenAI calls being a managed dependency. Then we can introduce a local model (even if only for testing purposes).
We should consider LangChain as an abstraction to vector databases because they already have support for multiple stores.
The text was updated successfully, but these errors were encountered:
Commenting to say that the high level goal here to enable local development is a high priority task. We're going to move the server itself out so the Python SDK and Typescript SDK can be used locally with a vectordb (probably Chroma) and OpenAI calls / local LLM.
That would be a great capability. We are working in an isolated environment with only Azure OpenAI ports opened, and having the server run on-premises would help us a lot.
Step one is #16. Then to add support for a local postgres/supabase. Ideally we can have a docker compose setup that runs in a completely isolated manner with only OpenAI calls being a managed dependency. Then we can introduce a local model (even if only for testing purposes).
We should consider LangChain as an abstraction to vector databases because they already have support for multiple stores.
The text was updated successfully, but these errors were encountered: