Skip to content

v1.4.0

Compare
Choose a tag to compare
@mitya52 mitya52 released this 09 Feb 10:21
· 259 commits to main since this release

What's New

  • WebGUI Chat: Now, we ship a chat UI with our docker image!
  • Embeddings: From now on, in our docker, by default, we are starting the model responsible for the embeddings. That is necessary for the VecDB support.
  • Shared Memory Issue Resolved: A critical performance issue related to shared memory has been fixed. For more details, check out the GitHub issue.
  • Anthropic Integration: We've implemented an ability to add API keys to use third-party models!
  • stable-code-3b: The list of available models is growing! This time, we added stabilityai/stable-code-3b!
  • Optional API Key for OSS: Refact.ai Self-hosted version can now use an optional API key for security if deployed on a cloud.
  • Build Information: In the settings, you can now find the About page, which includes information about packages that are used, versions, and commit hashes.
  • LoRa Switch Fix: The issue with switching between LoRas (didn't show information in logs) is now fixed!
  • VLLM Out-of-Memory (OOM) Fix: We've fixed an out-of-memory issue with VLLM for the Refact.ai Enterprise!