Skip to content

v0.3.0

Compare
Choose a tag to compare
@github-actions github-actions released this 16 Feb 16:42
066ea5b

0.3.0 (2024-02-16)

Features

  • add mistral + chatml prompts (#1426) (e326126)
  • Add stream information to generate SDKs (#1569) (24fae66)
  • API: Ingest plain text (#1417) (6eeb95e)
  • bulk-ingest: Add --ignored Flag to Exclude Specific Files and Directories During Ingestion (#1432) (b178b51)
  • llm: Add openailike llm mode (#1447) (2d27a9f), closes #1424
  • llm: Add support for Ollama LLM (#1526) (6bbec79)
  • settings: Configurable context_window and tokenizer (#1437) (4780540)
  • settings: Update default model to TheBloke/Mistral-7B-Instruct-v0.2-GGUF (#1415) (8ec7cf4)
  • ui: make chat area stretch to fill the screen (#1397) (c71ae7c)
  • UI: Select file to Query or Delete + Delete ALL (#1612) (aa13afd)

Bug Fixes

  • Adding an LLM param to fix broken generator from llamacpp (#1519) (869233f)
  • deploy: fix local and external dockerfiles (fde2b94)
  • docker: docker broken copy (#1419) (059f358)
  • docs: Update quickstart doc and set version in pyproject.toml to 0.2.0 (0a89d76)
  • minor bug in chat stream output - python error being serialized (#1449) (6191bcd)
  • settings: correct yaml multiline string (#1403) (2564f8d)
  • tests: load the test settings only when running tests (d3acd85)
  • UI: Updated ui.py. Frees up the CPU to not be bottlenecked. (24fb80c)