Skip to content

Latest commit

 

History

History
72 lines (47 loc) · 7.24 KB

CHANGELOG.md

File metadata and controls

72 lines (47 loc) · 7.24 KB

Changelog

0.3.0 (2024-02-16)

Features

  • add mistral + chatml prompts (#1426) (e326126)
  • Add stream information to generate SDKs (#1569) (24fae66)
  • API: Ingest plain text (#1417) (6eeb95e)
  • bulk-ingest: Add --ignored Flag to Exclude Specific Files and Directories During Ingestion (#1432) (b178b51)
  • llm: Add openailike llm mode (#1447) (2d27a9f), closes #1424
  • llm: Add support for Ollama LLM (#1526) (6bbec79)
  • settings: Configurable context_window and tokenizer (#1437) (4780540)
  • settings: Update default model to TheBloke/Mistral-7B-Instruct-v0.2-GGUF (#1415) (8ec7cf4)
  • ui: make chat area stretch to fill the screen (#1397) (c71ae7c)
  • UI: Select file to Query or Delete + Delete ALL (#1612) (aa13afd)

Bug Fixes

  • Adding an LLM param to fix broken generator from llamacpp (#1519) (869233f)
  • deploy: fix local and external dockerfiles (fde2b94)
  • docker: docker broken copy (#1419) (059f358)
  • docs: Update quickstart doc and set version in pyproject.toml to 0.2.0 (0a89d76)
  • minor bug in chat stream output - python error being serialized (#1449) (6191bcd)
  • settings: correct yaml multiline string (#1403) (2564f8d)
  • tests: load the test settings only when running tests (d3acd85)
  • UI: Updated ui.py. Frees up the CPU to not be bottlenecked. (24fb80c)

0.2.0 (2023-12-10)

Features

  • llm: drop default_system_prompt (#1385) (a3ed14c)
  • ui: Allows User to Set System Prompt via "Additional Options" in Chat Interface (#1353) (145f3ec)

0.1.0 (2023-11-30)

Features

Bug Fixes

0.0.2 (2023-10-20)

Bug Fixes

0.0.1 (2023-10-20)

Miscellaneous Chores