Releases: LlamaEdge/rag-api-server
LlamaEdge-RAG 0.9.2
Major change:
- Update to
llama-core v0.16.1
,chat-prompts v0.13.0
andendpoints v0.13.1
LlamaEdge-RAG 0.9.1
Major change:
- Update to
llama-core v0.16.0
,chat-prompts v0.12.0
andendpoints v0.13.1
LlamaEdge-RAG 0.9.0
Major changes:
- (BREAKING) Migrate to
WasmEdge v0.14
- New CLI options:
--threads
,--grammar
, and--json-schema
NOTICE
For developers on macOS, it is strongly recommended to read TLS on MacOS before building llama-api-server.wasm
and llama-chat.wasm
from source; in addition, prefix RUSTFLAGS="--cfg wasmedge --cfg tokio_unstable"
to cargo build --target wasm32-wasip1 --release
command
LlamaEdge-RAG 0.8.2
Major changes:
- Add code for handling the HTTP
OPTIONS
method in handlers
LlamaEdge-RAG 0.8.1
Major changes:
-
Improve
RagPromptBuilder
-
Remove the bindings between
--main-gpu
and--tensor-split
CLI options -
Update to
llama-core v0.14.1
,chat-prompts v0.11.1
, andendpoints v0.12.0
.
LlamaEdge-RAG 0.8.0
Major changes:
-
Support Meta-Llama-3.1-Instruct and internlm2.5-7b-chat tool use
- New prompt template:
llama-3-tool
andinternlm-2-tool
- New prompt template:
-
Add
--main-gpu
and--tensor-split
CLI options -
Update to
llama-core v0.14.0
,chat-prompts v0.11.0
, andendpoints v0.11.1
.
LlamaEdge-RAG 0.7.5
Major changes:
- Update to
llama-core v0.13.1
,chat-prompts v0.10.2
, andendpoints v0.11.0
.
LlamaEdge-RAG 0.7.4
Major changes:
- Update to
llama-core v0.13.0
,chat-prompts v0.10.1
, andendpoints v0.10.2
.
LlamaEdge-RAG 0.7.3
Major changes:
-
Extend
/v1/files
endpoint- List files on server
- Retrieve a specific file
- Delete a specific file
-
Update to
llama-core v0.12.1
,chat-prompts v0.10.0
, andendpoints v0.10.1
.
LlamaEdge-RAG 0.7.2
Major changes:
- Improve
rag_query_handler
- Update deps
llama-core v0.12.0
chat-prompts v0.9.0
endpoints v0.10.0