Skip to content

Commit

Permalink
Merge dev branch (#129)
Browse files Browse the repository at this point in the history
* [Example] ggml: add grammar example (#126)

* [Example] ggml: add grammar example

Signed-off-by: dm4 <[email protected]>

* [CI] add grammar test

Signed-off-by: dm4 <[email protected]>

---------

Signed-off-by: dm4 <[email protected]>

* [Example] ggml: update command-r example to use command-r-plus with tool use prompt

Signed-off-by: dm4 <[email protected]>

* [CI] llama: add llama2 embedding test

Signed-off-by: dm4 <[email protected]>

* [CI] llama: add llama-stream test

Signed-off-by: dm4 <[email protected]>

---------

Signed-off-by: dm4 <[email protected]>
  • Loading branch information
dm4 authored Apr 16, 2024
1 parent 653daf0 commit 6e81ae3
Show file tree
Hide file tree
Showing 10 changed files with 430 additions and 19 deletions.
40 changes: 39 additions & 1 deletion .github/workflows/llama.yml
Original file line number Diff line number Diff line change
Expand Up @@ -101,6 +101,19 @@ jobs:
default \
$'[INST] <<SYS>>\nYou are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature. If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you do not know the answer to a question, please do not share false information.\n<</SYS>>\nWhat is the capital of Japan?[/INST]'
- name: Llama2 7B (Streaming)
run: |
test -f ~/.wasmedge/env && source ~/.wasmedge/env
cd wasmedge-ggml/llama-stream
curl -LO https://huggingface.co/TheBloke/Llama-2-7b-Chat-GGUF/resolve/main/llama-2-7b-chat.Q5_K_M.gguf
cargo build --target wasm32-wasi --release
time wasmedge --dir .:. \
--env n_gpu_layers="$NGL" \
--nn-preload default:GGML:AUTO:llama-2-7b-chat.Q5_K_M.gguf \
target/wasm32-wasi/release/wasmedge-ggml-llama-stream.wasm \
default \
$'[INST] <<SYS>>\nYou are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature. If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you do not know the answer to a question, please do not share false information.\n<</SYS>>\nWhat is the capital of Japan?[/INST]'
- name: StarCoder 2 7B
run: |
test -f ~/.wasmedge/env && source ~/.wasmedge/env
Expand Down Expand Up @@ -133,7 +146,7 @@ jobs:
target/wasm32-wasi/release/wasmedge-ggml-multimodel.wasm \
'describe this picture please'
- name: Embedding Example
- name: Embedding Example (All-MiniLM)
run: |
test -f ~/.wasmedge/env && source ~/.wasmedge/env
cd wasmedge-ggml/embedding
Expand All @@ -145,6 +158,18 @@ jobs:
default \
'hello world'
- name: Embedding Example (Llama-2)
run: |
test -f ~/.wasmedge/env && source ~/.wasmedge/env
cd wasmedge-ggml/embedding
curl -LO https://huggingface.co/TheBloke/Llama-2-7b-Chat-GGUF/resolve/main/llama-2-7b-chat.Q5_K_M.gguf
cargo build --target wasm32-wasi --release
time wasmedge --dir .:. \
--nn-preload default:GGML:AUTO:llama-2-7b-chat.Q5_K_M.gguf \
target/wasm32-wasi/release/wasmedge-ggml-llama-embedding.wasm \
default \
'hello world'
- name: RPC Example
run: |
test -f ~/.wasmedge/env && source ~/.wasmedge/env
Expand All @@ -171,6 +196,19 @@ jobs:
default \
'<start_of_turn>user Where is the capital of Japan? <end_of_turn><start_of_turn>model'
- name: Grammar Example
run: |
test -f ~/.wasmedge/env && source ~/.wasmedge/env
cd wasmedge-ggml/grammar
curl -LO https://huggingface.co/TheBloke/Llama-2-7b-GGUF/resolve/main/llama-2-7b.Q5_K_M.gguf
cargo build --target wasm32-wasi --release
time wasmedge --dir .:. \
--env n_gpu_layers="$NGL" \
--nn-preload default:GGML:AUTO:llama-2-7b.Q5_K_M.gguf \
target/wasm32-wasi/release/wasmedge-ggml-grammar.wasm \
default \
'JSON object with 5 country names as keys and their capitals as values: '
- name: Build llama-stream
run: |
cd wasmedge-ggml/llama-stream
Expand Down
30 changes: 20 additions & 10 deletions wasmedge-ggml/command-r/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,23 +10,33 @@
## Get Model

Here we use the `c4ai-command-r-plus-GGUF` model as an example. You can download the model from the Hugging Face model hub.

```bash
curl -LO https://huggingface.co/andrewcanis/c4ai-command-r-v01-GGUF/resolve/main/c4ai-command-r-v01-Q5_K_M.gguf
curl -LO https://huggingface.co/pmysl/c4ai-command-r-plus-GGUF/resolve/main/command-r-plus-Q5_K_M-00001-of-00002.gguf
curl -LO https://huggingface.co/pmysl/c4ai-command-r-plus-GGUF/resolve/main/command-r-plus-Q5_K_M-00002-of-00002.gguf
```

## Execute

```console
In this example, we use the system prompt with the definition of avaiable tools from [Example Rendered Tool Use Prompt](https://huggingface.co/CohereForAI/c4ai-command-r-plus).

````console
$ wasmedge --dir .:. \
--nn-preload default:GGML:AUTO:c4ai-command-r-v01-Q5_K_M.gguf \
--nn-preload default:GGML:AUTO:command-r-plus-Q5_K_M-00001-of-00002.gguf \
./wasmedge-ggml-command-r.wasm default

USER:
What's the capital of the United States?
Whats the biggest penguin in the world?
ASSISTANT:
The capital of the United States is Washington, D.C.
USER:
How about Japan?
ASSISTANT:
Tokyo is the capital of Japan.
```
Action: ```json
[
{
"tool_name": "internet_search",
"parameters": {
"query": "biggest penguin species"
}
}
]
```
````
53 changes: 50 additions & 3 deletions wasmedge-ggml/command-r/src/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -141,16 +141,63 @@ fn main() {
}

let mut saved_prompt = String::new();
let system_prompt = String::from("You are a helpful, respectful and honest assistant. Always answer as short as possible, while being safe." );
let system_tool_prompt = r#"
# Safety Preamble
The instructions in this section override those in the task description and style guide sections. Don't answer questions that are harmful or immoral.
# System Preamble
## Basic Rules
You are a powerful conversational AI trained by Cohere to help people. You are augmented by a number of tools, and your job is to use and consume the output of these tools to best help the user. You will see a conversation history between yourself and a user, ending with an utterance from the user. You will then see a specific instruction instructing you what kind of response to generate. When you answer the user's requests, you cite your sources in your answers, according to those instructions.
# User Preamble
## Task and Context
You help people answer their questions and other requests interactively. You will be asked a very wide array of requests on all kinds of topics. You will be equipped with a wide range of search engines or similar tools to help you, which you use to research your answer. You should focus on serving the user's needs as best you can, which will be wide-ranging.
## Style Guide
Unless the user asks for a different style of answer, you should answer in full sentences, using proper grammar and spelling.
## Available Tools
Here is a list of tools that you have available to you:
```python
def internet_search(query: str) -> List[Dict]:
"""Returns a list of relevant document snippets for a textual query retrieved from the internet
Args:
query (str): Query to search the internet with
"""
pass
```
```python
def directly_answer() -> List[Dict]:
"""Calls a standard (un-augmented) AI chatbot to generate a response given the conversation history
"""
pass
```
"#;
let system_instruction_prompt = r#"
Write 'Action:' followed by a json-formatted list of actions that you want to perform in order to produce a good response to the user's last input. You can use any of the supplied tools any number of times, but you should aim to execute the minimum number of necessary actions for the input. You should use the `directly-answer` tool if calling the other tools is unnecessary. The list of actions you want to call should be formatted as a list of json objects, for example:
```json
[
{
"tool_name": title of the tool in the specification,
"parameters": a dict of parameters to input into the tool as they are defined in the specs, or {} if it takes no parameters
}
]```
"#;

loop {
println!("USER:");
let input = read_input();
//
if saved_prompt.is_empty() {
saved_prompt = format!(
"<|START_OF_TURN_TOKEN|><|USER_TOKEN|>{} {}<|END_OF_TURN_TOKEN|><|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>",
system_prompt, input
"<|START_OF_TURN_TOKEN|><|SYSTEM_TOKEN|>{}<|END_OF_TURN_TOKEN|>\
<|USER_TOKEN|>{}<|END_OF_TURN_TOKEN|>\
<|START_OF_TURN_TOKEN|><|SYSTEM_TOKEN|>{}<|END_OF_TURN_TOKEN|>\
<|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>",
system_tool_prompt, input, system_instruction_prompt
);
} else {
saved_prompt = format!("{} <|START_OF_TURN_TOKEN|><|USER_TOKEN|>{}<|END_OF_TURN_TOKEN|><|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>", saved_prompt, input);
Expand Down
Binary file modified wasmedge-ggml/command-r/wasmedge-ggml-command-r.wasm
Binary file not shown.
8 changes: 8 additions & 0 deletions wasmedge-ggml/grammar/Cargo.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
[package]
name = "wasmedge-ggml-grammar"
version = "0.1.0"
edition = "2021"

[dependencies]
serde_json = "1.0"
wasmedge-wasi-nn = "0.7.0"
37 changes: 37 additions & 0 deletions wasmedge-ggml/grammar/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# Grammar Example For WASI-NN with GGML Backend

> [!NOTE]
> Please refer to the [wasmedge-ggml/README.md](../README.md) for the general introduction and the setup of the WASI-NN plugin with GGML backend. This document will focus on the specific example of using grammar in ggml.
## Get the Model

In this example, we are going to use the [llama-2-7b](https://huggingface.co/TheBloke/Llama-2-7B-GGUF) model. Please note that we are not using a fine-tuned chat model.

```bash
curl -LO https://huggingface.co/TheBloke/Llama-2-7B-GGUF/resolve/main/llama-2-7b.Q5_K_M.gguf
```

## Parameters

> [!NOTE]
> Please check the parameters section of [wasmedge-ggml/README.md](https://github.com/second-state/WasmEdge-WASINN-examples/tree/master/wasmedge-ggml#parameters) first.
In this example, we are going to use the `grammar` option to constrain the model to generate the JSON output in a specific format.

You can check [the documents at llama.cpp](https://github.com/ggerganov/llama.cpp/tree/master/grammars) for more details about grammars.

## Execute

In this example, we are going to use the `n_predict` option to avoid the model from generating too many outputs.

```console
$ wasmedge --dir .:. \
--env n_predict=99 \
--nn-preload default:GGML:AUTO:llama-2-7b.Q5_K_M.gguf \
wasmedge-ggml-grammar.wasm default

USER:
JSON object with 5 country names as keys and their capitals as values:
ASSISTANT:
{"US": "Washington", "UK": "London", "Germany": "Berlin", "France": "Paris", "Italy": "Rome"}
```
Loading

0 comments on commit 6e81ae3

Please sign in to comment.