From 440d0dd6c63f817a15d7175d42d67aa7d7a97d5b Mon Sep 17 00:00:00 2001 From: Jeremy Fowers <80718789+jeremyfowers@users.noreply.github.com> Date: Thu, 29 Aug 2024 11:41:21 -0400 Subject: [PATCH] Fix installation link Signed-off-by: Jeremy Fowers <80718789+jeremyfowers@users.noreply.github.com> --- src/turnkeyml/llm/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/turnkeyml/llm/README.md b/src/turnkeyml/llm/README.md index 92308ea..14f5ea1 100644 --- a/src/turnkeyml/llm/README.md +++ b/src/turnkeyml/llm/README.md @@ -102,7 +102,7 @@ You can also try Phi-3-Mini-128k-Instruct with the following commands: > Note: no other models or devices are officially supported by `lemonade` on OGA at this time. Contributions appreciated! -## Install Ryzen AI NPU +## Install RyzenAI NPU To run your LLMs on Ryzen AI NPU, first install and set up the `ryzenai-transformers` conda environment (see instructions [here](https://github.com/amd/RyzenAI-SW/tree/main/example/transformers)). Then, install `lemonade` into `ryzenai-transformers`. The `ryzenai-npu-load` Tool will become available in that environment.