Skip to content

rupeshs/ovllm_node_addon

Repository files navigation

Node.js OpenVINO LLM C++ addon

This is a Node.js addon for OpenVINO GenAI LLM. Tested using TinyLLama chat 1.1 OpenVINO int4 model on Windows 11 (Intel Core i7 CPU).

Watch below YouTube video for demo :

IMAGE ALT TEXT HERE

Build

Run the following commands to build:

npm install
node-gyp configure
node-gyp build

Run

To test the Node.js OpenVINO LLM addon run the index.js script.

node index.js D:/demo/TinyLlama-1.1B-Chat-v1.0-openvino-int4

Disable streaming

node index.js D:/demo/TinyLlama-1.1B-Chat-v1.0-openvino-int4 nostream

Supported models

Supported models are here

Releases

No releases published

Packages

No packages published

Languages