rupeshs / ovllm_node_addon

OpenVINO LLM Node.js C++ addon

Repository from Github https://github.comrupeshs/ovllm_node_addonRepository from Github https://github.comrupeshs/ovllm_node_addon

Node.js OpenVINO LLM C++ addon

This is a Node.js addon for OpenVINO GenAI LLM. Tested using TinyLLama chat 1.1 OpenVINO int4 model on Windows 11 (Intel Core i7 CPU).

Watch below YouTube video for demo :

IMAGE ALT TEXT HERE

Build

Run the following commands to build:

npm install
node-gyp configure
node-gyp build

Run

To test the Node.js OpenVINO LLM addon run the index.js script.

node index.js D:/demo/TinyLlama-1.1B-Chat-v1.0-openvino-int4

Disable streaming

node index.js D:/demo/TinyLlama-1.1B-Chat-v1.0-openvino-int4 nostream

Supported models

Supported models are here

About

OpenVINO LLM Node.js C++ addon


Languages

Language:C++ 95.4%Language:C 4.5%Language:JavaScript 0.1%Language:Python 0.0%