Search npm for packages to use with Deno
Add them to your projects with a simple command that will already feel familar.
node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
@huggingface/gguf
a GGUF parser that works on remotely hosted files
@huggingface/ollama-utils
Various utilities for maintaining Ollama compatibility with models on Hugging Face hub
hyllama
llama.cpp gguf file parser for javascript
catai
Chat UI and Local API for the Llama models
gufflabs
Lightweight JavaScript package for running GGUF language models
@duck4i/llama
Native Node.JS plugin to run LLAMA inference directly on your machine with no other dependencies.
custom-koya-node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
@ngxson/gguf-test
a GGUF parser that works on remotely hosted files
browser-llm-engine
A browser-friendly library for running LLM inference using Wllama with preset and dynamic model loading, caching, and download capabilities.