A native Capacitor plugin that embeds llama.cpp directly into mobile apps, enabling offline AI inference with chat-first API design. Complete iOS and Android support: text generation, chat, multimodal, TTS, LoRA, embeddings, and more.
[email protected] low health (61/100) — consider alternatives
Get this data programmatically — free, no authentication.
curl https://depscope.dev/api/check/npm/llama-cpp-capacitorFirst published · 2025-08-29T23:51:14.268Z
Last updated · 2026-02-18T23:10:03.234Z