O-Lang resolver for local LLM inference via Ollama. Zero data leaves your infrastructure.
@o-lang/[email protected] low health (55/100) — consider alternatives
Get this data programmatically — free, no authentication.
curl https://depscope.dev/api/check/npm/@o-lang/llm-ollamaFirst published · 2026-04-25T20:43:41.416Z
Last updated · 2026-04-28T04:05:03.209Z