LLM inference in C/C++
llama.cpp@8990 is safe to use (health: 67/100)
Get this data programmatically — free, no authentication.
curl https://depscope.dev/api/check/homebrew/llama.cpp
Last updated · 2026-05-01T02:01:48Z