A persistent, structured memory for LLM coding agents. Solves the context window problem — retrieves decisive context for each edit under a small token budget, measurably beating flat recency, LLM compact, and classic RAG.
[email protected] is safe to use (health: 56/100)
Get this data programmatically — free, no authentication.
curl https://depscope.dev/api/check/pypi/bellamemLast updated · 2026-04-14T20:26:30.488453Z