ollama-ai-provider

npmv1.2.0

Vercel AI Provider for running LLMs locally using Ollama

License Apache-2.0permissive25 versions1 maintainers3 deps
sgomez/ollama-ai-provider
44
/ 100
Health
safe to use

[email protected] is safe to use (health: 44/100)

Health breakdown0 – 100
5/25
maintenance
0/20
popularity
25/25
security
12/15
maturity
2/15
community
Vulnerabilities
0
none known

Bundle & TypeScript

📦

Bundle Size

39.0 KBminified
11.1 KB gzipped
3 direct dependencies
ESMside effects
🌟

TypeScript

10/10typed
bundled

Health History

Dependency Tree

License Audit

API access

Get this data programmatically — free, no authentication.

curl https://depscope.dev/api/check/npm/ollama-ai-provider

First published · 2024-05-05T09:34:41.490Z

Last updated · 2025-01-17T16:54:36.767Z