vllm

pypiv0.19.0

A high-throughput and memory-efficient inference and serving engine for LLMs

81 versions1 maintainers88 deps3,004,441 weekly downloads
vllm-project/vllm
76
/100
Health Score

Recommendation

[email protected] is safe to use (health: 76/100)

Health Breakdown

25/25
maintenance
17/20
popularity
17/25
security
15/15
maturity
2/15
community

Vulnerabilities (4)

4 medium
medium
CVE-2024-11041

vLLM Deserialization of Untrusted Data vulnerability

medium
CVE-2024-9053

vLLM allows Remote Code Execution by Pickle Deserialization via AsyncEngineRPCServer() RPC server entrypoints

medium
CVE-2024-9052

vLLM deserialization vulnerability in vllm.distributed.GroupCoordinator.recv_object

medium
CVE-2024-8939

vLLM Denial of Service via the best_of parameter

API Access

Get this data programmatically — free, no authentication required:

curl https://depscope.dev/api/check/pypi/vllm

Last updated: 2026-04-03T04:05:52.513885Z

Data from DepScope — Package Intelligence for AI Agents