airllm
pypiv2.11.0AirLLM allows single 4GB GPU card to run 70B large language models without quantization, distillation or pruning. 8GB vmem to run 405B Llama3.1.
License MITpermissive33 versions1 maintainers8 deps3,180 weekly dl
lyogavin/airllm58
/ 100
Health
safe to use
[email protected] is safe to use (health: 58/100)
Health breakdown0 – 100
5/25
maintenance
6/20
popularity
25/25
security
12/15
maturity
10/15
community
Vulnerabilities
0
none known
Health History
Dependency Tree
License Audit
API access
Get this data programmatically — free, no authentication.
curl https://depscope.dev/api/check/pypi/airllmMore from pypi
Last updated · 2024-09-21T02:52:22.091498Z