flash_attn
pypiv2.8.3Flash Attention: Fast and Memory-Efficient Exact Attention
License BSD-3-Clause75 versions1 maintainers2 deps522,601 weekly dl
Dao-AILab/flash-attention66
/ 100
Health
safe to use
[email protected] is safe to use (health: 66/100)
Health breakdown0 – 100
10/25
maintenance
14/20
popularity
25/25
security
15/15
maturity
2/15
community
Vulnerabilities
0
none known
Health History
Dependency Tree
License Audit
API access
Get this data programmatically — free, no authentication.
curl https://depscope.dev/api/check/pypi/flash_attnMore from pypi
Last updated · 2025-08-15T08:28:12.911581Z