flash-linear-attention
pypiv0.5.0Fast linear attention models and layers
13 versions7 deps92,736 weekly dl
fla-org/flash-linear-attention69
/ 100
Health
safe to use
[email protected] is safe to use (health: 69/100)
Health breakdown0 – 100
25/25
maintenance
10/20
popularity
25/25
security
9/15
maturity
0/15
community
Vulnerabilities
0
none known
Health History
Dependency Tree
License Audit
API access
Get this data programmatically — free, no authentication.
curl https://depscope.dev/api/check/pypi/flash-linear-attentionLast updated · 2026-04-21T20:25:39.473551Z