#machine-learning #blas #tensor

candle-flash-attn

Flash attention layer for the candle ML framework

23 unstable releases (7 breaking)

0.8.4 Mar 15, 2025
0.8.2 Jan 7, 2025
0.8.1 Dec 7, 2024
0.8.0 Nov 12, 2024
0.3.1 Nov 12, 2023

#1208 in Machine learning

Download history 129/week @ 2024-12-02 194/week @ 2024-12-09 75/week @ 2024-12-16 62/week @ 2024-12-23 30/week @ 2024-12-30 183/week @ 2025-01-06 84/week @ 2025-01-13 15/week @ 2025-01-20 28/week @ 2025-01-27 100/week @ 2025-02-03 127/week @ 2025-02-10 129/week @ 2025-02-17 129/week @ 2025-02-24 69/week @ 2025-03-03 165/week @ 2025-03-10 87/week @ 2025-03-17

463 downloads per month
Used in 10 crates (6 directly)

MIT/Apache

2.5MB
27K SLoC

candle-flash-attn

Dependencies

~37MB
~846K SLoC