This is the repository card of kernels-community/sage-attention that has been pushed on the Hub. It was built to be used with the kernels library. This card was automatically generated.
How to use
# make sure `kernels` is installed: `pip install -U kernels`
from kernels import get_kernel
kernel_module = get_kernel("kernels-community/sage-attention")
per_block_int8 = kernel_module.per_block_int8
per_block_int8(...)
Available functions
per_block_int8per_warp_int8sub_meanper_channel_fp8sageattnsageattn3_blackwell
Benchmarks
No benchmark available yet.
- Downloads last month
- 237
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support