flash-attention-prebuild-wheels

Packages

[!NOTE] Since v0.5.0, wheels are built with a local version label indicating the CUDA and PyTorch versions. Example: pip list -> flash_attn==2.8.3 -> flash_attn==2.8.3+cu130torch2.9

Table of Contents

🐧 Linux x86_64

Flash-Attention 2.8.3

Packages for Flash-Attention 2.8.3 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.9 | 2.5 | 12.4 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.8.3+cu124torch2.5-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.5 | 12.6 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.8.3+cu126torch2.5-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.6 | 12.4 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.8.3+cu124torch2.6-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.6 | 12.6 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.8.3+cu126torch2.6-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.7 | 12.4 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.8.3+cu124torch2.7-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.7 | 12.6 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.8.3+cu126torch2.7-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.8 | 12.4 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.8.3+cu124torch2.8-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.8 | 12.6 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.8.3+cu126torch2.8-cp39-cp39-linux_x86_64.whl) | | 3.10 | 2.5 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu124torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu126torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu126torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu128torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu128torch2.5-cp310-cp310-linux_x86_64.whl), [Download3(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu128torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu124torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu124torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu126torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu126torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu128torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu128torch2.6-cp310-cp310-linux_x86_64.whl), [Download3(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu128torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.9 | [Download1(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu129torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu124torch2.7-cp310-cp310-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu124torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu126torch2.7-cp310-cp310-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu126torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.8 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu128torch2.7-cp310-cp310-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu128torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.8 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu124torch2.8-cp310-cp310-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu124torch2.8-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.8 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu126torch2.8-cp310-cp310-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu126torch2.8-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.8 | 12.8 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu128torch2.8-cp310-cp310-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu128torch2.8-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.8 | 12.9 | [Download1(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu129torch2.8-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.9 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu124torch2.9-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.9 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu126torch2.9-cp310-cp310-linux_x86_64.whl), [Download2(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.8.3+cu126torch2.9-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.8.3+cu128torch2.9-cp310-cp310-linux_x86_64.whl), [Download2(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.8.3+cu128torch2.9-cp310-cp310-linux_x86_64.whl), [Download3(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu128torch2.9-cp310-cp310-linux_x86_64.whl), [Download4(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.8.3+cu128torch2.9-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.9 | 13.0 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.8.3+cu130torch2.9-cp310-cp310-linux_x86_64.whl), [Download2(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.8.3+cu130torch2.9-cp310-cp310-linux_x86_64.whl), [Download3(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu130torch2.9-cp310-cp310-linux_x86_64.whl), [Download4(v0.4.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.18/flash_attn-2.8.3+cu130torch2.9-cp310-cp310-linux_x86_64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu126torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu126torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu128torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu128torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu128torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu124torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu124torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu126torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu126torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.8 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu128torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu128torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.9 | [Download1(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu129torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu124torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu124torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu126torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu126torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.8 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu128torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu128torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.8 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu124torch2.8-cp311-cp311-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu124torch2.8-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.8 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu126torch2.8-cp311-cp311-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu126torch2.8-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.8 | 12.8 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu128torch2.8-cp311-cp311-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu128torch2.8-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.8 | 12.9 | [Download1(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu129torch2.8-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.9 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu124torch2.9-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.9 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu126torch2.9-cp311-cp311-linux_x86_64.whl), [Download2(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.8.3+cu126torch2.9-cp311-cp311-linux_x86_64.whl), [Download3(v0.4.15)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.15/flash_attn-2.8.3+cu126torch2.9-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.8.3+cu128torch2.9-cp311-cp311-linux_x86_64.whl), [Download2(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.8.3+cu128torch2.9-cp311-cp311-linux_x86_64.whl), [Download3(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu128torch2.9-cp311-cp311-linux_x86_64.whl), [Download4(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.8.3+cu128torch2.9-cp311-cp311-linux_x86_64.whl), [Download5(v0.4.15)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.15/flash_attn-2.8.3+cu128torch2.9-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.9 | 13.0 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.8.3+cu130torch2.9-cp311-cp311-linux_x86_64.whl), [Download2(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.8.3+cu130torch2.9-cp311-cp311-linux_x86_64.whl), [Download3(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu130torch2.9-cp311-cp311-linux_x86_64.whl), [Download4(v0.4.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.18/flash_attn-2.8.3+cu130torch2.9-cp311-cp311-linux_x86_64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu126torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu126torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu128torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu128torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu128torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu124torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu124torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu126torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu126torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu128torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu128torch2.6-cp312-cp312-linux_x86_64.whl), [Download3(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu128torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.9 | [Download1(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu129torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu124torch2.7-cp312-cp312-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu124torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu126torch2.7-cp312-cp312-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu126torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu128torch2.7-cp312-cp312-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu128torch2.7-cp312-cp312-linux_x86_64.whl), [Download3(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu128torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.8 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu124torch2.8-cp312-cp312-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu124torch2.8-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.8 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu126torch2.8-cp312-cp312-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu126torch2.8-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.8 | 12.8 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu128torch2.8-cp312-cp312-linux_x86_64.whl), [Download2(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu128torch2.8-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.8 | 12.9 | [Download1(v0.4.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu129torch2.8-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.9 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu124torch2.9-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.9 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu126torch2.9-cp312-cp312-linux_x86_64.whl), [Download2(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.8.3+cu126torch2.9-cp312-cp312-linux_x86_64.whl), [Download3(v0.4.15)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.15/flash_attn-2.8.3+cu126torch2.9-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.8.3+cu128torch2.9-cp312-cp312-linux_x86_64.whl), [Download2(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.8.3+cu128torch2.9-cp312-cp312-linux_x86_64.whl), [Download3(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu128torch2.9-cp312-cp312-linux_x86_64.whl), [Download4(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.8.3+cu128torch2.9-cp312-cp312-linux_x86_64.whl), [Download5(v0.4.15)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.15/flash_attn-2.8.3+cu128torch2.9-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.9 | 13.0 | [Download1(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.8.3+cu130torch2.9-cp312-cp312-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.8.3+cu130torch2.9-cp312-cp312-linux_x86_64.whl), [Download3(v0.4.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.18/flash_attn-2.8.3+cu130torch2.9-cp312-cp312-linux_x86_64.whl) | | 3.13 | 2.5 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu128torch2.5-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.6 | 12.4 | [Download1(v0.4.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.12/flash_attn-2.8.3+cu124torch2.6-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.6 | 12.6 | [Download1(v0.4.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.12/flash_attn-2.8.3+cu126torch2.6-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.6 | 12.8 | [Download1(v0.4.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.12/flash_attn-2.8.3+cu128torch2.6-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.6 | 12.9 | [Download1(v0.4.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.12/flash_attn-2.8.3+cu129torch2.6-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.7 | 12.4 | [Download1(v0.4.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.12/flash_attn-2.8.3+cu124torch2.7-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.7 | 12.6 | [Download1(v0.4.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.12/flash_attn-2.8.3+cu126torch2.7-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.7 | 12.8 | [Download1(v0.4.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.12/flash_attn-2.8.3+cu128torch2.7-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.8 | 12.4 | [Download1(v0.4.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.12/flash_attn-2.8.3+cu124torch2.8-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.8 | 12.6 | [Download1(v0.4.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.12/flash_attn-2.8.3+cu126torch2.8-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.8 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu128torch2.8-cp313-cp313-linux_x86_64.whl), [Download2(v0.4.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.12/flash_attn-2.8.3+cu128torch2.8-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.8 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu129torch2.8-cp313-cp313-linux_x86_64.whl), [Download2(v0.4.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.12/flash_attn-2.8.3+cu129torch2.8-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.9 | 12.6 | [Download1(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.8.3+cu126torch2.9-cp313-cp313-linux_x86_64.whl), [Download2(v0.4.15)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.15/flash_attn-2.8.3+cu126torch2.9-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.9 | 12.8 | [Download1(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.8.3+cu128torch2.9-cp313-cp313-linux_x86_64.whl), [Download2(v0.4.15)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.15/flash_attn-2.8.3+cu128torch2.9-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.9 | 13.0 | [Download1(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.8.3+cu130torch2.9-cp313-cp313-linux_x86_64.whl), [Download2(v0.5.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.2/flash_attn-2.8.3+cu130torch2.9-cp313-cp313-linux_x86_64.whl), [Download3(v0.4.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.18/flash_attn-2.8.3+cu130torch2.9-cp313-cp313-linux_x86_64.whl) | | 3.14 | 2.9 | 12.8 | [Download1(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.8.3+cu128torch2.9-cp314-cp314-linux_x86_64.whl) | | 3.14 | 2.9 | 13.0 | [Download1(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.8.3+cu130torch2.9-cp314-cp314-linux_x86_64.whl) |

Flash-Attention 2.8.2

Packages for Flash-Attention 2.8.2 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.5 | 12.4 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu124torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.8 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu128torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.8 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu128torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.9 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu129torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.4 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu124torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.8 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu128torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.8 | 12.4 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu124torch2.8-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.8 | 12.8 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu128torch2.8-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.8 | 12.9 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu129torch2.8-cp310-cp310-linux_x86_64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu124torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.8 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu128torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.4 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu124torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.8 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu128torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.9 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu129torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.4 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu124torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.8 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu128torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.8 | 12.4 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu124torch2.8-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.8 | 12.8 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu128torch2.8-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.8 | 12.9 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu129torch2.8-cp311-cp311-linux_x86_64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu124torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.8 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu128torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.4 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu124torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.8 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu128torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.9 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu129torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.4 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu124torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.8 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu128torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.8 | 12.4 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu124torch2.8-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.8 | 12.8 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu128torch2.8-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.8 | 12.9 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu129torch2.8-cp312-cp312-linux_x86_64.whl) |

Flash-Attention 2.8.1

Packages for Flash-Attention 2.8.1 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.4 | 12.8 | [Download1(v0.3.13)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.13/flash_attn-2.8.1+cu128torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.8 | [Download1(v0.3.13)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.13/flash_attn-2.8.1+cu128torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.8 | [Download1(v0.3.13)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.13/flash_attn-2.8.1+cu128torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.8 | [Download1(v0.3.13)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.13/flash_attn-2.8.1+cu128torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.9 | 12.8 | [Download1(v0.4.22)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.22/flash_attn-2.8.1+cu128torch2.9-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.9 | 13.0 | [Download1(v0.4.22)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.22/flash_attn-2.8.1+cu130torch2.9-cp310-cp310-linux_x86_64.whl) | | 3.11 | 2.4 | 12.8 | [Download1(v0.3.13)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.13/flash_attn-2.8.1+cu128torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.8 | [Download1(v0.3.13)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.13/flash_attn-2.8.1+cu128torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.8 | [Download1(v0.3.13)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.13/flash_attn-2.8.1+cu128torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.8 | [Download1(v0.3.13)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.13/flash_attn-2.8.1+cu128torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.9 | 12.8 | [Download1(v0.4.22)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.22/flash_attn-2.8.1+cu128torch2.9-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.9 | 13.0 | [Download1(v0.4.22)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.22/flash_attn-2.8.1+cu130torch2.9-cp311-cp311-linux_x86_64.whl) | | 3.12 | 2.4 | 12.8 | [Download1(v0.3.13)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.13/flash_attn-2.8.1+cu128torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.8 | [Download1(v0.3.13)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.13/flash_attn-2.8.1+cu128torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.8 | [Download1(v0.3.13)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.13/flash_attn-2.8.1+cu128torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.8 | [Download1(v0.3.13)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.13/flash_attn-2.8.1+cu128torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.9 | 12.8 | [Download1(v0.4.22)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.22/flash_attn-2.8.1+cu128torch2.9-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.9 | 13.0 | [Download1(v0.4.22)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.22/flash_attn-2.8.1+cu130torch2.9-cp312-cp312-linux_x86_64.whl) | | 3.13 | 2.9 | 12.8 | [Download1(v0.4.22)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.22/flash_attn-2.8.1+cu128torch2.9-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.9 | 13.0 | [Download1(v0.4.22)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.22/flash_attn-2.8.1+cu130torch2.9-cp313-cp313-linux_x86_64.whl) |

Flash-Attention 2.8.0

Packages for Flash-Attention 2.8.0 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.4 | 12.1 | [Download1(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu121torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.4 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu124torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu124torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.8 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu128torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu128torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.1 | [Download1(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu121torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.4 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu124torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.8 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu128torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu128torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.4 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu124torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu124torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.8 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu128torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.4 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu124torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.8 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu128torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.11 | 2.4 | 12.1 | [Download1(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu121torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.4 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu124torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.8 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu128torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.1 | [Download1(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu121torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu124torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.8 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu128torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.4 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu124torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu124torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.8 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu128torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu128torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.4 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu124torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.8 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu128torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu128torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.12 | 2.4 | 12.1 | [Download1(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu121torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.4 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu124torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.8 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu128torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.1 | [Download1(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu121torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu124torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.8 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu128torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu128torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.4 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu124torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.11)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.11/flash_attn-2.8.0+cu124torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.8 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu128torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.4 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu124torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.8 | [Download1(v0.3.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.12/flash_attn-2.8.0+cu128torch2.7-cp312-cp312-linux_x86_64.whl) |

Flash-Attention 2.7.4.post1

Packages for Flash-Attention 2.7.4.post1 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.9 | 2.5 | 12.4 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.7.4.post1+cu124torch2.5-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.5 | 12.6 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.7.4.post1+cu126torch2.5-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.6 | 12.4 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.7.4.post1+cu124torch2.6-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.6 | 12.6 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.7.4.post1+cu126torch2.6-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.7 | 12.4 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.7.4.post1+cu124torch2.7-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.7 | 12.6 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.7.4.post1+cu126torch2.7-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.8 | 12.4 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.7.4.post1+cu124torch2.8-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.8 | 12.6 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.7.4.post1+cu126torch2.8-cp39-cp39-linux_x86_64.whl) | | 3.10 | 2.2 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu124torch2.2-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu124torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu126torch2.2-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu126torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu124torch2.3-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu124torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu126torch2.3-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu126torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu118torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu118torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu124torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu124torch2.4-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu124torch2.4-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu124torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu126torch2.4-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu126torch2.4-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu126torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu118torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu118torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu124torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu126torch2.5-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu126torch2.5-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu126torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu118torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu118torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu124torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu124torch2.6-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu124torch2.6-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu124torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu126torch2.6-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu126torch2.6-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu126torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu118torch2.7-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu118torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.7-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu126torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.9 | 12.6 | [Download1(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.7.4.post1+cu126torch2.9-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.9 | 13.0 | [Download1(v0.4.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.18/flash_attn-2.7.4.post1+cu130torch2.9-cp310-cp310-linux_x86_64.whl) | | 3.11 | 2.2 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu124torch2.2-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu124torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu126torch2.2-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu126torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu124torch2.3-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu124torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu126torch2.3-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu126torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu118torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu118torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu124torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu126torch2.4-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu126torch2.4-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu126torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu118torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu118torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu124torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu126torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu126torch2.5-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu126torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu118torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu118torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu124torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu124torch2.6-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu124torch2.6-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu124torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu126torch2.6-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu126torch2.6-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu126torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu118torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu118torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu126torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.9 | 12.6 | [Download1(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.7.4.post1+cu126torch2.9-cp311-cp311-linux_x86_64.whl) | | 3.12 | 2.2 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu124torch2.2-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu124torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu126torch2.2-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu126torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu124torch2.3-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu124torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu126torch2.3-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu126torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu118torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu118torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu124torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu126torch2.4-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu126torch2.4-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu126torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu118torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu118torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu124torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu126torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu126torch2.5-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu126torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu118torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu118torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu124torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu124torch2.6-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu124torch2.6-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu124torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu126torch2.6-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.7.4.post1+cu126torch2.6-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.7.4.post1+cu126torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu118torch2.7-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu118torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.7-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.7.4.post1+cu126torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.9 | 12.6 | [Download1(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.7.4.post1+cu126torch2.9-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.9 | 13.0 | [Download1(v0.4.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.18/flash_attn-2.7.4.post1+cu130torch2.9-cp312-cp312-linux_x86_64.whl) | | 3.13 | 2.9 | 12.6 | [Download1(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.7.4.post1+cu126torch2.9-cp313-cp313-linux_x86_64.whl) |

Flash-Attention 2.7.4

Packages for Flash-Attention 2.7.4 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.5 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.7.4+cu124torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu126torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu128torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu124torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu124torch2.6-cp310-cp310-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.7.4+cu124torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu126torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu128torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu129torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu129torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu124torch2.7-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu124torch2.7-cp310-cp310-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.7.4+cu124torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu126torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.7-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu128torch2.7-cp310-cp310-linux_x86_64.whl), [Download3(v0.3.10)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.10/flash_attn-2.7.4+cu128torch2.7-cp310-cp310-linux_x86_64.whl), [Download4(v0.1.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.1.0/flash_attn-2.7.4+cu128torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.8 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu124torch2.8-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu124torch2.8-cp310-cp310-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.7.4+cu124torch2.8-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.8 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu126torch2.8-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.8 | 12.8 | [Download1(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu128torch2.8-cp310-cp310-linux_x86_64.whl), [Download2(v0.2.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.1/flash_attn-2.7.4+cu128torch2.8-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.8 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu129torch2.8-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu129torch2.8-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.9 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu124torch2.9-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.9 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu126torch2.9-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.7.4+cu128torch2.9-cp310-cp310-linux_x86_64.whl), [Download2(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.7.4+cu128torch2.9-cp310-cp310-linux_x86_64.whl), [Download3(v0.4.21)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.21/flash_attn-2.7.4+cu128torch2.9-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.9 | 13.0 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.7.4+cu130torch2.9-cp310-cp310-linux_x86_64.whl), [Download2(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.7.4+cu130torch2.9-cp310-cp310-linux_x86_64.whl), [Download3(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu130torch2.9-cp310-cp310-linux_x86_64.whl), [Download4(v0.4.21)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.21/flash_attn-2.7.4+cu130torch2.9-cp310-cp310-linux_x86_64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.7.4+cu124torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu126torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu128torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu124torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu124torch2.6-cp311-cp311-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.7.4+cu124torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu126torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu128torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.9 | [Download1(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu129torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu124torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu124torch2.7-cp311-cp311-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.7.4+cu124torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu126torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu128torch2.7-cp311-cp311-linux_x86_64.whl), [Download3(v0.3.10)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.10/flash_attn-2.7.4+cu128torch2.7-cp311-cp311-linux_x86_64.whl), [Download4(v0.1.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.1.0/flash_attn-2.7.4+cu128torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.8 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu124torch2.8-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu124torch2.8-cp311-cp311-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.7.4+cu124torch2.8-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.8 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu126torch2.8-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.8 | 12.8 | [Download1(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu128torch2.8-cp311-cp311-linux_x86_64.whl), [Download2(v0.2.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.1/flash_attn-2.7.4+cu128torch2.8-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.8 | 12.9 | [Download1(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu129torch2.8-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.9 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu124torch2.9-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.9 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu126torch2.9-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.7.4+cu128torch2.9-cp311-cp311-linux_x86_64.whl), [Download2(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.7.4+cu128torch2.9-cp311-cp311-linux_x86_64.whl), [Download3(v0.4.21)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.21/flash_attn-2.7.4+cu128torch2.9-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.9 | 13.0 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.7.4+cu130torch2.9-cp311-cp311-linux_x86_64.whl), [Download2(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.7.4+cu130torch2.9-cp311-cp311-linux_x86_64.whl), [Download3(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu130torch2.9-cp311-cp311-linux_x86_64.whl), [Download4(v0.4.21)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.21/flash_attn-2.7.4+cu130torch2.9-cp311-cp311-linux_x86_64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.7.4+cu124torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu126torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.8 | [Download1(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu128torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu124torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu124torch2.6-cp312-cp312-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.7.4+cu124torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu126torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.8 | [Download1(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu128torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.9 | [Download1(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu129torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu124torch2.7-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu124torch2.7-cp312-cp312-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.7.4+cu124torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu126torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.7-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu128torch2.7-cp312-cp312-linux_x86_64.whl), [Download3(v0.3.10)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.10/flash_attn-2.7.4+cu128torch2.7-cp312-cp312-linux_x86_64.whl), [Download4(v0.1.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.1.0/flash_attn-2.7.4+cu128torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.8 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu124torch2.8-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu124torch2.8-cp312-cp312-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.7.4+cu124torch2.8-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.8 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu126torch2.8-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.8 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.8-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu128torch2.8-cp312-cp312-linux_x86_64.whl), [Download3(v0.2.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.1/flash_attn-2.7.4+cu128torch2.8-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.8 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu129torch2.8-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu129torch2.8-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.9 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu124torch2.9-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.9 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.7.4+cu126torch2.9-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.7.4+cu128torch2.9-cp312-cp312-linux_x86_64.whl), [Download2(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.7.4+cu128torch2.9-cp312-cp312-linux_x86_64.whl), [Download3(v0.4.21)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.21/flash_attn-2.7.4+cu128torch2.9-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.9 | 13.0 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.7.4+cu130torch2.9-cp312-cp312-linux_x86_64.whl), [Download2(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.7.4+cu130torch2.9-cp312-cp312-linux_x86_64.whl), [Download3(v0.4.21)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.21/flash_attn-2.7.4+cu130torch2.9-cp312-cp312-linux_x86_64.whl) | | 3.13 | 2.6 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.6-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.8 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.8-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.8 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu129torch2.8-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.9 | 12.8 | [Download1(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.7.4+cu128torch2.9-cp313-cp313-linux_x86_64.whl), [Download2(v0.4.21)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.21/flash_attn-2.7.4+cu128torch2.9-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.9 | 13.0 | [Download1(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.7.4+cu130torch2.9-cp313-cp313-linux_x86_64.whl), [Download2(v0.4.21)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.21/flash_attn-2.7.4+cu130torch2.9-cp313-cp313-linux_x86_64.whl) | | 3.14 | 2.9 | 12.8 | [Download1(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.7.4+cu128torch2.9-cp314-cp314-linux_x86_64.whl) | | 3.14 | 2.9 | 13.0 | [Download1(v0.6.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.8/flash_attn-2.7.4+cu130torch2.9-cp314-cp314-linux_x86_64.whl) |

Flash-Attention 2.7.3

Packages for Flash-Attention 2.7.3 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.0 | 11.8 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu118torch2.0-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 11.8 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu118torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 12.1 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu121torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 12.4 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu124torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 11.8 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu118torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.1 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu121torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.4 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu124torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 11.8 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu118torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.1 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu121torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.4 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu124torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 11.8 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu118torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.1 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu121torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.4 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu124torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 11.8 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu118torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.1 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu121torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.4 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu124torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.11 | 2.0 | 11.8 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu118torch2.0-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 11.8 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu118torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 12.1 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu121torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 12.4 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu124torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 11.8 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu118torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.1 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu121torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.4 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu124torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 11.8 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu118torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.1 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu121torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.4 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu124torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 11.8 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu118torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.1 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu121torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.4 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu124torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 11.8 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu118torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.1 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu121torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.12 | 2.2 | 11.8 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu118torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.1 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu121torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.4 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu124torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 11.8 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu118torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.1 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu121torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.4 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu124torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 11.8 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu118torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.1 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu121torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.4 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu124torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 11.8 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu118torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.1 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu121torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.0.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.4/flash_attn-2.7.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl) |

Flash-Attention 2.7.2.post1

Packages for Flash-Attention 2.7.2.post1 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.0 | 11.8 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu118torch2.0-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 11.8 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu118torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 12.1 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu121torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 12.4 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu124torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 11.8 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu118torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.1 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu121torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.4 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu124torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 11.8 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu118torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.1 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu121torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.4 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu124torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 11.8 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu118torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.1 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu121torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.4 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu124torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 11.8 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu118torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.1 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu121torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.4 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu124torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.11 | 2.0 | 11.8 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu118torch2.0-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 11.8 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu118torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 12.1 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu121torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 12.4 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu124torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 11.8 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu118torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.1 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu121torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.4 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu124torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 11.8 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu118torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.1 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu121torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.4 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu124torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 11.8 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu118torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.1 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu121torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.4 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu124torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 11.8 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu118torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.1 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu121torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu124torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.12 | 2.2 | 11.8 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu118torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.1 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu121torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.4 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu124torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 11.8 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu118torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.1 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu121torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.4 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu124torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 11.8 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu118torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.1 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu121torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.4 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu124torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 11.8 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu118torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.1 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu121torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.0.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.3/flash_attn-2.7.2.post1+cu124torch2.5-cp312-cp312-linux_x86_64.whl) |

Flash-Attention 2.7.0.post2

Packages for Flash-Attention 2.7.0.post2 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.0 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu118torch2.0-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu118torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu121torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu124torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu118torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu121torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu124torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu118torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu121torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu124torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu118torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu121torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu124torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu118torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu121torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu124torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.11 | 2.0 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu118torch2.0-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu118torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu121torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu124torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu118torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu121torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu124torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu118torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu121torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu124torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu118torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu121torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu124torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu118torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu121torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu124torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.12 | 2.2 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu118torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu121torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu124torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu118torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu121torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu124torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu118torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu121torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu124torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu118torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu121torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.7.0.post2+cu124torch2.5-cp312-cp312-linux_x86_64.whl) |

Flash-Attention 2.6.3

Packages for Flash-Attention 2.6.3 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.9 | 2.5 | 12.4 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.6.3+cu124torch2.5-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.5 | 12.6 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.6.3+cu126torch2.5-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.6 | 12.4 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.6.3+cu124torch2.6-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.6 | 12.6 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.6.3+cu126torch2.6-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.7 | 12.4 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.6.3+cu124torch2.7-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.7 | 12.6 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.6.3+cu126torch2.7-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.8 | 12.4 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.6.3+cu124torch2.8-cp39-cp39-linux_x86_64.whl) | | 3.9 | 2.8 | 12.6 | [Download1(v0.4.16)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.16/flash_attn-2.6.3+cu126torch2.8-cp39-cp39-linux_x86_64.whl) | | 3.10 | 2.0 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu118torch2.0-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu118torch2.0-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu118torch2.1-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu118torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu121torch2.1-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu121torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 12.4 | [Download1(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu124torch2.1-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu124torch2.1-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu124torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 12.6 | [Download1(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu126torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu118torch2.2-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu118torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu121torch2.2-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu121torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu124torch2.2-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu124torch2.2-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu124torch2.2-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu124torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu126torch2.2-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu126torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu118torch2.3-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu118torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu121torch2.3-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu121torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu124torch2.3-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu124torch2.3-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu124torch2.3-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu124torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu126torch2.3-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu126torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu118torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu118torch2.4-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu118torch2.4-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu118torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu121torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu121torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu124torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu124torch2.4-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu124torch2.4-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu124torch2.4-cp310-cp310-linux_x86_64.whl), [Download5(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu124torch2.4-cp310-cp310-linux_x86_64.whl), [Download6(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu124torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu126torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu126torch2.4-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu126torch2.4-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu126torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu128torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu118torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu118torch2.5-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu118torch2.5-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu118torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu121torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu121torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download5(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download6(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download7(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download8(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu124torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu126torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu126torch2.5-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu126torch2.5-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu126torch2.5-cp310-cp310-linux_x86_64.whl), [Download5(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu126torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu128torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu128torch2.5-cp310-cp310-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu128torch2.5-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu128torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu118torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu118torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu124torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu124torch2.6-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu124torch2.6-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu124torch2.6-cp310-cp310-linux_x86_64.whl), [Download5(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu124torch2.6-cp310-cp310-linux_x86_64.whl), [Download6(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu124torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu126torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu126torch2.6-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu126torch2.6-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu126torch2.6-cp310-cp310-linux_x86_64.whl), [Download5(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu126torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu128torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu128torch2.6-cp310-cp310-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu128torch2.6-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu128torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu129torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu129torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu118torch2.7-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu118torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu124torch2.7-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu124torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu126torch2.7-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu126torch2.7-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu126torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu128torch2.7-cp310-cp310-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu128torch2.7-cp310-cp310-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu128torch2.7-cp310-cp310-linux_x86_64.whl), [Download4(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu128torch2.7-cp310-cp310-linux_x86_64.whl), [Download5(v0.1.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.1.0/flash_attn-2.6.3+cu128torch2.7-cp310-cp310-linux_x86_64.whl), [Download6(v0.0.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.9/flash_attn-2.6.3+cu128torch2.7-cp310-cp310-linux_x86_64.whl), [Download7(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu128torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.8 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu124torch2.8-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu124torch2.8-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.8 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu126torch2.8-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.8 | 12.8 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu128torch2.8-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu128torch2.8-cp310-cp310-linux_x86_64.whl), [Download3(v0.2.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.1/flash_attn-2.6.3+cu128torch2.8-cp310-cp310-linux_x86_64.whl), [Download4(v0.2.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.0/flash_attn-2.6.3+cu128torch2.8-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.8 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu129torch2.8-cp310-cp310-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu129torch2.8-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.9 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu124torch2.9-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.9 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu126torch2.9-cp310-cp310-linux_x86_64.whl), [Download2(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.6.3+cu126torch2.9-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.6.3+cu128torch2.9-cp310-cp310-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu128torch2.9-cp310-cp310-linux_x86_64.whl), [Download3(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.6.3+cu128torch2.9-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.9 | 13.0 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.6.3+cu130torch2.9-cp310-cp310-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu130torch2.9-cp310-cp310-linux_x86_64.whl), [Download3(v0.4.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.18/flash_attn-2.6.3+cu130torch2.9-cp310-cp310-linux_x86_64.whl) | | 3.11 | 2.0 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu118torch2.0-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu118torch2.0-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu118torch2.0-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu118torch2.1-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu118torch2.1-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu118torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu121torch2.1-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu121torch2.1-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu121torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 12.4 | [Download1(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu124torch2.1-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu124torch2.1-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu124torch2.1-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu124torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 12.6 | [Download1(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu126torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu118torch2.2-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu118torch2.2-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu118torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu121torch2.2-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu121torch2.2-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu121torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu124torch2.2-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu124torch2.2-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu124torch2.2-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu124torch2.2-cp311-cp311-linux_x86_64.whl), [Download5(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu124torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu126torch2.2-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu126torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu118torch2.3-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu118torch2.3-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu118torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu121torch2.3-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu121torch2.3-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu121torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu124torch2.3-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu124torch2.3-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu124torch2.3-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu124torch2.3-cp311-cp311-linux_x86_64.whl), [Download5(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu124torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu126torch2.3-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu126torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu118torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu118torch2.4-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu118torch2.4-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu118torch2.4-cp311-cp311-linux_x86_64.whl), [Download5(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu118torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu121torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu121torch2.4-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu121torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download5(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download6(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download7(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu124torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu126torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu126torch2.4-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu126torch2.4-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu126torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu128torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu118torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu118torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu118torch2.5-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu118torch2.5-cp311-cp311-linux_x86_64.whl), [Download5(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu118torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu121torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu121torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu121torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download5(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download6(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download7(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download8(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download9(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu126torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu126torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu126torch2.5-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu126torch2.5-cp311-cp311-linux_x86_64.whl), [Download5(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu126torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.8 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu128torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu128torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu128torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu118torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu118torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu124torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu124torch2.6-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu124torch2.6-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu124torch2.6-cp311-cp311-linux_x86_64.whl), [Download5(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu124torch2.6-cp311-cp311-linux_x86_64.whl), [Download6(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu124torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu126torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu126torch2.6-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu126torch2.6-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu126torch2.6-cp311-cp311-linux_x86_64.whl), [Download5(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu126torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu128torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu128torch2.6-cp311-cp311-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu128torch2.6-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu128torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu129torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu129torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu118torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu118torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu124torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu124torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu126torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu126torch2.7-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu126torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu128torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu128torch2.7-cp311-cp311-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu128torch2.7-cp311-cp311-linux_x86_64.whl), [Download4(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu128torch2.7-cp311-cp311-linux_x86_64.whl), [Download5(v0.1.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.1.0/flash_attn-2.6.3+cu128torch2.7-cp311-cp311-linux_x86_64.whl), [Download6(v0.0.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.9/flash_attn-2.6.3+cu128torch2.7-cp311-cp311-linux_x86_64.whl), [Download7(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu128torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.8 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu124torch2.8-cp311-cp311-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu124torch2.8-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.8 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu126torch2.8-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.8 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu128torch2.8-cp311-cp311-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu128torch2.8-cp311-cp311-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu128torch2.8-cp311-cp311-linux_x86_64.whl), [Download4(v0.2.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.1/flash_attn-2.6.3+cu128torch2.8-cp311-cp311-linux_x86_64.whl), [Download5(v0.2.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.0/flash_attn-2.6.3+cu128torch2.8-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.8 | 12.9 | [Download1(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu129torch2.8-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.9 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu124torch2.9-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.9 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu126torch2.9-cp311-cp311-linux_x86_64.whl), [Download2(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.6.3+cu126torch2.9-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.6.3+cu128torch2.9-cp311-cp311-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu128torch2.9-cp311-cp311-linux_x86_64.whl), [Download3(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.6.3+cu128torch2.9-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.9 | 13.0 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.6.3+cu130torch2.9-cp311-cp311-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu130torch2.9-cp311-cp311-linux_x86_64.whl), [Download3(v0.4.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.18/flash_attn-2.6.3+cu130torch2.9-cp311-cp311-linux_x86_64.whl) | | 3.12 | 2.2 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu118torch2.2-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu118torch2.2-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu118torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu121torch2.2-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu121torch2.2-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu121torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu124torch2.2-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu124torch2.2-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu124torch2.2-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu124torch2.2-cp312-cp312-linux_x86_64.whl), [Download5(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu124torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu126torch2.2-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu126torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu118torch2.3-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu118torch2.3-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu118torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu121torch2.3-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu121torch2.3-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu121torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu124torch2.3-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu124torch2.3-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu124torch2.3-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu124torch2.3-cp312-cp312-linux_x86_64.whl), [Download5(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu124torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu126torch2.3-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu126torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu118torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu118torch2.4-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu118torch2.4-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu118torch2.4-cp312-cp312-linux_x86_64.whl), [Download5(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu118torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu121torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu121torch2.4-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu121torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download5(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download6(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download7(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu124torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu126torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu126torch2.4-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu126torch2.4-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu126torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu128torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu118torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu118torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu118torch2.5-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu118torch2.5-cp312-cp312-linux_x86_64.whl), [Download5(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu118torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu121torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu121torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu121torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download5(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download6(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download7(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.6.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download8(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.6.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download9(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.6.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu126torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu126torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu126torch2.5-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu126torch2.5-cp312-cp312-linux_x86_64.whl), [Download5(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu126torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu128torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu128torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu128torch2.5-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu128torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu118torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu118torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu124torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu124torch2.6-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu124torch2.6-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu124torch2.6-cp312-cp312-linux_x86_64.whl), [Download5(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu124torch2.6-cp312-cp312-linux_x86_64.whl), [Download6(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu124torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu126torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu126torch2.6-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu126torch2.6-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.6.3+cu126torch2.6-cp312-cp312-linux_x86_64.whl), [Download5(v0.0.5)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.5/flash_attn-2.6.3+cu126torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.8 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu128torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu128torch2.6-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu128torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu129torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu129torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu118torch2.7-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu118torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu124torch2.7-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu124torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu126torch2.7-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.6.3+cu126torch2.7-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu126torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.8 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu128torch2.7-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu128torch2.7-cp312-cp312-linux_x86_64.whl), [Download3(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu128torch2.7-cp312-cp312-linux_x86_64.whl), [Download4(v0.1.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.1.0/flash_attn-2.6.3+cu128torch2.7-cp312-cp312-linux_x86_64.whl), [Download5(v0.0.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.9/flash_attn-2.6.3+cu128torch2.7-cp312-cp312-linux_x86_64.whl), [Download6(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.6.3+cu128torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.8 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu124torch2.8-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu124torch2.8-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.8 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu126torch2.8-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.8 | 12.8 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu128torch2.8-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu128torch2.8-cp312-cp312-linux_x86_64.whl), [Download3(v0.2.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.1/flash_attn-2.6.3+cu128torch2.8-cp312-cp312-linux_x86_64.whl), [Download4(v0.2.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.0/flash_attn-2.6.3+cu128torch2.8-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.8 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu129torch2.8-cp312-cp312-linux_x86_64.whl), [Download2(v0.3.14)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.6.3+cu129torch2.8-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.9 | 12.4 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu124torch2.9-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.9 | 12.6 | [Download1(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu126torch2.9-cp312-cp312-linux_x86_64.whl), [Download2(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.6.3+cu126torch2.9-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.6.3+cu128torch2.9-cp312-cp312-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu128torch2.9-cp312-cp312-linux_x86_64.whl), [Download3(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.6.3+cu128torch2.9-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.9 | 13.0 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.6.3+cu130torch2.9-cp312-cp312-linux_x86_64.whl), [Download2(v0.5.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.5.4/flash_attn-2.6.3+cu130torch2.9-cp312-cp312-linux_x86_64.whl), [Download3(v0.4.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.18/flash_attn-2.6.3+cu130torch2.9-cp312-cp312-linux_x86_64.whl) | | 3.13 | 2.6 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu129torch2.6-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.7 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu128torch2.7-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.9 | 12.6 | [Download1(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.6.3+cu126torch2.9-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.9 | 12.8 | [Download1(v0.4.17)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.17/flash_attn-2.6.3+cu128torch2.9-cp313-cp313-linux_x86_64.whl) | | 3.13 | 2.9 | 13.0 | [Download1(v0.4.18)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.18/flash_attn-2.6.3+cu130torch2.9-cp313-cp313-linux_x86_64.whl) | | 3.14 | 2.9 | 13.0 | [Download1(v0.6.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.9/flash_attn-2.6.3+cu130torch2.9-cp314-cp314-linux_x86_64.whl) |

Flash-Attention 2.5.9

Packages for Flash-Attention 2.5.9 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.0 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu118torch2.0-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu118torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu121torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 12.4 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu124torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu118torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu121torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu124torch2.2-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu124torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu126torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu118torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu121torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu124torch2.3-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu124torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu126torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu118torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu118torch2.4-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu118torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu121torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu124torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu124torch2.4-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu124torch2.4-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu124torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu126torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu126torch2.4-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu126torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu128torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu118torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu118torch2.5-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu118torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu121torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu124torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu126torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu126torch2.5-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu126torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu128torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu118torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu118torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu124torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu124torch2.6-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu124torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu126torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu126torch2.6-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu126torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu128torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu118torch2.7-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu118torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu126torch2.7-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu126torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu128torch2.7-cp310-cp310-linux_x86_64.whl), [Download2(v0.1.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.1.0/flash_attn-2.5.9+cu128torch2.7-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.9/flash_attn-2.5.9+cu128torch2.7-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu128torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.8 | 12.8 | [Download1(v0.2.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.1/flash_attn-2.5.9+cu128torch2.8-cp310-cp310-linux_x86_64.whl), [Download2(v0.2.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.0/flash_attn-2.5.9+cu128torch2.8-cp310-cp310-linux_x86_64.whl) | | 3.11 | 2.0 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu118torch2.0-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu118torch2.0-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu118torch2.1-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu118torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu121torch2.1-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu121torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 12.4 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu124torch2.1-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu124torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu118torch2.2-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu118torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu121torch2.2-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu121torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu124torch2.2-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu124torch2.2-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu124torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu126torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu118torch2.3-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu118torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu121torch2.3-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu121torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu124torch2.3-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu124torch2.3-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu124torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu126torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu118torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu118torch2.4-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu118torch2.4-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu118torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu121torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu121torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download5(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu124torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu126torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu126torch2.4-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu126torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu128torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu118torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu118torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu118torch2.5-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu118torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu121torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu121torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download5(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu124torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu126torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu126torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu126torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu128torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu118torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu118torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu124torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu124torch2.6-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu124torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu126torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu126torch2.6-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu126torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu128torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu118torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu118torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu126torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu126torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu128torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.1.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.1.0/flash_attn-2.5.9+cu128torch2.7-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.9/flash_attn-2.5.9+cu128torch2.7-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu128torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.8 | 12.8 | [Download1(v0.2.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.1/flash_attn-2.5.9+cu128torch2.8-cp311-cp311-linux_x86_64.whl), [Download2(v0.2.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.0/flash_attn-2.5.9+cu128torch2.8-cp311-cp311-linux_x86_64.whl) | | 3.12 | 2.2 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu118torch2.2-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu118torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu121torch2.2-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu121torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu124torch2.2-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu124torch2.2-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu124torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu126torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu118torch2.3-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu118torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu121torch2.3-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu121torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu124torch2.3-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu124torch2.3-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu124torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu126torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu118torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu118torch2.4-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu118torch2.4-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu118torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu121torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu121torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download5(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu124torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu126torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu126torch2.4-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu126torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu128torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu118torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu118torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu118torch2.5-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu118torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu121torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu121torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.9+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download5(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.9+cu124torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu126torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu126torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu126torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu128torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu118torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu118torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu124torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu124torch2.6-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu124torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu126torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu126torch2.6-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.5.9+cu126torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu128torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu118torch2.7-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu118torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.5.9+cu126torch2.7-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu126torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu128torch2.7-cp312-cp312-linux_x86_64.whl), [Download2(v0.1.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.1.0/flash_attn-2.5.9+cu128torch2.7-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.9/flash_attn-2.5.9+cu128torch2.7-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.5.9+cu128torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.8 | 12.8 | [Download1(v0.2.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.1/flash_attn-2.5.9+cu128torch2.8-cp312-cp312-linux_x86_64.whl), [Download2(v0.2.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.0/flash_attn-2.5.9+cu128torch2.8-cp312-cp312-linux_x86_64.whl) |

Flash-Attention 2.5.6

Packages for Flash-Attention 2.5.6 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.0 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu118torch2.0-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu118torch2.0-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu118torch2.1-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu118torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu121torch2.1-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu121torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu124torch2.1-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu124torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu118torch2.2-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu118torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu121torch2.2-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu121torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu124torch2.2-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu124torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu118torch2.3-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu118torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu121torch2.3-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu121torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu124torch2.3-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu124torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu118torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu118torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu121torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu121torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu124torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu124torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu118torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu118torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu121torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu121torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu124torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.11 | 2.0 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu118torch2.0-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu118torch2.0-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu118torch2.0-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu118torch2.1-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu118torch2.1-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu118torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu121torch2.1-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu121torch2.1-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu121torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu124torch2.1-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu124torch2.1-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu124torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu118torch2.2-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu118torch2.2-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu118torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu121torch2.2-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu121torch2.2-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu121torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu124torch2.2-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu124torch2.2-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu124torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu118torch2.3-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu118torch2.3-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu118torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu121torch2.3-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu121torch2.3-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu121torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu124torch2.3-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu124torch2.3-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu124torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu118torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu118torch2.4-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu118torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu121torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu121torch2.4-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu121torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu124torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu118torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu118torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu118torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu121torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu121torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu121torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu124torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.12 | 2.2 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu118torch2.2-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu118torch2.2-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu118torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu121torch2.2-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu121torch2.2-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu121torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu124torch2.2-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu124torch2.2-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu124torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu118torch2.3-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu118torch2.3-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu118torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu121torch2.3-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu121torch2.3-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu121torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu124torch2.3-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu124torch2.3-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu124torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu118torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu118torch2.4-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu118torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu121torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu121torch2.4-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu121torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu124torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu118torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu118torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu118torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu121torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu121torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu121torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.5.6+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.5.6+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.5.6+cu124torch2.5-cp312-cp312-linux_x86_64.whl) |

Flash-Attention 2.4.3

Packages for Flash-Attention 2.4.3 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.0 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu118torch2.0-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu118torch2.0-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu118torch2.1-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu118torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu121torch2.1-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu121torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu124torch2.1-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu124torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu118torch2.2-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu118torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu121torch2.2-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu121torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu124torch2.2-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu124torch2.2-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu124torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu126torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu118torch2.3-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu118torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu121torch2.3-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu121torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu124torch2.3-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu124torch2.3-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu124torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu126torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu118torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu118torch2.4-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu118torch2.4-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu118torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu121torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu121torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu124torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu124torch2.4-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu124torch2.4-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu124torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu126torch2.4-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu126torch2.4-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu126torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu128torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu118torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu118torch2.5-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu118torch2.5-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu118torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu121torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu121torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu124torch2.5-cp310-cp310-linux_x86_64.whl), [Download5(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu124torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu126torch2.5-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu126torch2.5-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu126torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu128torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu118torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu118torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu124torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu124torch2.6-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu124torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu126torch2.6-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu126torch2.6-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu126torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.6 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu128torch2.6-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu118torch2.7-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu118torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu126torch2.7-cp310-cp310-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu126torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.7 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.4.3+cu128torch2.7-cp310-cp310-linux_x86_64.whl), [Download2(v0.1.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.1.0/flash_attn-2.4.3+cu128torch2.7-cp310-cp310-linux_x86_64.whl), [Download3(v0.0.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.9/flash_attn-2.4.3+cu128torch2.7-cp310-cp310-linux_x86_64.whl), [Download4(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu128torch2.7-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.8 | 12.8 | [Download1(v0.2.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.1/flash_attn-2.4.3+cu128torch2.8-cp310-cp310-linux_x86_64.whl), [Download2(v0.2.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.0/flash_attn-2.4.3+cu128torch2.8-cp310-cp310-linux_x86_64.whl) | | 3.11 | 2.0 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu118torch2.0-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu118torch2.0-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu118torch2.0-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu118torch2.1-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu118torch2.1-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu118torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu121torch2.1-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu121torch2.1-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu121torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 12.4 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu124torch2.1-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu124torch2.1-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu124torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu118torch2.2-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu118torch2.2-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu118torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu121torch2.2-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu121torch2.2-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu121torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu124torch2.2-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu124torch2.2-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu124torch2.2-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu124torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu126torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu118torch2.3-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu118torch2.3-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu118torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu121torch2.3-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu121torch2.3-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu121torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu124torch2.3-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu124torch2.3-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu124torch2.3-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu124torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu126torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu118torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu118torch2.4-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu118torch2.4-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu118torch2.4-cp311-cp311-linux_x86_64.whl), [Download5(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu118torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu121torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu121torch2.4-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu121torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download5(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu124torch2.4-cp311-cp311-linux_x86_64.whl), [Download6(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu124torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu126torch2.4-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu126torch2.4-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu126torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu128torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu118torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu118torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu118torch2.5-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu118torch2.5-cp311-cp311-linux_x86_64.whl), [Download5(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu118torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu121torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu121torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu121torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download5(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl), [Download6(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu124torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu126torch2.5-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu126torch2.5-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu126torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu128torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu118torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu118torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu124torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu124torch2.6-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu124torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu126torch2.6-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu126torch2.6-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu126torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.6 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu128torch2.6-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu118torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu118torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu126torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu126torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.7 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.4.3+cu128torch2.7-cp311-cp311-linux_x86_64.whl), [Download2(v0.1.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.1.0/flash_attn-2.4.3+cu128torch2.7-cp311-cp311-linux_x86_64.whl), [Download3(v0.0.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.9/flash_attn-2.4.3+cu128torch2.7-cp311-cp311-linux_x86_64.whl), [Download4(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu128torch2.7-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.8 | 12.8 | [Download1(v0.2.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.1/flash_attn-2.4.3+cu128torch2.8-cp311-cp311-linux_x86_64.whl), [Download2(v0.2.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.0/flash_attn-2.4.3+cu128torch2.8-cp311-cp311-linux_x86_64.whl) | | 3.12 | 2.2 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu118torch2.2-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu118torch2.2-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu118torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu121torch2.2-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu121torch2.2-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu121torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu124torch2.2-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu124torch2.2-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu124torch2.2-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu124torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu126torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 11.8 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu118torch2.3-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu118torch2.3-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu118torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu121torch2.3-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu121torch2.3-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu121torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.4 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu124torch2.3-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu124torch2.3-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu124torch2.3-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu124torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.6 | [Download1(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu126torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu118torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu118torch2.4-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu118torch2.4-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu118torch2.4-cp312-cp312-linux_x86_64.whl), [Download5(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu118torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu121torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu121torch2.4-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu121torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download5(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu124torch2.4-cp312-cp312-linux_x86_64.whl), [Download6(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu124torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu126torch2.4-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu126torch2.4-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu126torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu128torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu118torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu118torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu118torch2.5-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu118torch2.5-cp312-cp312-linux_x86_64.whl), [Download5(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu118torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.1 | [Download1(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu121torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu121torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu121torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.2/flash_attn-2.4.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download5(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-2.4.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl), [Download6(v0.0.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.0/flash_attn-2.4.3+cu124torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu126torch2.5-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu126torch2.5-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu126torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu128torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu118torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu118torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.4 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu124torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu124torch2.6-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu124torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu126torch2.6-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu126torch2.6-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.6)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.6/flash_attn-2.4.3+cu126torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.6 | 12.8 | [Download1(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu128torch2.6-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 11.8 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu118torch2.7-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu118torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.6 | [Download1(v0.0.8)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.4.3+cu126torch2.7-cp312-cp312-linux_x86_64.whl), [Download2(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu126torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.7 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.4.3+cu128torch2.7-cp312-cp312-linux_x86_64.whl), [Download2(v0.1.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.1.0/flash_attn-2.4.3+cu128torch2.7-cp312-cp312-linux_x86_64.whl), [Download3(v0.0.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.9/flash_attn-2.4.3+cu128torch2.7-cp312-cp312-linux_x86_64.whl), [Download4(v0.0.7)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.7/flash_attn-2.4.3+cu128torch2.7-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.8 | 12.8 | [Download1(v0.2.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.1/flash_attn-2.4.3+cu128torch2.8-cp312-cp312-linux_x86_64.whl), [Download2(v0.2.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.2.0/flash_attn-2.4.3+cu128torch2.8-cp312-cp312-linux_x86_64.whl) |

Flash-Attention 1.0.9

Packages for Flash-Attention 1.0.9 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.0 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu118torch2.0-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu118torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu121torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.1 | 12.4 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu124torch2.1-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu118torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu121torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.2 | 12.4 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu124torch2.2-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu118torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu121torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.3 | 12.4 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu124torch2.3-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu118torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu121torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.4 | 12.4 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu124torch2.4-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu118torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu121torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.10 | 2.5 | 12.4 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu124torch2.5-cp310-cp310-linux_x86_64.whl) | | 3.11 | 2.0 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu118torch2.0-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu118torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu121torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.1 | 12.4 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu124torch2.1-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu118torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu121torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.2 | 12.4 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu124torch2.2-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu118torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu121torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.3 | 12.4 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu124torch2.3-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu118torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu121torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.4 | 12.4 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu124torch2.4-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu118torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu121torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu124torch2.5-cp311-cp311-linux_x86_64.whl) | | 3.12 | 2.2 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu118torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu121torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.2 | 12.4 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu124torch2.2-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu118torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu121torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.3 | 12.4 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu124torch2.3-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu118torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu121torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.4 | 12.4 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu124torch2.4-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 11.8 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu118torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.1 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu121torch2.5-cp312-cp312-linux_x86_64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.0.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.1/flash_attn-1.0.9+cu124torch2.5-cp312-cp312-linux_x86_64.whl) |

🐧 Linux arm64

Flash-Attention 2.8.3

Packages for Flash-Attention 2.8.3 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.5 | 12.4 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu124torch2.5-cp310-cp310-linux_aarch64.whl) | | 3.10 | 2.5 | 12.8 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu128torch2.5-cp310-cp310-linux_aarch64.whl) | | 3.10 | 2.6 | 12.8 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu128torch2.6-cp310-cp310-linux_aarch64.whl) | | 3.10 | 2.7 | 12.8 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu128torch2.7-cp310-cp310-linux_aarch64.whl) | | 3.10 | 2.9 | 12.4 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu124torch2.9-cp310-cp310-linux_aarch64.whl) | | 3.10 | 2.9 | 12.8 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu128torch2.9-cp310-cp310-linux_aarch64.whl) | | 3.10 | 2.9 | 13.0 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu130torch2.9-cp310-cp310-linux_aarch64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu124torch2.5-cp311-cp311-linux_aarch64.whl) | | 3.11 | 2.5 | 12.8 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu128torch2.5-cp311-cp311-linux_aarch64.whl) | | 3.11 | 2.6 | 12.8 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu128torch2.6-cp311-cp311-linux_aarch64.whl) | | 3.11 | 2.7 | 12.8 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu128torch2.7-cp311-cp311-linux_aarch64.whl) | | 3.11 | 2.9 | 12.4 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu124torch2.9-cp311-cp311-linux_aarch64.whl) | | 3.11 | 2.9 | 12.8 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu128torch2.9-cp311-cp311-linux_aarch64.whl) | | 3.11 | 2.9 | 13.0 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu130torch2.9-cp311-cp311-linux_aarch64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu124torch2.5-cp312-cp312-linux_aarch64.whl) | | 3.12 | 2.5 | 12.8 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu128torch2.5-cp312-cp312-linux_aarch64.whl) | | 3.12 | 2.6 | 12.8 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu128torch2.6-cp312-cp312-linux_aarch64.whl) | | 3.12 | 2.7 | 12.8 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu128torch2.7-cp312-cp312-linux_aarch64.whl) | | 3.12 | 2.9 | 12.4 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu124torch2.9-cp312-cp312-linux_aarch64.whl) | | 3.12 | 2.9 | 12.8 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu128torch2.9-cp312-cp312-linux_aarch64.whl) | | 3.12 | 2.9 | 13.0 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.8.3+cu130torch2.9-cp312-cp312-linux_aarch64.whl) |

Flash-Attention 2.7.4

Packages for Flash-Attention 2.7.4 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.5 | 12.4 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.7.4+cu124torch2.5-cp310-cp310-linux_aarch64.whl) | | 3.10 | 2.9 | 12.4 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.7.4+cu124torch2.9-cp310-cp310-linux_aarch64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.7.4+cu124torch2.5-cp311-cp311-linux_aarch64.whl) | | 3.11 | 2.9 | 12.4 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.7.4+cu124torch2.9-cp311-cp311-linux_aarch64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.7.4+cu124torch2.5-cp312-cp312-linux_aarch64.whl) | | 3.12 | 2.9 | 12.4 | [Download1(v0.6.4)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.4/flash_attn-2.7.4+cu124torch2.9-cp312-cp312-linux_aarch64.whl) |

Flash-Attention 2.6.3

Packages for Flash-Attention 2.6.3 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.5 | 12.4 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu124torch2.5-cp310-cp310-linux_aarch64.whl) | | 3.10 | 2.5 | 12.8 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu128torch2.5-cp310-cp310-linux_aarch64.whl) | | 3.10 | 2.6 | 12.8 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu128torch2.6-cp310-cp310-linux_aarch64.whl) | | 3.10 | 2.7 | 12.8 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu128torch2.7-cp310-cp310-linux_aarch64.whl) | | 3.10 | 2.9 | 12.4 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu124torch2.9-cp310-cp310-linux_aarch64.whl) | | 3.10 | 2.9 | 12.8 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu128torch2.9-cp310-cp310-linux_aarch64.whl) | | 3.10 | 2.9 | 13.0 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu130torch2.9-cp310-cp310-linux_aarch64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu124torch2.5-cp311-cp311-linux_aarch64.whl) | | 3.11 | 2.5 | 12.8 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu128torch2.5-cp311-cp311-linux_aarch64.whl) | | 3.11 | 2.6 | 12.8 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu128torch2.6-cp311-cp311-linux_aarch64.whl) | | 3.11 | 2.7 | 12.8 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu128torch2.7-cp311-cp311-linux_aarch64.whl) | | 3.11 | 2.9 | 12.4 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu124torch2.9-cp311-cp311-linux_aarch64.whl) | | 3.11 | 2.9 | 12.8 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu128torch2.9-cp311-cp311-linux_aarch64.whl) | | 3.11 | 2.9 | 13.0 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu130torch2.9-cp311-cp311-linux_aarch64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu124torch2.5-cp312-cp312-linux_aarch64.whl) | | 3.12 | 2.5 | 12.8 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu128torch2.5-cp312-cp312-linux_aarch64.whl) | | 3.12 | 2.6 | 12.8 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu128torch2.6-cp312-cp312-linux_aarch64.whl) | | 3.12 | 2.7 | 12.8 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu128torch2.7-cp312-cp312-linux_aarch64.whl) | | 3.12 | 2.9 | 12.4 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu124torch2.9-cp312-cp312-linux_aarch64.whl) | | 3.12 | 2.9 | 12.8 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu128torch2.9-cp312-cp312-linux_aarch64.whl) | | 3.12 | 2.9 | 13.0 | [Download1(v0.6.3)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.6.3/flash_attn-2.6.3+cu130torch2.9-cp312-cp312-linux_aarch64.whl) |

🐧 Manylinux 2_24 x86_64

Flash-Attention 2.8.3

Packages for Flash-Attention 2.8.3 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.6 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu128torch2.6-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.10 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.8.3+cu128torch2.9-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.10 | 2.9 | 13.0 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.8.3+cu130torch2.9-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.11 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.8.3+cu128torch2.9-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.11 | 2.9 | 13.0 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.8.3+cu130torch2.9-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.12 | 2.6 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu128torch2.6-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.12 | 2.7 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu128torch2.7-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.12 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.8.3+cu128torch2.9-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.13 | 2.8 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu128torch2.8-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.13 | 2.8 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu129torch2.8-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) |

Flash-Attention 2.7.4

Packages for Flash-Attention 2.7.4 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.6 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.6-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.10 | 2.6 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu129torch2.6-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.10 | 2.7 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.7-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.10 | 2.8 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu129torch2.8-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.10 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.7.4+cu128torch2.9-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.10 | 2.9 | 13.0 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.7.4+cu130torch2.9-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.11 | 2.6 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.6-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.11 | 2.7 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.7-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.11 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.7.4+cu128torch2.9-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.11 | 2.9 | 13.0 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.7.4+cu130torch2.9-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.12 | 2.7 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.7-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.12 | 2.8 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.8-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.12 | 2.8 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu129torch2.8-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.12 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.7.4+cu128torch2.9-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.12 | 2.9 | 13.0 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.7.4+cu130torch2.9-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.13 | 2.6 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.6-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.13 | 2.8 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.8-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.13 | 2.8 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu129torch2.8-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) |

Flash-Attention 2.6.3

Packages for Flash-Attention 2.6.3 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.6 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu128torch2.6-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.10 | 2.6 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu129torch2.6-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.10 | 2.7 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu128torch2.7-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.10 | 2.8 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu129torch2.8-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.10 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.6.3+cu128torch2.9-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.10 | 2.9 | 13.0 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.6.3+cu130torch2.9-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.11 | 2.6 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu128torch2.6-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.11 | 2.6 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu129torch2.6-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.11 | 2.7 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu128torch2.7-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.11 | 2.8 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu128torch2.8-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.11 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.6.3+cu128torch2.9-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.11 | 2.9 | 13.0 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.6.3+cu130torch2.9-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.12 | 2.6 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu129torch2.6-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.12 | 2.8 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu129torch2.8-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.12 | 2.9 | 12.8 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.6.3+cu128torch2.9-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.12 | 2.9 | 13.0 | [Download1(v0.7.0)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.6.3+cu130torch2.9-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.13 | 2.6 | 12.9 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu129torch2.6-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) | | 3.13 | 2.7 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu128torch2.7-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl) |

🐧 Manylinux2014 x86_64

Flash-Attention 2.8.3

Packages for Flash-Attention 2.8.3 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.5 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu128torch2.5-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl) | | 3.11 | 2.5 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu128torch2.5-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl) | | 3.12 | 2.5 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu128torch2.5-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl) | | 3.13 | 2.5 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.8.3+cu128torch2.5-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl) |

Flash-Attention 2.7.4

Packages for Flash-Attention 2.7.4 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.5 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.5-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl) | | 3.11 | 2.5 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.7.4+cu128torch2.5-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl) |

Flash-Attention 2.6.3

Packages for Flash-Attention 2.6.3 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.5 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu128torch2.5-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl) | | 3.12 | 2.5 | 12.8 | [Download1(v0.7.2)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.2/flash_attn-2.6.3+cu128torch2.5-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl) |

🪟 Windows x86_64

Flash-Attention 2.8.3

Packages for Flash-Attention 2.8.3 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.5 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.5-cp310-cp310-win_amd64.whl) | | 3.10 | 2.5 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.5-cp310-cp310-win_amd64.whl) | | 3.10 | 2.6 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.6-cp310-cp310-win_amd64.whl) | | 3.10 | 2.6 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.6-cp310-cp310-win_amd64.whl) | | 3.10 | 2.7 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.7-cp310-cp310-win_amd64.whl) | | 3.10 | 2.7 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.7-cp310-cp310-win_amd64.whl) | | 3.10 | 2.8 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.8-cp310-cp310-win_amd64.whl) | | 3.10 | 2.8 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.8-cp310-cp310-win_amd64.whl) | | 3.10 | 2.9 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.9-cp310-cp310-win_amd64.whl) | | 3.10 | 2.9 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.9-cp310-cp310-win_amd64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.5-cp311-cp311-win_amd64.whl) | | 3.11 | 2.5 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.5-cp311-cp311-win_amd64.whl) | | 3.11 | 2.6 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.6-cp311-cp311-win_amd64.whl) | | 3.11 | 2.6 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.6-cp311-cp311-win_amd64.whl) | | 3.11 | 2.7 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.7-cp311-cp311-win_amd64.whl) | | 3.11 | 2.7 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.7-cp311-cp311-win_amd64.whl) | | 3.11 | 2.8 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.8-cp311-cp311-win_amd64.whl) | | 3.11 | 2.8 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.8-cp311-cp311-win_amd64.whl) | | 3.11 | 2.9 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.9-cp311-cp311-win_amd64.whl) | | 3.11 | 2.9 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.9-cp311-cp311-win_amd64.whl), [Download2(v0.4.15)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.15/flash_attn-2.8.3+cu126torch2.9-cp311-cp311-win_amd64.whl) | | 3.11 | 2.9 | 13.0 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu130torch2.9-cp311-cp311-win_amd64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.5-cp312-cp312-win_amd64.whl) | | 3.12 | 2.5 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.5-cp312-cp312-win_amd64.whl) | | 3.12 | 2.6 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.6-cp312-cp312-win_amd64.whl) | | 3.12 | 2.6 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.6-cp312-cp312-win_amd64.whl) | | 3.12 | 2.7 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.7-cp312-cp312-win_amd64.whl) | | 3.12 | 2.7 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.7-cp312-cp312-win_amd64.whl) | | 3.12 | 2.8 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.8-cp312-cp312-win_amd64.whl) | | 3.12 | 2.8 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.8-cp312-cp312-win_amd64.whl) | | 3.12 | 2.9 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.9-cp312-cp312-win_amd64.whl) | | 3.12 | 2.9 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.9-cp312-cp312-win_amd64.whl), [Download2(v0.4.15)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.15/flash_attn-2.8.3+cu126torch2.9-cp312-cp312-win_amd64.whl) | | 3.13 | 2.6 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.6-cp313-cp313-win_amd64.whl) | | 3.13 | 2.6 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.6-cp313-cp313-win_amd64.whl) | | 3.13 | 2.7 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.7-cp313-cp313-win_amd64.whl) | | 3.13 | 2.7 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.7-cp313-cp313-win_amd64.whl) | | 3.13 | 2.8 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.8-cp313-cp313-win_amd64.whl) | | 3.13 | 2.8 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.8-cp313-cp313-win_amd64.whl) | | 3.13 | 2.9 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu124torch2.9-cp313-cp313-win_amd64.whl) | | 3.13 | 2.9 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu126torch2.9-cp313-cp313-win_amd64.whl), [Download2(v0.4.15)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.15/flash_attn-2.8.3+cu126torch2.9-cp313-cp313-win_amd64.whl) | | 3.13 | 2.9 | 13.0 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.8.3+cu130torch2.9-cp313-cp313-win_amd64.whl) |

Flash-Attention 2.8.2

Packages for Flash-Attention 2.8.2 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.7 | 12.8 | [Download1(v0.4.10)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.10/flash_attn-2.8.2+cu128torch2.7-cp310-cp310-win_amd64.whl) | | 3.10 | 2.8 | 12.8 | [Download1(v0.4.10)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.10/flash_attn-2.8.2+cu128torch2.8-cp310-cp310-win_amd64.whl) | | 3.11 | 2.7 | 12.8 | [Download1(v0.4.10)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.10/flash_attn-2.8.2+cu128torch2.7-cp311-cp311-win_amd64.whl) | | 3.11 | 2.8 | 12.8 | [Download1(v0.4.10)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.10/flash_attn-2.8.2+cu128torch2.8-cp311-cp311-win_amd64.whl) | | 3.12 | 2.7 | 12.8 | [Download1(v0.4.10)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.10/flash_attn-2.8.2+cu128torch2.7-cp312-cp312-win_amd64.whl) | | 3.12 | 2.8 | 12.8 | [Download1(v0.4.10)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.10/flash_attn-2.8.2+cu128torch2.8-cp312-cp312-win_amd64.whl) | | 3.13 | 2.6 | 12.4 | [Download1(v0.4.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.12/flash_attn-2.8.2+cu124torch2.6-cp313-cp313-win_amd64.whl) | | 3.13 | 2.7 | 12.4 | [Download1(v0.4.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.12/flash_attn-2.8.2+cu124torch2.7-cp313-cp313-win_amd64.whl) | | 3.13 | 2.7 | 12.6 | [Download1(v0.4.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.12/flash_attn-2.8.2+cu126torch2.7-cp313-cp313-win_amd64.whl) | | 3.13 | 2.8 | 12.4 | [Download1(v0.4.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.12/flash_attn-2.8.2+cu124torch2.8-cp313-cp313-win_amd64.whl) | | 3.13 | 2.8 | 12.6 | [Download1(v0.4.12)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.12/flash_attn-2.8.2+cu126torch2.8-cp313-cp313-win_amd64.whl) |

Flash-Attention 2.7.4.post1

Packages for Flash-Attention 2.7.4.post1 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.5 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu126torch2.5-cp310-cp310-win_amd64.whl) | | 3.10 | 2.6 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu124torch2.6-cp310-cp310-win_amd64.whl) | | 3.10 | 2.6 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu126torch2.6-cp310-cp310-win_amd64.whl) | | 3.10 | 2.7 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu124torch2.7-cp310-cp310-win_amd64.whl) | | 3.10 | 2.8 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu124torch2.8-cp310-cp310-win_amd64.whl) | | 3.10 | 2.9 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu124torch2.9-cp310-cp310-win_amd64.whl) | | 3.10 | 2.9 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu126torch2.9-cp310-cp310-win_amd64.whl) | | 3.11 | 2.5 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu126torch2.5-cp311-cp311-win_amd64.whl) | | 3.11 | 2.6 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu124torch2.6-cp311-cp311-win_amd64.whl) | | 3.11 | 2.6 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu126torch2.6-cp311-cp311-win_amd64.whl) | | 3.11 | 2.7 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu124torch2.7-cp311-cp311-win_amd64.whl) | | 3.11 | 2.7 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu126torch2.7-cp311-cp311-win_amd64.whl) | | 3.11 | 2.8 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu126torch2.8-cp311-cp311-win_amd64.whl) | | 3.11 | 2.9 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu126torch2.9-cp311-cp311-win_amd64.whl) | | 3.12 | 2.6 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu124torch2.6-cp312-cp312-win_amd64.whl) | | 3.12 | 2.6 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu126torch2.6-cp312-cp312-win_amd64.whl) | | 3.12 | 2.7 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu124torch2.7-cp312-cp312-win_amd64.whl) | | 3.12 | 2.7 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu126torch2.7-cp312-cp312-win_amd64.whl) | | 3.12 | 2.8 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu126torch2.8-cp312-cp312-win_amd64.whl) | | 3.12 | 2.9 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu124torch2.9-cp312-cp312-win_amd64.whl) | | 3.12 | 2.9 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu126torch2.9-cp312-cp312-win_amd64.whl) | | 3.13 | 2.6 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu126torch2.6-cp313-cp313-win_amd64.whl) | | 3.13 | 2.7 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu124torch2.7-cp313-cp313-win_amd64.whl) | | 3.13 | 2.7 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu126torch2.7-cp313-cp313-win_amd64.whl) | | 3.13 | 2.8 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu124torch2.8-cp313-cp313-win_amd64.whl) | | 3.13 | 2.8 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu126torch2.8-cp313-cp313-win_amd64.whl) | | 3.13 | 2.9 | 12.4 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu124torch2.9-cp313-cp313-win_amd64.whl) | | 3.13 | 2.9 | 12.6 | [Download1(v0.4.19)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.19/flash_attn-2.7.4.post1+cu126torch2.9-cp313-cp313-win_amd64.whl) |

Flash-Attention 2.7.4

Packages for Flash-Attention 2.7.4 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.4 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.7.4+cu124torch2.4-cp310-cp310-win_amd64.whl) | | 3.10 | 2.5 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.7.4+cu124torch2.5-cp310-cp310-win_amd64.whl) | | 3.10 | 2.6 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.7.4+cu124torch2.6-cp310-cp310-win_amd64.whl) | | 3.10 | 2.7 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.7.4+cu124torch2.7-cp310-cp310-win_amd64.whl) | | 3.10 | 2.7 | 12.8 | [Download1(v0.4.10)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.10/flash_attn-2.7.4+cu128torch2.7-cp310-cp310-win_amd64.whl) | | 3.10 | 2.8 | 12.8 | [Download1(v0.4.10)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.10/flash_attn-2.7.4+cu128torch2.8-cp310-cp310-win_amd64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.7.4+cu124torch2.5-cp311-cp311-win_amd64.whl) | | 3.11 | 2.7 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.7.4+cu124torch2.7-cp311-cp311-win_amd64.whl) | | 3.11 | 2.7 | 12.8 | [Download1(v0.4.10)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.10/flash_attn-2.7.4+cu128torch2.7-cp311-cp311-win_amd64.whl), [Download2(v0.4.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.9/flash_attn-2.7.4+cu128torch2.7-cp311-cp311-win_amd64.whl) | | 3.11 | 2.8 | 12.8 | [Download1(v0.4.10)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.10/flash_attn-2.7.4+cu128torch2.8-cp311-cp311-win_amd64.whl) | | 3.12 | 2.4 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.7.4+cu124torch2.4-cp312-cp312-win_amd64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.7.4+cu124torch2.5-cp312-cp312-win_amd64.whl) | | 3.12 | 2.6 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.7.4+cu124torch2.6-cp312-cp312-win_amd64.whl) | | 3.12 | 2.7 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.7.4+cu124torch2.7-cp312-cp312-win_amd64.whl) | | 3.12 | 2.7 | 12.8 | [Download1(v0.4.10)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.10/flash_attn-2.7.4+cu128torch2.7-cp312-cp312-win_amd64.whl) | | 3.12 | 2.8 | 12.8 | [Download1(v0.4.10)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.10/flash_attn-2.7.4+cu128torch2.8-cp312-cp312-win_amd64.whl) |

Flash-Attention 2.6.3

Packages for Flash-Attention 2.6.3 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.4 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu124torch2.4-cp310-cp310-win_amd64.whl) | | 3.10 | 2.4 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu128torch2.4-cp310-cp310-win_amd64.whl) | | 3.10 | 2.5 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu124torch2.5-cp310-cp310-win_amd64.whl) | | 3.10 | 2.5 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu128torch2.5-cp310-cp310-win_amd64.whl) | | 3.10 | 2.6 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu124torch2.6-cp310-cp310-win_amd64.whl) | | 3.10 | 2.6 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu128torch2.6-cp310-cp310-win_amd64.whl) | | 3.10 | 2.7 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu124torch2.7-cp310-cp310-win_amd64.whl) | | 3.10 | 2.7 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu128torch2.7-cp310-cp310-win_amd64.whl) | | 3.11 | 2.4 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu124torch2.4-cp311-cp311-win_amd64.whl) | | 3.11 | 2.4 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu128torch2.4-cp311-cp311-win_amd64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu124torch2.5-cp311-cp311-win_amd64.whl) | | 3.11 | 2.5 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu128torch2.5-cp311-cp311-win_amd64.whl) | | 3.11 | 2.6 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu124torch2.6-cp311-cp311-win_amd64.whl) | | 3.11 | 2.6 | 12.6 | [Download1(v0.3.1)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.1/flash_attn-2.6.3+cu126torch2.6-cp311-cp311-win_amd64.whl) | | 3.11 | 2.6 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu128torch2.6-cp311-cp311-win_amd64.whl) | | 3.11 | 2.7 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu124torch2.7-cp311-cp311-win_amd64.whl) | | 3.11 | 2.7 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu128torch2.7-cp311-cp311-win_amd64.whl) | | 3.12 | 2.4 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu124torch2.4-cp312-cp312-win_amd64.whl) | | 3.12 | 2.4 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu128torch2.4-cp312-cp312-win_amd64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu124torch2.5-cp312-cp312-win_amd64.whl) | | 3.12 | 2.5 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu128torch2.5-cp312-cp312-win_amd64.whl) | | 3.12 | 2.6 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu124torch2.6-cp312-cp312-win_amd64.whl) | | 3.12 | 2.6 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu128torch2.6-cp312-cp312-win_amd64.whl) | | 3.12 | 2.7 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu124torch2.7-cp312-cp312-win_amd64.whl) | | 3.12 | 2.7 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.6.3+cu128torch2.7-cp312-cp312-win_amd64.whl) |

Flash-Attention 2.5.9

Packages for Flash-Attention 2.5.9 | Python | PyTorch | CUDA | package | | ------ | ------- | ---- | ------- | | 3.10 | 2.4 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu124torch2.4-cp310-cp310-win_amd64.whl) | | 3.10 | 2.4 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu128torch2.4-cp310-cp310-win_amd64.whl) | | 3.10 | 2.5 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu124torch2.5-cp310-cp310-win_amd64.whl) | | 3.10 | 2.5 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu128torch2.5-cp310-cp310-win_amd64.whl) | | 3.10 | 2.6 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu124torch2.6-cp310-cp310-win_amd64.whl) | | 3.10 | 2.6 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu128torch2.6-cp310-cp310-win_amd64.whl) | | 3.10 | 2.7 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu124torch2.7-cp310-cp310-win_amd64.whl) | | 3.10 | 2.7 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu128torch2.7-cp310-cp310-win_amd64.whl) | | 3.11 | 2.4 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu124torch2.4-cp311-cp311-win_amd64.whl) | | 3.11 | 2.4 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu128torch2.4-cp311-cp311-win_amd64.whl) | | 3.11 | 2.5 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu124torch2.5-cp311-cp311-win_amd64.whl) | | 3.11 | 2.5 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu128torch2.5-cp311-cp311-win_amd64.whl) | | 3.11 | 2.6 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu124torch2.6-cp311-cp311-win_amd64.whl) | | 3.11 | 2.6 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu128torch2.6-cp311-cp311-win_amd64.whl) | | 3.11 | 2.7 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu124torch2.7-cp311-cp311-win_amd64.whl) | | 3.11 | 2.7 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu128torch2.7-cp311-cp311-win_amd64.whl) | | 3.12 | 2.4 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu124torch2.4-cp312-cp312-win_amd64.whl) | | 3.12 | 2.4 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu128torch2.4-cp312-cp312-win_amd64.whl) | | 3.12 | 2.5 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu124torch2.5-cp312-cp312-win_amd64.whl) | | 3.12 | 2.5 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu128torch2.5-cp312-cp312-win_amd64.whl) | | 3.12 | 2.6 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu124torch2.6-cp312-cp312-win_amd64.whl) | | 3.12 | 2.6 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu128torch2.6-cp312-cp312-win_amd64.whl) | | 3.12 | 2.7 | 12.4 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu124torch2.7-cp312-cp312-win_amd64.whl) | | 3.12 | 2.7 | 12.8 | [Download1(v0.3.9)](https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.9/flash_attn-2.5.9+cu128torch2.7-cp312-cp312-win_amd64.whl) |