vllm
27c065df - [Bugfix][V1][ROCm] Fix AITER Flash Attention Backend (Fix API Break and Local Attention Logic: affecting Llama4) (#19904)

Commit
231 days ago
[Bugfix][V1][ROCm] Fix AITER Flash Attention Backend (Fix API Break and Local Attention Logic: affecting Llama4) (#19904) Signed-off-by: tjtanaa <tunjian.tan@embeddedllm.com>
Author
Parents
Loading