transformers
851b8f28 - [`kernels`] If flash attention2 is not installed / fails to import (cc on our cluster) default to kernels (#40178)

Commit
133 days ago
[`kernels`] If flash attention2 is not installed / fails to import (cc on our cluster) default to kernels (#40178) * first step if flash not installed but you set to use it * try importing * now default to using it * update our tests as well * wow yesterday I was not awake * fixup * style * lol the fix was very very simple * `RUN python3 -m pip install --no-cache-dir git+https://github.com/huggingface/kernels@main#egg=kernels ` for updated dockers * push review comments * fix --------- Co-authored-by: Cyril Vallez <cyril.vallez@huggingface.co> Co-authored-by: Cyril Vallez <cyril.vallez@gmail.com>
Author
Parents
Loading