transformers
Add flash attention for `gpt_bigcode`
#26479
Merged

Loading