transformers
1cfcbfca
- [VLMs] fix flash-attention tests (#37603)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
235 days ago
[VLMs] fix flash-attention tests (#37603) * fix one test * fa2 ln test * remove keys from config recursively * fix * fixup
References
#37603 - [VLMs] fix flash-attention tests
Author
zucchini-nlp
Parents
02baa61f
Loading