transformers
1cfcbfca - [VLMs] fix flash-attention tests (#37603)

Commit
235 days ago
[VLMs] fix flash-attention tests (#37603) * fix one test * fa2 ln test * remove keys from config recursively * fix * fixup
Author
Parents
Loading