transformers
492ee17e
- Fix paligemma detection inference (#31587)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
Fix paligemma detection inference (#31587) * fix extended attention mask * add slow test for detection instance * [run-slow]paligemma
References
#31587 - Fix paligemma detection inference
Author
molbap
Parents
e71f2863
Loading