transformers
4ee7f51e - Embedding VLMs don't need a head (#45000)

Commit
21 days ago
Embedding VLMs don't need a head (#45000) * squash * fix copies * skip, we dont need to load base model for it * oops, one more regex since now we have no prefix
Author
Parents
Loading