transformers
Fix attention mask handling in EoMT-DINOv3 converter
#59
Open

Fix attention mask handling in EoMT-DINOv3 converter #59

NielsRogge
NielsRogge Fix attention mask handling in EoMT-DINOv3 converter
150d7568
NielsRogge NielsRogge added codex
NielsRogge Document EoMT-DINOv3 converter usage and improve gated download error
3a4c1368
NielsRogge Fix EoMT-DINOv3 small patch conversion and load
ff90f847
NielsRogge Verify RoPE outputs during EoMT-DINOv3 conversion
574497de
NielsRogge Align positional embedding verification inputs
7e290529
NielsRogge Align masked attention replication in verifier
b4c9d053
NielsRogge Add script
b8c9b2d6
NielsRogge Merge branch 'codex/write-conversion-script-for-eomt-dinov3-checkpoin…
bc197ed5
NielsRogge Support all EoMT-DINOv3 checkpoints in converter
504b0da0
NielsRogge Update script
418d331b
NielsRogge Merge branch 'codex/write-conversion-script-for-eomt-dinov3-checkpoin…
94c56c57
NielsRogge Update conversion script
43a3fbfe
github-actions
NielsRogge Test conversion of large checkpoint
71c0411d

Login to write a write a comment.

Login via GitHub

Reviewers
No reviews
Assignees
No one assigned
Labels
Milestone