[`Attn Masks`] Non-vmap default for attention masks #41852
atmpt 1
dfcb545d
fixup masking to work correctly with old torch
b87a139f
Merge branch 'main' into non-vmap-masks
9aed30d1
few changes to make things a bit more cleaner
969dab55
oopsie
513c8ef7
fix integer overflow on bidirectional masks via indexing fn
466acaba
rm executorch workarounds --> still need to handle on sliding etc fns…
bbaf41d8
typo
65357d9b
docs, fix older torch inplace issue, proper kwarg handling
aaaaec2b
chunked works with non vmap and older torch, add warning on non guara…
539bafad
lift unnecessary restriction on older torch
01848e3b
vasqu
changed the title [`WIP`][`Masking`] Non-vmap default for attention masks [`Attn Masks`] Non-vmap default for attention masks 95 days ago
Merge branch 'main' into non-vmap-masks
9dc62965
vasqu
marked this pull request as ready for review 95 days ago
vasqu
commented
on 2025-10-29
vasqu
commented
on 2025-10-29
vasqu
commented
on 2025-10-29
vasqu
commented
on 2025-10-29
simplify a few things, restrict torch < 2.6 to non-vmap (for now)
17c7a486
try fix
4e6e799b
remove unnecessary slicing logic
26b266c4
remove legacy func
1fb7510e
harmonize slightly
4f62c81c
vasqu
merged
03538a80
into main 83 days ago
vasqu
deleted the non-vmap-masks branch 83 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub