pytorch
73eab18a - set lowmem_dropout and fallback_random configs for all tests in test_… (#100506)

Commit
1 year ago
set lowmem_dropout and fallback_random configs for all tests in test_… (#100506) …fused_attention This allows all the tests in test_fused_attention to succeed when run together, otherwise replacements are registered without proper config set, and thus some tests fail and succeed only on rerun. This is also confusing when running full file locally. Pull Request resolved: https://github.com/pytorch/pytorch/pull/100506 Approved by: https://github.com/drisspg
Author
Natalia Gimelshein
Committer
Parents
Loading