Add Flash Attention 2 support to Bark #27364
change handmade attention mask to _prepare_4d_attention_mask
0eadfabc
add flashattention2 support in Bark
0fcff472
add flashattention2 tests on BarkSemanticModel
7ca710cb
make style
10d81bad
fix flashattention and tests + make style
32fb57d0
fix memory leak and allow Bark to pass flash attention to sub-models
c2ff5f40
make style
ef106a42
Apply suggestions from code review
425d41d2
remove unecessary code from tests + justify overriding
049c2e95
Merge branch 'bark-flashattention-2' of github.com:ylacombe/transform…
c6a34cf3
Update tests/models/bark/test_modeling_bark.py
653fa13d
make style
5f76f132
ylacombe
merged
a5bee89c
into main 2 years ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub