transformers
Add Flash Attention 2 support to Bark
#27364
Merged

Add Flash Attention 2 support to Bark #27364

ylacombe
ylacombe change handmade attention mask to _prepare_4d_attention_mask
0eadfabc
ylacombe add flashattention2 support in Bark
0fcff472
ylacombe add flashattention2 tests on BarkSemanticModel
7ca710cb
ylacombe make style
10d81bad
ylacombe fix flashattention and tests + make style
32fb57d0
ylacombe fix memory leak and allow Bark to pass flash attention to sub-models
c2ff5f40
ylacombe make style
ef106a42
HuggingFaceDocBuilderDev
sanchit-gandhi
sanchit-gandhi approved these changes on 2023-11-08
ylacombe Apply suggestions from code review
425d41d2
ylacombe remove unecessary code from tests + justify overriding
049c2e95
ylacombe Merge branch 'bark-flashattention-2' of github.com:ylacombe/transform…
c6a34cf3
ylacombe
amyeroberts
amyeroberts approved these changes on 2023-11-08
ylacombe Update tests/models/bark/test_modeling_bark.py
653fa13d
ylacombe make style
5f76f132
ylacombe
ylacombe ylacombe merged a5bee89c into main 2 years ago
ArthurZucker
ArthurZucker commented on 2023-11-09

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone