transformers
Fix bug of _prepare_4d_attention_mask
#27847
Merged

Fix bug of _prepare_4d_attention_mask #27847

ArthurZucker merged 3 commits into huggingface:main from jiqing-feng:llama
jiqing-feng
jiqing-feng use _prepare_4d_attention_mask
8590e15b
jiqing-feng jiqing-feng changed the title use _prepare_4d_attention_mask Fix bug of _prepare_4d_attention_mask 2 years ago
jiqing-feng jiqing-feng marked this pull request as ready for review 2 years ago
ArthurZucker
ArthurZucker approved these changes on 2023-12-05
younesbelkada
younesbelkada commented on 2023-12-05
jiqing-feng Merge branch 'huggingface:main' into llama
199e7735
jiqing-feng fix comment
8f9a097d
jiqing-feng
younesbelkada
younesbelkada approved these changes on 2023-12-06
ArthurZucker ArthurZucker merged 4d806dba into main 2 years ago
jiqing-feng jiqing-feng deleted the llama branch 1 year ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone