Format bert or transformers code #12646
limit line length 120
ae090762
fix
9cbda864
tianleiwu
marked this pull request as draft 3 years ago
add more files
66931feb
format more files
07d972f6
tianleiwu
changed the title Format bert cuda code to limit line length to 120 Format bert or transformers code 3 years ago
fix line length > 120
d8c1df25
fix pragma
7483edb9
tianleiwu
marked this pull request as ready for review 3 years ago
adjust parameter order of LaunchAttentionKernel
727fb059
wangyems
dismissed these changes
on 2022-08-19
add rocm files
589f35c1
tianleiwu
dismissed their stale review
via 589f35c1
3 years ago
use static_cast
7f7450f5
rocblas
a8922a22
wangyems
approved these changes
on 2022-08-22
tianleiwu
merged
d93e6533
into main 3 years ago
tianleiwu
deleted the tlwu/format_attention_code branch 3 years ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub