Remove redundant transposes for rope rotation #807
Merge pull request #1 from mosaicml/main
04dd3349
Merge pull request #8 from mosaicml/main
87b2fdcd
Merge pull request #12 from mosaicml/main
c9a42e47
Merge branch 'mosaicml:main' into main
ddea9eec
Merge pull request #13 from mosaicml/main
0bcd8eee
Merge pull request #14 from mosaicml/main
f209b581
Merge pull request #15 from mosaicml/main
ec4378df
Merge branch 'mosaicml:main' into main
b4367063
..
bcace036
Merge branch 'mosaicml:main' into main
cf4aa585
Merge branch 'mosaicml:main' into main
7c35ce89
..
0a8ebfbc
..
6f18a332
Merge branch 'mosaicml:main' into main
f42d5859
..
6535d049
Merge branch 'main' into shashank/fix_redundant_transposes_rope
fff3b487
..
e2d33eb6
Merge branch 'main' into shashank/fix_redundant_transposes_rope
fe4afd1f
merging
e9fabadd
Merge branch 'main' into shashank/fix_redundant_transposes_rope
d2602b1a
dakinggg
approved these changes
on 2023-12-20
Update llmfoundry/models/layers/attention.py
d94baa63
..
1b4eb956
ShashankMosaicML
deleted the shashank/fix_redundant_transposes_rope branch 2 years ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub