llm-foundry
Fix pairwise attention comparison in test
#737
Merged

Fix pairwise attention comparison in test #737

sashaDoubov
sashaDoubov fix attn impl not being set
60cdf36a
sashaDoubov sashaDoubov changed the title fix attn impl not being set Fix pairwise attention comparison in test 2 years ago
sashaDoubov sashaDoubov marked this pull request as draft 2 years ago
sashaDoubov change d_model and increase tolerance
f603b469
sashaDoubov sashaDoubov marked this pull request as ready for review 2 years ago
sashaDoubov sashaDoubov requested a review from vchiley vchiley 2 years ago
sashaDoubov sashaDoubov requested a review from dakinggg dakinggg 2 years ago
sashaDoubov sashaDoubov requested a review from ShashankMosaicML ShashankMosaicML 2 years ago
dakinggg
dakinggg approved these changes on 2023-11-15
sashaDoubov add special case
51a43b1b
ShashankMosaicML
ShashankMosaicML approved these changes on 2023-11-15
ShashankMosaicML
ShashankMosaicML commented on 2023-11-15
sashaDoubov undo typo
d9159346
sashaDoubov sashaDoubov merged f114dad5 into main 2 years ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone