llm-foundry
96cf646c
- Adding a fix for Cross Entropy Loss for long sequence lengths. (#795)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
Adding a fix for Cross Entropy Loss for long sequence lengths. (#795) * .. * .. * .. * .. * .. * .. * .. * .. * .. * ..
References
#795 - Adding a fix for Cross Entropy Loss for long sequence lengths.
Author
ShashankMosaicML
Parents
410d5c71
Loading