llm-foundry
289536bd - Remove redundant transposes for rope rotation (#807)

Commit
2 years ago
Remove redundant transposes for rope rotation (#807) * .. * .. * Update llmfoundry/models/layers/attention.py Co-authored-by: Daniel King <43149077+dakinggg@users.noreply.github.com> * .. --------- Co-authored-by: Daniel King <43149077+dakinggg@users.noreply.github.com>
Parents
Loading