Remove redundant transposes for rope rotation (#807)
* ..
* ..
* Update llmfoundry/models/layers/attention.py
Co-authored-by: Daniel King <43149077+dakinggg@users.noreply.github.com>
* ..
---------
Co-authored-by: Daniel King <43149077+dakinggg@users.noreply.github.com>