transformers
d45f47ab - Fix: Disable torch.autocast in RotaryEmbedding of Gemma and LLaMa for MPS device (#29439)

Commit
1 year ago
Fix: Disable torch.autocast in RotaryEmbedding of Gemma and LLaMa for MPS device (#29439) * Fix: Disable torch.autocast in RotaryEmbedding of Gemma and LLaMa for MPS devices * Update src/transformers/models/gemma/modeling_gemma.py Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com> * Update llama ang gemma rope use cpu in mps device --------- Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
Author
Parents
Loading