transformers
454c0a7c
- Use `torch.get_autocast_dtype` instead of `torch.get_autocast_gpu_dtype` (#42055)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
68 days ago
Use `torch.get_autocast_dtype` instead of `torch.get_autocast_gpu_dtype` (#42055) Update dtype handling for PyTorch 2.4 compatibility in flash attention models
References
#42055 - Use `torch.get_autocast_dtype` instead of `torch.get_autocast_gpu_dtype`
Author
qgallouedec
Parents
f4c8497d
Loading