pytorch
59532bd6 - [inductor] Fix a cpp wrapper codegen issue for _scaled_dot_product_efficient_attention (#102624)

Commit
1 year ago
[inductor] Fix a cpp wrapper codegen issue for _scaled_dot_product_efficient_attention (#102624) Summary: This fixes a cpp_wrapper coverage drop on TIMM models as shown in recent inference dashboard. Pull Request resolved: https://github.com/pytorch/pytorch/pull/102624 Approved by: https://github.com/ngimel, https://github.com/jansel
Author
Committer
Parents
Loading