Update fusion_attention to properly convert bfloat16 values #25404
Update fusion_attention to properly convert bfloat16 values
ef171ab5
Update onnxruntime/python/tools/transformers/fusion_attention.py
bee28270
Update onnxruntime/python/tools/transformers/fusion_attention.py
4df0d847
Update fusion_base.py
67154890
Fix
7a324498
Update requirements.txt
fb479f2e
Use ir locally
566411a3
Merge branch 'main' into justinchu/fix-fusion_attention
216a59e0
update
12957daf
format
f55d45bc
Update fusion_base.py
2ba7a777
justinchuby
deleted the justinchu/fix-fusion_attention branch 264 days ago
Login to write a write a comment.
Login via GitHub