openvino
df7c5c31 - MoE adoption fixes (#34186)

Commit
42 days ago
MoE adoption fixes (#34186) ### Details: Some Mixture of Experts patterns from frameworks may contain MatMuls with `transpose_b=false` + an explicit Transpose operation on weights after conversion to OV IR. These Transposes must be fused into MatMul (rather than constant folded) to match plugins' expectations. In this PR, `TransposeMatMul` is registered in MOC and pre-post process pipelines before the first `ConstantFolding` registration ### Tickets: - *CVS-181449* - *CVS-179204*
Author
Parents
Loading