Fix SkipLayerNorm fusion incorrectly applied when gamma/beta are not 1D (#27459)
### Description
The `SkipLayerNormFusion` optimizer skips fusion when the
`LayerNormalization` gamma or beta inputs are not 1D tensors (e.g. shape
`[1, 1, hidden_size]`). The `SkipLayerNormalization` kernel strictly
requires 1D gamma/beta, so fusing without this check caused a hard
runtime error.
- **`skip_layer_norm_fusion.cc`**: After matching the Add+LayerNorm
pattern, check that gamma (and beta if present) have exactly 1 dimension
before proceeding with fusion. If shape info is unavailable (dynamic),
fusion is allowed and runtime validation takes over.
- **`graph_transform_test_layernorm.cc`**: Added
`SkipLayerNormFusion_3DGamma_NoFusion` test — builds a graph with `Add +
LayerNormalization` where gamma/beta are `[1, 1, 4]` and asserts no
`SkipLayerNormalization` node is created.
### Motivation and Context
Models with residual connections followed by `LayerNormalization` where
the scale/bias tensors carry extra batch/sequence dimensions (e.g.
exported as `[1, 1, hidden_size]` rather than `[hidden_size]`) would
trigger fusion and then fail at runtime:
```
Non-zero status code returned while running SkipLayerNormalization node.
Status Message: gamma is expected to have 1 dimension, got 3
```
The error only appeared with 3D inputs and disappeared at
`ORT_ENABLE_BASIC` optimization level (which disables the fusion),
confirming the optimizer as the source of the regression.
<!-- START COPILOT CODING AGENT TIPS -->
---
💬 We'd love your input! Share your thoughts on Copilot coding agent in
our [2 minute survey](https://gh.io/copilot-coding-agent-survey).
---------
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: tianleiwu <30328909+tianleiwu@users.noreply.github.com>