[GPU] Add canonicalization for attention mask shape with rank less than 4 (#31096)
### Details:
- Add canonicalization check for attention mask shape. Previously, using
an attention mask with rank less than 4 could lead to kernel compilation
errors and incorrect indexing. However, according to SDPA specification,
it can be either at least 2-dimensional or empty if it is to be ignored
- Related transformation PRs that squeeze attention mask in some cases:
https://github.com/openvinotoolkit/openvino/pull/31089 and
https://github.com/openvinotoolkit/openvino/pull/30659
### Tickets:
- [CVS-165700](https://jira.devtools.intel.com/browse/CVS-165700)
- [CVS-169359](https://jira.devtools.intel.com/browse/CVS-169359)