[QNN EP] Enable option to set QNN context priority (#18315)
Enable option qnn_context_priority to set QNN context priority, options:
"low", "normal", "normal_high", "high".
### Description
Enable option qnn_context_priority to set QNN context priority, options:
"low", "normal", "normal_high", "high".
This feature guarantees the model inference with higher priority. Tested
with onnxruntime_perf_test tool using same model.
1. Run the model on the NPU with single instance, the latency is 300ms.
2. Run the same model on NPU with 2 instance at same time.
Case 1:
both with same priority (high ) -- latency is 600ms
Case 2:
1 with low priority -- latency is 30,000ms
1 with high priority -- latency is 300ms
Case 3:
1 with normal priority -- latency is 15,000ms
1 with high priority -- latency is 300ms