[LoRA] fix `cross_attention_kwargs` problems and tighten tests (#7388)
* debugging
* let's see the numbers
* let's see the numbers
* let's see the numbers
* restrict tolerance.
* increase inference steps.
* shallow copy of cross_attentionkwargs
* remove print