[LoRA] Remove the use of depcrecated loRA functionalities such as `LoRAAttnProcessor` (#6369)
* start deprecating loraattn.
* fix
* wrap into unet_lora_state_dict
* utilize text_encoder_lora_params
* utilize text_encoder_attn_modules
* debug
* debug
* remove print
* don't use text encoder for test_stable_diffusion_lora
* load the procs.
* set_default_attn_processor
* fix: set_default_attn_processor call.
* fix: lora_components[unet_lora_params]
* checking for 3d.
* 3d.
* more fixes.
* debug
* debug
* debug
* debug
* more debug
* more debug
* more debug
* more debug
* more debug
* more debug
* hack.
* remove comments and prep for a PR.
* appropriate set_lora_weights()
* fix
* fix: test_unload_lora_sd
* fix: test_unload_lora_sd
* use dfault attebtion processors.
* debu
* debug nan
* debug nan
* debug nan
* use NaN instead of inf
* remove comments.
* fix: test_text_encoder_lora_state_dict_unchanged
* attention processor default
* default attention processors.
* default
* style