[LoRA] feat: add lora attention processor for pt 2.0. (#3594)
* feat: add lora attention processor for pt 2.0.
* explicit context manager for SDPA.
* switch to flash attention
* make shapes compatible to work optimally with SDPA.
* fix: circular import problem.
* explicitly specify the flash attention kernel in sdpa
* fall back to efficient attention context manager.
* remove explicit dispatch.
* fix: removed processor.
* fix: remove optional from type annotation.
* feat: make changes regarding LoRAAttnProcessor2_0.
* remove confusing warning.
* formatting.
* relax tolerance for PT 2.0
* fix: loading message.
* remove unnecessary logging.
* add: entry to the docs.
* add: network_alpha argument.
* relax tolerance.