[LoRA] use the PyTorch classes wherever needed and start depcrecation cycles (#7204)
* fix PyTorch classes and start deprecsation cycles.
* remove args crafting for accommodating scale.
* remove scale check in feedforward.
* assert against nn.Linear and not CompatibleLinear.
* remove conv_cls and lineaR_cls.
* remove scale
* 👋 scale.
* fix: unet2dcondition
* fix attention.py
* fix: attention.py again
* fix: unet_2d_blocks.
* fix-copies.
* more fixes.
* fix: resnet.py
* more fixes
* fix i2vgenxl unet.
* depcrecate scale gently.
* fix-copies
* Apply suggestions from code review
Co-authored-by: YiYi Xu <yixu310@gmail.com>
* quality
* throw warning when scale is passed to the the BasicTransformerBlock class.
* remove scale from signature.
* cross_attention_kwargs, very nice catch by Yiyi
* fix: logger.warn
* make deprecation message clearer.
* address final comments.
* maintain same depcrecation message and also add it to activations.
* address yiyi
* fix copies
* Apply suggestions from code review
Co-authored-by: YiYi Xu <yixu310@gmail.com>
* more depcrecation
* fix-copies
---------
Co-authored-by: YiYi Xu <yixu310@gmail.com>