transformers
255c62a1 - Allow Attention and Experts to be used as standalone modules (#43622)

Commit
106 days ago
Allow Attention and Experts to be used as standalone modules (#43622) * default * fix all * fix * add warning * strict check * fix * add test * fix * improve test
Author
Parents
Loading