transformers
255c62a1
- Allow Attention and Experts to be used as standalone modules (#43622)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
106 days ago
Allow Attention and Experts to be used as standalone modules (#43622) * default * fix all * fix * add warning * strict check * fix * add test * fix * improve test
References
#43622 - Allow Attention and Experts to be used as standalone modules
Author
Cyrilvallez
Parents
11b9b0f5
Loading