transformers
cd9f9d63
- [gpt-neox] Add attention_bias config to support model trained without attention biases (#28126)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
[gpt-neox] Add attention_bias config to support model trained without attention biases (#28126) * add attention_bias hparam for a model trained without attention biases * fix argument documentation error
References
#28126 - [gpt-neox] Add attention_bias config to support model trained without attention biases
Author
dalgarak
Parents
def581ef
Loading