transformers
cd9f9d63 - [gpt-neox] Add attention_bias config to support model trained without attention biases (#28126)

Commit
2 years ago
[gpt-neox] Add attention_bias config to support model trained without attention biases (#28126) * add attention_bias hparam for a model trained without attention biases * fix argument documentation error
Author
Parents
Loading