transformers
[gpt-neox] Add attention_bias config to support model trained without attention biases
#28126
Merged

[gpt-neox] Add attention_bias config to support model trained without attention biases #28126

dalgarak
dalgarak add attention_bias hparam for a model trained without attention biases
1aa10b65
amyeroberts
younesbelkada
younesbelkada commented on 2023-12-19
dalgarak
ArthurZucker
ArthurZucker approved these changes on 2023-12-20
HuggingFaceDocBuilderDev
dalgarak fix argument documentation error
589ab059
dalgarak
younesbelkada
younesbelkada approved these changes on 2023-12-20
ArthurZucker ArthurZucker merged cd9f9d63 into main 2 years ago
dalgarak dalgarak deleted the gpt_neox_no_attn_biases branch 2 years ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone