transformers
d55fcbcc
- fix default num_attention_heads in segformer doc (#16612)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
3 years ago
fix default num_attention_heads in segformer doc (#16612)
References
#16612 - fix default num_attention_heads in segformer doc
Author
JunMa11
Parents
b18dfd95
Loading