transformers
1650e0e5
- Fixed typo in Llama configuration docstring (#35520)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
344 days ago
Fixed typo in Llama configuration docstring (#35520) Update configuration_llama.py There is no `num_heads` parameter, only `num_attention_heads`
References
#35520 - Fixed typo in Llama configuration docstring
Author
sudarshan-mukund
Parents
3b1be043
Loading