transformers
1650e0e5 - Fixed typo in Llama configuration docstring (#35520)

Commit
344 days ago
Fixed typo in Llama configuration docstring (#35520) Update configuration_llama.py There is no `num_heads` parameter, only `num_attention_heads`
Parents
Loading