transformers
Support `num_attention_heads` != `num_key_value_heads` in Flax Llama Implementation
#29557
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
5
Changes
View On
GitHub
Loading