llama.cpp
78eb487b
- llama : fix qs.n_attention_wv for DeepSeek-V2 (#9156)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Hide Minimap (CTRL+M)
Commit
351 days ago
llama : fix qs.n_attention_wv for DeepSeek-V2 (#9156)
References
#9156 - llama : fix qs.n_attention_wv for DeepSeek-V2
Author
compilade
Parents
a77feb5d
Files
1
src
llama.cpp
Loading