llama.cpp
78eb487b - llama : fix qs.n_attention_wv for DeepSeek-V2 (#9156)

Commit
1 year ago
llama : fix qs.n_attention_wv for DeepSeek-V2 (#9156)
Author
Parents
Loading