llama.cpp
llama : fix qs.n_attention_wv for DeepSeek-V2
#9156
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
1
Changes
View On
GitHub
llama : fix qs.n_attention_wv for DeepSeek-V2
#9156
ggerganov
merged 1 commit into
master
from
compilade/fix-deepseek-n_wv
llama : fix qs.n_attention_wv for DeepSeek-V2
f5f4cdef
compilade
added
bugfix
compilade
added
Review Complexity : Low
ggerganov
approved these changes on 2024-08-26
ggerganov
merged
78eb487b
into master
1 year ago
ggerganov
deleted the compilade/fix-deepseek-n_wv branch
1 year ago
Login to write a write a comment.
Login via GitHub
Reviewers
ggerganov
Assignees
No one assigned
Labels
bugfix
Review Complexity : Low
Milestone
No milestone
Login to write a write a comment.
Login via GitHub