llama.cpp
CUDA: fix unused warning in mmq.cu
#7442
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
1
Changes
View On
GitHub
CUDA: fix unused warning in mmq.cu
#7442
ggerganov
merged 1 commit into
ggml-org:master
from
JohannesGaessler:cuda-fix-mmq-warning
CUDA: fix unused warning in mmq.cu
219f39b0
ggerganov
approved these changes on 2024-05-21
mofosyne
added
Nvidia GPU
mofosyne
added
Review Complexity : Low
mofosyne
added
merge ready
cebtenzzre
approved these changes on 2024-05-21
ggerganov
merged
fcf6538b
into master
1 year ago
github-actions
added
ggml
Login to write a write a comment.
Login via GitHub
Reviewers
ggerganov
cebtenzzre
Assignees
No one assigned
Labels
Nvidia GPU
Review Complexity : Low
ggml
merge ready
Milestone
No milestone
Login to write a write a comment.
Login via GitHub