llama.cpp
Vulkan: Fix mmq int dot float cache size
#12722
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
1
Changes
View On
GitHub
Vulkan: Fix mmq int dot float cache size
#12722
0cc4m
merged 1 commit into
master
from
0cc4m/vulkan-mmq-dp4a-fix
Vulkan: Fix mmq int dot float cache size
0c67672c
0cc4m
requested a review
from
jeffbolznv
331 days ago
jeffbolznv
approved these changes on 2025-04-02
github-actions
added
Vulkan
github-actions
added
ggml
0cc4m
merged
92e3006b
into master
331 days ago
0cc4m
deleted the 0cc4m/vulkan-mmq-dp4a-fix branch
331 days ago
Login to write a write a comment.
Login via GitHub
Reviewers
jeffbolznv
Assignees
No one assigned
Labels
Vulkan
ggml
Milestone
No milestone
Login to write a write a comment.
Login via GitHub