llama.cpp
Vulkan: Fix mmq int dot float cache size
#12722
Merged

Vulkan: Fix mmq int dot float cache size #12722

0cc4m merged 1 commit into master from 0cc4m/vulkan-mmq-dp4a-fix
0cc4m
0cc4m Vulkan: Fix mmq int dot float cache size
0c67672c
0cc4m 0cc4m requested a review from jeffbolznv jeffbolznv 331 days ago
jeffbolznv
jeffbolznv approved these changes on 2025-04-02
github-actions github-actions added Vulkan
github-actions github-actions added ggml
0cc4m 0cc4m merged 92e3006b into master 331 days ago
0cc4m 0cc4m deleted the 0cc4m/vulkan-mmq-dp4a-fix branch 331 days ago
0cc4m

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone