llama.cpp
metal : fix flash attention kernel requirements
#7169
Merged

metal : fix flash attention kernel requirements #7169

ggerganov merged 2 commits into master from gg/metal-fattn-reqs
ggerganov
ggerganov metal : fix flash attention kernel requirements
1174def5
ggerganov metal : fix ggml_metal_supports_op
fecb81e3
mofosyne mofosyne added bugfix
mofosyne mofosyne added Review Complexity : High
mofosyne mofosyne added need feedback
mofosyne
mofosyne mofosyne assigned mofosyne mofosyne 1 year ago
mofosyne mofosyne requested a review from mofosyne mofosyne 1 year ago
mofosyne
mofosyne approved these changes on 2024-05-10
Animaxx
ggerganov ggerganov merged 18e43766 into master 1 year ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
Labels
Milestone