llama.cpp
metal : fix flash attention kernel requirements
#7169
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
2
Changes
View On
GitHub
metal : fix flash attention kernel requirements
#7169
ggerganov
merged 2 commits into
master
from
gg/metal-fattn-reqs
metal : fix flash attention kernel requirements
1174def5
metal : fix ggml_metal_supports_op
fecb81e3
mofosyne
added
bugfix
mofosyne
added
Review Complexity : High
mofosyne
added
need feedback
mofosyne
assigned
mofosyne
1 year ago
mofosyne
requested a review
from
mofosyne
1 year ago
mofosyne
approved these changes on 2024-05-10
ggerganov
merged
18e43766
into master
1 year ago
Login to write a write a comment.
Login via GitHub
Reviewers
mofosyne
Assignees
mofosyne
Labels
need feedback
bugfix
Review Complexity : High
Milestone
No milestone
Login to write a write a comment.
Login via GitHub