llama.cpp
llama : enable flash attn automatically when supported (WIP)
#10101
Open
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
1
Changes
View On
GitHub
llama : enable flash attn automatically when supported (WIP)
#10101
slaren
wants to merge 1 commit into
master
from
sl/auto-flash-attn
llama : enable flash attn automatically when supported
afc4a7de
Login to write a write a comment.
Login via GitHub
Reviewers
No reviews
Assignees
No one assigned
Labels
None yet
Milestone
No milestone
Login to write a write a comment.
Login via GitHub