llama.cpp
llama : enable flash attn automatically when supported (WIP)
#10101
Open

llama : enable flash attn automatically when supported (WIP) #10101

slaren wants to merge 1 commit into master from sl/auto-flash-attn
slaren
slaren llama : enable flash attn automatically when supported
afc4a7de

Login to write a write a comment.

Login via GitHub

Reviewers
No reviews
Assignees
No one assigned
Labels
Milestone