llama.cpp
18e43766 - metal : fix flash attention kernel requirements (#7169)

Commit
1 year ago
metal : fix flash attention kernel requirements (#7169) * metal : fix flash attention kernel requirements ggml-ci * metal : fix ggml_metal_supports_op ggml-ci
Author
Parents
Loading