llama.cpp
07aaa0f6
- ggml : fix ggml_flash_attn to use op_params (#2387)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
ggml : fix ggml_flash_attn to use op_params (#2387) * ggml : fix ggml_flash_attn to use op_params
References
#2387 - ggml : fix ggml_flash_attn to use op_params
Author
slaren
Parents
fce48caf
Loading