llama.cpp
8551c44d
- context : always use non-causal attention for encoder graphs (#12447)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
206 days ago
context : always use non-causal attention for encoder graphs (#12447) * context : always use non-causal attention for encoder graphs ggml-ci * context : move the change to llama_context::encode() ggml-ci
References
#12447 - context : always use non-causal attention for encoder graphs
Author
ggerganov
Parents
35cae5ba
Loading