Fix RWKV v6 model conversion #10913
Enable --no-context-shift for llama-perplexity example
ff3d2265
ggerganov
approved these changes
on 2024-12-20
RWKV 6: Fix error in ggml_cuda_op_bin_bcast
a20a94f5
ggerganov
merged
0a11f8b7
into master 364 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub