llama.cpp
Fix RWKV v6 model conversion
#10913
Merged

Fix RWKV v6 model conversion #10913

MollySophia
MollySophia Enable --no-context-shift for llama-perplexity example
ff3d2265
github-actions github-actions added python
ggerganov
ggerganov approved these changes on 2024-12-20
MollySophia
ggerganov
MollySophia
MollySophia RWKV 6: Fix error in ggml_cuda_op_bin_bcast
a20a94f5
MollySophia MollySophia force pushed to a20a94f5 364 days ago
ggerganov
MollySophia
ggerganov ggerganov merged 0a11f8b7 into master 364 days ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone