llama : fix command-r inference when omitting outputs #6367
llama : fix command-r inference
64b7d858
Wuzzooy
approved these changes
on 2024-03-28
ggerganov
approved these changes
on 2024-03-28
ggerganov
merged
0308f5e3
into master 1 year ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub