llama.cpp
examples : evaluate tokens in batches after swapping context
#1014
Merged

examples : evaluate tokens in batches after swapping context #1014

ggerganov merged 2 commits into ggml-org:master from grencez:batching
grencez
grencez grencez force pushed 2 years ago
grencez grencez force pushed 2 years ago
grencez grencez force pushed 2 years ago
grencez grencez force pushed 2 years ago
grencez grencez force pushed 2 years ago
grencez grencez changed the title Evaluate tokens in batches after swapping context examples: Evaluate tokens in batches after swapping context 2 years ago
grencez grencez changed the title examples: Evaluate tokens in batches after swapping context examples : evaluate tokens in batches after swapping context 2 years ago
grencez grencez marked this pull request as ready for review 2 years ago
grencez examples : evaluate tokens in batches after swapping context
d1f02102
grencez grencez force pushed to d1f02102 2 years ago
grencez
ggerganov
ggerganov approved these changes on 2023-04-21
ggerganov Update examples/main/main.cpp
80d1c166
ggerganov ggerganov merged 94112882 into master 2 years ago
grencez grencez deleted the batching branch 2 years ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone