llama.cpp
Add support for batch size to `--perplexity`
#407
Merged

Add support for batch size to `--perplexity` #407

glinscott
glinscott Add support to batch size for perplexity
9ea43d4d
glinscott glinscott changed the title Add support to batch size for perplexity Add support for batch size to `--perplexity` 2 years ago
glinscott
gjmulder gjmulder added enhancement
gjmulder gjmulder added generation quality
ggerganov
ggerganov
Green-Sky
glinscott
glinscott Merge remote-tracking branch 'origin/master' into batch_perplexity
9179d089
glinscott Revert "Fix memory allocation issues and seg faults"
57dc4dc6
glinscott
glinscott
Green-Sky
ggerganov
glinscott
glinscott Merge branch 'master' into batch_perplexity
c3d3cd2d
glinscott update from merge
7392ad62
glinscott Remove perplexity from main
43523220
glinscott Merge branch 'master' into batch_perplexity
a17e745b
glinscott updates
864dcb26
glinscott
glinscott
glinscott
ivanstepanovftw
ggerganov
ggerganov ggerganov added high priority
ggerganov
glinscott Merge remote-tracking branch 'origin/master' into batch_perplexity
fbcecd59
glinscott Update batch size for efficiency
23fd782d
glinscott glinscott marked this pull request as ready for review 2 years ago
glinscott
ggerganov
ggerganov approved these changes on 2023-04-13
ggerganov ggerganov merged be87b6ed into master 2 years ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone