llama.cpp
486ae645
- Compute perplexity over prompt (#270)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
Compute perplexity over prompt (#270) * Compute perplexity over prompt * More accurate perplexity calculation - over all logits in the context window (so 512x more tokens!) * Output all perplexitiies * Add timing/ETA
References
#270 - Compute perplexity over prompt
Author
glinscott
Parents
3ab3e658
Loading