llama.cpp
486ae645 - Compute perplexity over prompt (#270)

Commit
2 years ago
Compute perplexity over prompt (#270) * Compute perplexity over prompt * More accurate perplexity calculation - over all logits in the context window (so 512x more tokens!) * Output all perplexitiies * Add timing/ETA
Author
Parents
Loading