llama.cpp
43f76bf1 - main : print total token count and tokens consumed so far (#4874)

Commit
1 year ago
main : print total token count and tokens consumed so far (#4874) * Token count changes * Add show token count * Updating before PR * Two requested changes * Move param def posn
Author
Parents
Loading