llama.cpp
43f76bf1
- main : print total token count and tokens consumed so far (#4874)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
main : print total token count and tokens consumed so far (#4874) * Token count changes * Add show token count * Updating before PR * Two requested changes * Move param def posn
References
#4874 - Add total token count and intermittent tokens consumed so far
Author
pudepiedj
Parents
2f043328
Loading