llama.cpp
03fb8a00 - If first token generated from the server is the stop word the server will crash (#7038)

Commit
1 year ago
If first token generated from the server is the stop word the server will crash (#7038) This will reproduce the issue in llama13b { 'prompt': 'Q: hello world \nA: ', 'stop': ['\n'], 'temperature': 0.0, 'n_predict': 10, 'cache_prompt': True, 'n_probs': 10 }
Author
Parents
Loading