llama.cpp
fix: convert_hf_to_gguf - use existing local chat_template if mistral-format model has one.
#17749
Merged

Loading