llama.cpp
403fbacb
- convert : Qwerky : use lora_rank_tokenshift and lora_rank_decay if present (#12667)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
262 days ago
convert : Qwerky : use lora_rank_tokenshift and lora_rank_decay if present (#12667)
References
#12667 - convert : Qwerky : use lora_rank_tokenshift and lora_rank_decay if present
Author
CISC
Parents
a8a1f335
Loading