AutoAWQ
GGUF compatible quantization (2, 3, 4 bit / any bit)
#285
Merged

GGUF compatible quantization (2, 3, 4 bit / any bit) #285

casper-hansen merged 5 commits into main from gguf
casper-hansen
casper-hansen GGUF compatible quantization (2, 3, 4 bit)
a0cb9e57
casper-hansen Update example
b02263f6
casper-hansen casper-hansen changed the title GGUF compatible quantization (2, 3, 4 bit) GGUF compatible quantization (2, 3, 4 bit / any bit) 2 years ago
casper-hansen Change default model to Mistral
8bbf7432
vince62s
casper-hansen
vince62s
sorasoras
casper-hansen
sorasoras
casper-hansen
JianbangZ
casper-hansen
casper-hansen
JianbangZ
casper-hansen
sorasoras
casper-hansen
casper-hansen Merge branch 'main' into gguf
0b40094b
casper-hansen pack() utility function. rename gguf_compatible -> export_compatible.
c7eae1b2
sorasoras
casper-hansen
ikawrakow
casper-hansen
ikawrakow
JianbangZ
casper-hansen
JianbangZ
casper-hansen
JianbangZ
casper-hansen
JianbangZ
ikawrakow
sorasoras
casper-hansen casper-hansen merged a3db8099 into main 2 years ago
DD-DuDa
casper-hansen casper-hansen deleted the gguf branch 2 years ago

Login to write a write a comment.

Login via GitHub

Reviewers
No reviews
Assignees
No one assigned
Labels
Milestone