update get_error_factor to cache up with the latest transformers change #3021
update get_error_factor to cache up with the latest transformers change
4104d72a
update scales
29d7e1dd
update scales
3fe2a1c7
SunMarc
approved these changes
on 2026-02-10
Merge branch 'huggingface:main' into 8bit
93318583
skip 8bit loftq test
2e0c6c7a
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub