auto-round
ca59d36a - Fix multimodal and moe issue (#191)

Commit
1 year ago
Fix multimodal and moe issue (#191) * fix mixtral model quantization issue Signed-off-by: Zhang, Weiwei1 <weiwei1.zhang@intel.com> * refine code Signed-off-by: Zhang, Weiwei1 <weiwei1.zhang@intel.com> * bugfix Signed-off-by: Zhang, Weiwei1 <weiwei1.zhang@intel.com> * refine get_block_name func, bugfix Signed-off-by: Zhang, Weiwei1 <weiwei1.zhang@intel.com> * refine multimodal example, mv autoround multimodal args, typofix Signed-off-by: Zhang, Weiwei1 <weiwei1.zhang@intel.com> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * add UT, fixtypo Signed-off-by: Zhang, Weiwei1 <weiwei1.zhang@intel.com> * fix code scan issues Signed-off-by: Zhang, Weiwei1 <weiwei1.zhang@intel.com> * edit UT Signed-off-by: Zhang, Weiwei1 <weiwei1.zhang@intel.com> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * fix code coverage decrease Signed-off-by: Zhang, Weiwei1 <weiwei1.zhang@intel.com> --------- Signed-off-by: Zhang, Weiwei1 <weiwei1.zhang@intel.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Author
Parents
Loading