transformers
Fix GlmMoeDsaConfig default mlp_layer_types in modular conversion
#43876
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
3
Changes
View On
GitHub
Fix GlmMoeDsaConfig default mlp_layer_types in modular conversion
#43876
zucchini-nlp
merged 3 commits into
huggingface:main
from
weiguangli-io:codex/transformers-43864-glm-moe-config-default
Fix GlmMoeDsaConfig default mlp layer pattern
760c2e4a
zucchini-nlp
commented on 2026-02-10
zucchini-nlp
commented on 2026-02-10
fix(glm-moe-dsa): dedupe config init and colocate test
a10f4303
Apply repo consistency fixes
d2125021
zucchini-nlp
approved these changes on 2026-02-10
zucchini-nlp
merged
476600a9
into main
37 days ago
Login to write a write a comment.
Login via GitHub
Reviewers
zucchini-nlp
Assignees
No one assigned
Labels
None yet
Milestone
No milestone
Login to write a write a comment.
Login via GitHub