transformers
[docs] update input documentation for MAMBA2 and MISTRAL models to include cache_position and attention_mask details
#34322
Merged

[docs] update input documentation for MAMBA2 and MISTRAL models to include cache_position and attention_mask details #34322

h3110Fr13nd
h3110Fr13nd
stevhliu
stevhliu approved these changes on 2024-10-25
h3110Fr13nd h3110Fr13nd force pushed 1 year ago
h3110Fr13nd [docs] update input documentation for MAMBA2 and MISTRAL models to in…
4d737507
h3110Fr13nd [docs] correct input documentation for MISTRAL model to reference `in…
4be2b015
h3110Fr13nd [docs] clarify cache_position description in MISTRAL model documentation
e73d151c
h3110Fr13nd h3110Fr13nd force pushed to e73d151c 1 year ago
HuggingFaceDocBuilderDev
stevhliu stevhliu merged fc1ae7f3 into main 1 year ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone