fix MllamaVisionAttention typehint #35975
fix MllamaVisionAttention typehint
b536cbfe
Update src/transformers/models/mllama/modeling_mllama.py
09947561
fix suggestion
100dad2a
kylesayrs
force pushed
from
33dcfd1e
to
100dad2a
338 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub