peft
FIX Cache position is None with transformers v5.4
#3120
Merged

FIX Cache position is None with transformers v5.4 #3120

BenjaminBossan
BenjaminBossan FIX Cache position is None with transformers v5.4
4983d8ae
HuggingFaceDocBuilderDev
BenjaminBossan
BenjaminBossan BenjaminBossan requested a review from zucchini-nlp zucchini-nlp 4 days ago
zucchini-nlp
zucchini-nlp commented on 2026-03-30
BenjaminBossan Reviewer feedback: much simpler solution
9dfc901b
BenjaminBossan
BenjaminBossan BenjaminBossan marked this pull request as ready for review 4 days ago
BenjaminBossan BenjaminBossan requested a review from githubnemo githubnemo 4 days ago
BenjaminBossan Use is_first_generation argument if present
0207696c
BenjaminBossan
BenjaminBossan BenjaminBossan requested a review from zucchini-nlp zucchini-nlp 3 days ago
zucchini-nlp
zucchini-nlp commented on 2026-03-31
zucchini-nlp
zucchini-nlp approved these changes on 2026-03-31
Cyrilvallez
BenjaminBossan
zucchini-nlp
BenjaminBossan
githubnemo
githubnemo approved these changes on 2026-04-01
BenjaminBossan BenjaminBossan merged 52efb763 into main 2 days ago
BenjaminBossan BenjaminBossan deleted the fix-cache-position-transformers-v5.4 branch 2 days ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone