Add cartridges to PEFT #2953
initial cartidges
1060f767
add link to blog
43e9aebb
fix test
549ecc0f
further examples
03f8d430
add CartridgeConfig
d6705aec
fixes from review
48922f58
model/task families that don’t support prefix KV-cache injection
a6102539
fix example
086a6b8f
Update examples/cartridge_self_study/README.md
3b2b77c1
Update src/peft/tuners/cartridge/utils.py
3cf7c97a
fix from reviews
36889133
copyright
2d4e46aa
BenjaminBossan
changed the title Add cartidges to PEFT Add cartridges to PEFT 44 days ago
use vllm to synthsis data
69d7634d
use device_map and correct dtype
5c563a6b
teacher is just the student base model without peft
75e975b0
use single model always
65138b95
use the whole document as context for synthesis
b91a9fc3
disable thinking
b5d92d0d
Account for num_virtual_tokens in RoPE for PREFIX_TUNING/CARTRIDGE
4fb96da7
position handling by offsetting position_ids and test
c2b58a62
use PeftModel.from_pretrained
806a504d
use initialize_kv_prefix_from_past_key_values and initialize_kv_prefi…
c512a982
fix initialize_kv_prefix_from_text example in doc
9275d196
fix random seed in prompt indicies
d610d985
use MAX_NEW_TOKENS_FOR_QUESTIONS
f02ef884
update path to the checked-in cartridges.tex
c2bc7e1e
fix _to_legacy_past_key_values for new transformers
847293e2
explicitly set return_dict=False
05aac917
rephrased
2d417dad
add prefix tunining position id test
2f7d5342
explicitly set return_dict=False
febddd87
moved the test
6589e6d4
increase default max token to 512
c44533b9
remove tex file
758b2de6
kashif
deleted the Cartridges branch 23 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub