[LoRA] Document support for effective rank for LoRA on MOE experts #3007
support for effective rank for LoRA on MOE experts
8197d95a
add vllm url
e05baf59
add short discription of merging
c8d1b0b0
test saving
93eb681d
fix formatting
6f0ba8f0
warning if moe_rank_normalization is True but no expert params
71bccbaa
fix formatting
0a16b567
remove moe_rank_normalization flag
4ec95145
Remove unused
b74f922a
fix test
bb1cf2f1
kashif
changed the title [LoRA] support for effective rank for LoRA on MOE experts [LoRA] Document support for effective rank for LoRA on MOE experts 64 days ago
Update tests/test_target_parameters.py
8ac6914a
move to developer
e17ffbb7
full undo
4c5bd303
fix formatting
429cf27c
Merge branch 'main' into lora_moe_expert
acab47e4
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub