peft
[LoRA] Document support for effective rank for LoRA on MOE experts
#3007
Merged

Commits
  • support for effective rank for LoRA on MOE experts
    kashif committed 69 days ago
  • add vllm url
    kashif committed 69 days ago
  • add short discription of merging
    kashif committed 68 days ago
  • test saving
    kashif committed 68 days ago
  • fix formatting
    kashif committed 68 days ago
  • warning if moe_rank_normalization is True but no expert params
    kashif committed 68 days ago
  • fix formatting
    kashif committed 68 days ago
  • remove moe_rank_normalization flag
    kashif committed 68 days ago
  • Remove unused
    kashif committed 68 days ago
  • fix test
    kashif committed 68 days ago
  • Update tests/test_target_parameters.py
    kashif committed 65 days ago
  • move to developer
    kashif committed 65 days ago
  • full undo
    kashif committed 65 days ago
  • fix formatting
    kashif committed 65 days ago
  • Merge branch 'main' into lora_moe_expert
    githubnemo committed 64 days ago
Loading