[InternLM] Add support for InternLM #26302
Add config.bias to LLaMA to allow InternLM models to be ported as LLa…
0a1d1f01
Rocketknight1
changed the title Add config.bias to LLaMA for InternLM [InternLM] Add support for InternLM 2 years ago
Rename bias -> attention_bias and add docstring
9281056d
Merge branch 'main' into port_internlm_as_llama
bcb105d6
Rocketknight1
deleted the port_internlm_as_llama branch 2 years ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub