transformers
6ba63ac3 - [InternLM] Add support for InternLM (#26302)

Commit
2 years ago
[InternLM] Add support for InternLM (#26302) * Add config.bias to LLaMA to allow InternLM models to be ported as LLaMA checkpoints * Rename bias -> attention_bias and add docstring
Author
Parents
Loading