transformers
2b183541 - add self.head_dim for VisionAttention in Qwen2-VL (#33211)

Commit
1 year ago
add self.head_dim for VisionAttention in Qwen2-VL (#33211) * add self.head_dim for VisionAttention in Qwen2-VL * add self.head_dim for VisionAttention in Qwen2-VL * fix ci * black the test_modeling_qwen2_vl.py * use ruff to format test_modeling_qwen2_vl.py * [run-slow] qwen2_vl * use tying for python3.8 * fix the import format * use ruff to fix the ci error I001 * [run-slow] qwen2_vl * remove unused import * commit for rebase * use ruff fix ci * [run-slow] qwen2_vl --------- Co-authored-by: root <liji>
Author
Parents
Loading