transformers
Add zero dim tensor check when using flash_attention
#38280
Merged

Add zero dim tensor check when using flash_attention #38280

ranzhejiang
ranzhejiang ranzhejiang force pushed from 007835f0 to 6afac400 324 days ago
ranzhejiang ranzhejiang force pushed from 6afac400 to a9fdcf48 324 days ago
Rocketknight1
ArthurZucker
ArthurZucker commented on 2025-05-23
ranzhejiang
ArthurZucker
ranzhejiang ranzhejiang force pushed from a9fdcf48 to 007835f0 316 days ago
ranzhejiang ranzhejiang force pushed from 007835f0 to 3d367724 316 days ago
ranzhejiang ranzhejiang force pushed from 3d367724 to 8365960c 316 days ago
ranzhejiang Add zero dim tensor check when using flash_attention
8365960c
ranzhejiang
ranzhejiang Add zero dim tensor check when using flash_attention
827b2304
ranzhejiang
ArthurZucker ArthurZucker merged ae32f1ad into main 290 days ago
ArthurZucker

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone