transformers
ae32f1ad - Add zero dim tensor check when using flash_attention (#38280)

Commit
205 days ago
Add zero dim tensor check when using flash_attention (#38280) * Add zero dim tensor check when using flash_attention Signed-off-by: ranzhejiang <zhejiang.ran@intel.com> * Add zero dim tensor check when using flash_attention Signed-off-by: ranzhejiang <zhejiang.ran@intel.com> --------- Signed-off-by: ranzhejiang <zhejiang.ran@intel.com>
Author
Parents
Loading