transformers
bfb954fa - fix attention mask for flash attention

Commit
348 days ago
fix attention mask for flash attention
Author
Parents
Loading