transformers
bfb954fa - fix attention mask for flash attention

Commit
275 days ago
fix attention mask for flash attention
Author
Parents
Loading