transformers
04b751f0 - Fix attention vizualizer (#40285)

Commit
119 days ago
Fix attention vizualizer (#40285) * make visualizer rely on create causal mask * format * fixup * fixup * read token * read token, duh * what is up with that token * small tests? * adjust * try with flush * normalize for ANSI * buffer shenanigans
Author
Parents
Loading