transformers
04b751f0
- Fix attention vizualizer (#40285)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
119 days ago
Fix attention vizualizer (#40285) * make visualizer rely on create causal mask * format * fixup * fixup * read token * read token, duh * what is up with that token * small tests? * adjust * try with flush * normalize for ANSI * buffer shenanigans
References
#40285 - Fix attention vizualizer
Author
molbap
Parents
1e1db123
Loading