transformers
Cache: use `batch_size` instead of `max_batch_size`
#32657
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
3
Changes
View On
GitHub
Commits
more precise name
gante
committed
1 year ago
better docstrings
gante
committed
1 year ago
Update src/transformers/cache_utils.py
gante
committed
1 year ago
Loading