transformers
Cache: use `batch_size` instead of `max_batch_size`
#32657
Merged

Commits
  • more precise name
    gante committed 1 year ago
  • better docstrings
    gante committed 1 year ago
  • Update src/transformers/cache_utils.py
    gante committed 1 year ago
Loading