transformers
22fe73c3 - TF safetensors reduced mem usage (#24404)

Commit
2 years ago
TF safetensors reduced mem usage (#24404) * Slight comment cleanup * Reduce peak mem usage when loading TF-format safetensor weights * Tweak the PyTorch loading code to support lazy loading from safetensors * Pass safe_open objects to the PyTorch loading function * Do GPU transposes for speed * One more tweak to reduce peak usage further * One-line hasattr * Fix bug when there's a shape mismatch * Rename state_dict in the loading code to be clearer * Use TF format everywhere for consistency
Author
Parents
Loading