Support sharded safetensors in TF #29350
Rocketknight1
marked this pull request as ready for review 1 year ago
Initial commit (still lots of unfinished bits)
080c24a9
(Still untested) add safetensors sharding to save_pretrained
aaa02123
Fix savetensors saving, update default shard size to match PT
43a1b2cd
Add proper loading of TF-format safetensors
36014c73
Revert default size in case that changes things
ab90852f
Fix incorrect index name
7eae1b49
Update loading priority
0a9cf9ab
Update tests
73d3d75f
Make the tests a little more stringent
b09d7573
Expand tests
8329061e
Add sharded cross-test
ffc0e3a1
Fix argument name
03efce91
One more test fix
f415eb89
Adding mlx to the list of allowed formats
e5aace5c
Remove irrelevant block for safetensors
e0faed0c
Refactor warning logging into a separate function
a7dbf06f
Remove unused skip_logger_warnings arg
e7a2c249
Update src/transformers/modeling_tf_utils.py
7ef18f9e
Move function def
d75d69a4
Rocketknight1
deleted the supported_sharded_safetensors_loading_in_tf branch 1 year ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub