pytorch
6ea790c5 - Make share_memory_ call thread safe within itself. (#96664)

Commit
1 year ago
Make share_memory_ call thread safe within itself. (#96664) To achieve this, I have a per-StorageImpl (was data_ptr in the previous version of this PR, but moved to StorageImpl to ensure stability of the key before/after sharing) lock created when we are about to share a storage and make sure that all other calls to share memory wait on this lock before moving forward. This does NOT make this call generally thread safe as any call that is not sharing memory will race and lead to UB. This makes ensures that the sample from @robertolat in https://github.com/pytorch/pytorch/issues/95606 works fine. This does NOT fix the example from @imurray in that same issue as the call still race with the `.sum()` call. This race is expected and there is no easy way for us to make it work I'm afraid (see issue for more details). Pull Request resolved: https://github.com/pytorch/pytorch/pull/96664 Approved by: https://github.com/colesbury
Author
Committer
Parents
Loading