pytorch
ca0ac3a7 - [caffe2] allow dropout to take 1.0 as dropout ratio to zero-out a layer (#72741)

Commit
2 years ago
[caffe2] allow dropout to take 1.0 as dropout ratio to zero-out a layer (#72741) Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/72741 as titled. Context: This is useful in fast mitigating feature induced overfitting in the sense that we can do omni-transfer on a trained model and apply dropout with ratio = 1 on features resulting in overfitting. Directly removing the features would not be feasible on omni-transfer scenarios since the downstream FC sizes would change. Experimental records: https://fb.quip.com/npIkAgRc8jl9#temp:C:DWC050ceaba14424d23a78462c01 Doing dropout = 1 on selected features improves the eval NE over the next few hours (compared to v0 baseline) as is shown in the figures. Test Plan: ``` buck test caffe2/caffe2/python/operator_test:dropout_op_test ``` Reviewed By: ustctf Differential Revision: D34178732 fbshipit-source-id: 533feebe21bc582eefd756de397d5c7807c7438d (cherry picked from commit 5dabf9c484c0bc5410e3700e3010cdabb4bf903c)
Author
Xiaohan Wei
Committer
Parents
Loading