Fix dropout backward by not tracing it (#66823)
- dropout is a CompositeImplicitAutograd op, which means in order for Autograd to implicitly materialize its backward ops, Autograd has to trace through the 'dropout' kernel on CPU (which in turn just executes other ATen ops, which have explicit backwards implementations).
- Lazy tracing of CompositeImplicitAutograd ops breaks backwards; it can't be done unless you also provide an override of the Autograd kernel for the op such as is currently done for maxpool
- See this error in codegen if vendor provides kernel for a composite op with no… #67090 for an updated safety check we're adding to prevent this in the future