[AutoDiff] Add a differentiation checkpointing API. (#22033)
Add an API that make a function be recomputed in its pullback when needed. This is most useful for reducing memory usage in gradient computation (i.e. reverse-mode differentiation). Our AD semantics allow this to be defined as a library function with zero compiler changes.
```swift
gradient { x in
withRecomputationInPullbacks(f)(x) + g(x)
}
```
This patch also adds a commented-out method variant. It does not work yet because it requires support for indirect parameters and results.
```swift
// Does not work yet.
gradient { x in
g(x) + withRecomputationInDerivatives(x) { x in
h(f(x))
}
}
```