Commit
1 year ago
gh-37822: Fix bug in TensorField.apply_map <!-- ^ Please provide a concise and informative title. --> <!-- ^ Don't put issue numbers in the title, do this in the PR description below. --> <!-- ^ For example, instead of "Fixes #12345" use "Introduce new method to calculate 1 + 2". --> <!-- v Describe your changes below in detail. --> <!-- v Why is this change required? What problem does it solve? --> <!-- v If this PR resolves an open issue, please link to it here. For example, "Fixes #12345". --> ### :memo: Checklist <!-- Put an `x` in all the boxes that apply. --> - [x] The title is concise and informative. - [x] The description explains in detail what this PR is about. - [ ] I have linked a relevant issue or discussion. - [x] I have created tests covering the changes. - [x] I have updated the documentation and checked the documentation preview. ### :hourglass: Dependencies <!-- List all open PRs that this PR logically depends on. For example, --> <!-- - #12345: short description why this is a dependency --> <!-- - #34567: ... --> In Sage 10.4.beta3 we have ``` sage: E.<x,y> = EuclideanSpace() sage: a = var('a') sage: v = E.vector_field(x, a*y) sage: v.divergence().expr() # OK a + 1 sage: v.apply_map(lambda t: t.subs(a=-1)) sage: v.display() x e_x - y e_y sage: v.divergence().expr() # wrong! a + 1 ``` The last output should be 0. This error occurs because the method `_del_derived ` is not invoked to reset derived quantities relative to the components of a tensor field that are modified by `apply_map`. In the above example, the differentials of the components, which are invoked in computing the divergence, are cached and not deleted by `apply_map`. This is corrected here. URL: https://github.com/sagemath/sage/pull/37822 Reported by: Eric Gourgoulhon Reviewer(s): Matthias Köppe
Author
Release Manager
Loading