Danny Hooper <hooper@google.com> [Thu, 02 Sep 2021 14:08:45 -0700] rev 48178
fix: reduce number of tool executions
By grouping together (path, ctx) pairs according to the inputs they would
provide to fixer tools, we can deduplicate executions of fixer tools to
significantly reduce the amount of time spent running slow tools.
This change does not handle clean files in the working copy, which could still
be deduplicated against the files in the checked out commit. It's a little
harder to do that because the filerev is not available in the workingfilectx
(and it doesn't exist for added files).
Anecdotally, this change makes some real uses cases at Google 10x faster. I
think we were originally hesitant to do this because the benefits weren't
obvious, and implementing it efficiently is kind of tricky. If we simply
memoized the formatter execution function, we would be keeping tons of file
content in memory.
Also included is a regression test for a corner case that I broke with my first
attempt at optimizing this code.
Differential Revision: https://phab.mercurial-scm.org/D11280
Danny Hooper <hooper@google.com> [Thu, 02 Sep 2021 14:07:55 -0700] rev 48177
fix: add test to demonstrate how many times tools are executed
The current implementation wastes a lot of effort invoking fixer tools more
than once on the same file name+content. It is good to track changes in this
behavior.
Differential Revision: https://phab.mercurial-scm.org/D11279