Mercurial > hg
changeset 43786:421ea5772039
copies: split the combination of the copies mapping in its own function
In some case, this part take up to 95% of the copy tracing that take about a
hundred second. This poor performance comes from the fact we keep duplciating
and merging dictionary that are mostly similar.
I want to experiment with smarter native code to do this, so I need to isolate
the function first.
author | Pierre-Yves David <pierre-yves.david@octobus.net> |
---|---|
date | Wed, 13 Nov 2019 20:42:13 +0100 |
parents | 3b039e43a1e6 |
children | be8552f25cab |
files | mercurial/copies.py |
diffstat | 1 files changed, 17 insertions(+), 2 deletions(-) [+] |
line wrap: on
line diff
--- a/mercurial/copies.py Wed Nov 13 09:39:44 2019 +0100 +++ b/mercurial/copies.py Wed Nov 13 20:42:13 2019 +0100 @@ -281,9 +281,24 @@ iterrevs &= mrset iterrevs.update(roots) iterrevs.remove(b.rev()) + revs = sorted(iterrevs) + return _combinechangesetcopies(revs, children, b.rev(), revinfo, match) + + +def _combinechangesetcopies(revs, children, targetrev, revinfo, match): + """combine the copies information for each item of iterrevs + + revs: sorted iterable of revision to visit + children: a {parent: [children]} mapping. + targetrev: the final copies destination revision (not in iterrevs) + revinfo(rev): a function that return (p1, p2, p1copies, p2copies, removed) + match: a matcher + + It returns the aggregated copies information for `targetrev`. + """ all_copies = {} alwaysmatch = match.always() - for r in sorted(iterrevs): + for r in revs: copies = all_copies.pop(r, None) if copies is None: # this is a root @@ -336,7 +351,7 @@ else: newcopies.update(othercopies) all_copies[c] = newcopies - return all_copies[b.rev()] + return all_copies[targetrev] def _forwardcopies(a, b, base=None, match=None):