tests/test-fix-metadata.t
author Valentin Gatien-Baron <valentin.gatienbaron@gmail.com>
Tue, 02 Jul 2019 12:59:58 -0400
changeset 42621 99ebde4fec99
parent 42429 6ed04139ed37
child 42757 2d70b1118af2
permissions -rw-r--r--
commit: improve the files field of changelog for merges Currently, the files list of merge commits repeats all the deletions (either actual deletions, or files that got renamed) that happened between base and p2 of the merge. If p2 is the main branch, the list can easily be much bigger than the change being merged. This results in various problems worth improving: - changelog is bigger than necessary - `hg log directory` lists many unrelated merge commits, and `hg log -v -r commit` frequently fills multiple screens worth of files - it possibly slows down adjustlinkrev, by forcing it to read more manifests, and that function can certainly be a bottleneck - the server side of pulls can waste a lot of time simply opening the filelogs for pointless files (the constant factors for opening even a tiny filelog is apparently pretty bad) So stop listing such files as described in the code. Impacted merge commits and their descendants get a different hash than they would have without this. This doesn't seem problematic, except for convert. The previous commit helped with that in the hg->hg case (but if you do svn->hg twice from scratch, hashes can still change). The rest of the description is numbers. I don't have much to report, because recreating the files list of existing repositories is not easy: - debugupgradeformat and bundle/unbundle don't recreate the list - export/import tends to choke quickly applying patches or on description that contain diffs, - merge commits from the convert extension don't have the right files list for reasons orthogonal to the current commit - replaying the merge with hg update/hg merge/hg revert --all/hg commit can end up failing in hg revert - I wasn't sure that using debugsetparents + debugrebuilddirstate would really build the right thing I measured commit time before and after this change, in a case with no files filtered out, several files filtered out (no difference) and 5k files filtered out (+1% time). Recreating the 100 more recent merges in a private repo, the concatenated uncompressed files lists goes from 1.12MB to 0.52MB. Excluding 3 merges that are not representative, then the size goes from 570k to 15k. I converted part of mozilla-central, and observed file list shrinking quite a bit too, starting at the very first merge, 733641d9feaf, going from 550 files to 10 files (although they have relatively few merges, so they probably wouldn't care). Differential Revision: https://phab.mercurial-scm.org/D6613

A python hook for "hg fix" that prints out the number of files and revisions
that were affected, along with which fixer tools were applied. Also checks how
many times it sees a specific key generated by one of the fixer tools defined
below.

  $ cat >> $TESTTMP/postfixhook.py <<EOF
  > import collections
  > def file(ui, repo, rev=None, path=b'', metadata=None, **kwargs):
  >   ui.status(b'fixed %s in revision %d using %s\n' %
  >             (path, rev, b', '.join(metadata.keys())))
  > def summarize(ui, repo, replacements=None, wdirwritten=False,
  >               metadata=None, **kwargs):
  >     counts = collections.defaultdict(int)
  >     keys = 0
  >     for fixername, metadatalist in metadata.items():
  >         for metadata in metadatalist:
  >             if metadata is None:
  >                 continue
  >             counts[fixername] += 1
  >             if 'key' in metadata:
  >                 keys += 1
  >     ui.status(b'saw "key" %d times\n' % (keys,))
  >     for name, count in sorted(counts.items()):
  >         ui.status(b'fixed %d files with %s\n' % (count, name))
  >     if replacements:
  >         ui.status(b'fixed %d revisions\n' % (len(replacements),))
  >     if wdirwritten:
  >         ui.status(b'fixed the working copy\n')
  > EOF

Some mock output for fixer tools that demonstrate what could go wrong with
expecting the metadata output format.

  $ printf 'new content\n' > $TESTTMP/missing
  $ printf 'not valid json\0new content\n' > $TESTTMP/invalid
  $ printf '{"key": "value"}\0new content\n' > $TESTTMP/valid

Configure some fixer tools based on the output defined above, and enable the
hooks defined above. Disable parallelism to make output of the parallel file
processing phase stable.

  $ cat >> $HGRCPATH <<EOF
  > [extensions]
  > fix =
  > [fix]
  > missing:command=cat $TESTTMP/missing
  > missing:pattern=missing
  > missing:metadata=true
  > invalid:command=cat $TESTTMP/invalid
  > invalid:pattern=invalid
  > invalid:metadata=true
  > valid:command=cat $TESTTMP/valid
  > valid:pattern=valid
  > valid:metadata=true
  > [hooks]
  > postfixfile = python:$TESTTMP/postfixhook.py:file
  > postfix = python:$TESTTMP/postfixhook.py:summarize
  > [worker]
  > enabled=false
  > EOF

See what happens when we execute each of the fixer tools. Some print warnings,
some write back to the file.

  $ hg init repo
  $ cd repo

  $ printf "old content\n" > invalid
  $ printf "old content\n" > missing
  $ printf "old content\n" > valid
  $ hg add -q

  $ hg fix -w
  ignored invalid output from fixer tool: invalid
  ignored invalid output from fixer tool: missing
  fixed valid in revision 2147483647 using valid
  saw "key" 1 times
  fixed 1 files with valid
  fixed the working copy

  $ cat missing invalid valid
  old content
  old content
  new content

  $ cd ..