view tests/test-manifest-merging.t @ 30180:736f92c44656

largefiles: always use filechunkiter when iterating files Before, we would sometimes use the default iterator over large files. That iterator is line based and would add extra buffering and use odd chunk sizes which could give some overhead. copyandhash can't just apply a filechunkiter as it sometimes is passed a genuine generator when downloading remotely.
author Mads Kiilerich <madski@unity3d.com>
date Wed, 12 Oct 2016 12:22:18 +0200
parents f2719b387380
children
line wrap: on
line source

  $ hg init base

  $ cd base
  $ echo 'alpha' > alpha
  $ hg ci -A -m 'add alpha'
  adding alpha
  $ cd ..

  $ hg clone base work
  updating to branch default
  1 files updated, 0 files merged, 0 files removed, 0 files unresolved

  $ cd work
  $ echo 'beta' > beta
  $ hg ci -A -m 'add beta'
  adding beta
  $ cd ..

  $ cd base
  $ echo 'gamma' > gamma
  $ hg ci -A -m 'add gamma'
  adding gamma
  $ cd ..

  $ cd work
  $ hg pull -q
  $ hg merge
  1 files updated, 0 files merged, 0 files removed, 0 files unresolved
  (branch merge, don't forget to commit)

Update --clean to revision 1 to simulate a failed merge:

  $ rm alpha beta gamma
  $ hg update --clean 1
  2 files updated, 0 files merged, 0 files removed, 0 files unresolved

  $ cd ..