view tests/test-issue842.t @ 30180:736f92c44656

largefiles: always use filechunkiter when iterating files Before, we would sometimes use the default iterator over large files. That iterator is line based and would add extra buffering and use odd chunk sizes which could give some overhead. copyandhash can't just apply a filechunkiter as it sometimes is passed a genuine generator when downloading remotely.
author Mads Kiilerich <madski@unity3d.com>
date Wed, 12 Oct 2016 12:22:18 +0200
parents 2fc86d92c4a9
children 55c6ebd11cb9
line wrap: on
line source

https://bz.mercurial-scm.org/842

  $ hg init
  $ echo foo > a
  $ hg ci -Ama
  adding a

  $ hg up -r0000
  0 files updated, 0 files merged, 1 files removed, 0 files unresolved

  $ echo bar > a

Should issue new head warning:

  $ hg ci -Amb
  adding a
  created new head

  $ hg up -r0000
  0 files updated, 0 files merged, 1 files removed, 0 files unresolved

  $ echo stuffy > a

Should not issue new head warning:

  $ hg ci -q -Amc

  $ hg up -r0000
  0 files updated, 0 files merged, 1 files removed, 0 files unresolved

  $ echo crap > a
  $ hg branch testing
  marked working directory as branch testing
  (branches are permanent and global, did you want a bookmark?)

Should not issue warning:

  $ hg ci -q -Amd