largefiles: don't crash when cloning to a remote repo
The immediate crash was when checking for requirements immediately after this,
but lfcommands.downloadlfiles() will also crash if --all-largefiles is
specified. That has been in place since atleast
5884812686f7 (2.3-rc) without
anyone noticing.
I can't tell from the peer classes if there's a way to make the custom largefile
functionality work in this case, but atleast it doesn't crash.
http://mercurial.selenic.com/bts/issue619
$ hg init
$ echo a > a
$ hg ci -Ama
adding a
$ echo b > b
$ hg branch b
marked working directory as branch b
(branches are permanent and global, did you want a bookmark?)
$ hg ci -Amb
adding b
$ hg co -C 0
0 files updated, 0 files merged, 1 files removed, 0 files unresolved
Fast-forward:
$ hg merge b
1 files updated, 0 files merged, 0 files removed, 0 files unresolved
(branch merge, don't forget to commit)
$ hg ci -Ammerge
Bogus fast-forward should fail:
$ hg merge b
abort: merging with a working directory ancestor has no effect
[255]
Even with strange revset (issue4465)
$ hg merge ::.
abort: merging with a working directory ancestor has no effect
[255]