largefiles: don't crash when cloning to a remote repo
The immediate crash was when checking for requirements immediately after this,
but lfcommands.downloadlfiles() will also crash if --all-largefiles is
specified. That has been in place since atleast
5884812686f7 (2.3-rc) without
anyone noticing.
I can't tell from the peer classes if there's a way to make the custom largefile
functionality work in this case, but atleast it doesn't crash.
#require serve
$ hg init server
$ cd server
$ cat >> .hg/hgrc << EOF
> [extensions]
> strip=
> EOF
$ echo 1 > foo
$ hg commit -A -m 'first'
adding foo
$ echo 2 > bar
$ hg commit -A -m 'second'
adding bar
Produce a bundle to use
$ hg strip -r 1
0 files updated, 0 files merged, 1 files removed, 0 files unresolved
saved backup bundle to $TESTTMP/server/.hg/strip-backup/ed602e697e0f-cc9fff6a-backup.hg (glob)
Serve from a bundle file
$ hg serve -R .hg/strip-backup/ed602e697e0f-cc9fff6a-backup.hg -d -p $HGPORT --pid-file=hg.pid
$ cat hg.pid >> $DAEMON_PIDS
Ensure we're serving from the bundle
$ ("$TESTDIR/get-with-headers.py" localhost:$HGPORT 'file/tip/?style=raw')
200 Script output follows
-rw-r--r-- 2 bar
-rw-r--r-- 2 foo