setdiscovery: fix hang when #heads>200 (
issue2971)
When setting up the next sample, we always add all of the heads, regardless
of the desired max sample size. But if the number of heads exceeds this
size, then we don't add any more nodes from the still undecided set.
(This is debatable per se, and I'll investigate it, but it's how we designed
it at the moment.)
The bug was that we always added the overall heads, not the heads of the
remaining undecided set. Thus, if #heads>200 (desired sample size), we
did not make progress any longer.
$ hg init
$ echo 123 > a
$ hg add a
$ hg commit -m "first" a
$ mkdir sub
$ echo 321 > sub/b
$ hg add sub/b
$ hg commit -m "second" sub/b
$ cat sub/b
321
$ hg co 0
0 files updated, 0 files merged, 1 files removed, 0 files unresolved
$ cat sub/b 2>/dev/null || echo "sub/b not present"
sub/b not present
$ test -d sub || echo "sub not present"
sub not present