worker: don't expose readinto() on _blockingreader since pickle is picky
authorMartin von Zweigbergk <martinvonz@google.com>
Fri, 14 Aug 2020 20:45:49 -0700
changeset 45409 7d24201b6447
parent 45408 6ba7190ff863
child 45410 8b700e9b9fc2
worker: don't expose readinto() on _blockingreader since pickle is picky The `pickle` module expects the input to be buffered and a whole object to be available when `pickle.load()` is called, which is not necessarily true when we send data from workers back to the parent process (i.e., it seems like a bad assumption for the `pickle` module to make). We added a workaround for that in https://phab.mercurial-scm.org/D8076, which made `read()` continue until all the requested bytes have been read. As we found out at work after a lot of investigation (I've spent the last two days on this), the native version of `pickle.load()` has started calling `readinto()` on the input since Python 3.8. That started being called in https://github.com/python/cpython/commit/91f4380cedbae32b49adbea2518014a5624c6523 (and only by the C version of `pickle.load()`)). Before that, it was only `read()` and `readline()` that were called. The problem with that was that `readinto()` on our `_blockingreader` was simply delegating to the underlying, *unbuffered* object. The symptom we saw was that `hg fix` started failing sometimes on Python 3.8 on Mac. It failed very relyable in some cases. I still haven't figured out under what circumstances it fails and I've been unable to reproduce it in test cases (I've tried writing larger amounts of data, using different numbers of workers, and making the formatters sleep). I have, however, been able to reproduce it 3-4 times on Linux, but then it stopped reproducing on the following few hundred attempts. To fix the problem, we can simply remove the implementation of `readinto()`, since the unpickler will then fall back to calling `read()`. The fallback was added a bit later, in https://github.com/python/cpython/commit/b19f7ecfa3adc6ba1544225317b9473649815b38. However, that commit also added checking that what `read()` returns is a `bytes`, so we also need to convert the `bytearray` we use into that. I was able to add a test for that failure at least. Differential Revision: https://phab.mercurial-scm.org/D8928
mercurial/worker.py
tests/test-fix-pickle.t
--- a/mercurial/worker.py	Tue Aug 18 15:03:57 2020 -0700
+++ b/mercurial/worker.py	Fri Aug 14 20:45:49 2020 -0700
@@ -71,8 +71,12 @@
         def __init__(self, wrapped):
             self._wrapped = wrapped
 
-        def __getattr__(self, attr):
-            return getattr(self._wrapped, attr)
+        # Do NOT implement readinto() by making it delegate to
+        # _wrapped.readinto(), since that is unbuffered. The unpickler is fine
+        # with just read() and readline(), so we don't need to implement it.
+
+        def readline(self):
+            return self._wrapped.readline()
 
         # issue multiple reads until size is fulfilled
         def read(self, size=-1):
@@ -91,7 +95,7 @@
 
             del view
             del buf[pos:]
-            return buf
+            return bytes(buf)
 
 
 else:
--- /dev/null	Thu Jan 01 00:00:00 1970 +0000
+++ b/tests/test-fix-pickle.t	Fri Aug 14 20:45:49 2020 -0700
@@ -0,0 +1,45 @@
+A script that implements uppercasing all letters in a file.
+
+  $ UPPERCASEPY="$TESTTMP/uppercase.py"
+  $ cat > $UPPERCASEPY <<EOF
+  > import sys
+  > from mercurial.utils.procutil import setbinary
+  > setbinary(sys.stdin)
+  > setbinary(sys.stdout)
+  > sys.stdout.write(sys.stdin.read().upper())
+  > EOF
+  $ TESTLINES="foo\nbar\nbaz\n"
+  $ printf $TESTLINES | "$PYTHON" $UPPERCASEPY
+  FOO
+  BAR
+  BAZ
+
+This file attempts to test our workarounds for pickle's lack of
+support for short reads.
+
+  $ cat >> $HGRCPATH <<EOF
+  > [extensions]
+  > fix =
+  > [fix]
+  > uppercase-whole-file:command="$PYTHON" $UPPERCASEPY
+  > uppercase-whole-file:pattern=set:**
+  > EOF
+
+  $ hg init repo
+  $ cd repo
+
+# Create a file that's large enough that it seems to not fit in
+# pickle's buffer, making it use the code path that expects our
+# _blockingreader's read() method to return bytes.
+  $ echo "some stuff" > file
+  $ for i in $($TESTDIR/seq.py 13); do
+  >   cat file file > tmp
+  >   mv -f tmp file
+  > done
+  $ hg commit -Am "add large file"
+  adding file
+
+Check that we don't get a crash
+
+  $ hg fix -r .
+  saved backup bundle to $TESTTMP/repo/.hg/strip-backup/*-fix.hg (glob)