Mercurial > hg
view hgext/largefiles/localstore.py @ 45095:8e04607023e5
procutil: ensure that procutil.std{out,err}.write() writes all bytes
Python 3 offers different kind of streams and it’s not guaranteed for all of
them that calling write() writes all bytes.
When Python is started in unbuffered mode, sys.std{out,err}.buffer are
instances of io.FileIO, whose write() can write less bytes for
platform-specific reasons (e.g. Linux has a 0x7ffff000 bytes maximum and could
write less if interrupted by a signal; when writing to Windows consoles, it’s
limited to 32767 bytes to avoid the "not enough space" error). This can lead to
silent loss of data, both when using sys.std{out,err}.buffer (which may in fact
not be a buffered stream) and when using the text streams sys.std{out,err}
(I’ve created a CPython bug report for that:
https://bugs.python.org/issue41221).
Python may fix the problem at some point. For now, we implement our own wrapper
for procutil.std{out,err} that calls the raw stream’s write() method until all
bytes have been written. We don’t use sys.std{out,err} for larger writes, so I
think it’s not worth the effort to patch them.
author | Manuel Jacob <me@manueljacob.de> |
---|---|
date | Fri, 10 Jul 2020 12:27:58 +0200 |
parents | eef9a2d67051 |
children | 89a2afe31e82 |
line wrap: on
line source
# Copyright 2009-2010 Gregory P. Ward # Copyright 2009-2010 Intelerad Medical Systems Incorporated # Copyright 2010-2011 Fog Creek Software # Copyright 2010-2011 Unity Technologies # # This software may be used and distributed according to the terms of the # GNU General Public License version 2 or any later version. '''store class for local filesystem''' from __future__ import absolute_import from mercurial.i18n import _ from mercurial.pycompat import open from mercurial import util from . import ( basestore, lfutil, ) class localstore(basestore.basestore): '''localstore first attempts to grab files out of the store in the remote Mercurial repository. Failing that, it attempts to grab the files from the user cache.''' def __init__(self, ui, repo, remote): self.remote = remote.local() super(localstore, self).__init__(ui, repo, self.remote.url()) def put(self, source, hash): if lfutil.instore(self.remote, hash): return lfutil.link(source, lfutil.storepath(self.remote, hash)) def exists(self, hashes): retval = {} for hash in hashes: retval[hash] = lfutil.instore(self.remote, hash) return retval def _getfile(self, tmpfile, filename, hash): path = lfutil.findfile(self.remote, hash) if not path: raise basestore.StoreError( filename, hash, self.url, _(b"can't get file locally") ) with open(path, b'rb') as fd: return lfutil.copyandhash(util.filechunkiter(fd), tmpfile) def _verifyfiles(self, contents, filestocheck): failed = False for cset, filename, expectedhash in filestocheck: storepath, exists = lfutil.findstorepath(self.repo, expectedhash) if not exists: storepath, exists = lfutil.findstorepath( self.remote, expectedhash ) if not exists: self.ui.warn( _(b'changeset %s: %s references missing %s\n') % (cset, filename, storepath) ) failed = True elif contents: actualhash = lfutil.hashfile(storepath) if actualhash != expectedhash: self.ui.warn( _(b'changeset %s: %s references corrupted %s\n') % (cset, filename, storepath) ) failed = True return failed