comparison hgext/largefiles/remotestore.py @ 30180:736f92c44656

largefiles: always use filechunkiter when iterating files Before, we would sometimes use the default iterator over large files. That iterator is line based and would add extra buffering and use odd chunk sizes which could give some overhead. copyandhash can't just apply a filechunkiter as it sometimes is passed a genuine generator when downloading remotely.
author Mads Kiilerich <madski@unity3d.com>
date Wed, 12 Oct 2016 12:22:18 +0200
parents 3dcaf1c4e90d
children dcdc17551653
comparison
equal deleted inserted replaced
30179:cdef35b38026 30180:736f92c44656
116 def _put(self, hash, fd): 116 def _put(self, hash, fd):
117 '''Put file with the given hash in the remote store.''' 117 '''Put file with the given hash in the remote store.'''
118 raise NotImplementedError('abstract method') 118 raise NotImplementedError('abstract method')
119 119
120 def _get(self, hash): 120 def _get(self, hash):
121 '''Get file with the given hash from the remote store.''' 121 '''Get a iterator for content with the given hash.'''
122 raise NotImplementedError('abstract method') 122 raise NotImplementedError('abstract method')
123 123
124 def _stat(self, hashes): 124 def _stat(self, hashes):
125 '''Get information about availability of files specified by 125 '''Get information about availability of files specified by
126 hashes in the remote store. Return dictionary mapping hashes 126 hashes in the remote store. Return dictionary mapping hashes