revlog: break up compression of large deltas
Python's zlib apparently makes an internal copy of strings passed to
compress(). To avoid this, compress strings 1M at a time, then join
them at the end if the result would be smaller than the original.
For initial commits of large but compressible files, this cuts peak
memory usage nearly in half.
#!/bin/sh
hg clone http://localhost:$HGPORT/ copy
echo $?
test -d copy || echo copy: No such file or directory
cat > dumb.py <<EOF
import BaseHTTPServer, SimpleHTTPServer, os, signal
def run(server_class=BaseHTTPServer.HTTPServer,
handler_class=SimpleHTTPServer.SimpleHTTPRequestHandler):
server_address = ('localhost', int(os.environ['HGPORT']))
httpd = server_class(server_address, handler_class)
httpd.serve_forever()
signal.signal(signal.SIGTERM, lambda x: sys.exit(0))
run()
EOF
python dumb.py 2>/dev/null &
echo $! >> $DAEMON_PIDS
# give the server some time to start running
sleep 1
http_proxy= hg clone http://localhost:$HGPORT/foo copy2 2>&1 | \
sed -e 's/404.*/404/' -e 's/Date:.*/Date:/'
echo $?
kill $!