view tests/test-newcgi.t @ 16120:47ee41fcf42b

largefiles: optimize update speed by only updating changed largefiles Historically, during 'hg update', every largefile in the working copy was hashed (which is a very expensive operation on big files) and any largefiles that did not have a hash that matched their standin were updated. This patch optimizes 'hg update' by keeping track of what standins have changed between the old and new revisions, and only updating the largefiles that have changed. This saves a lot of time by avoiding the unecessary calculation of a list of sha1 hashes for big files. With this patch, the time 'hg update' takes to complete is a function of how many largefiles need to be updated and what their size is. Performance tests on a repository with about 80 largefiles ranging from a few MB to about 97 MB are shown below. The tests show how long it takes to run 'hg update' with no changes actually being updated. Mercurial 2.1 release: $ time hg update 0 files updated, 0 files merged, 0 files removed, 0 files unresolved getting changed largefiles 0 largefiles updated, 0 removed real 0m10.045s user 0m9.367s sys 0m0.674s With this patch: $ time hg update 0 files updated, 0 files merged, 0 files removed, 0 files unresolved real 0m0.965s user 0m0.845s sys 0m0.115s The same repsoitory, without the largefiles extension enabled: $ time hg update 0 files updated, 0 files merged, 0 files removed, 0 files unresolved real 0m0.799s user 0m0.684s sys 0m0.111s So before the patch, 'hg update' with no changes was approximately 9.25s slower with largefiles enabled. With this patch, it is approximately 0.165s slower.
author Na'Tosha Bard <natosha@unity3d.com>
date Mon, 13 Feb 2012 18:37:07 +0100
parents 8b84d040d9f9
children 7a9cbb315d84
line wrap: on
line source

  $ "$TESTDIR/hghave" no-msys || exit 80 # MSYS will translate web paths as if they were file paths

This tests if CGI files from after d0db3462d568 but
before d74fc8dec2b4 still work.

  $ hg init test
  $ cat >hgweb.cgi <<HGWEB
  > #!/usr/bin/env python
  > #
  > # An example CGI script to use hgweb, edit as necessary
  > 
  > import cgitb
  > cgitb.enable()
  > 
  > from mercurial import demandimport; demandimport.enable()
  > from mercurial.hgweb import hgweb
  > from mercurial.hgweb import wsgicgi
  > from mercurial.hgweb.request import wsgiapplication
  > 
  > def make_web_app():
  >     return hgweb("test", "Empty test repository")
  > 
  > wsgicgi.launch(wsgiapplication(make_web_app))
  > HGWEB

  $ chmod 755 hgweb.cgi

  $ cat >hgweb.config <<HGWEBDIRCONF
  > [paths]
  > test = test
  > HGWEBDIRCONF

  $ cat >hgwebdir.cgi <<HGWEBDIR
  > #!/usr/bin/env python
  > #
  > # An example CGI script to export multiple hgweb repos, edit as necessary
  > 
  > import cgitb
  > cgitb.enable()
  > 
  > from mercurial import demandimport; demandimport.enable()
  > from mercurial.hgweb import hgwebdir
  > from mercurial.hgweb import wsgicgi
  > from mercurial.hgweb.request import wsgiapplication
  > 
  > def make_web_app():
  >     return hgwebdir("hgweb.config")
  > 
  > wsgicgi.launch(wsgiapplication(make_web_app))
  > HGWEBDIR

  $ chmod 755 hgwebdir.cgi

  $ . "$TESTDIR/cgienv"
  $ python hgweb.cgi > page1
  $ python hgwebdir.cgi > page2

  $ PATH_INFO="/test/"
  $ PATH_TRANSLATED="/var/something/test.cgi"
  $ REQUEST_URI="/test/test/"
  $ SCRIPT_URI="http://hg.omnifarious.org/test/test/"
  $ SCRIPT_URL="/test/test/"
  $ python hgwebdir.cgi > page3

  $ grep -i error page1 page2 page3
  [1]