view tests/svnxml.py @ 21932:21a2f31f054d stable

largefiles: use "normallookup", if "mtime" of standin is unset Before this patch, largefiles gotten from "other" revision (without conflict) at "hg merge" become "clean" unexpectedly in steps below: 1. "merge.update()" is invoked 1-1 standinfile SF is updated in the working directory 1-2 "dirstate" entry for SF is "normallookup"-ed 2. "lfcommands.updatelfiles()" is invoked (by "overrides.hgmerge()") 2-1 largefile LF (for SF) is updated in the working directory 2-2 "dirstate" returns "n" for SF (by 1-2) 2-3 "lfdirstate" entry for LF is "normal"-ed 2-4 "lfdirstate" is written into ".hg/largefiles/dirstate", and timestamp of LF is stored into "lfdirstate" file (ASSUMPTION: timestamp of LF differs from one of "lfdirstate" file) Then, "hs status" treats LF as "clean", even though LF is updated by "other" revision (by 2-1), because "lfilesrepo.status()" always treats "normal"-ed files (by 2-3 and 2-4) as "clean". When timestamp is not set (= negative value) for standinfile in "dirstate", largefile should be "normallookup"-ed regardless of rebasing or not, because "n" state in "dirstate" doesn't ensure "clean"-ness of a standinfile at that time. This patch uses "normallookup" instead of "normal", if "mtime" of standin is unset This is a temporary way to fix with less changes. For fundamental resolution of this kind of problems in the future, "lfdirstate" should be updated with "dirstate" simultaneously while "merge.update" execution: maybe by hooking "recordupdates" It is also why this patch (temporarily) uses internal field "_map" of "dirstate" directly. This patch uses "[debug] dirstate.delaywrite" feature in the test, to ensure that timestamp of the largefile gotten from "other" revision is stored into ".hg/largefiles/dirstate". (for ASSUMPTION at 2-4) This patch newly adds "test-largefiles-update.t", to avoid increasing cost to run other tests for largefiles by subsequent patches (especially, "[debug] dirstate.delaywrite" causes so).
author FUJIWARA Katsunori <foozy@lares.dti.ne.jp>
date Tue, 22 Jul 2014 23:59:34 +0900
parents c58bdecdb800
children 812eb3b7dc43
line wrap: on
line source

# Read the output of a "svn log --xml" command on stdin, parse it and
# print a subset of attributes common to all svn versions tested by
# hg.
import xml.dom.minidom, sys

def xmltext(e):
    return ''.join(c.data for c
                   in e.childNodes
                   if c.nodeType == c.TEXT_NODE)

def parseentry(entry):
    e = {}
    e['revision'] = entry.getAttribute('revision')
    e['author'] = xmltext(entry.getElementsByTagName('author')[0])
    e['msg'] = xmltext(entry.getElementsByTagName('msg')[0])
    e['paths'] = []
    paths = entry.getElementsByTagName('paths')
    if paths:
        paths = paths[0]
        for p in paths.getElementsByTagName('path'):
            action = p.getAttribute('action')
            path = xmltext(p)
            frompath = p.getAttribute('copyfrom-path')
            fromrev = p.getAttribute('copyfrom-rev')
            e['paths'].append((path, action, frompath, fromrev))
    return e

def parselog(data):
    entries = []
    doc = xml.dom.minidom.parseString(data)
    for e in doc.getElementsByTagName('logentry'):
        entries.append(parseentry(e))
    return entries

def printentries(entries):
    fp = sys.stdout
    for e in entries:
        for k in ('revision', 'author', 'msg'):
            fp.write(('%s: %s\n' % (k, e[k])).encode('utf-8'))
        for path, action, fpath, frev in sorted(e['paths']):
            frominfo = ''
            if frev:
                frominfo = ' (from %s@%s)' % (fpath, frev)
            p = ' %s %s%s\n' % (action, path, frominfo)
            fp.write(p.encode('utf-8'))

if __name__ == '__main__':
    data = sys.stdin.read()
    entries = parselog(data)
    printentries(entries)