view tests/basic_test_result.py @ 44861:065421e12248

files: speed up `hg files` when no flags change display It's not the first time I see slowness from this command slow down tools built on top of hg. The majority of the time is spent merely printing the result before this change, which is clearly not how it should be (especially since the computation of the result also looks slow). Running `hg files` in mozilla-central: parent revision: 1,260s this commit: 0,683s this commit without batching ui.write: 0,931s this commit replacing the body of the loop with `pass`: 0,566s This looks like a prime candidate for a rust fast path, but until then, it seems reasonable to optimize the python. Differential Revision: https://phab.mercurial-scm.org/D8586
author Valentin Gatien-Baron <valentin.gatienbaron@gmail.com>
date Tue, 26 May 2020 08:15:09 -0400
parents 2372284d9457
children 3a95a4e660b9
line wrap: on
line source

from __future__ import absolute_import, print_function

import unittest


class TestResult(unittest._TextTestResult):
    def __init__(self, options, *args, **kwargs):
        super(TestResult, self).__init__(*args, **kwargs)
        self._options = options

        # unittest.TestResult didn't have skipped until 2.7. We need to
        # polyfill it.
        self.skipped = []

        # We have a custom "ignored" result that isn't present in any Python
        # unittest implementation. It is very similar to skipped. It may make
        # sense to map it into skip some day.
        self.ignored = []

        self.times = []
        self._firststarttime = None
        # Data stored for the benefit of generating xunit reports.
        self.successes = []
        self.faildata = {}

    def addFailure(self, test, reason):
        print("FAILURE!", test, reason)

    def addSuccess(self, test):
        print("SUCCESS!", test)

    def addError(self, test, err):
        print("ERR!", test, err)

    # Polyfill.
    def addSkip(self, test, reason):
        print("SKIP!", test, reason)

    def addIgnore(self, test, reason):
        print("IGNORE!", test, reason)

    def onStart(self, test):
        print("ON_START!", test)

    def onEnd(self):
        print("ON_END!")

    def addOutputMismatch(self, test, ret, got, expected):
        return False

    def stopTest(self, test, interrupted=False):
        super(TestResult, self).stopTest(test)