view tests/basic_test_result.py @ 40034:393e44324037

httppeer: report http statistics Now that keepalive.py records HTTP request count and the number of bytes sent and received as part of performing those requests, we can easily print a report on the activity when closing a peer instance! Exact byte counts are globbed in tests because they are influenced by non-deterministic things, such as hostnames and port numbers. Plus, the exact byte count isn't too important anyway. I feel obliged to note that printing the byte count could have security implications. e.g. if sending a password via HTTP basic auth, the length of that password will influence the byte count and the reporting of the byte count could be a side-channel leak of the password length. I /think/ this is beyond our threshold for concern. But if we think it poses a problem, we can teach the byte count logging code to e.g. ignore sensitive HTTP request headers. We could also consider not reporting the byte count of request headers altogether. But since the wire protocol uses HTTP headers for sending command arguments, it is kind of important to report their size. Differential Revision: https://phab.mercurial-scm.org/D4858
author Gregory Szorc <gregory.szorc@gmail.com>
date Mon, 01 Oct 2018 13:17:38 -0700
parents f4a214300957
children 2372284d9457
line wrap: on
line source

from __future__ import absolute_import, print_function

import unittest

class TestResult(unittest._TextTestResult):

    def __init__(self, options, *args, **kwargs):
        super(TestResult, self).__init__(*args, **kwargs)
        self._options = options

        # unittest.TestResult didn't have skipped until 2.7. We need to
        # polyfill it.
        self.skipped = []

        # We have a custom "ignored" result that isn't present in any Python
        # unittest implementation. It is very similar to skipped. It may make
        # sense to map it into skip some day.
        self.ignored = []

        self.times = []
        self._firststarttime = None
        # Data stored for the benefit of generating xunit reports.
        self.successes = []
        self.faildata = {}

    def addFailure(self, test, reason):
        print("FAILURE!", test, reason)

    def addSuccess(self, test):
        print("SUCCESS!", test)

    def addError(self, test, err):
        print("ERR!", test, err)

    # Polyfill.
    def addSkip(self, test, reason):
        print("SKIP!", test, reason)

    def addIgnore(self, test, reason):
        print("IGNORE!", test, reason)

    def onStart(self, test):
        print("ON_START!", test)

    def onEnd(self):
        print("ON_END!")

    def addOutputMismatch(self, test, ret, got, expected):
        return False

    def stopTest(self, test, interrupted=False):
        super(TestResult, self).stopTest(test)