view .editorconfig @ 36845:ff2370a70fe8 stable

hgweb: garbage collect on every request There appears to be a cycle in localrepository or hgweb that is preventing repositories from being garbage collected when hgwebdir dispatches to hgweb. Every request creates a new repository instance and then leaks that object and other referenced objects. A periodic GC to find cycles will eventually collect the old repositories. But these don't run reliably and rapid requests to hgwebdir can result in rapidly increasing memory consumption. With the Firefox repository, repeated requests to raw-file URLs leak ~100 MB per hgwebdir request (most of this appears to be cached manifest data structures). WSGI processes quickly grow to >1 GB RSS. Breaking the cycles in localrepository is going to be a bit of work. Because we know that hgwebdir leaks localrepository instances, let's put a band aid on the problem in the form of an explicit gc.collect() on every hgwebdir request. As the inline comment states, ideally we'd do this in a finally block for the current request iff it dispatches to hgweb. But _runwsgi() returns an explicit value. We need the finally to run after generator exhaustion. So we'd need to refactor _runwsgi() to "yield" instead of "return." That's too much change for a patch to stable. So we implement this hack one function above and run it on every request. The performance impact of this change should be minimal. Any impact should be offset by benefits from not having hgwebdir processes leak memory.
author Gregory Szorc <gregory.szorc@gmail.com>
date Mon, 12 Mar 2018 13:15:00 -0700
parents d30fdd6d1bf7
children 1d6066336d7b
line wrap: on
line source

# See http://EditorConfig.org for the specification

root = true

[*.py]
indent_size = 4
indent_style = space
trim_trailing_whitespace = true

[*.{c,h}]
indent_size = 8
indent_style = tab
trim_trailing_whitespace = true