# HG changeset patch # User Nikolaj Sjujskij # Date 1368558263 -14400 # Node ID 838c6b72928d98280f7976728119ba964e3bd5f2 # Parent 0890e6fd3e00b40170a9414a28f59ac95967143a# Parent ddc7a6be20212d18f3e27d9d7e6f079a66d96f21 merge in i18n-ru diff -r 0890e6fd3e00 -r 838c6b72928d .hgsigs --- a/.hgsigs Sun May 12 15:35:53 2013 +0400 +++ b/.hgsigs Tue May 14 23:04:23 2013 +0400 @@ -64,3 +64,10 @@ 0c10cf8191469e7c3c8844922e17e71a176cb7cb 0 iD8DBQBQvQWoywK+sNU5EO8RAnq3AJoCn98u4geFx5YaQaeh99gFhCd7bQCgjoBwBSUyOvGd0yBy60E3Vv3VZhM= a4765077b65e6ae29ba42bab7834717b5072d5ba 0 iD8DBQBQ486sywK+sNU5EO8RAhmJAJ90aLfLKZhmcZN7kqphigQJxiFOQACeJ5IUZxjGKH4xzi3MrgIcx9n+dB0= f5fbe15ca7449f2c9a3cf817c86d0ae68b307214 0 iD8DBQBQ+yuYywK+sNU5EO8RAm9JAJoD/UciWvpGeKBcpGtZJBFJVcL/HACghDXSgQ+xQDjB+6uGrdgAQsRR1Lg= +a6088c05e43a8aee0472ca3a4f6f8d7dd914ebbf 0 iD8DBQBRDDROywK+sNU5EO8RAh75AJ9uJCGoCWnP0Lv/+XuYs4hvUl+sAgCcD36QgAnuw8IQXrvv684BAXAnHcA= +7511d4df752e61fe7ae4f3682e0a0008573b0402 0 iD8DBQBRFYaoywK+sNU5EO8RAuErAJoDyhXn+lptU3+AevVdwAIeNFyR2gCdHzPHyWd+JDeWCUR+pSOBi8O2ppM= +5b7175377babacce80a6c1e12366d8032a6d4340 0 iD8DBQBRMCYgywK+sNU5EO8RAq1/AKCWKlt9ysibyQgYwoxxIOZv5J8rpwCcDSHQaaf1fFZUTnQsOePwcM2Y/Sg= +50c922c1b5145dab8baefefb0437d363b6a6c21c 0 iD8DBQBRWnUnywK+sNU5EO8RAuQRAJwM42cJqJPeqJ0jVNdMqKMDqr4dSACeP0cRVGz1gitMuV0x8f3mrZrqc7I= +8a7bd2dccd44ed571afe7424cd7f95594f27c092 0 iD8DBQBRXfBvywK+sNU5EO8RAn+LAKCsMmflbuXjYRxlzFwId5ptm8TZcwCdGkyLbZcASBOkzQUm/WW1qfknJHU= +292cd385856d98bacb2c3086f8897bc660c2beea 0 iD8DBQBRcM0BywK+sNU5EO8RAjp4AKCJBykQbvXhKuvLSMxKx3a2TBiXcACfbr/kLg5GlZTF/XDPmY+PyHgI/GM= +23f785b38af38d2fca6b8f3db56b8007a84cd73a 0 iD8DBQBRgZwNywK+sNU5EO8RAmO4AJ4u2ILGuimRP6MJgE2t65LZ5dAdkACgiENEstIdrlFC80p+sWKD81kKIYI= diff -r 0890e6fd3e00 -r 838c6b72928d .hgtags --- a/.hgtags Sun May 12 15:35:53 2013 +0400 +++ b/.hgtags Tue May 14 23:04:23 2013 +0400 @@ -77,3 +77,10 @@ 0c10cf8191469e7c3c8844922e17e71a176cb7cb 2.4.1 a4765077b65e6ae29ba42bab7834717b5072d5ba 2.4.2 f5fbe15ca7449f2c9a3cf817c86d0ae68b307214 2.5-rc +a6088c05e43a8aee0472ca3a4f6f8d7dd914ebbf 2.5 +7511d4df752e61fe7ae4f3682e0a0008573b0402 2.5.1 +5b7175377babacce80a6c1e12366d8032a6d4340 2.5.2 +50c922c1b5145dab8baefefb0437d363b6a6c21c 2.5.3 +8a7bd2dccd44ed571afe7424cd7f95594f27c092 2.5.4 +292cd385856d98bacb2c3086f8897bc660c2beea 2.6-rc +23f785b38af38d2fca6b8f3db56b8007a84cd73a 2.6 diff -r 0890e6fd3e00 -r 838c6b72928d Makefile --- a/Makefile Sun May 12 15:35:53 2013 +0400 +++ b/Makefile Tue May 14 23:04:23 2013 +0400 @@ -94,6 +94,9 @@ test-%: cd tests && $(PYTHON) run-tests.py $(TESTFLAGS) $@ +check-code: + hg manifest | xargs python contrib/check-code.py + update-pot: i18n/hg.pot i18n/hg.pot: $(PYFILES) $(DOCFILES) diff -r 0890e6fd3e00 -r 838c6b72928d contrib/bash_completion --- a/contrib/bash_completion Sun May 12 15:35:53 2013 +0400 +++ b/contrib/bash_completion Tue May 14 23:04:23 2013 +0400 @@ -1,4 +1,4 @@ -# bash completion for the Mercurial distributed SCM +# bash completion for the Mercurial distributed SCM -*- sh -*- # Docs: # @@ -80,26 +80,20 @@ done } -_hg_status() +_hg_debugpathcomplete() { - local files="$(_hg_cmd status -n$1 .)" + local files="$(_hg_cmd debugpathcomplete $1 "$cur")" local IFS=$'\n' compopt -o filenames 2>/dev/null COMPREPLY=(${COMPREPLY[@]:-} $(compgen -W '$files' -- "$cur")) } -_hg_tags() +_hg_status() { - local tags="$(_hg_cmd tags -q)" + local files="$(_hg_cmd status -n$1 "glob:$cur**")" local IFS=$'\n' - COMPREPLY=(${COMPREPLY[@]:-} $(compgen -W '$tags' -- "$cur")) -} - -_hg_branches() -{ - local branches="$(_hg_cmd branches -q)" - local IFS=$'\n' - COMPREPLY=(${COMPREPLY[@]:-} $(compgen -W '$branches' -- "$cur")) + compopt -o filenames 2>/dev/null + COMPREPLY=(${COMPREPLY[@]:-} $(compgen -W '$files' -- "$cur")) } _hg_bookmarks() @@ -111,9 +105,9 @@ _hg_labels() { - _hg_tags - _hg_branches - _hg_bookmarks + local labels="$(_hg_cmd debuglabelcomplete "$cur")" + local IFS=$'\n' + COMPREPLY=(${COMPREPLY[@]:-} $(compgen -W '$labels' -- "$cur")) } # this is "kind of" ugly... @@ -235,7 +229,7 @@ fi _hg_labels ;; - manifest|update) + manifest|update|up|checkout|co) _hg_labels ;; pull|push|outgoing|incoming) @@ -251,20 +245,20 @@ merge) _hg_labels ;; - commit|record) + commit|ci|record) _hg_status "mar" ;; - remove) - _hg_status "d" + remove|rm) + _hg_debugpathcomplete -n ;; forget) - _hg_status "a" + _hg_debugpathcomplete -fa ;; diff) _hg_status "mar" ;; revert) - _hg_status "mard" + _hg_debugpathcomplete ;; clone) local count=$(_hg_count_non_option) @@ -294,13 +288,6 @@ # Completion for commands provided by extensions # bookmarks -_hg_bookmarks() -{ - local bookmarks="$(_hg_cmd bookmarks --quiet )" - local IFS=$'\n' - COMPREPLY=(${COMPREPLY[@]:-} $(compgen -W '$bookmarks' -- "$cur")) -} - _hg_cmd_bookmarks() { if [[ "$prev" = @(-d|--delete|-m|--rename) ]]; then diff -r 0890e6fd3e00 -r 838c6b72928d contrib/check-code.py --- a/contrib/check-code.py Sun May 12 15:35:53 2013 +0400 +++ b/contrib/check-code.py Tue May 14 23:04:23 2013 +0400 @@ -19,7 +19,8 @@ def reppython(m): comment = m.group('comment') if comment: - return "#" * len(comment) + l = len(comment.rstrip()) + return "#" * l + comment[l:] return repquote(m) def repcomment(m): @@ -73,13 +74,16 @@ (r'/dev/u?random', "don't use entropy, use /dev/zero"), (r'do\s*true;\s*done', "don't use true as loop body, use sleep 0"), (r'^( *)\t', "don't use tabs to indent"), + (r'sed (-e )?\'(\d+|/[^/]*/)i(?!\\\n)', + "put a backslash-escaped newline after sed 'i' command"), ], # warnings [ (r'^function', "don't use 'function', use old style"), (r'^diff.*-\w*N', "don't use 'diff -N'"), - (r'\$PWD', "don't use $PWD, use `pwd`"), + (r'\$PWD|\${PWD}', "don't use $PWD, use `pwd`"), (r'^([^"\'\n]|("[^"\n]*")|(\'[^\'\n]*\'))*\^', "^ must be quoted"), + (r'kill (`|\$\()', "don't use kill, use killdaemons.py") ] ] @@ -88,6 +92,7 @@ (r"<<(\S+)((.|\n)*?\n\1)", rephere), ] +winglobmsg = "use (glob) to match Windows paths too" uprefix = r"^ \$ " utestpats = [ [ @@ -100,11 +105,16 @@ "explicit exit code checks unnecessary"), (uprefix + r'set -e', "don't use set -e"), (uprefix + r'\s', "don't indent commands, use > for continued lines"), - (r'^ saved backup bundle to \$TESTTMP.*\.hg$', - "use (glob) to match Windows paths too"), + (r'^ saved backup bundle to \$TESTTMP.*\.hg$', winglobmsg), + (r'^ changeset .* references (corrupted|missing) \$TESTTMP/.*[^)]$', + winglobmsg), + (r'^ pulling from \$TESTTMP/.*[^)]$', winglobmsg, '\$TESTTMP/unix-repo$'), ], # warnings - [] + [ + (r'^ [^*?/\n]* \(glob\)$', + "warning: glob match with no glob character (?*/)"), + ] ] for i in [0, 1]: @@ -212,10 +222,11 @@ (r'(?i)descendent', "the proper spelling is descendAnt"), (r'\.debug\(\_', "don't mark debug messages for translation"), (r'\.strip\(\)\.split\(\)', "no need to strip before splitting"), - (r'^\s*except\s*:', "warning: naked except clause", r'#.*re-raises'), + (r'^\s*except\s*:', "naked except clause", r'#.*re-raises'), (r':\n( )*( ){1,3}[^ ]', "must indent 4 spaces"), (r'ui\.(status|progress|write|note|warn)\([\'\"]x', "missing _() in ui message (use () to hide false-positives)"), + (r'release\(.*wlock, .*lock\)', "wrong lock release order"), ], # warnings [ @@ -229,6 +240,15 @@ (?P=quote))""", reppython), ] +txtfilters = [] + +txtpats = [ + [ + ('\s$', 'trailing whitespace'), + ], + [] +] + cpats = [ [ (r'//', "don't use //-style comments"), @@ -284,6 +304,7 @@ inrevlogpats), ('layering violation ui in util', r'mercurial/util\.py', pyfilters, inutilpats), + ('txt', r'.*\.txt$', txtfilters, txtpats), ] class norepeatlogger(object): diff -r 0890e6fd3e00 -r 838c6b72928d contrib/hgk --- a/contrib/hgk Sun May 12 15:35:53 2013 +0400 +++ b/contrib/hgk Tue May 14 23:04:23 2013 +0400 @@ -30,6 +30,8 @@ interp alias {} ttk::label {} label interp alias {} ttk::scrollbar {} scrollbar interp alias {} ttk::optionMenu {} tk_optionMenu + + proc updatepalette {} {} } else { proc ::ttk::optionMenu {w varName firstValue args} { upvar #0 $varName var @@ -46,6 +48,11 @@ } return $w.menu } + proc updatepalette {} { + catch { + tk_setPalette background [ttk::style lookup client -background] + } + } } if {[tk windowingsystem] eq "win32"} { @@ -109,12 +116,18 @@ # end of win32 section } else { -if {[ttk::style theme use] eq "default"} { +if {[catch { + set theme [ttk::style theme use] +}]} { + set theme $::ttk::currentTheme +} +if {$theme eq "default"} { ttk::style theme use clam } } +updatepalette # Unify right mouse button handling. # See "mouse buttons on macintosh" thread on comp.lang.tcl @@ -134,6 +147,18 @@ } } +proc popupify {w} { + wm resizable $w 0 0 + wm withdraw $w + update + set x [expr {([winfo screenwidth .]-[winfo reqwidth $w])/2}] + set y [expr {([winfo screenheight .]-[winfo reqheight $w])/2}] + wm geometry $w +$x+$y + wm transient $w . + wm deiconify $w + wm resizable $w 1 1 +} + proc getcommits {rargs} { global commits commfd phase canv mainfont env global startmsecs nextupdate ncmupdate @@ -375,11 +400,11 @@ } } if {$audate != {}} { - set audate [clock format $audate -format "%Y-%m-%d %H:%M:%S"] + set audate [clock format $audate] } if {$comdate != {}} { set cdate($id) $comdate - set comdate [clock format $comdate -format "%Y-%m-%d %H:%M:%S"] + set comdate [clock format $comdate] } set commitinfo($id) [list $headline $auname $audate \ $comname $comdate $comment $rev $branch $bookmark] @@ -431,16 +456,13 @@ exit 2 } } - regsub -all "\r\n" $tags "\n" tags - - set lines [split $tags "\n"] - foreach f $lines { - regexp {(\S+)$} $f full - regsub {\s+(\S+)$} $f "" direct - set sha [split $full ':'] - set tag [lindex $sha 1] - lappend tagids($direct) $tag - lappend idtags($tag) $direct + + foreach {tag rev} $tags { + # we use foreach as Tcl8.4 doesn't support lassign + foreach {- id} [split $rev :] { + lappend tagids($tag) $id + lappend idtags($id) $tag + } } set status [catch {exec $env(HG) --config ui.report_untrusted=false heads} heads] @@ -450,9 +472,8 @@ exit 2 } } - regsub -all "\r\n" $heads "\n" heads - - set lines [split $heads "\n"] + + set lines [split $heads \r\n] foreach f $lines { set match "" regexp {changeset:\s+(\S+):(\S+)$} $f match id sha @@ -524,6 +545,7 @@ ttk::button $w.ok -text OK -command "destroy $w" pack $w.ok -side bottom -fill x bind $w "grab $w; focus $w" + popupify $w tkwait window $w } @@ -540,7 +562,7 @@ if {[info exists posx]} { wm geometry . +$posx+$posy } - + menu .bar .bar add cascade -label "File" -menu .bar.file menu .bar.file @@ -637,7 +659,7 @@ findtype Exact IgnCase Regexp] set findloc "All fields" ttk::optionMenu .ctop.top.bar.findloc findloc "All fields" Headline \ - Comments Author Committer Files Pickaxe + Comments Author Files Pickaxe pack .ctop.top.bar.findloc -side right pack .ctop.top.bar.findtype -side right # for making sure type==Exact whenever loc==Pickaxe @@ -944,6 +966,7 @@ pack $w.m -side top -fill x -padx 20 -pady 20 ttk::button $w.ok -text Close -command "destroy $w" pack $w.ok -side bottom + popupify $w } set aunextcolor 0 @@ -1924,7 +1947,7 @@ set oldsel $selectedline } set didsel 0 - set fldtypes {Headline Author Date Committer CDate Comment} + set fldtypes {Headline Author Date CDate Comment} for {set l 0} {$l < $numcommits} {incr l} { set id $lineid($l) set info $commitinfo($id) @@ -2466,12 +2489,12 @@ $ctext mark set fmark.0 0.0 $ctext mark gravity fmark.0 left set info $commitinfo($id) - $ctext insert end "Revision: [lindex $info 6]\n" + $ctext insert end "Changeset: [lindex $info 6]\n" if {[llength [lindex $info 7]] > 0} { $ctext insert end "Branch: [lindex $info 7]\n" } - $ctext insert end "Author: [lindex $info 1] [lindex $info 2]\n" - $ctext insert end "Committer: [lindex $info 3] [lindex $info 4]\n" + $ctext insert end "User: [lindex $info 1]\n" + $ctext insert end "Date: [lindex $info 2]\n" if {[info exists idbookmarks($id)]} { $ctext insert end "Bookmarks:" foreach bookmark $idbookmarks($id) { @@ -3596,7 +3619,7 @@ $ctext tag bind link0 <1> [list selbyid $id] set info $commitinfo($id) $ctext insert end "\n\t[lindex $info 0]\n" - $ctext insert end "\tAuthor:\t[lindex $info 1]\n" + $ctext insert end "\tUser:\t[lindex $info 1]\n" $ctext insert end "\tDate:\t[lindex $info 2]\n" if {[info exists children($id)]} { $ctext insert end "\nChildren:" @@ -3608,7 +3631,7 @@ $ctext insert end $child [list link link$i] $ctext tag bind link$i <1> [list selbyid $child] $ctext insert end "\n\t[lindex $info 0]" - $ctext insert end "\n\tAuthor:\t[lindex $info 1]" + $ctext insert end "\n\tUser:\t[lindex $info 1]" $ctext insert end "\n\tDate:\t[lindex $info 2]\n" } } @@ -3715,13 +3738,11 @@ set patchtop $top catch {destroy $top} toplevel $top - ttk::label $top.title -text "Generate patch" - grid $top.title - -pady 10 ttk::label $top.from -text "From:" ttk::entry $top.fromsha1 -width 40 $top.fromsha1 insert 0 $oldid $top.fromsha1 conf -state readonly - grid $top.from $top.fromsha1 -sticky w + grid $top.from $top.fromsha1 -sticky w -pady {10 0} ttk::entry $top.fromhead -width 60 $top.fromhead insert 0 $oldhead $top.fromhead conf -state readonly @@ -3750,6 +3771,8 @@ grid columnconfigure $top.buts 1 -weight 1 -uniform a grid $top.buts - -pady 10 -sticky ew focus $top.fname + popupify $top + wm title $top "Generate a patch" } proc mkpatchrev {} { @@ -3795,13 +3818,11 @@ set mktagtop $top catch {destroy $top} toplevel $top - ttk::label $top.title -text "Create tag" - grid $top.title - -pady 10 ttk::label $top.id -text "ID:" ttk::entry $top.sha1 -width 40 $top.sha1 insert 0 $rowmenuid $top.sha1 conf -state readonly - grid $top.id $top.sha1 -sticky w + grid $top.id $top.sha1 -sticky w -pady {10 0} ttk::entry $top.head -width 60 $top.head insert 0 [lindex $commitinfo($rowmenuid) 0] $top.head conf -state readonly @@ -3817,6 +3838,8 @@ grid columnconfigure $top.buts 1 -weight 1 -uniform a grid $top.buts - -pady 10 -sticky ew focus $top.tag + popupify $top + wm title $top "Create a tag" } proc domktag {} { @@ -3875,13 +3898,11 @@ set wrcomtop $top catch {destroy $top} toplevel $top - ttk::label $top.title -text "Write commit to file" - grid $top.title - -pady 10 ttk::label $top.id -text "ID:" ttk::entry $top.sha1 -width 40 $top.sha1 insert 0 $rowmenuid $top.sha1 conf -state readonly - grid $top.id $top.sha1 -sticky w + grid $top.id $top.sha1 -sticky w -pady {10 0} ttk::entry $top.head -width 60 $top.head insert 0 [lindex $commitinfo($rowmenuid) 0] $top.head conf -state readonly @@ -3901,6 +3922,8 @@ grid columnconfigure $top.buts 1 -weight 1 -uniform a grid $top.buts - -pady 10 -sticky ew focus $top.fname + popupify $top + wm title $top "Write commit to a file" } proc wrcomgo {} { diff -r 0890e6fd3e00 -r 838c6b72928d contrib/mergetools.hgrc --- a/contrib/mergetools.hgrc Sun May 12 15:35:53 2013 +0400 +++ b/contrib/mergetools.hgrc Tue May 14 23:04:23 2013 +0400 @@ -15,7 +15,7 @@ gvimdiff.regname=path gvimdiff.priority=-9 -vimdiff.args=$local $other $base +vimdiff.args=$local $other $base -c 'redraw | echomsg "hg merge conflict, type \":cq\" to abort vimdiff"' vimdiff.check=changed vimdiff.priority=-10 @@ -25,7 +25,8 @@ gpyfm.gui=True meld.gui=True -meld.args=--label='local' $local --label='base' $base --label='other' $other +meld.args=--label='local' $local --label='merged' $base --label='other' $other -o $output +meld.check=changed meld.diffargs=-a --label='$plabel1' $parent --label='$clabel' $child tkdiff.args=$local $other -a $base -o $output diff -r 0890e6fd3e00 -r 838c6b72928d contrib/perf.py --- a/contrib/perf.py Sun May 12 15:35:53 2013 +0400 +++ b/contrib/perf.py Tue May 14 23:04:23 2013 +0400 @@ -2,7 +2,7 @@ '''helper extension to measure performance''' from mercurial import cmdutil, scmutil, util, match, commands, obsolete -from mercurial import repoview, branchmap +from mercurial import repoview, branchmap, merge, copies import time, os, sys cmdtable = {} @@ -54,6 +54,15 @@ # False)))) timer(lambda: sum(map(len, repo.status(**opts)))) +@command('perfaddremove') +def perfaddremove(ui, repo): + try: + oldquiet = repo.ui.quiet + repo.ui.quiet = True + timer(lambda: scmutil.addremove(repo, dry_run=True)) + finally: + repo.ui.quiet = oldquiet + def clearcaches(cl): # behave somewhat consistently across internal API changes if util.safehasattr(cl, 'clearcaches'): @@ -99,6 +108,15 @@ rev in s timer(d) +@command('perfdirs') +def perfdirs(ui, repo): + dirstate = repo.dirstate + 'a' in dirstate + def d(): + dirstate.dirs() + del dirstate._dirs + timer(d) + @command('perfdirstate') def perfdirstate(ui, repo): "a" in repo.dirstate @@ -124,6 +142,30 @@ ds.write() timer(d) +@command('perfmergecalculate', + [('r', 'rev', '.', 'rev to merge against')]) +def perfmergecalculate(ui, repo, rev): + wctx = repo[None] + rctx = scmutil.revsingle(repo, rev, rev) + ancestor = wctx.ancestor(rctx) + # we don't want working dir files to be stat'd in the benchmark, so prime + # that cache + wctx.dirty() + def d(): + # acceptremote is True because we don't want prompts in the middle of + # our benchmark + merge.calculateupdates(repo, wctx, rctx, ancestor, False, False, False, + acceptremote=True) + timer(d) + +@command('perfpathcopies', [], "REV REV") +def perfpathcopies(ui, repo, rev1, rev2): + ctx1 = scmutil.revsingle(repo, rev1, rev1) + ctx2 = scmutil.revsingle(repo, rev2, rev2) + def d(): + copies.pathcopies(ctx1, ctx2) + timer(d) + @command('perfmanifest') def perfmanifest(ui, repo): def d(): @@ -268,7 +310,7 @@ def perfrevset(ui, repo, expr, clear=False): """benchmark the execution time of a revset - Use the --clean option if need to evaluate the impact of build volative + Use the --clean option if need to evaluate the impact of build volatile revisions set cache on the revset execution. Volatile cache hold filtered and obsolete related cache.""" def d(): @@ -361,6 +403,3 @@ finally: branchmap.read = oldread branchmap.branchcache.write = oldwrite - - - diff -r 0890e6fd3e00 -r 838c6b72928d contrib/pylintrc --- a/contrib/pylintrc Sun May 12 15:35:53 2013 +0400 +++ b/contrib/pylintrc Tue May 14 23:04:23 2013 +0400 @@ -1,11 +1,11 @@ # lint Python modules using external checkers. -# +# # This is the main checker controlling the other ones and the reports # generation. It is itself both a raw checker and an astng checker in order # to: # * handle message activation / deactivation at the module level # * handle some basic but necessary stats'data (number of classes, methods...) -# +# [MASTER] # Specify a configuration file. @@ -95,7 +95,7 @@ # try to find bugs in the code using type inference -# +# [TYPECHECK] # Tells whether missing members accessed in mixin class should be ignored. A @@ -120,7 +120,7 @@ # * undefined variables # * redefinition of variable from builtins or from an outer scope # * use of variable before assignment -# +# [VARIABLES] # Tells whether we should check for unused import in __init__ files. @@ -143,7 +143,7 @@ # * dangerous default values as arguments # * redefinition of function / method / class # * uses of the global statement -# +# [BASIC] # Required attributes for module, separated by a comma @@ -197,7 +197,7 @@ # * relative / wildcard imports # * cyclic imports # * uses of deprecated modules -# +# [IMPORTS] # Deprecated modules which should not be used, separated by a comma @@ -219,7 +219,7 @@ # checks for sign of poor/misdesign: # * number of methods, attributes, local variables... # * size, complexity of functions, methods -# +# [DESIGN] # Maximum number of arguments for function / method @@ -257,7 +257,7 @@ # * attributes not defined in the __init__ method # * supported interfaces implementation # * unreachable code -# +# [CLASSES] # List of interface methods to ignore, separated by a comma. This is used for @@ -273,7 +273,7 @@ # * strict indentation # * line length # * use of <> instead of != -# +# [FORMAT] # Maximum number of characters on a single line. @@ -290,7 +290,7 @@ # checks for: # * warning notes in the code like FIXME, XXX # * PEP 263: source code with non ascii character but no encoding declaration -# +# [MISCELLANEOUS] # List of note tags to take in consideration, separated by a comma. @@ -300,7 +300,7 @@ # checks for similarities and duplicated code. This computation may be # memory / CPU intensive, so you should disable it if you experiments some # problems. -# +# [SIMILARITIES] # Minimum lines number of a similarity. diff -r 0890e6fd3e00 -r 838c6b72928d contrib/simplemerge --- a/contrib/simplemerge Sun May 12 15:35:53 2013 +0400 +++ b/contrib/simplemerge Tue May 14 23:04:23 2013 +0400 @@ -44,7 +44,7 @@ try: for fp in (sys.stdin, sys.stdout, sys.stderr): util.setbinary(fp) - + opts = {} try: args = fancyopts.fancyopts(sys.argv[1:], options, opts) diff -r 0890e6fd3e00 -r 838c6b72928d contrib/synthrepo.py --- a/contrib/synthrepo.py Sun May 12 15:35:53 2013 +0400 +++ b/contrib/synthrepo.py Tue May 14 23:04:23 2013 +0400 @@ -35,7 +35,7 @@ - Symlinks and binary files are ignored ''' -import bisect, collections, json, os, random, time +import bisect, collections, json, os, random, time, sys from mercurial import cmdutil, context, patch, scmutil, url, util, hg from mercurial.i18n import _ from mercurial.node import nullrev, nullid diff -r 0890e6fd3e00 -r 838c6b72928d contrib/undumprevlog --- a/contrib/undumprevlog Sun May 12 15:35:53 2013 +0400 +++ b/contrib/undumprevlog Tue May 14 23:04:23 2013 +0400 @@ -11,7 +11,7 @@ opener = scmutil.opener('.', False) tr = transaction.transaction(sys.stderr.write, opener, "undump.journal") -while 1: +while True: l = sys.stdin.readline() if not l: break diff -r 0890e6fd3e00 -r 838c6b72928d contrib/vim/hgcommand.vim --- a/contrib/vim/hgcommand.vim Sun May 12 15:35:53 2013 +0400 +++ b/contrib/vim/hgcommand.vim Tue May 14 23:04:23 2013 +0400 @@ -10,7 +10,7 @@ " Bob Hiestand for the fabulous " cvscommand.vim from which this script was directly created by " means of sed commands and minor tweaks. -" Note: +" Note: " For Vim7 the use of Bob Hiestand's vcscommand.vim " " in conjunction with Vladmir Marek's Hg backend diff -r 0890e6fd3e00 -r 838c6b72928d contrib/win32/hg.bat --- a/contrib/win32/hg.bat Sun May 12 15:35:53 2013 +0400 +++ b/contrib/win32/hg.bat Tue May 14 23:04:23 2013 +0400 @@ -4,9 +4,14 @@ setlocal set HG=%~f0 -rem Use a full path to Python (relative to this script) as the standard Python -rem install does not put python.exe on the PATH... +rem Use a full path to Python (relative to this script) if it exists, +rem as the standard Python install does not put python.exe on the PATH... +rem Otherwise, expect that python.exe can be found on the PATH. rem %~dp0 is the directory of this script -"%~dp0..\python" "%~dp0hg" %* +if exist "%~dp0..\python.exe" ( + "%~dp0..\python" "%~dp0hg" %* +) else ( + python "%~dp0hg" %* +) endlocal diff -r 0890e6fd3e00 -r 838c6b72928d contrib/win32/win32-build.txt --- a/contrib/win32/win32-build.txt Sun May 12 15:35:53 2013 +0400 +++ b/contrib/win32/win32-build.txt Tue May 14 23:04:23 2013 +0400 @@ -24,7 +24,7 @@ http://www.microsoft.com/downloads/details.aspx?FamilyID=9b2da534-3e03-4391-8a4d-074b9f2bc1bf for 64-bit: http://www.microsoft.com/downloads/details.aspx?familyid=bd2a6171-e2d6-4230-b809-9a8d7548c1b6 - + The py2exe distutils extension http://sourceforge.net/projects/py2exe/ @@ -94,7 +94,7 @@ amd64_Microsoft.VC90.CRT_(...)_9.0.21022.8(...).manifest for x64), copy it in the dist directory and rename it to Microsoft.VC90.CRT.manifest. -Before building the installer, you have to build Mercurial HTML documentation +Before building the installer, you have to build Mercurial HTML documentation (or fix mercurial.iss to not reference the doc directory): cd doc diff -r 0890e6fd3e00 -r 838c6b72928d contrib/wix/guids.wxi --- a/contrib/wix/guids.wxi Sun May 12 15:35:53 2013 +0400 +++ b/contrib/wix/guids.wxi Tue May 14 23:04:23 2013 +0400 @@ -19,7 +19,7 @@ - + diff -r 0890e6fd3e00 -r 838c6b72928d contrib/wix/i18n.wxs --- a/contrib/wix/i18n.wxs Sun May 12 15:35:53 2013 +0400 +++ b/contrib/wix/i18n.wxs Tue May 14 23:04:23 2013 +0400 @@ -4,7 +4,7 @@ - @@ -14,8 +14,8 @@ - diff -r 0890e6fd3e00 -r 838c6b72928d contrib/wix/mercurial.wxs --- a/contrib/wix/mercurial.wxs Sun May 12 15:35:53 2013 +0400 +++ b/contrib/wix/mercurial.wxs Tue May 14 23:04:23 2013 +0400 @@ -16,7 +16,7 @@ diff -r 0890e6fd3e00 -r 838c6b72928d contrib/zsh_completion --- a/contrib/zsh_completion Sun May 12 15:35:53 2013 +0400 +++ b/contrib/zsh_completion Tue May 14 23:04:23 2013 +0400 @@ -163,21 +163,10 @@ } _hg_labels() { - _hg_tags "$@" - _hg_bookmarks "$@" - _hg_branches "$@" + labels=("${(f)$(_hg_cmd debuglabelcomplete)}") + (( $#labels )) && _describe -t labels 'labels' labels } -_hg_tags() { - typeset -a tags - local tag rev - - _hg_cmd tags | while read tag - do - tags+=(${tag/ #[0-9]#:*}) - done - (( $#tags )) && _describe -t tags 'tags' tags -} _hg_bookmarks() { typeset -a bookmark bookmarks @@ -508,7 +497,7 @@ '(--user -u)'{-u+,--user}'[record user as commiter]:user:' \ '(--rev -r)'{-r+,--rev}'[revision]:revision:_hg_labels' \ '(--message -m)'{-m+,--message}'[use as commit message]:text:' \ - '(--logfile -l)'{-l+,--logfile}'[read commit message from ]:log file:_files -g \*.txt' + '(--logfile -l)'{-l+,--logfile}'[read commit message from ]:log file:_files' } _hg_cmd_bisect() { @@ -576,7 +565,7 @@ _arguments -s -w : $_hg_global_opts $_hg_pat_opts $_hg_subrepos_opts \ '(--addremove -A)'{-A,--addremove}'[mark new/missing files as added/removed before committing]' \ '(--message -m)'{-m+,--message}'[use as commit message]:text:' \ - '(--logfile -l)'{-l+,--logfile}'[read commit message from ]:log file:_files -g \*.txt' \ + '(--logfile -l)'{-l+,--logfile}'[read commit message from ]:log file:_files' \ '(--date -d)'{-d+,--date}'[record datecode as commit date]:date code:' \ '(--user -u)'{-u+,--user}'[record user as commiter]:user:' \ '--amend[amend the parent of the working dir]' \ @@ -940,7 +929,7 @@ _hg_cmd_view() { _arguments -s -w : $_hg_global_opts \ '(--limit -l)'{-l+,--limit}'[limit number of changes displayed]:' \ - ':revision range:_hg_tags' + ':revision range:_hg_labels' } # MQ diff -r 0890e6fd3e00 -r 838c6b72928d doc/gendoc.py --- a/doc/gendoc.py Sun May 12 15:35:53 2013 +0400 +++ b/doc/gendoc.py Tue May 14 23:04:23 2013 +0400 @@ -5,6 +5,7 @@ sys.path.append(os.path.join('..', 'mercurial', 'pure')) from mercurial import demandimport; demandimport.enable() from mercurial import encoding +from mercurial import minirst from mercurial.commands import table, globalopts from mercurial.i18n import _ from mercurial.help import helptable @@ -63,28 +64,15 @@ return d -def section(ui, s): - ui.write("%s\n%s\n\n" % (s, "\"" * encoding.colwidth(s))) - -def subsection(ui, s): - ui.write("%s\n%s\n\n" % (s, '=' * encoding.colwidth(s))) - -def subsubsection(ui, s): - ui.write("%s\n%s\n\n" % (s, "-" * encoding.colwidth(s))) - -def subsubsubsection(ui, s): - ui.write("%s\n%s\n\n" % (s, "." * encoding.colwidth(s))) - - def show_doc(ui): # print options - section(ui, _("Options")) + ui.write(minirst.section(_("Options"))) for optstr, desc in get_opts(globalopts): ui.write("%s\n %s\n\n" % (optstr, desc)) # print cmds - section(ui, _("Commands")) - commandprinter(ui, table, subsection) + ui.write(minirst.section(_("Commands"))) + commandprinter(ui, table, minirst.subsection) # print topics for names, sec, doc in helptable: @@ -95,13 +83,13 @@ for name in names: ui.write(".. _%s:\n" % name) ui.write("\n") - section(ui, sec) + ui.write(minirst.section(sec)) if util.safehasattr(doc, '__call__'): doc = doc() ui.write(doc) ui.write("\n") - section(ui, _("Extensions")) + ui.write(minirst.section(_("Extensions"))) ui.write(_("This section contains help for extensions that are " "distributed together with Mercurial. Help for other " "extensions is available in the help system.")) @@ -113,12 +101,12 @@ for extensionname in sorted(allextensionnames()): mod = extensions.load(None, extensionname, None) - subsection(ui, extensionname) + ui.write(minirst.subsection(extensionname)) ui.write("%s\n\n" % mod.__doc__) cmdtable = getattr(mod, 'cmdtable', None) if cmdtable: - subsubsection(ui, _('Commands')) - commandprinter(ui, cmdtable, subsubsubsection) + ui.write(minirst.subsubsection(_('Commands'))) + commandprinter(ui, cmdtable, minirst.subsubsubsection) def commandprinter(ui, cmdtable, sectionfunc): h = {} @@ -133,7 +121,7 @@ if f.startswith("debug"): continue d = get_cmd(h[f], cmdtable) - sectionfunc(ui, d['cmd']) + ui.write(sectionfunc(d['cmd'])) # synopsis ui.write("::\n\n") synopsislines = d['synopsis'].splitlines() diff -r 0890e6fd3e00 -r 838c6b72928d doc/hg.1.txt --- a/doc/hg.1.txt Sun May 12 15:35:53 2013 +0400 +++ b/doc/hg.1.txt Tue May 14 23:04:23 2013 +0400 @@ -84,7 +84,7 @@ This file can be used to define local tags which are not shared among repositories. The file format is the same as for ``.hgtags``, but it is encoded using the local system encoding. - + Some commands (e.g. revert) produce backup files ending in ``.orig``, if the ``.orig`` file already exists and is not tracked by Mercurial, it will be overwritten. diff -r 0890e6fd3e00 -r 838c6b72928d doc/style.css --- a/doc/style.css Sun May 12 15:35:53 2013 +0400 +++ b/doc/style.css Tue May 14 23:04:23 2013 +0400 @@ -303,7 +303,7 @@ div.contents.local { -moz-column-width: 10em; -moz-column-gap: 1em; - + -webkit-column-width: 10em; -webkit-column-gap: 1em; } diff -r 0890e6fd3e00 -r 838c6b72928d hgext/blackbox.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/hgext/blackbox.py Tue May 14 23:04:23 2013 +0400 @@ -0,0 +1,157 @@ +# blackbox.py - log repository events to a file for post-mortem debugging +# +# Copyright 2010 Nicolas Dumazet +# Copyright 2013 Facebook, Inc. +# +# This software may be used and distributed according to the terms of the +# GNU General Public License version 2 or any later version. + +"""log repository events to a blackbox for debugging + +Logs event information to .hg/blackbox.log to help debug and diagnose problems. +The events that get logged can be configured via the blackbox.track config key. +Examples:: + + [blackbox] + track = * + + [blackbox] + track = command, commandfinish, commandexception, exthook, pythonhook + + [blackbox] + track = incoming + + [blackbox] + # limit the size of a log file + maxsize = 1.5 MB + # rotate up to N log files when the current one gets too big + maxfiles = 3 + +""" + +from mercurial import util, cmdutil +from mercurial.i18n import _ +import errno, os, re + +cmdtable = {} +command = cmdutil.command(cmdtable) +testedwith = 'internal' +lastblackbox = None + +def wrapui(ui): + class blackboxui(ui.__class__): + @util.propertycache + def track(self): + return self.configlist('blackbox', 'track', ['*']) + + def _openlogfile(self): + def rotate(oldpath, newpath): + try: + os.unlink(newpath) + except OSError, err: + if err.errno != errno.ENOENT: + self.debug("warning: cannot remove '%s': %s\n" % + (newpath, err.strerror)) + try: + if newpath: + os.rename(oldpath, newpath) + except OSError, err: + if err.errno != errno.ENOENT: + self.debug("warning: cannot rename '%s' to '%s': %s\n" % + (newpath, oldpath, err.strerror)) + + fp = self._bbopener('blackbox.log', 'a') + maxsize = self.configbytes('blackbox', 'maxsize', 1048576) + if maxsize > 0: + st = os.fstat(fp.fileno()) + if st.st_size >= maxsize: + path = fp.name + fp.close() + maxfiles = self.configint('blackbox', 'maxfiles', 7) + for i in xrange(maxfiles - 1, 1, -1): + rotate(oldpath='%s.%d' % (path, i - 1), + newpath='%s.%d' % (path, i)) + rotate(oldpath=path, + newpath=maxfiles > 0 and path + '.1') + fp = self._bbopener('blackbox.log', 'a') + return fp + + def log(self, event, *msg, **opts): + global lastblackbox + super(blackboxui, self).log(event, *msg, **opts) + + if not '*' in self.track and not event in self.track: + return + + if util.safehasattr(self, '_blackbox'): + blackbox = self._blackbox + elif util.safehasattr(self, '_bbopener'): + try: + self._blackbox = self._openlogfile() + except (IOError, OSError), err: + self.debug('warning: cannot write to blackbox.log: %s\n' % + err.strerror) + del self._bbopener + self._blackbox = None + blackbox = self._blackbox + else: + # certain ui instances exist outside the context of + # a repo, so just default to the last blackbox that + # was seen. + blackbox = lastblackbox + + if blackbox: + date = util.datestr(None, '%Y/%m/%d %H:%M:%S') + user = util.getuser() + formattedmsg = msg[0] % msg[1:] + try: + blackbox.write('%s %s> %s' % (date, user, formattedmsg)) + except IOError, err: + self.debug('warning: cannot write to blackbox.log: %s\n' % + err.strerror) + lastblackbox = blackbox + + def setrepo(self, repo): + self._bbopener = repo.opener + + ui.__class__ = blackboxui + +def uisetup(ui): + wrapui(ui) + +def reposetup(ui, repo): + # During 'hg pull' a httppeer repo is created to represent the remote repo. + # It doesn't have a .hg directory to put a blackbox in, so we don't do + # the blackbox setup for it. + if not repo.local(): + return + + ui.setrepo(repo) + +@command('^blackbox', + [('l', 'limit', 10, _('the number of events to show')), + ], + _('hg blackbox [OPTION]...')) +def blackbox(ui, repo, *revs, **opts): + '''view the recent repository events + ''' + + if not os.path.exists(repo.join('blackbox.log')): + return + + limit = opts.get('limit') + blackbox = repo.opener('blackbox.log', 'r') + lines = blackbox.read().split('\n') + + count = 0 + output = [] + for line in reversed(lines): + if count >= limit: + break + + # count the commands by matching lines like: 2013/01/23 19:13:36 root> + if re.match('^\d{4}/\d{2}/\d{2} \d{2}:\d{2}:\d{2} .*> .*', line): + count += 1 + output.append(line) + + ui.status('\n'.join(reversed(output))) diff -r 0890e6fd3e00 -r 838c6b72928d hgext/color.py --- a/hgext/color.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/color.py Tue May 14 23:04:23 2013 +0400 @@ -103,7 +103,7 @@ import os from mercurial import commands, dispatch, extensions, ui as uimod, util -from mercurial import templater +from mercurial import templater, error from mercurial.i18n import _ testedwith = 'internal' @@ -317,6 +317,9 @@ class colorui(uimod.ui): def popbuffer(self, labeled=False): + if self._colormode is None: + return super(colorui, self).popbuffer(labeled) + if labeled: return ''.join(self.label(a, label) for a, label in self._buffers.pop()) @@ -324,6 +327,9 @@ _colormode = 'ansi' def write(self, *args, **opts): + if self._colormode is None: + return super(colorui, self).write(*args, **opts) + label = opts.get('label', '') if self._buffers: self._buffers[-1].extend([(str(a), label) for a in args]) @@ -335,6 +341,9 @@ *[self.label(str(a), label) for a in args], **opts) def write_err(self, *args, **opts): + if self._colormode is None: + return super(colorui, self).write_err(*args, **opts) + label = opts.get('label', '') if self._colormode == 'win32': for a in args: @@ -344,6 +353,9 @@ *[self.label(str(a), label) for a in args], **opts) def label(self, msg, label): + if self._colormode is None: + return super(colorui, self).label(msg, label) + effects = [] for l in label.split(): s = _styles.get(l, '') @@ -379,16 +391,15 @@ return repo.ui.label(thing, label) def uisetup(ui): - global _terminfo_params if ui.plain(): return + if not issubclass(ui.__class__, colorui): + colorui.__bases__ = (ui.__class__,) + ui.__class__ = colorui def colorcmd(orig, ui_, opts, cmd, cmdfunc): mode = _modesetup(ui_, opts) + colorui._colormode = mode if mode: - colorui._colormode = mode - if not issubclass(ui_.__class__, colorui): - colorui.__bases__ = (ui_.__class__,) - ui_.__class__ = colorui extstyles() configstyles(ui_) return orig(ui_, opts, cmd, cmdfunc) diff -r 0890e6fd3e00 -r 838c6b72928d hgext/convert/__init__.py --- a/hgext/convert/__init__.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/convert/__init__.py Tue May 14 23:04:23 2013 +0400 @@ -61,6 +61,10 @@ --sourcesort try to preserve source revisions order, only supported by Mercurial sources. + --closesort try to move closed revisions as close as possible + to parent branches, only supported by Mercurial + sources. + If ``REVMAP`` isn't given, it will be put in a default location (``/.hg/shamap`` by default). The ``REVMAP`` is a simple text file that maps each source commit ID to the destination ID @@ -318,7 +322,8 @@ _('change branch names while converting'), _('FILE')), ('', 'branchsort', None, _('try to sort changesets by branches')), ('', 'datesort', None, _('try to sort changesets by date')), - ('', 'sourcesort', None, _('preserve source changesets order'))], + ('', 'sourcesort', None, _('preserve source changesets order')), + ('', 'closesort', None, _('try to reorder closed revisions'))], _('hg convert [OPTION]... SOURCE [DEST [REVMAP]]')), "debugsvnlog": (debugsvnlog, diff -r 0890e6fd3e00 -r 838c6b72928d hgext/convert/common.py --- a/hgext/convert/common.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/convert/common.py Tue May 14 23:04:23 2013 +0400 @@ -145,6 +145,11 @@ """ return False + def hasnativeclose(self): + """Return true if this source has ability to close branch. + """ + return False + def lookuprev(self, rev): """If rev is a meaningful revision reference in source, return the referenced identifier in the same format used by getcommit(). diff -r 0890e6fd3e00 -r 838c6b72928d hgext/convert/convcmd.py --- a/hgext/convert/convcmd.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/convert/convcmd.py Tue May 14 23:04:23 2013 +0400 @@ -227,6 +227,14 @@ return sorted(nodes, key=keyfn)[0] return picknext + def makeclosesorter(): + """Close order sort.""" + keyfn = lambda n: ('close' not in self.commitcache[n].extra, + self.commitcache[n].sortkey) + def picknext(nodes): + return sorted(nodes, key=keyfn)[0] + return picknext + def makedatesorter(): """Sort revisions by date.""" dates = {} @@ -246,6 +254,8 @@ picknext = makedatesorter() elif sortmode == 'sourcesort': picknext = makesourcesorter() + elif sortmode == 'closesort': + picknext = makeclosesorter() else: raise util.Abort(_('unknown sort mode: %s') % sortmode) @@ -446,13 +456,15 @@ shutil.rmtree(path, True) raise - sortmodes = ('branchsort', 'datesort', 'sourcesort') + sortmodes = ('branchsort', 'datesort', 'sourcesort', 'closesort') sortmode = [m for m in sortmodes if opts.get(m)] if len(sortmode) > 1: raise util.Abort(_('more than one sort mode specified')) sortmode = sortmode and sortmode[0] or defaultsort if sortmode == 'sourcesort' and not srcc.hasnativeorder(): raise util.Abort(_('--sourcesort is not supported by this data source')) + if sortmode == 'closesort' and not srcc.hasnativeclose(): + raise util.Abort(_('--closesort is not supported by this data source')) fmap = opts.get('filemap') if fmap: diff -r 0890e6fd3e00 -r 838c6b72928d hgext/convert/cvsps.py --- a/hgext/convert/cvsps.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/convert/cvsps.py Tue May 14 23:04:23 2013 +0400 @@ -50,7 +50,7 @@ >>> getrepopath('/foo/bar') '/foo/bar' >>> getrepopath('c:/foo/bar') - 'c:/foo/bar' + '/foo/bar' >>> getrepopath(':pserver:10/foo/bar') '/foo/bar' >>> getrepopath(':pserver:10c:/foo/bar') @@ -58,30 +58,30 @@ >>> getrepopath(':pserver:/foo/bar') '/foo/bar' >>> getrepopath(':pserver:c:/foo/bar') - 'c:/foo/bar' + '/foo/bar' >>> getrepopath(':pserver:truc@foo.bar:/foo/bar') '/foo/bar' >>> getrepopath(':pserver:truc@foo.bar:c:/foo/bar') - 'c:/foo/bar' + '/foo/bar' + >>> getrepopath('user@server/path/to/repository') + '/path/to/repository' """ # According to CVS manual, CVS paths are expressed like: # [:method:][[user][:password]@]hostname[:[port]]/path/to/repository # - # Unfortunately, Windows absolute paths start with a drive letter - # like 'c:' making it harder to parse. Here we assume that drive - # letters are only one character long and any CVS component before - # the repository path is at least 2 characters long, and use this - # to disambiguate. + # CVSpath is splitted into parts and then position of the first occurrence + # of the '/' char after the '@' is located. The solution is the rest of the + # string after that '/' sign including it + parts = cvspath.split(':') - if len(parts) == 1: - return parts[0] - # Here there is an ambiguous case if we have a port number - # immediately followed by a Windows driver letter. We assume this - # never happens and decide it must be CVS path component, - # therefore ignoring it. - if len(parts[-2]) > 1: - return parts[-1].lstrip('0123456789') - return parts[-2] + ':' + parts[-1] + atposition = parts[-1].find('@') + start = 0 + + if atposition != -1: + start = atposition + + repopath = parts[-1][parts[-1].find('/', start):] + return repopath def createlog(ui, directory=None, root="", rlog=True, cache=None): '''Collect the CVS rlog''' @@ -508,9 +508,15 @@ ui.status(_('creating changesets\n')) + # try to order commitids by date + mindate = {} + for e in log: + if e.commitid: + mindate[e.commitid] = min(e.date, mindate.get(e.commitid)) + # Merge changesets - log.sort(key=lambda x: (x.commitid, x.comment, x.author, x.branch, x.date, - x.branchpoints)) + log.sort(key=lambda x: (mindate.get(x.commitid), x.commitid, x.comment, + x.author, x.branch, x.date, x.branchpoints)) changesets = [] files = set() diff -r 0890e6fd3e00 -r 838c6b72928d hgext/convert/git.py --- a/hgext/convert/git.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/convert/git.py Tue May 14 23:04:23 2013 +0400 @@ -6,6 +6,7 @@ # GNU General Public License version 2 or any later version. import os +import subprocess from mercurial import util, config from mercurial.node import hex, nullid from mercurial.i18n import _ @@ -29,13 +30,15 @@ # cannot remove environment variable. Just assume none have # both issues. if util.safehasattr(os, 'unsetenv'): - def gitopen(self, s, noerr=False): + def gitopen(self, s, err=None): prevgitdir = os.environ.get('GIT_DIR') os.environ['GIT_DIR'] = self.path try: - if noerr: + if err == subprocess.PIPE: (stdin, stdout, stderr) = util.popen3(s) return stdout + elif err == subprocess.STDOUT: + return self.popen_with_stderr(s) else: return util.popen(s, 'rb') finally: @@ -44,13 +47,25 @@ else: os.environ['GIT_DIR'] = prevgitdir else: - def gitopen(self, s, noerr=False): - if noerr: + def gitopen(self, s, err=None): + if err == subprocess.PIPE: (sin, so, se) = util.popen3('GIT_DIR=%s %s' % (self.path, s)) return so + elif err == subprocess.STDOUT: + return self.popen_with_stderr(s) else: return util.popen('GIT_DIR=%s %s' % (self.path, s), 'rb') + def popen_with_stderr(self, s): + p = subprocess.Popen(s, shell=True, bufsize=-1, + close_fds=util.closefds, + stdin=subprocess.PIPE, + stdout=subprocess.PIPE, + stderr=subprocess.STDOUT, + universal_newlines=False, + env=None) + return p.stdout + def gitread(self, s): fh = self.gitopen(s) data = fh.read() @@ -209,12 +224,15 @@ def gettags(self): tags = {} alltags = {} - fh = self.gitopen('git ls-remote --tags "%s"' % self.path) + fh = self.gitopen('git ls-remote --tags "%s"' % self.path, + err=subprocess.STDOUT) prefix = 'refs/tags/' # Build complete list of tags, both annotated and bare ones for line in fh: line = line.strip() + if line.startswith("error:") or line.startswith("fatal:"): + raise util.Abort(_('cannot read tags from %s') % self.path) node, tag = line.split(None, 1) if not tag.startswith(prefix): continue @@ -266,7 +284,7 @@ # Origin heads for reftype in gitcmd: try: - fh = self.gitopen(gitcmd[reftype], noerr=True) + fh = self.gitopen(gitcmd[reftype], err=subprocess.PIPE) for line in fh: line = line.strip() rev, name = line.split(None, 1) diff -r 0890e6fd3e00 -r 838c6b72928d hgext/convert/hg.py --- a/hgext/convert/hg.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/convert/hg.py Tue May 14 23:04:23 2013 +0400 @@ -386,6 +386,9 @@ def hasnativeorder(self): return True + def hasnativeclose(self): + return True + def lookuprev(self, rev): try: return hex(self.repo.lookup(rev)) diff -r 0890e6fd3e00 -r 838c6b72928d hgext/factotum.py --- a/hgext/factotum.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/factotum.py Tue May 14 23:04:23 2013 +0400 @@ -47,8 +47,8 @@ from mercurial.i18n import _ from mercurial.url import passwordmgr -from mercurial import httpconnection, urllib2, util -import os +from mercurial import httpconnection, util +import os, urllib2 ERRMAX = 128 diff -r 0890e6fd3e00 -r 838c6b72928d hgext/hgk.py --- a/hgext/hgk.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/hgk.py Tue May 14 23:04:23 2013 +0400 @@ -114,7 +114,8 @@ if committer != '': ui.write(("committer %s %s %s\n" % (committer, int(date[0]), date[1]))) ui.write(("revision %d\n" % ctx.rev())) - ui.write(("branch %s\n\n" % ctx.branch())) + ui.write(("branch %s\n" % ctx.branch())) + ui.write(("phase %s\n\n" % ctx.phasestr())) if prefix != "": ui.write("%s%s\n" % (prefix, diff -r 0890e6fd3e00 -r 838c6b72928d hgext/highlight/highlight.py --- a/hgext/highlight/highlight.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/highlight/highlight.py Tue May 14 23:04:23 2013 +0400 @@ -38,12 +38,13 @@ # To get multi-line strings right, we can't format line-by-line try: - lexer = guess_lexer_for_filename(fctx.path(), text[:1024]) + lexer = guess_lexer_for_filename(fctx.path(), text[:1024], + stripnl=False) except (ClassNotFound, ValueError): try: - lexer = guess_lexer(text[:1024]) + lexer = guess_lexer(text[:1024], stripnl=False) except (ClassNotFound, ValueError): - lexer = TextLexer() + lexer = TextLexer(stripnl=False) formatter = HtmlFormatter(style=style) diff -r 0890e6fd3e00 -r 838c6b72928d hgext/histedit.py --- a/hgext/histedit.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/histedit.py Tue May 14 23:04:23 2013 +0400 @@ -143,6 +143,7 @@ except ImportError: import pickle import os +import sys from mercurial import cmdutil from mercurial import discovery @@ -179,7 +180,7 @@ def commitfuncfor(repo, src): """Build a commit function for the replacement of - This function ensure we apply the same treatement to all changesets. + This function ensure we apply the same treatment to all changesets. - Add a 'histedit_source' entry in extra. @@ -301,8 +302,8 @@ hg.update(repo, ctx.node()) stats = applychanges(ui, repo, oldctx, opts) if stats and stats[3] > 0: - raise util.Abort(_('Fix up the change and run ' - 'hg histedit --continue')) + raise error.InterventionRequired(_('Fix up the change and run ' + 'hg histedit --continue')) # drop the second merge parent commit = commitfuncfor(repo, oldctx) n = commit(text=oldctx.description(), user=oldctx.user(), @@ -319,17 +320,17 @@ oldctx = repo[ha] hg.update(repo, ctx.node()) applychanges(ui, repo, oldctx, opts) - raise util.Abort(_('Make changes as needed, you may commit or record as ' - 'needed now.\nWhen you are finished, run hg' - ' histedit --continue to resume.')) + raise error.InterventionRequired( + _('Make changes as needed, you may commit or record as needed now.\n' + 'When you are finished, run hg histedit --continue to resume.')) def fold(ui, repo, ctx, ha, opts): oldctx = repo[ha] hg.update(repo, ctx.node()) stats = applychanges(ui, repo, oldctx, opts) if stats and stats[3] > 0: - raise util.Abort(_('Fix up the change and run ' - 'hg histedit --continue')) + raise error.InterventionRequired( + _('Fix up the change and run hg histedit --continue')) n = repo.commit(text='fold-temp-revision %s' % ha, user=oldctx.user(), date=oldctx.date(), extra=oldctx.extra()) if n is None: @@ -390,8 +391,8 @@ hg.update(repo, ctx.node()) stats = applychanges(ui, repo, oldctx, opts) if stats and stats[3] > 0: - raise util.Abort(_('Fix up the change and run ' - 'hg histedit --continue')) + raise error.InterventionRequired( + _('Fix up the change and run hg histedit --continue')) message = oldctx.description() + '\n' message = ui.edit(message, ui.username()) commit = commitfuncfor(repo, oldctx) @@ -403,6 +404,29 @@ # We didn't make an edit, so just indicate no replaced nodes return newctx, [] +def findoutgoing(ui, repo, remote=None, force=False, opts={}): + """utility function to find the first outgoing changeset + + Used by initialisation code""" + dest = ui.expandpath(remote or 'default-push', remote or 'default') + dest, revs = hg.parseurl(dest, None)[:2] + ui.status(_('comparing with %s\n') % util.hidepassword(dest)) + + revs, checkout = hg.addbranchrevs(repo, repo, revs, None) + other = hg.peer(repo, opts, dest) + + if revs: + revs = [repo.lookup(rev) for rev in revs] + + # hexlify nodes from outgoing, because we're going to parse + # parent[0] using revsingle below, and if the binary hash + # contains special revset characters like ":" the revset + # parser can choke. + outgoing = discovery.findcommonoutgoing(repo, other, revs, force=force) + if not outgoing.missing: + raise util.Abort(_('no outgoing ancestors')) + return outgoing.missing[0] + actiontable = {'p': pick, 'pick': pick, 'e': edit, @@ -427,7 +451,7 @@ _('force outgoing even for unrelated repositories')), ('r', 'rev', [], _('first revision to be edited'))], _("[PARENT]")) -def histedit(ui, repo, *parent, **opts): +def histedit(ui, repo, *freeargs, **opts): """interactively edit changeset history """ # TODO only abort if we try and histedit mq patches, not just @@ -436,41 +460,48 @@ if mq and mq.applied: raise util.Abort(_('source has mq patches applied')) - parent = list(parent) + opts.get('rev', []) - if opts.get('outgoing'): - if len(parent) > 1: - raise util.Abort( - _('only one repo argument allowed with --outgoing')) - elif parent: - parent = parent[0] - - dest = ui.expandpath(parent or 'default-push', parent or 'default') - dest, revs = hg.parseurl(dest, None)[:2] - ui.status(_('comparing with %s\n') % util.hidepassword(dest)) + # basic argument incompatibility processing + outg = opts.get('outgoing') + cont = opts.get('continue') + abort = opts.get('abort') + force = opts.get('force') + rules = opts.get('commands', '') + revs = opts.get('rev', []) + goal = 'new' # This invocation goal, in new, continue, abort + if force and not outg: + raise util.Abort(_('--force only allowed with --outgoing')) + if cont: + if util.any((outg, abort, revs, freeargs, rules)): + raise util.Abort(_('no arguments allowed with --continue')) + goal = 'continue' + elif abort: + if util.any((outg, revs, freeargs, rules)): + raise util.Abort(_('no arguments allowed with --abort')) + goal = 'abort' + else: + if os.path.exists(os.path.join(repo.path, 'histedit-state')): + raise util.Abort(_('history edit already in progress, try ' + '--continue or --abort')) + if outg: + if revs: + raise util.Abort(_('no revisions allowed with --outgoing')) + if len(freeargs) > 1: + raise util.Abort( + _('only one repo argument allowed with --outgoing')) + else: + revs.extend(freeargs) + if len(revs) != 1: + raise util.Abort( + _('histedit requires exactly one parent revision')) - revs, checkout = hg.addbranchrevs(repo, repo, revs, None) - other = hg.peer(repo, opts, dest) - if revs: - revs = [repo.lookup(rev) for rev in revs] - - parent = discovery.findcommonoutgoing( - repo, other, [], force=opts.get('force')).missing[0:1] - else: - if opts.get('force'): - raise util.Abort(_('--force only allowed with --outgoing')) - - if opts.get('continue', False): - if len(parent) != 0: - raise util.Abort(_('no arguments allowed with --continue')) + if goal == 'continue': (parentctxnode, rules, keep, topmost, replacements) = readstate(repo) currentparent, wantnull = repo.dirstate.parents() parentctx = repo[parentctxnode] parentctx, repl = bootstrapcontinue(ui, repo, parentctx, rules, opts) replacements.extend(repl) - elif opts.get('abort', False): - if len(parent) != 0: - raise util.Abort(_('no arguments allowed with --abort')) + elif goal == 'abort': (parentctxnode, rules, keep, topmost, replacements) = readstate(repo) mapping, tmpnodes, leafs, _ntm = processreplacement(repo, replacements) ui.debug('restore wc to old parent %s\n' % node.short(topmost)) @@ -481,28 +512,29 @@ return else: cmdutil.bailifchanged(repo) - if os.path.exists(os.path.join(repo.path, 'histedit-state')): - raise util.Abort(_('history edit already in progress, try ' - '--continue or --abort')) topmost, empty = repo.dirstate.parents() - - if len(parent) != 1: - raise util.Abort(_('histedit requires exactly one parent revision')) - parent = scmutil.revsingle(repo, parent[0]).node() + if outg: + if freeargs: + remote = freeargs[0] + else: + remote = None + root = findoutgoing(ui, repo, remote, force, opts) + else: + root = revs[0] + root = scmutil.revsingle(repo, root).node() keep = opts.get('keep', False) - revs = between(repo, parent, topmost, keep) + revs = between(repo, root, topmost, keep) if not revs: - ui.warn(_('nothing to edit\n')) - return 1 + raise util.Abort(_('%s is not an ancestor of working directory') % + node.short(root)) ctxs = [repo[r] for r in revs] - rules = opts.get('commands', '') if not rules: rules = '\n'.join([makedesc(c) for c in ctxs]) rules += '\n\n' - rules += editcomment % (node.short(parent), node.short(topmost)) + rules += editcomment % (node.short(root), node.short(topmost)) rules = ui.edit(rules, ui.username()) # Save edit rules in .hg/histedit-last-edit.txt in case # the user needs to ask for help after something @@ -511,14 +543,17 @@ f.write(rules) f.close() else: - f = open(rules) + if rules == '-': + f = sys.stdin + else: + f = open(rules) rules = f.read() f.close() rules = [l for l in (r.strip() for r in rules.splitlines()) if l and not l[0] == '#'] rules = verifyrules(rules, repo, ctxs) - parentctx = repo[parent].parents()[0] + parentctx = repo[root].parents()[0] keep = opts.get('keep', False) replacements = [] @@ -576,14 +611,15 @@ # note: does not take non linear new change in account (but previous # implementation didn't used them anyway (issue3655) newchildren = [c.node() for c in repo.set('(%d::.)', parentctx)] - if not newchildren: - # `parentctxnode` should match but no result. This means that - # currentnode is not a descendant from parentctxnode. - msg = _('working directory parent is not a descendant of %s') - hint = _('update to %s or descendant and run "hg histedit ' - '--continue" again') % parentctx - raise util.Abort(msg % parentctx, hint=hint) - newchildren.pop(0) # remove parentctxnode + if parentctx.node() != node.nullid: + if not newchildren: + # `parentctxnode` should match but no result. This means that + # currentnode is not a descendant from parentctxnode. + msg = _('%s is not an ancestor of working directory') + hint = _('update to %s or descendant and run "hg histedit ' + '--continue" again') % parentctx + raise util.Abort(msg % parentctx, hint=hint) + newchildren.pop(0) # remove parentctxnode # Commit dirty working directory if necessary new = None m, a, r, d = repo.status()[:4] @@ -613,16 +649,22 @@ replacements.append((ctx.node(), tuple(newchildren))) if action in ('f', 'fold'): - # finalize fold operation if applicable - if new is None: - new = newchildren[-1] + if newchildren: + # finalize fold operation if applicable + if new is None: + new = newchildren[-1] + else: + newchildren.pop() # remove new from internal changes + parentctx, repl = finishfold(ui, repo, parentctx, ctx, new, opts, + newchildren) + replacements.extend(repl) else: - newchildren.pop() # remove new from internal changes - parentctx, repl = finishfold(ui, repo, parentctx, ctx, new, opts, - newchildren) - replacements.extend(repl) + # newchildren is empty if the fold did not result in any commit + # this happen when all folded change are discarded during the + # merge. + replacements.append((ctx.node(), (parentctx.node(),))) elif newchildren: - # otherwize update "parentctx" before proceding to further operation + # otherwise update "parentctx" before proceeding to further operation parentctx = repo[newchildren[-1]] return parentctx, replacements @@ -674,25 +716,30 @@ or a rule on a changeset outside of the user-given range. """ parsed = [] - if len(rules) != len(ctxs): - raise util.Abort(_('must specify a rule for each changeset once')) + expected = set(str(c) for c in ctxs) + seen = set() for r in rules: if ' ' not in r: raise util.Abort(_('malformed line "%s"') % r) action, rest = r.split(' ', 1) - if ' ' in rest.strip(): - ha, rest = rest.split(' ', 1) - else: - ha = r.strip() + ha = rest.strip().split(' ', 1)[0] try: - if repo[ha] not in ctxs: - raise util.Abort( - _('may not use changesets other than the ones listed')) + ha = str(repo[ha]) # ensure its a short hash except error.RepoError: raise util.Abort(_('unknown changeset %s listed') % ha) + if ha not in expected: + raise util.Abort( + _('may not use changesets other than the ones listed')) + if ha in seen: + raise util.Abort(_('duplicated command for changeset %s') % ha) + seen.add(ha) if action not in actiontable: raise util.Abort(_('unknown action "%s"') % action) parsed.append([action, ha]) + missing = sorted(expected - seen) # sort to stabilize output + if missing: + raise util.Abort(_('missing rules for changeset %s') % missing[0], + hint=_('do you want to use the drop action?')) return parsed def processreplacement(repo, replacements): diff -r 0890e6fd3e00 -r 838c6b72928d hgext/keyword.py --- a/hgext/keyword.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/keyword.py Tue May 14 23:04:23 2013 +0400 @@ -384,7 +384,7 @@ fn = 'demo.txt' tmpdir = tempfile.mkdtemp('', 'kwdemo.') ui.note(_('creating temporary repository at %s\n') % tmpdir) - repo = localrepo.localrepository(ui, tmpdir, True) + repo = localrepo.localrepository(repo.baseui, tmpdir, True) ui.setconfig('keyword', fn, '') svn = ui.configbool('keywordset', 'svn') # explicitly set keywordset for demo output diff -r 0890e6fd3e00 -r 838c6b72928d hgext/largefiles/__init__.py --- a/hgext/largefiles/__init__.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/largefiles/__init__.py Tue May 14 23:04:23 2013 +0400 @@ -41,11 +41,30 @@ enabled for this to work. When you pull a changeset that affects largefiles from a remote -repository, Mercurial behaves as normal. However, when you update to -such a revision, any largefiles needed by that revision are downloaded -and cached (if they have never been downloaded before). This means -that network access may be required to update to changesets you have -not previously updated to. +repository, the largefiles for the changeset will by default not be +pulled down. However, when you update to such a revision, any +largefiles needed by that revision are downloaded and cached (if +they have never been downloaded before). One way to pull largefiles +when pulling is thus to use --update, which will update your working +copy to the latest pulled revision (and thereby downloading any new +largefiles). + +If you want to pull largefiles you don't need for update yet, then +you can use pull with the `--lfrev` option or the :hg:`lfpull` command. + +If you know you are pulling from a non-default location and want to +download all the largefiles that correspond to the new changesets at +the same time, then you can pull with `--lfrev "pulled()"`. + +If you just want to ensure that you will have the largefiles needed to +merge or rebase with new heads that you are pulling, then you can pull +with `--lfrev "head(pulled())"` flag to pre-emptively download any largefiles +that are new in the heads you are pulling. + +Keep in mind that network access may now be required to update to +changesets that you have not previously updated to. The nature of the +largefiles extension means that updating is no longer guaranteed to +be a local-only operation. If you already have large files tracked by Mercurial without the largefiles extension, you will need to convert your repository in diff -r 0890e6fd3e00 -r 838c6b72928d hgext/largefiles/basestore.py --- a/hgext/largefiles/basestore.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/largefiles/basestore.py Tue May 14 23:04:23 2013 +0400 @@ -8,7 +8,6 @@ '''base class for store implementations and store-related utility code''' -import binascii import re from mercurial import util, node, hg @@ -39,11 +38,12 @@ self.url = url def put(self, source, hash): - '''Put source file into the store under /.''' + '''Put source file into the store so it can be retrieved by hash.''' raise NotImplementedError('abstract method') def exists(self, hashes): - '''Check to see if the store contains the given hashes.''' + '''Check to see if the store contains the given hashes. Given an + iterable of hashes it returns a mapping from hash to bool.''' raise NotImplementedError('abstract method') def get(self, files): @@ -59,32 +59,42 @@ missing = [] ui = self.ui + util.makedirs(lfutil.storepath(self.repo, '')) + at = 0 + available = self.exists(set(hash for (_filename, hash) in files)) for filename, hash in files: ui.progress(_('getting largefiles'), at, unit='lfile', total=len(files)) at += 1 ui.note(_('getting %s:%s\n') % (filename, hash)) + if not available.get(hash): + ui.warn(_('%s: largefile %s not available from %s\n') + % (filename, hash, self.url)) + missing.append(filename) + continue + storefilename = lfutil.storepath(self.repo, hash) - tmpfile = util.atomictempfile(storefilename, + tmpfile = util.atomictempfile(storefilename + '.tmp', createmode=self.repo.store.createmode) try: - hhash = binascii.hexlify(self._getfile(tmpfile, filename, hash)) + hhash = self._getfile(tmpfile, filename, hash) except StoreError, err: ui.warn(err.longmessage()) hhash = "" + tmpfile.close() if hhash != hash: if hhash != "": ui.warn(_('%s: data corruption (expected %s, got %s)\n') % (filename, hash, hhash)) - tmpfile.discard() # no-op if it's already closed + util.unlink(storefilename + '.tmp') missing.append(filename) continue - tmpfile.close() + util.rename(storefilename + '.tmp', storefilename) lfutil.linktousercache(self.repo, hash) success.append((filename, hhash)) @@ -95,40 +105,47 @@ '''Verify the existence (and, optionally, contents) of every big file revision referenced by every changeset in revs. Return 0 if all is well, non-zero on any errors.''' - write = self.ui.write failed = False - write(_('searching %d changesets for largefiles\n') % len(revs)) + self.ui.status(_('searching %d changesets for largefiles\n') % + len(revs)) verified = set() # set of (filename, filenode) tuples for rev in revs: cctx = self.repo[rev] cset = "%d:%s" % (cctx.rev(), node.short(cctx.node())) - failed = util.any(self._verifyfile( - cctx, cset, contents, standin, verified) for standin in cctx) + for standin in cctx: + if self._verifyfile(cctx, cset, contents, standin, verified): + failed = True numrevs = len(verified) numlfiles = len(set([fname for (fname, fnode) in verified])) if contents: - write(_('verified contents of %d revisions of %d largefiles\n') - % (numrevs, numlfiles)) + self.ui.status( + _('verified contents of %d revisions of %d largefiles\n') + % (numrevs, numlfiles)) else: - write(_('verified existence of %d revisions of %d largefiles\n') - % (numrevs, numlfiles)) - + self.ui.status( + _('verified existence of %d revisions of %d largefiles\n') + % (numrevs, numlfiles)) return int(failed) def _getfile(self, tmpfile, filename, hash): '''Fetch one revision of one file from the store and write it to tmpfile. Compute the hash of the file on-the-fly as it - downloads and return the binary hash. Close tmpfile. Raise + downloads and return the hash. Close tmpfile. Raise StoreError if unable to download the file (e.g. it does not exist in the store).''' raise NotImplementedError('abstract method') def _verifyfile(self, cctx, cset, contents, standin, verified): '''Perform the actual verification of a file in the store. + 'cset' is only used in warnings. + 'contents' controls verification of content hash. + 'standin' is the standin path of the largefile to verify. + 'verified' is maintained as a set of already verified files. + Returns _true_ if it is a standin and any problems are found! ''' raise NotImplementedError('abstract method') @@ -163,6 +180,7 @@ path = '' remote = repo else: + path, _branches = hg.parseurl(path) remote = hg.peer(repo, {}, path) # The path could be a scheme so use Mercurial's normal functionality diff -r 0890e6fd3e00 -r 838c6b72928d hgext/largefiles/lfcommands.py --- a/hgext/largefiles/lfcommands.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/largefiles/lfcommands.py Tue May 14 23:04:23 2013 +0400 @@ -8,7 +8,7 @@ '''High-level command function for lfconvert, plus the cmdtable.''' -import os +import os, errno import shutil from mercurial import util, match as match_, hg, node, context, error, \ @@ -215,20 +215,12 @@ raise util.Abort(_('largefile %s becomes symlink') % f) # largefile was modified, update standins - fullpath = rdst.wjoin(f) - util.makedirs(os.path.dirname(fullpath)) m = util.sha1('') m.update(ctx[f].data()) hash = m.hexdigest() if f not in lfiletohash or lfiletohash[f] != hash: - try: - fd = open(fullpath, 'wb') - fd.write(ctx[f].data()) - finally: - if fd: - fd.close() + rdst.wwrite(f, ctx[f].data(), ctx[f].flags()) executable = 'x' in ctx[f].flags() - os.chmod(fullpath, lfutil.getmode(executable)) lfutil.writestandin(rdst, lfutil.standin(f), hash, executable) lfiletohash[f] = hash @@ -368,9 +360,9 @@ ui.progress(_('uploading largefiles'), None) def verifylfiles(ui, repo, all=False, contents=False): - '''Verify that every big file revision in the current changeset + '''Verify that every largefile revision in the current changeset exists in the central store. With --contents, also verify that - the contents of each big file revision are correct (SHA-1 hash + the contents of each local largefile file revision are correct (SHA-1 hash matches the revision ID). With --all, check every changeset in this repository.''' if all: @@ -403,22 +395,13 @@ toget = [] for lfile in lfiles: - # If we are mid-merge, then we have to trust the standin that is in the - # working copy to have the correct hashvalue. This is because the - # original hg.merge() already updated the standin as part of the normal - # merge process -- we just have to update the largefile to match. - if (getattr(repo, "_ismerging", False) and - os.path.exists(repo.wjoin(lfutil.standin(lfile)))): - expectedhash = lfutil.readstandin(repo, lfile) - else: + try: expectedhash = repo[node][lfutil.standin(lfile)].data().strip() - - # if it exists and its hash matches, it might have been locally - # modified before updating and the user chose 'local'. in this case, - # it will not be in any store, so don't look for it. - if ((not os.path.exists(repo.wjoin(lfile)) or - expectedhash != lfutil.hashfile(repo.wjoin(lfile))) and - not lfutil.findfile(repo, expectedhash)): + except IOError, err: + if err.errno == errno.ENOENT: + continue # node must be None and standin wasn't found in wctx + raise + if not lfutil.findfile(repo, expectedhash): toget.append((lfile, expectedhash)) if toget: @@ -435,11 +418,12 @@ pass totalsuccess = 0 totalmissing = 0 - for ctx in cmdutil.walkchangerevs(repo, matchfn, {'rev' : rev}, - prepare): - success, missing = cachelfiles(ui, repo, ctx.node()) - totalsuccess += len(success) - totalmissing += len(missing) + if rev != []: # walkchangerevs on empty list would return all revs + for ctx in cmdutil.walkchangerevs(repo, matchfn, {'rev' : rev}, + prepare): + success, missing = cachelfiles(ui, repo, ctx.node()) + totalsuccess += len(success) + totalmissing += len(missing) ui.status(_("%d additional largefiles cached\n") % totalsuccess) if totalmissing > 0: ui.status(_("%d largefiles failed to download\n") % totalmissing) @@ -458,7 +442,7 @@ if printmessage and lfiles: ui.status(_('getting changed largefiles\n')) printed = True - cachelfiles(ui, repo, '.', lfiles) + cachelfiles(ui, repo, None, lfiles) updated, removed = 0, 0 for f in lfiles: @@ -500,6 +484,8 @@ # use normallookup() to allocate entry in largefiles dirstate, # because lack of it misleads lfilesrepo.status() into # recognition that such cache missing files are REMOVED. + if lfile not in repo[None]: # not switched to normal file + util.unlinkpath(abslfile, ignoremissing=True) lfdirstate.normallookup(lfile) return None # don't try to set the mode else: @@ -516,7 +502,8 @@ # lfile is added to the repository again. This happens when a # largefile is converted back to a normal file: the standin # disappears, but a new (normal) file appears as the lfile. - if os.path.exists(abslfile) and lfile not in repo[None]: + if (os.path.exists(abslfile) and + repo.dirstate.normalize(lfile) not in repo[None]): util.unlinkpath(abslfile) ret = -1 state = repo.dirstate[lfutil.standin(lfile)] @@ -536,22 +523,40 @@ lfdirstate.drop(lfile) return ret -def catlfile(repo, lfile, rev, filename): - hash = lfutil.readstandin(repo, lfile, rev) - if not lfutil.inusercache(repo.ui, hash): - store = basestore._openstore(repo) - success, missing = store.get([(lfile, hash)]) - if len(success) != 1: - raise util.Abort( - _('largefile %s is not in cache and could not be downloaded') - % lfile) - path = lfutil.usercachepath(repo.ui, hash) - fpout = cmdutil.makefileobj(repo, filename) - fpin = open(path, "rb") - fpout.write(fpin.read()) - fpout.close() - fpin.close() - return 0 +def lfpull(ui, repo, source="default", **opts): + """pull largefiles for the specified revisions from the specified source + + Pull largefiles that are referenced from local changesets but missing + locally, pulling from a remote repository to the local cache. + + If SOURCE is omitted, the 'default' path will be used. + See :hg:`help urls` for more information. + + .. container:: verbose + + Some examples: + + - pull largefiles for all branch heads:: + + hg lfpull -r "head() and not closed()" + + - pull largefiles on the default branch:: + + hg lfpull -r "branch(default)" + """ + repo.lfpullsource = source + + revs = opts.get('rev', []) + if not revs: + raise util.Abort(_('no revisions specified')) + revs = scmutil.revrange(repo, revs) + + numcached = 0 + for rev in revs: + ui.note(_('pulling largefiles for revision %s\n') % rev) + (cached, missing) = cachelfiles(ui, repo, rev) + numcached += len(cached) + ui.status(_("%d largefiles cached\n") % numcached) # -- hg commands declarations ------------------------------------------------ @@ -565,6 +570,11 @@ _('convert from a largefiles repo to a normal repo')), ], _('hg lfconvert SOURCE DEST [FILE ...]')), + 'lfpull': (lfpull, + [('r', 'rev', [], _('pull largefiles for these revisions')) + ] + commands.remoteopts, + _('-r REV... [-e CMD] [--remotecmd CMD] [SOURCE]') + ), } commands.inferrepo += " lfconvert" diff -r 0890e6fd3e00 -r 838c6b72928d hgext/largefiles/lfutil.py --- a/hgext/largefiles/lfutil.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/largefiles/lfutil.py Tue May 14 23:04:23 2013 +0400 @@ -9,7 +9,6 @@ '''largefiles utility code: must not import other modules in this package.''' import os -import errno import platform import shutil import stat @@ -39,6 +38,7 @@ return lfsize def link(src, dest): + util.makedirs(os.path.dirname(dest)) try: util.oslink(src, dest) except OSError: @@ -86,7 +86,6 @@ elif inusercache(repo.ui, hash): repo.ui.note(_('found %s in system cache\n') % hash) path = storepath(repo, hash) - util.makedirs(os.path.dirname(path)) link(usercachepath(repo.ui, hash), path) return path return None @@ -127,14 +126,7 @@ matcher = getstandinmatcher(repo) for standin in repo.dirstate.walk(matcher, [], False, False): lfile = splitstandin(standin) - hash = readstandin(repo, lfile) lfdirstate.normallookup(lfile) - try: - if hash == hashfile(repo.wjoin(lfile)): - lfdirstate.normal(lfile) - except OSError, err: - if err.errno != errno.ENOENT: - raise return lfdirstate def lfdirstatestatus(lfdirstate, repo, rev): @@ -203,10 +195,10 @@ def copytostoreabsolute(repo, file, hash): - util.makedirs(os.path.dirname(storepath(repo, hash))) if inusercache(repo.ui, hash): link(usercachepath(repo.ui, hash), storepath(repo, hash)) elif not getattr(repo, "_isconverting", False): + util.makedirs(os.path.dirname(storepath(repo, hash))) dst = util.atomictempfile(storepath(repo, hash), createmode=repo.store.createmode) for chunk in util.filechunkiter(open(file, 'rb')): @@ -217,27 +209,16 @@ def linktousercache(repo, hash): path = usercachepath(repo.ui, hash) if path: - util.makedirs(os.path.dirname(path)) link(storepath(repo, hash), path) def getstandinmatcher(repo, pats=[], opts={}): '''Return a match object that applies pats to the standin directory''' standindir = repo.wjoin(shortname) if pats: - # patterns supplied: search standin directory relative to current dir - cwd = repo.getcwd() - if os.path.isabs(cwd): - # cwd is an absolute path for hg -R - # work relative to the repository root in this case - cwd = '' - pats = [os.path.join(standindir, cwd, pat) for pat in pats] - elif os.path.isdir(standindir): + pats = [os.path.join(standindir, pat) for pat in pats] + else: # no patterns: relative to repo root pats = [standindir] - else: - # no patterns and no standin dir: return matcher that matches nothing - return match_.match(repo.root, None, [], exact=True) - # no warnings about missing files or directories match = scmutil.match(repo[None], pats, opts) match.bad = lambda f, msg: None @@ -296,23 +277,16 @@ def writestandin(repo, standin, hash, executable): '''write hash to /''' - writehash(hash, repo.wjoin(standin), executable) + repo.wwrite(standin, hash + '\n', executable and 'x' or '') def copyandhash(instream, outfile): '''Read bytes from instream (iterable) and write them to outfile, - computing the SHA-1 hash of the data along the way. Close outfile - when done and return the binary hash.''' + computing the SHA-1 hash of the data along the way. Return the hash.''' hasher = util.sha1('') for data in instream: hasher.update(data) outfile.write(data) - - # Blecch: closing a file that somebody else opened is rude and - # wrong. But it's so darn convenient and practical! After all, - # outfile was opened just to copy and hash. - outfile.close() - - return hasher.digest() + return hasher.hexdigest() def hashrepofile(repo, file): return hashfile(repo.wjoin(file)) @@ -322,53 +296,17 @@ return '' hasher = util.sha1('') fd = open(file, 'rb') - for data in blockstream(fd): + for data in util.filechunkiter(fd, 128 * 1024): hasher.update(data) fd.close() return hasher.hexdigest() -class limitreader(object): - def __init__(self, f, limit): - self.f = f - self.limit = limit - - def read(self, length): - if self.limit == 0: - return '' - length = length > self.limit and self.limit or length - self.limit -= length - return self.f.read(length) - - def close(self): - pass - -def blockstream(infile, blocksize=128 * 1024): - """Generator that yields blocks of data from infile and closes infile.""" - while True: - data = infile.read(blocksize) - if not data: - break - yield data - # same blecch as copyandhash() above - infile.close() - -def writehash(hash, filename, executable): - util.makedirs(os.path.dirname(filename)) - util.writefile(filename, hash + '\n') - os.chmod(filename, getmode(executable)) - def getexecutable(filename): mode = os.stat(filename).st_mode return ((mode & stat.S_IXUSR) and (mode & stat.S_IXGRP) and (mode & stat.S_IXOTH)) -def getmode(executable): - if executable: - return 0755 - else: - return 0644 - def urljoin(first, second, *arg): def join(left, right): if not left.endswith('/'): @@ -408,14 +346,6 @@ def __init__(self, storetypes): self.storetypes = storetypes -def getcurrentheads(repo): - branches = repo.branchmap() - heads = [] - for branch in branches: - newheads = repo.branchheads(branch) - heads = heads + newheads - return heads - def getstandinsstate(repo): standins = [] matcher = getstandinmatcher(repo) diff -r 0890e6fd3e00 -r 838c6b72928d hgext/largefiles/localstore.py --- a/hgext/largefiles/localstore.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/largefiles/localstore.py Tue May 14 23:04:23 2013 +0400 @@ -8,9 +8,6 @@ '''store class for local filesystem''' -import os - -from mercurial import util from mercurial.i18n import _ import lfutil @@ -26,11 +23,9 @@ super(localstore, self).__init__(ui, repo, self.remote.url()) def put(self, source, hash): - util.makedirs(os.path.dirname(lfutil.storepath(self.remote, hash))) if lfutil.instore(self.remote, hash): return - lfutil.link(lfutil.storepath(self.repo, hash), - lfutil.storepath(self.remote, hash)) + lfutil.link(source, lfutil.storepath(self.remote, hash)) def exists(self, hashes): retval = {} @@ -40,11 +35,8 @@ def _getfile(self, tmpfile, filename, hash): - if lfutil.instore(self.remote, hash): - path = lfutil.storepath(self.remote, hash) - elif lfutil.inusercache(self.ui, hash): - path = lfutil.usercachepath(self.ui, hash) - else: + path = lfutil.findfile(self.remote, hash) + if not path: raise basestore.StoreError(filename, hash, self.url, _("can't get file locally")) fd = open(path, 'rb') @@ -63,23 +55,19 @@ return False expecthash = fctx.data()[0:40] + storepath = lfutil.storepath(self.remote, expecthash) verified.add(key) if not lfutil.instore(self.remote, expecthash): self.ui.warn( - _('changeset %s: %s missing\n' - ' (looked for hash %s)\n') - % (cset, filename, expecthash)) + _('changeset %s: %s references missing %s\n') + % (cset, filename, storepath)) return True # failed if contents: - storepath = lfutil.storepath(self.remote, expecthash) actualhash = lfutil.hashfile(storepath) if actualhash != expecthash: self.ui.warn( - _('changeset %s: %s: contents differ\n' - ' (%s:\n' - ' expected hash %s,\n' - ' but got %s)\n') - % (cset, filename, storepath, expecthash, actualhash)) + _('changeset %s: %s references corrupted %s\n') + % (cset, filename, storepath)) return True # failed return False diff -r 0890e6fd3e00 -r 838c6b72928d hgext/largefiles/overrides.py --- a/hgext/largefiles/overrides.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/largefiles/overrides.py Tue May 14 23:04:23 2013 +0400 @@ -19,6 +19,7 @@ import lfutil import lfcommands +import basestore # -- Utility functions: commonly/repeatedly needed functionality --------------- @@ -34,6 +35,7 @@ manifest) m._files = filter(notlfile, m._files) m._fmap = set(m._files) + m._always = False origmatchfn = m.matchfn m.matchfn = lambda f: notlfile(f) and origmatchfn(f) or None return m @@ -251,6 +253,7 @@ standins = [lfutil.standin(f) for f in m._files] m._files.extend(standins) m._fmap = set(m._files) + m._always = False origmatchfn = m.matchfn def lfmatchfn(f): lf = lfutil.splitstandin(f) @@ -274,7 +277,7 @@ contents = opts.pop('lfc', False) result = orig(ui, repo, *pats, **opts) - if large: + if large or all or contents: result = result or lfcommands.verifylfiles(ui, repo, all, contents) return result @@ -330,7 +333,7 @@ # largefiles. This makes the merge proceed and we can then handle this # case further in the overridden manifestmerge function below. def overridecheckunknownfile(origfn, repo, wctx, mctx, f): - if lfutil.standin(f) in wctx: + if lfutil.standin(repo.dirstate.normalize(f)) in wctx: return False return origfn(repo, wctx, mctx, f) @@ -360,29 +363,35 @@ # Finally, the merge.applyupdates function will then take care of # writing the files into the working copy and lfcommands.updatelfiles # will update the largefiles. -def overridemanifestmerge(origfn, repo, p1, p2, pa, overwrite, partial): - actions = origfn(repo, p1, p2, pa, overwrite, partial) +def overridemanifestmerge(origfn, repo, p1, p2, pa, branchmerge, force, + partial, acceptremote=False): + overwrite = force and not branchmerge + actions = origfn(repo, p1, p2, pa, branchmerge, force, partial, + acceptremote) processed = [] for action in actions: if overwrite: processed.append(action) continue - f, m = action[:2] + f, m, args, msg = action choices = (_('&Largefile'), _('&Normal file')) - if m == "g" and lfutil.splitstandin(f) in p1 and f in p2: + + splitstandin = lfutil.splitstandin(f) + if (m == "g" and splitstandin is not None and + splitstandin in p1 and f in p2): # Case 1: normal file in the working copy, largefile in # the second parent - lfile = lfutil.splitstandin(f) + lfile = splitstandin standin = f msg = _('%s has been turned into a largefile\n' 'use (l)argefile or keep as (n)ormal file?') % lfile if repo.ui.promptchoice(msg, choices, 0) == 0: - processed.append((lfile, "r")) - processed.append((standin, "g", p2.flags(standin))) + processed.append((lfile, "r", None, msg)) + processed.append((standin, "g", (p2.flags(standin),), msg)) else: - processed.append((standin, "r")) + processed.append((standin, "r", None, msg)) elif m == "g" and lfutil.standin(f) in p1 and f in p2: # Case 2: largefile in the working copy, normal file in # the second parent @@ -391,10 +400,10 @@ msg = _('%s has been turned into a normal file\n' 'keep as (l)argefile or use (n)ormal file?') % lfile if repo.ui.promptchoice(msg, choices, 0) == 0: - processed.append((lfile, "r")) + processed.append((lfile, "r", None, msg)) else: - processed.append((standin, "r")) - processed.append((lfile, "g", p2.flags(lfile))) + processed.append((standin, "r", None, msg)) + processed.append((lfile, "g", (p2.flags(lfile),), msg)) else: processed.append(action) @@ -513,6 +522,7 @@ lfile = lambda f: lfutil.standin(f) in manifest m._files = [lfutil.standin(f) for f in m._files if lfile(f)] m._fmap = set(m._files) + m._always = False origmatchfn = m.matchfn m.matchfn = lambda f: (lfutil.isstandin(f) and (f in manifest) and @@ -619,6 +629,7 @@ m._files = [tostandin(f) for f in m._files] m._files = [f for f in m._files if f is not None] m._fmap = set(m._files) + m._always = False origmatchfn = m.matchfn def matchfn(f): if lfutil.isstandin(f): @@ -684,15 +695,8 @@ return result def hgmerge(orig, repo, node, force=None, remind=True): - # Mark the repo as being in the middle of a merge, so that - # updatelfiles() will know that it needs to trust the standins in - # the working copy, not in the standins in the current node - repo._ismerging = True - try: - result = orig(repo, node, force, remind) - lfcommands.updatelfiles(repo.ui, repo) - finally: - repo._ismerging = False + result = orig(repo, node, force, remind) + lfcommands.updatelfiles(repo.ui, repo) return result # When we rebase a repository with remotely changed largefiles, we need to @@ -700,6 +704,9 @@ # working copy def overridepull(orig, ui, repo, source=None, **opts): revsprepull = len(repo) + if not source: + source = 'default' + repo.lfpullsource = source if opts.get('rebase', False): repo._isrebasing = True try: @@ -713,9 +720,6 @@ def _dummy(*args, **kwargs): pass commands.postincoming = _dummy - if not source: - source = 'default' - repo.lfpullsource = source try: result = commands.pull(ui, repo, source, **opts) finally: @@ -726,31 +730,50 @@ finally: repo._isrebasing = False else: - if not source: - source = 'default' - repo.lfpullsource = source - oldheads = lfutil.getcurrentheads(repo) result = orig(ui, repo, source, **opts) - # If we do not have the new largefiles for any new heads we pulled, we - # will run into a problem later if we try to merge or rebase with one of - # these heads, so cache the largefiles now directly into the system - # cache. - ui.status(_("caching new largefiles\n")) + revspostpull = len(repo) + lfrevs = opts.get('lfrev', []) + if opts.get('all_largefiles'): + lfrevs.append('pulled()') + if lfrevs and revspostpull > revsprepull: numcached = 0 - heads = lfutil.getcurrentheads(repo) - newheads = set(heads).difference(set(oldheads)) - for head in newheads: - (cached, missing) = lfcommands.cachelfiles(ui, repo, head) - numcached += len(cached) + repo.firstpulled = revsprepull # for pulled() revset expression + try: + for rev in scmutil.revrange(repo, lfrevs): + ui.note(_('pulling largefiles for revision %s\n') % rev) + (cached, missing) = lfcommands.cachelfiles(ui, repo, rev) + numcached += len(cached) + finally: + del repo.firstpulled ui.status(_("%d largefiles cached\n") % numcached) - if opts.get('all_largefiles'): - revspostpull = len(repo) - revs = [] - for rev in xrange(revsprepull + 1, revspostpull): - revs.append(repo[rev].rev()) - lfcommands.downloadlfiles(ui, repo, revs) return result +def pulledrevsetsymbol(repo, subset, x): + """``pulled()`` + Changesets that just has been pulled. + + Only available with largefiles from pull --lfrev expressions. + + .. container:: verbose + + Some examples: + + - pull largefiles for all new changesets:: + + hg pull -lfrev "pulled()" + + - pull largefiles for all new branch heads:: + + hg pull -lfrev "head(pulled()) and not closed()" + + """ + + try: + firstpulled = repo.firstpulled + except AttributeError: + raise util.Abort(_("pulled() only available in --lfrev")) + return [r for r in subset if r >= firstpulled] + def overrideclone(orig, ui, source, dest=None, **opts): d = dest if d is None: @@ -769,17 +792,6 @@ sourcerepo, destrepo = result repo = destrepo.local() - # The .hglf directory must exist for the standin matcher to match - # anything (which listlfiles uses for each rev), and .hg/largefiles is - # assumed to exist by the code that caches the downloaded file. These - # directories exist if clone updated to any rev. (If the repo does not - # have largefiles, download never gets to the point of needing - # .hg/largefiles, and the standin matcher won't match anything anyway.) - if 'largefiles' in repo.requirements: - if opts.get('noupdate'): - util.makedirs(repo.wjoin(lfutil.shortname)) - util.makedirs(repo.join(lfutil.longname)) - # Caching is implicitly limited to 'rev' option, since the dest repo was # truncated at that point. The user may expect a download count with # this option, so attempt whether or not this is a largefile repo. @@ -1149,10 +1161,49 @@ def overridecat(orig, ui, repo, file1, *pats, **opts): ctx = scmutil.revsingle(repo, opts.get('rev')) - if not lfutil.standin(file1) in ctx: - result = orig(ui, repo, file1, *pats, **opts) - return result - return lfcommands.catlfile(repo, file1, ctx.rev(), opts.get('output')) + err = 1 + notbad = set() + m = scmutil.match(ctx, (file1,) + pats, opts) + origmatchfn = m.matchfn + def lfmatchfn(f): + lf = lfutil.splitstandin(f) + if lf is None: + return origmatchfn(f) + notbad.add(lf) + return origmatchfn(lf) + m.matchfn = lfmatchfn + origbadfn = m.bad + def lfbadfn(f, msg): + if not f in notbad: + return origbadfn(f, msg) + m.bad = lfbadfn + for f in ctx.walk(m): + fp = cmdutil.makefileobj(repo, opts.get('output'), ctx.node(), + pathname=f) + lf = lfutil.splitstandin(f) + if lf is None: + # duplicating unreachable code from commands.cat + data = ctx[f].data() + if opts.get('decode'): + data = repo.wwritedata(f, data) + fp.write(data) + else: + hash = lfutil.readstandin(repo, lf, ctx.rev()) + if not lfutil.inusercache(repo.ui, hash): + store = basestore._openstore(repo) + success, missing = store.get([(lf, hash)]) + if len(success) != 1: + raise util.Abort( + _('largefile %s is not in cache and could not be ' + 'downloaded') % lf) + path = lfutil.usercachepath(repo.ui, hash) + fpin = open(path, "rb") + for chunk in util.filechunkiter(fpin, 128 * 1024): + fp.write(chunk) + fpin.close() + fp.close() + err = 0 + return err def mercurialsinkbefore(orig, sink): sink.repo._isconverting = True diff -r 0890e6fd3e00 -r 838c6b72928d hgext/largefiles/proto.py --- a/hgext/largefiles/proto.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/largefiles/proto.py Tue May 14 23:04:23 2013 +0400 @@ -16,6 +16,11 @@ '\n\nPlease enable it in your Mercurial config ' 'file.\n') +# these will all be replaced by largefiles.uisetup +capabilitiesorig = None +ssholdcallstream = None +httpoldcallstream = None + def putlfile(repo, proto, sha): '''Put a largefile into a repository's local store and into the user cache.''' @@ -58,23 +63,21 @@ # ssh proto does for string responses. def generator(): yield '%d\n' % length - for chunk in f: + for chunk in util.filechunkiter(f): yield chunk return wireproto.streamres(generator()) def statlfile(repo, proto, sha): - '''Return '2\n' if the largefile is missing, '1\n' if it has a - mismatched checksum, or '0\n' if it is in good condition''' + '''Return '2\n' if the largefile is missing, '0\n' if it seems to be in + good condition. + + The value 1 is reserved for mismatched checksum, but that is too expensive + to be verified on every stat and must be caught be running 'hg verify' + server side.''' filename = lfutil.findfile(repo, sha) if not filename: return '2\n' - fd = None - try: - fd = open(filename, 'rb') - return lfutil.hexsha1(fd) == sha and '0\n' or '1\n' - finally: - if fd: - fd.close() + return '0\n' def wirereposetup(ui, repo): class lfileswirerepository(repo.__class__): @@ -111,6 +114,7 @@ _('putlfile failed (unexpected response):'), ret) def getlfile(self, sha): + """returns an iterable with the chunks of the file with sha sha""" stream = self._callstream("getlfile", sha=sha) length = stream.readline() try: @@ -118,7 +122,17 @@ except ValueError: self._abort(error.ResponseError(_("unexpected response:"), length)) - return (length, stream) + + # SSH streams will block if reading more than length + for chunk in util.filechunkiter(stream, 128 * 1024, length): + yield chunk + # HTTP streams must hit the end to process the last empty + # chunk of Chunked-Encoding so the connection can be reused. + if issubclass(self.__class__, httppeer.httppeer): + chunk = stream.read(1) + if chunk: + self._abort(error.ResponseError(_("unexpected response:"), + chunk)) @batchable def statlfile(self, sha): diff -r 0890e6fd3e00 -r 838c6b72928d hgext/largefiles/remotestore.py --- a/hgext/largefiles/remotestore.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/largefiles/remotestore.py Tue May 14 23:04:23 2013 +0400 @@ -29,7 +29,7 @@ _('remotestore: put %s to remote store %s') % (source, self.url)) def exists(self, hashes): - return self._verify(hashes) + return dict((h, s == 0) for (h, s) in self._stat(hashes).iteritems()) def sendfile(self, filename, hash): self.ui.debug('remotestore: sendfile(%s, %s)\n' % (filename, hash)) @@ -47,15 +47,8 @@ fd.close() def _getfile(self, tmpfile, filename, hash): - # quit if the largefile isn't there - stat = self._stat(hash) - if stat == 1: - raise util.Abort(_('remotestore: largefile %s is invalid') % hash) - elif stat == 2: - raise util.Abort(_('remotestore: largefile %s is missing') % hash) - try: - length, infile = self._get(hash) + chunks = self._get(hash) except urllib2.HTTPError, e: # 401s get converted to util.Aborts; everything else is fine being # turned into a StoreError @@ -68,13 +61,7 @@ except IOError, e: raise basestore.StoreError(filename, hash, self.url, str(e)) - # Mercurial does not close its SSH connections after writing a stream - if length is not None: - infile = lfutil.limitreader(infile, length) - return lfutil.copyandhash(lfutil.blockstream(infile), tmpfile) - - def _verify(self, hashes): - return self._stat(hashes) + return lfutil.copyandhash(chunks, tmpfile) def _verifyfile(self, cctx, cset, contents, standin, verified): filename = lfutil.splitstandin(standin) @@ -87,7 +74,8 @@ verified.add(key) - stat = self._stat(hash) + expecthash = fctx.data()[0:40] + stat = self._stat([expecthash])[expecthash] if not stat: return False elif stat == 1: diff -r 0890e6fd3e00 -r 838c6b72928d hgext/largefiles/reposetup.py --- a/hgext/largefiles/reposetup.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/largefiles/reposetup.py Tue May 14 23:04:23 2013 +0400 @@ -8,7 +8,6 @@ '''setup for largefiles repositories: reposetup''' import copy -import types import os from mercurial import context, error, manifest, match as match_, util, \ @@ -27,15 +26,6 @@ if not repo.local(): return proto.wirereposetup(ui, repo) - for name in ('status', 'commitctx', 'commit', 'push'): - method = getattr(repo, name) - if (isinstance(method, types.FunctionType) and - method.func_name == 'wrap'): - ui.warn(_('largefiles: repo method %r appears to have already been' - ' wrapped by another extension: ' - 'largefiles may behave incorrectly\n') - % name) - class lfilesrepo(repo.__class__): lfstatus = False def status_nolfiles(self, *args, **kwargs): @@ -125,127 +115,146 @@ if match is None: match = match_.always(self.root, self.getcwd()) - # First check if there were files specified on the - # command line. If there were, and none of them were - # largefiles, we should just bail here and let super - # handle it -- thus gaining a big performance boost. - lfdirstate = lfutil.openlfdirstate(ui, self) - if match.files() and not match.anypats(): - for f in lfdirstate: - if match(f): - break - else: - return super(lfilesrepo, self).status(node1, node2, - match, listignored, listclean, - listunknown, listsubrepos) + wlock = None + try: + try: + # updating the dirstate is optional + # so we don't wait on the lock + wlock = self.wlock(False) + except error.LockError: + pass - # Create a copy of match that matches standins instead - # of largefiles. - def tostandins(files): - if not working: - return files - newfiles = [] - dirstate = self.dirstate - for f in files: - sf = lfutil.standin(f) - if sf in dirstate: - newfiles.append(sf) - elif sf in dirstate.dirs(): - # Directory entries could be regular or - # standin, check both - newfiles.extend((f, sf)) + # First check if there were files specified on the + # command line. If there were, and none of them were + # largefiles, we should just bail here and let super + # handle it -- thus gaining a big performance boost. + lfdirstate = lfutil.openlfdirstate(ui, self) + if match.files() and not match.anypats(): + for f in lfdirstate: + if match(f): + break else: - newfiles.append(f) - return newfiles - - m = copy.copy(match) - m._files = tostandins(m._files) + return super(lfilesrepo, self).status(node1, node2, + match, listignored, listclean, + listunknown, listsubrepos) - result = super(lfilesrepo, self).status(node1, node2, m, - ignored, clean, unknown, listsubrepos) - if working: + # Create a copy of match that matches standins instead + # of largefiles. + def tostandins(files): + if not working: + return files + newfiles = [] + dirstate = self.dirstate + for f in files: + sf = lfutil.standin(f) + if sf in dirstate: + newfiles.append(sf) + elif sf in dirstate.dirs(): + # Directory entries could be regular or + # standin, check both + newfiles.extend((f, sf)) + else: + newfiles.append(f) + return newfiles - def sfindirstate(f): - sf = lfutil.standin(f) - dirstate = self.dirstate - return sf in dirstate or sf in dirstate.dirs() + m = copy.copy(match) + m._files = tostandins(m._files) - match._files = [f for f in match._files - if sfindirstate(f)] - # Don't waste time getting the ignored and unknown - # files from lfdirstate - s = lfdirstate.status(match, [], False, - listclean, False) - (unsure, modified, added, removed, missing, _unknown, - _ignored, clean) = s - if parentworking: - for lfile in unsure: - standin = lfutil.standin(lfile) - if standin not in ctx1: - # from second parent - modified.append(lfile) - elif ctx1[standin].data().strip() \ - != lfutil.hashfile(self.wjoin(lfile)): - modified.append(lfile) - else: - clean.append(lfile) - lfdirstate.normal(lfile) - else: - tocheck = unsure + modified + added + clean - modified, added, clean = [], [], [] + result = super(lfilesrepo, self).status(node1, node2, m, + ignored, clean, unknown, listsubrepos) + if working: + + def sfindirstate(f): + sf = lfutil.standin(f) + dirstate = self.dirstate + return sf in dirstate or sf in dirstate.dirs() - for lfile in tocheck: - standin = lfutil.standin(lfile) - if inctx(standin, ctx1): - if ctx1[standin].data().strip() != \ - lfutil.hashfile(self.wjoin(lfile)): + match._files = [f for f in match._files + if sfindirstate(f)] + # Don't waste time getting the ignored and unknown + # files from lfdirstate + s = lfdirstate.status(match, [], False, + listclean, False) + (unsure, modified, added, removed, missing, _unknown, + _ignored, clean) = s + if parentworking: + for lfile in unsure: + standin = lfutil.standin(lfile) + if standin not in ctx1: + # from second parent + modified.append(lfile) + elif ctx1[standin].data().strip() \ + != lfutil.hashfile(self.wjoin(lfile)): modified.append(lfile) else: clean.append(lfile) - else: - added.append(lfile) + lfdirstate.normal(lfile) + else: + tocheck = unsure + modified + added + clean + modified, added, clean = [], [], [] - # Standins no longer found in lfdirstate has been removed - for standin in ctx1.manifest(): - if not lfutil.isstandin(standin): - continue - lfile = lfutil.splitstandin(standin) - if not match(lfile): - continue - if lfile not in lfdirstate: - removed.append(lfile) + for lfile in tocheck: + standin = lfutil.standin(lfile) + if inctx(standin, ctx1): + if ctx1[standin].data().strip() != \ + lfutil.hashfile(self.wjoin(lfile)): + modified.append(lfile) + else: + clean.append(lfile) + else: + added.append(lfile) - # Filter result lists - result = list(result) + # Standins no longer found in lfdirstate has been + # removed + for standin in ctx1.manifest(): + if not lfutil.isstandin(standin): + continue + lfile = lfutil.splitstandin(standin) + if not match(lfile): + continue + if lfile not in lfdirstate: + removed.append(lfile) + + # Filter result lists + result = list(result) - # Largefiles are not really removed when they're - # still in the normal dirstate. Likewise, normal - # files are not really removed if they are still in - # lfdirstate. This happens in merges where files - # change type. - removed = [f for f in removed if f not in self.dirstate] - result[2] = [f for f in result[2] if f not in lfdirstate] + # Largefiles are not really removed when they're + # still in the normal dirstate. Likewise, normal + # files are not really removed if they are still in + # lfdirstate. This happens in merges where files + # change type. + removed = [f for f in removed + if f not in self.dirstate] + result[2] = [f for f in result[2] + if f not in lfdirstate] - lfiles = set(lfdirstate._map) - # Unknown files - result[4] = set(result[4]).difference(lfiles) - # Ignored files - result[5] = set(result[5]).difference(lfiles) - # combine normal files and largefiles - normals = [[fn for fn in filelist - if not lfutil.isstandin(fn)] - for filelist in result] - lfiles = (modified, added, removed, missing, [], [], clean) - result = [sorted(list1 + list2) - for (list1, list2) in zip(normals, lfiles)] - else: - def toname(f): - if lfutil.isstandin(f): - return lfutil.splitstandin(f) - return f - result = [[toname(f) for f in items] for items in result] + lfiles = set(lfdirstate._map) + # Unknown files + result[4] = set(result[4]).difference(lfiles) + # Ignored files + result[5] = set(result[5]).difference(lfiles) + # combine normal files and largefiles + normals = [[fn for fn in filelist + if not lfutil.isstandin(fn)] + for filelist in result] + lfiles = (modified, added, removed, missing, [], [], + clean) + result = [sorted(list1 + list2) + for (list1, list2) in zip(normals, lfiles)] + else: + def toname(f): + if lfutil.isstandin(f): + return lfutil.splitstandin(f) + return f + result = [[toname(f) for f in items] + for items in result] - lfdirstate.write() + if wlock: + lfdirstate.write() + + finally: + if wlock: + wlock.release() if not listunknown: result[4] = [] @@ -299,9 +308,9 @@ lfdirstate = lfutil.openlfdirstate(ui, self) dirtymatch = match_.always(self.root, self.getcwd()) s = lfdirstate.status(dirtymatch, [], False, False, False) - modifiedfiles = [] - for i in s: - modifiedfiles.extend(i) + (unsure, modified, added, removed, _missing, _unknown, + _ignored, _clean) = s + modifiedfiles = unsure + modified + added + removed lfiles = lfutil.listlfiles(self) # this only loops through largefiles that exist (not # removed/renamed) @@ -446,7 +455,7 @@ the largefiles. So we do the following: For directories that only have largefiles as matches, - we explicitly add the largefiles to the matchlist and remove + we explicitly add the largefiles to the match list and remove the directory. In other cases, we leave the match list unmodified. ''' diff -r 0890e6fd3e00 -r 838c6b72928d hgext/largefiles/uisetup.py --- a/hgext/largefiles/uisetup.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/largefiles/uisetup.py Tue May 14 23:04:23 2013 +0400 @@ -9,7 +9,7 @@ '''setup for largefiles extension: uisetup''' from mercurial import archival, cmdutil, commands, extensions, filemerge, hg, \ - httppeer, localrepo, merge, scmutil, sshpeer, wireproto + httppeer, localrepo, merge, scmutil, sshpeer, wireproto, revset from mercurial.i18n import _ from mercurial.hgweb import hgweb_mod, webcommands from mercurial.subrepo import hgsubrepo @@ -52,11 +52,12 @@ entry = extensions.wrapcommand(commands.table, 'verify', overrides.overrideverify) - verifyopt = [('', 'large', None, _('verify largefiles')), + verifyopt = [('', 'large', None, + _('verify that all largefiles in current revision exists')), ('', 'lfa', None, - _('verify all revisions of largefiles not just current')), + _('verify largefiles in all revisions, not just current')), ('', 'lfc', None, - _('verify largefile contents not just existence'))] + _('verify local largefile contents, not just existence'))] entry[1].extend(verifyopt) entry = extensions.wrapcommand(commands.table, 'debugstate', @@ -78,8 +79,12 @@ entry = extensions.wrapcommand(commands.table, 'pull', overrides.overridepull) pullopt = [('', 'all-largefiles', None, - _('download all pulled versions of largefiles'))] + _('download all pulled versions of largefiles (DEPRECATED)')), + ('', 'lfrev', [], + _('download largefiles for these revisions'), _('REV'))] entry[1].extend(pullopt) + revset.symbols['pulled'] = overrides.pulledrevsetsymbol + entry = extensions.wrapcommand(commands.table, 'clone', overrides.overrideclone) cloneopt = [('', 'all-largefiles', None, diff -r 0890e6fd3e00 -r 838c6b72928d hgext/largefiles/wirestore.py --- a/hgext/largefiles/wirestore.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/largefiles/wirestore.py Tue May 14 23:04:23 2013 +0400 @@ -26,6 +26,9 @@ return self.remote.getlfile(hash) def _stat(self, hashes): + '''For each hash, return 0 if it is available, other values if not. + It is usually 2 if the largefile is missing, but might be 1 the server + has a corrupted copy.''' batch = self.remote.batch() futures = {} for hash in hashes: @@ -33,5 +36,5 @@ batch.submit() retval = {} for hash in hashes: - retval[hash] = not futures[hash].value + retval[hash] = futures[hash].value return retval diff -r 0890e6fd3e00 -r 838c6b72928d hgext/mq.py --- a/hgext/mq.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/mq.py Tue May 14 23:04:23 2013 +0400 @@ -282,7 +282,7 @@ if phase is not None: backup = repo.ui.backupconfig('phases', 'new-commit') # Marking the repository as committing an mq patch can be used - # to optimize operations like _branchtags(). + # to optimize operations like branchtags(). repo._committingpatch = True try: if phase is not None: @@ -297,7 +297,7 @@ pass class queue(object): - def __init__(self, ui, path, patchdir=None): + def __init__(self, ui, baseui, path, patchdir=None): self.basepath = path try: fh = open(os.path.join(path, 'patches.queue')) @@ -312,6 +312,7 @@ self.path = patchdir or curpath self.opener = scmutil.opener(self.path) self.ui = ui + self.baseui = baseui self.applieddirty = False self.seriesdirty = False self.added = [] @@ -1571,7 +1572,7 @@ r = list(dd) a = list(aa) - # create 'match' that includes the files to be recommited. + # create 'match' that includes the files to be recommitted. # apply matchfn via repo.status to ensure correct case handling. cm, ca, cr, cd = repo.status(patchparent, match=matchfn)[:4] allmatches = set(cm + ca + cr + cd) @@ -1774,9 +1775,7 @@ return True def qrepo(self, create=False): - ui = self.ui.copy() - ui.setconfig('paths', 'default', '', overlay=False) - ui.setconfig('paths', 'default-push', '', overlay=False) + ui = self.baseui.copy() if create or os.path.isdir(self.join(".hg")): return hg.repository(ui, path=self.path, create=create) @@ -2761,7 +2760,7 @@ if not newpath: ui.warn(_("no saved queues found, please use -n\n")) return 1 - mergeq = queue(ui, repo.path, newpath) + mergeq = queue(ui, repo.baseui, repo.path, newpath) ui.warn(_("merging with queue at: %s\n") % mergeq.path) ret = q.push(repo, patch, force=opts.get('force'), list=opts.get('list'), mergeq=mergeq, all=opts.get('all'), move=opts.get('move'), @@ -2795,7 +2794,7 @@ opts = fixkeepchangesopts(ui, opts) localupdate = True if opts.get('name'): - q = queue(ui, repo.path, repo.join(opts.get('name'))) + q = queue(ui, repo.baseui, repo.path, repo.join(opts.get('name'))) ui.warn(_('using patch queue: %s\n') % q.path) localupdate = False else: @@ -3037,7 +3036,22 @@ wlock = repo.wlock() try: urev = repo.mq.qparents(repo, revs[0]) - repo.dirstate.rebuild(urev, repo[urev].manifest()) + uctx = repo[urev] + + # only reset the dirstate for files that would actually change + # between the working context and uctx + descendantrevs = repo.revs("%s::." % uctx.rev()) + changedfiles = [] + for rev in descendantrevs: + # blindly reset the files, regardless of what actually changed + changedfiles.extend(repo[rev].files()) + + # reset files that only changed in the dirstate too + dirstate = repo.dirstate + dirchanges = [f for f in dirstate if dirstate[f] != 'n'] + changedfiles.extend(dirchanges) + + repo.dirstate.rebuild(urev, uctx.manifest(), changedfiles) repo.dirstate.write() update = False finally: @@ -3398,7 +3412,7 @@ class mqrepo(repo.__class__): @util.propertycache def mq(self): - return queue(self.ui, self.path) + return queue(self.ui, self.baseui, self.path) def abortifwdirpatched(self, errmsg, force=False): if self.mq.applied and not force: @@ -3454,6 +3468,12 @@ % short(mqtags[-1][0])) return result + # do not add fake tags for filtered revisions + included = self.changelog.hasnode + mqtags = [mqt for mqt in mqtags if included(mqt[0])] + if not mqtags: + return result + mqtags.append((mqtags[-1][0], 'qtip')) mqtags.append((mqtags[0][0], 'qbase')) mqtags.append((self.changelog.parents(mqtags[0][0])[0], 'qparent')) diff -r 0890e6fd3e00 -r 838c6b72928d hgext/pager.py --- a/hgext/pager.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/pager.py Tue May 14 23:04:23 2013 +0400 @@ -47,7 +47,7 @@ normal behavior. ''' -import atexit, sys, os, signal, subprocess +import atexit, sys, os, signal, subprocess, errno, shlex from mercurial import commands, dispatch, util, extensions from mercurial.i18n import _ @@ -94,6 +94,8 @@ @atexit.register def killpager(): + if util.safehasattr(signal, "SIGINT"): + signal.signal(signal.SIGINT, signal.SIG_IGN) pager.stdin.close() os.dup2(stdout, sys.stdout.fileno()) os.dup2(stderr, sys.stderr.fileno()) diff -r 0890e6fd3e00 -r 838c6b72928d hgext/patchbomb.py --- a/hgext/patchbomb.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/patchbomb.py Tue May 14 23:04:23 2013 +0400 @@ -540,7 +540,13 @@ fp.close() else: if not sendmail: - sendmail = mail.connect(ui, mbox=mbox) + verifycert = ui.config('smtp', 'verifycert') + if opts.get('insecure'): + ui.setconfig('smtp', 'verifycert', 'loose') + try: + sendmail = mail.connect(ui, mbox=mbox) + finally: + ui.setconfig('smtp', 'verifycert', verifycert) ui.status(_('sending '), subj, ' ...\n') ui.progress(_('sending'), i, item=subj, total=len(msgs)) if not mbox: diff -r 0890e6fd3e00 -r 838c6b72928d hgext/rebase.py --- a/hgext/rebase.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/rebase.py Tue May 14 23:04:23 2013 +0400 @@ -15,7 +15,7 @@ ''' from mercurial import hg, util, repair, merge, cmdutil, commands, bookmarks -from mercurial import extensions, patch, scmutil, phases, obsolete +from mercurial import extensions, patch, scmutil, phases, obsolete, error from mercurial.commands import templateopts from mercurial.node import nullrev from mercurial.lock import release @@ -68,6 +68,9 @@ same rebase or they will end up with duplicated changesets after pulling in your rebased changesets. + In its default configuration, Mercurial will prevent you from + rebasing published changes. See :hg:`help phases` for details. + If you don't specify a destination changeset (``-d/--dest``), rebase uses the tipmost head of the current named branch as the destination. (The destination changeset is not modified by @@ -85,6 +88,11 @@ the whole branch. If you specify neither ``-s`` nor ``-b``, rebase uses the parent of the working directory as the base. + For advanced usage, a third way is available through the ``--rev`` + option. It allows you to specify an arbitrary set of changesets to + rebase. Descendants of revs you specify with this option are not + automatically included in the rebase. + By default, rebase recreates the changesets in the source branch as descendants of dest and then destroys the originals. Use ``--keep`` to preserve the original source changesets. Some @@ -104,6 +112,7 @@ Returns 0 on success, 1 if nothing to rebase. """ originalwd = target = None + activebookmark = None external = nullrev state = {} skipped = set() @@ -151,7 +160,7 @@ ui.warn(_('tool option will be ignored\n')) (originalwd, target, state, skipped, collapsef, keepf, - keepbranchesf, external) = restorestatus(repo) + keepbranchesf, external, activebookmark) = restorestatus(repo) if abortf: return abort(repo, originalwd, target, state) else: @@ -200,10 +209,6 @@ _("can't remove original changesets with" " unrebased descendants"), hint=_('use --keep to keep original changesets')) - elif not keepf and not repo[root].mutable(): - raise util.Abort(_("can't rebase immutable changeset %s") - % repo[root], - hint=_('see hg help phases for details')) else: result = buildstate(repo, dest, rebaseset, collapsef) @@ -211,6 +216,10 @@ # Empty state built, nothing to rebase ui.status(_('nothing to rebase\n')) return 1 + elif not keepf and not repo[root].mutable(): + raise util.Abort(_("can't rebase immutable changeset %s") + % repo[root], + hint=_('see hg help phases for details')) else: originalwd, target, state = result if collapsef: @@ -237,7 +246,7 @@ # Keep track of the current bookmarks in order to reset them later currentbookmarks = repo._bookmarks.copy() - activebookmark = repo._bookmarkcurrent + activebookmark = activebookmark or repo._bookmarkcurrent if activebookmark: bookmarks.unsetcurrent(repo) @@ -250,7 +259,7 @@ ui.progress(_("rebasing"), pos, ("%d:%s" % (rev, repo[rev])), _('changesets'), total) storestatus(repo, originalwd, target, state, collapsef, keepf, - keepbranchesf, external) + keepbranchesf, external, activebookmark) p1, p2 = defineparents(repo, rev, target, state, targetancestors) if len(repo.parents()) == 2: @@ -260,8 +269,9 @@ ui.setconfig('ui', 'forcemerge', opts.get('tool', '')) stats = rebasenode(repo, rev, p1, state, collapsef) if stats and stats[3] > 0: - raise util.Abort(_('unresolved conflicts (see hg ' - 'resolve, then hg rebase --continue)')) + raise error.InterventionRequired( + _('unresolved conflicts (see hg ' + 'resolve, then hg rebase --continue)')) finally: ui.setconfig('ui', 'forcemerge', '') cmdutil.duplicatecopies(repo, rev, target) @@ -308,6 +318,9 @@ for k, v in state.iteritems(): if v > nullmerge: nstate[repo[k].node()] = repo[v].node() + # XXX this is the same as dest.node() for the non-continue path -- + # this should probably be cleaned up + targetnode = repo[target].node() if not keepf: collapsedas = None @@ -316,7 +329,7 @@ clearrebased(ui, repo, state, skipped, collapsedas) if currentbookmarks: - updatebookmarks(repo, nstate, currentbookmarks, **opts) + updatebookmarks(repo, targetnode, nstate, currentbookmarks) clearstatus(repo) ui.note(_("rebase completed\n")) @@ -493,19 +506,19 @@ mq.seriesdirty = True mq.savedirty() -def updatebookmarks(repo, nstate, originalbookmarks, **opts): - 'Move bookmarks to their correct changesets' +def updatebookmarks(repo, targetnode, nstate, originalbookmarks): + 'Move bookmarks to their correct changesets, and delete divergent ones' marks = repo._bookmarks for k, v in originalbookmarks.iteritems(): if v in nstate: - if nstate[v] > nullmerge: - # update the bookmarks for revs that have moved - marks[k] = nstate[v] + # update the bookmarks for revs that have moved + marks[k] = nstate[v] + bookmarks.deletedivergent(repo, [targetnode], k) marks.write() def storestatus(repo, originalwd, target, state, collapse, keep, keepbranches, - external): + external, activebookmark): 'Store the current status to allow recovery' f = repo.opener("rebasestate", "w") f.write(repo[originalwd].hex() + '\n') @@ -514,6 +527,7 @@ f.write('%d\n' % int(collapse)) f.write('%d\n' % int(keep)) f.write('%d\n' % int(keepbranches)) + f.write('%s\n' % (activebookmark or '')) for d, v in state.iteritems(): oldrev = repo[d].hex() if v > nullmerge: @@ -534,6 +548,7 @@ target = None collapse = False external = nullrev + activebookmark = None state = {} f = repo.opener("rebasestate") for i, l in enumerate(f.read().splitlines()): @@ -549,6 +564,10 @@ keep = bool(int(l)) elif i == 5: keepbranches = bool(int(l)) + elif i == 6 and not (len(l) == 81 and ':' in l): + # line 6 is a recent addition, so for backwards compatibility + # check that the line doesn't look like the oldrev:newrev lines + activebookmark = l else: oldrev, newrev = l.split(':') if newrev in (str(nullmerge), str(revignored)): @@ -566,7 +585,7 @@ repo.ui.debug('computed skipped revs: %s\n' % skipped) repo.ui.debug('rebase status resumed\n') return (originalwd, target, state, skipped, - collapse, keep, keepbranches, external) + collapse, keep, keepbranches, external, activebookmark) except IOError, err: if err.errno != errno.ENOENT: raise @@ -681,8 +700,8 @@ # If we have multiple roots, we may have "hole" in the rebase set. # Rebase roots that descend from those "hole" should not be detached as # other root are. We use the special `revignored` to inform rebase that - # the revision should be ignored but that `defineparent` should search - # a rebase destination that make sense regarding rebaset topology. + # the revision should be ignored but that `defineparents` should search + # a rebase destination that make sense regarding rebased topology. rebasedomain = set(repo.revs('%ld::%ld', rebaseset, rebaseset)) for ignored in set(rebasedomain) - set(rebaseset): state[ignored] = revignored diff -r 0890e6fd3e00 -r 838c6b72928d hgext/record.py --- a/hgext/record.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/record.py Tue May 14 23:04:23 2013 +0400 @@ -76,7 +76,7 @@ if m: yield 'range', m.groups() else: - raise patch.PatchError('unknown patch content: %r' % line) + yield 'other', line class header(object): """patch header @@ -228,6 +228,9 @@ self.headers.append(h) self.header = h + def addother(self, line): + pass # 'other' lines are ignored + def finished(self): self.addcontext([]) return self.headers @@ -239,12 +242,14 @@ 'range': addrange}, 'context': {'file': newfile, 'hunk': addhunk, - 'range': addrange}, + 'range': addrange, + 'other': addother}, 'hunk': {'context': addcontext, 'file': newfile, 'range': addrange}, 'range': {'context': addcontext, 'hunk': addhunk}, + 'other': {'other': addother}, } p = parser() @@ -531,7 +536,11 @@ fp.seek(0) # 1. filter patch, so we have intending-to apply subset of it - chunks = filterpatch(ui, parsepatch(fp)) + try: + chunks = filterpatch(ui, parsepatch(fp)) + except patch.PatchError, err: + raise util.Abort(_('error parsing patch: %s') % err) + del fp contenders = set() diff -r 0890e6fd3e00 -r 838c6b72928d hgext/relink.py --- a/hgext/relink.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/relink.py Tue May 14 23:04:23 2013 +0400 @@ -41,7 +41,7 @@ if (not util.safehasattr(util, 'samefile') or not util.safehasattr(util, 'samedevice')): raise util.Abort(_('hardlinks are not supported on this system')) - src = hg.repository(ui, ui.expandpath(origin or 'default-relink', + src = hg.repository(repo.baseui, ui.expandpath(origin or 'default-relink', origin or 'default')) ui.status(_('relinking %s to %s\n') % (src.store.path, repo.store.path)) if repo.root == src.root: diff -r 0890e6fd3e00 -r 838c6b72928d hgext/schemes.py --- a/hgext/schemes.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/schemes.py Tue May 14 23:04:23 2013 +0400 @@ -62,7 +62,10 @@ def instance(self, ui, url, create): # Should this use the util.url class, or is manual parsing better? - url = url.split('://', 1)[1] + try: + url = url.split('://', 1)[1] + except IndexError: + raise util.Abort(_("no '://' in scheme url '%s'") % url) parts = url.split('/', self.parts) if len(parts) > self.parts: tail = parts[-1] diff -r 0890e6fd3e00 -r 838c6b72928d hgext/share.py --- a/hgext/share.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/share.py Tue May 14 23:04:23 2013 +0400 @@ -59,7 +59,7 @@ lock and lock.release() # update store, spath, sopener and sjoin of repo - repo.__init__(ui, repo.root) + repo.__init__(repo.baseui, repo.root) cmdtable = { "share": diff -r 0890e6fd3e00 -r 838c6b72928d hgext/transplant.py --- a/hgext/transplant.py Sun May 12 15:35:53 2013 +0400 +++ b/hgext/transplant.py Tue May 14 23:04:23 2013 +0400 @@ -7,7 +7,8 @@ '''command to transplant changesets from another branch -This extension allows you to transplant patches from another branch. +This extension allows you to transplant changes to another parent revision, +possibly in another repository. The transplant is done using 'diff' patches. Transplanted patches are recorded in .hg/transplant/transplants, as a map from a changeset hash to its hash in the source repository. @@ -294,10 +295,10 @@ return n - def resume(self, repo, source, opts=None): + def resume(self, repo, source, opts): '''recover last transaction and apply remaining changesets''' if os.path.exists(os.path.join(self.path, 'journal')): - n, node = self.recover(repo) + n, node = self.recover(repo, source, opts) self.ui.status(_('%s transplanted as %s\n') % (short(node), short(n))) seriespath = os.path.join(self.path, 'series') @@ -312,7 +313,7 @@ self.apply(repo, source, revmap, merges, opts) - def recover(self, repo): + def recover(self, repo, source, opts): '''commit working directory using journal metadata''' node, user, date, message, parents = self.readlog() merge = False @@ -492,10 +493,9 @@ return (transplants, merges) @command('transplant', - [('s', 'source', '', _('pull patches from REPO'), _('REPO')), - ('b', 'branch', [], - _('pull patches from branch BRANCH'), _('BRANCH')), - ('a', 'all', None, _('pull all changesets up to BRANCH')), + [('s', 'source', '', _('transplant changesets from REPO'), _('REPO')), + ('b', 'branch', [], _('use this source changeset as head'), _('REV')), + ('a', 'all', None, _('pull all changesets up to the --branch revisions')), ('p', 'prune', [], _('skip over REV'), _('REV')), ('m', 'merge', [], _('merge at REV'), _('REV')), ('', 'parent', '', @@ -503,7 +503,7 @@ ('e', 'edit', False, _('invoke editor on commit messages')), ('', 'log', None, _('append transplant info to log message')), ('c', 'continue', None, _('continue last transplant session ' - 'after repair')), + 'after fixing conflicts')), ('', 'filter', '', _('filter changesets through command'), _('CMD'))], _('hg transplant [-s REPO] [-b BRANCH [-a]] [-p REV] ' @@ -513,9 +513,13 @@ Selected changesets will be applied on top of the current working directory with the log of the original changeset. The changesets - are copied and will thus appear twice in the history. Use the - rebase extension instead if you want to move a whole branch of - unpublished changesets. + are copied and will thus appear twice in the history with different + identities. + + Consider using the graft command if everything is inside the same + repository - it will use merges and will usually give a better result. + Use the rebase extension if the changesets are unpublished and you want + to move them instead of copying them. If --log is specified, log messages will have a comment appended of the form:: @@ -526,16 +530,19 @@ Its argument will be invoked with the current changelog message as $1 and the patch as $2. - If --source/-s is specified, selects changesets from the named - repository. If --branch/-b is specified, selects changesets from - the branch holding the named revision, up to that revision. If - --all/-a is specified, all changesets on the branch will be - transplanted, otherwise you will be prompted to select the - changesets you want. + --source/-s specifies another repository to use for selecting changesets, + just as if it temporarily had been pulled. + If --branch/-b is specified, these revisions will be used as + heads when deciding which changsets to transplant, just as if only + these revisions had been pulled. + If --all/-a is specified, all the revisions up to the heads specified + with --branch will be transplanted. - :hg:`transplant --branch REV --all` will transplant the - selected branch (up to the named revision) onto your current - working directory. + Example: + + - transplant all changes up to REV on top of your current revision:: + + hg transplant --branch REV --all You can optionally mark selected transplanted changesets as merge changesets. You will not be prompted to transplant any ancestors @@ -557,13 +564,16 @@ if match(node): yield node - def transplantwalk(repo, root, branches, match=util.always): - if not branches: - branches = repo.heads() + def transplantwalk(repo, dest, heads, match=util.always): + '''Yield all nodes that are ancestors of a head but not ancestors + of dest. + If no heads are specified, the heads of repo will be used.''' + if not heads: + heads = repo.heads() ancestors = [] - for branch in branches: - ancestors.append(repo.changelog.ancestor(root, branch)) - for node in repo.changelog.nodesbetween(ancestors, branches)[0]: + for head in heads: + ancestors.append(repo.changelog.ancestor(dest, head)) + for node in repo.changelog.nodesbetween(ancestors, heads)[0]: if match(node): yield node @@ -571,11 +581,11 @@ if opts.get('continue'): if opts.get('branch') or opts.get('all') or opts.get('merge'): raise util.Abort(_('--continue is incompatible with ' - 'branch, all or merge')) + '--branch, --all and --merge')) return if not (opts.get('source') or revs or opts.get('merge') or opts.get('branch')): - raise util.Abort(_('no source URL, branch tag or revision ' + raise util.Abort(_('no source URL, branch revision or revision ' 'list provided')) if opts.get('all'): if not opts.get('branch'): @@ -608,12 +618,12 @@ sourcerepo = opts.get('source') if sourcerepo: peer = hg.peer(repo, opts, ui.expandpath(sourcerepo)) - branches = map(peer.lookup, opts.get('branch', ())) + heads = map(peer.lookup, opts.get('branch', ())) source, csets, cleanupfn = bundlerepo.getremotechanges(ui, repo, peer, - onlyheads=branches, force=True) + onlyheads=heads, force=True) else: source = repo - branches = map(source.lookup, opts.get('branch', ())) + heads = map(source.lookup, opts.get('branch', ())) cleanupfn = None try: @@ -623,8 +633,8 @@ tf = tp.transplantfilter(repo, source, p1) if opts.get('prune'): - prune = [source.lookup(r) - for r in scmutil.revrange(source, opts.get('prune'))] + prune = set(source.lookup(r) + for r in scmutil.revrange(source, opts.get('prune'))) matchfn = lambda x: tf(x) and x not in prune else: matchfn = tf @@ -637,7 +647,7 @@ if source != repo: alltransplants = incwalk(source, csets, match=matchfn) else: - alltransplants = transplantwalk(source, p1, branches, + alltransplants = transplantwalk(source, p1, heads, match=matchfn) if opts.get('all'): revs = alltransplants diff -r 0890e6fd3e00 -r 838c6b72928d i18n/de.po --- a/i18n/de.po Sun May 12 15:35:53 2013 +0400 +++ b/i18n/de.po Tue May 14 23:04:23 2013 +0400 @@ -21,7 +21,7 @@ "Project-Id-Version: Mercurial\n" "Report-Msgid-Bugs-To: \n" "POT-Creation-Date: 2012-09-01 14:20+0200\n" -"PO-Revision-Date: 2012-09-10 09:39+0100\n" +"PO-Revision-Date: 2013-02-05 22:00+0100\n" "Last-Translator: Martin Schröder \n" "Language-Team: \n" "Language: de\n" @@ -8105,7 +8105,7 @@ msgstr "" " j - Diese Änderung übernehmen\n" " n - Diese Änderung überspringen\n" -" b - Diese Änderung manuell bearbeiten" +" e - Diese Änderung manuell bearbeiten" msgid "" " s - skip remaining changes to this file\n" diff -r 0890e6fd3e00 -r 838c6b72928d i18n/el.po --- a/i18n/el.po Sun May 12 15:35:53 2013 +0400 +++ b/i18n/el.po Tue May 14 23:04:23 2013 +0400 @@ -1,8 +1,8 @@ # Greek translations for Mercurial # Ελληνική μετάφραση των μηνυμάτων του Mercurial -# +# # Copyright (C) 2009 Matt Mackall και άλλοι -# +# msgid "" msgstr "" "Project-Id-Version: Mercurial\n" @@ -14181,7 +14181,7 @@ msgstr "" msgid "&Delete" -msgstr "&Διαγραφή" +msgstr "" #, python-format msgid "" @@ -14190,7 +14190,7 @@ msgstr "" msgid "&Deleted" -msgstr "&Διαγράφηκε" +msgstr "" #, python-format msgid "update failed to remove %s: %s!\n" @@ -14422,7 +14422,7 @@ msgstr "" msgid "&Remote" -msgstr "&Απομακρυσμένο:" +msgstr "" #, python-format msgid "" diff -r 0890e6fd3e00 -r 838c6b72928d i18n/fr.po --- a/i18n/fr.po Sun May 12 15:35:53 2013 +0400 +++ b/i18n/fr.po Tue May 14 23:04:23 2013 +0400 @@ -1,17 +1,17 @@ # French translations for Mercurial # Traductions françaises de Mercurial # Copyright (C) 2009 Matt Mackall and others -# +# # Quelques règles : # - dans l'aide d'une commande, la première ligne descriptive # commence par un verbe au présent sans majuscule # - dans l'aide d'une commande, la description des options # utilise un verbe à l'infinitif -# +# # Note : la terminologie ci-dessous est loin d'être complète, figée ou # parfaite. À compléter et à améliorer, particulièrement les # termes comportant un point d'interrogation... -# +# # Dictionnaire de termes courants : # - to apply a patch appliquer un patch # - a branch une branche @@ -46,7 +46,7 @@ # untracked non suivi, non géré, pas sous contrôle du dépôt, # hors révision ? # - the working directory le répertoire de travail -# +# # Termes très courants repris de l'anglais - à utiliser sans guillemets # pour ne pas alourdir inutilement la traduction : # - a diff un diff ? (ou également un patch ? synonymes...) @@ -54,7 +54,7 @@ # - a patch un patch # - a tag un tag # - to tag taguer -# +# # Termes anglais avec une signification très spécifique, difficile à # paraphraser sans alourdir ou perdre le sens - à utiliser avec guillemets : # - a bundle un \"bundle\" @@ -62,7 +62,7 @@ # - a changeset un \"changeset\" # - a changegroup un \"changegroup\" # - the tip la révision \"tip\" -# +# # Termes dont le classement / la traduction sont à déterminer : # - a commit un commit, un \"commit\" # - to commit \"committer\" ? (beuark, même dit tous les jours) @@ -73,7 +73,7 @@ # - to push propager ? (utilisé par svn pour commit) # publier ? pousser ?? envoyer ?? # - the series file (mq) ? -# +# # Notes : # - (cédric) je verrais bien l'ajout d'une rubrique générale dans l'aide # (par exemple 'hg help glossary') librement remplissable par chaque équipe @@ -81,7 +81,7 @@ # qui vont être rencontrés dans mercurial - et en particulier permettrait # de faire le lien avec des termes franglisants parfois utilisés # (par ex. fusionner = "merger", etc.) ... ? -# +# msgid "" msgstr "" "Project-Id-Version: Mercurial\n" diff -r 0890e6fd3e00 -r 838c6b72928d i18n/it.po --- a/i18n/it.po Sun May 12 15:35:53 2013 +0400 +++ b/i18n/it.po Tue May 14 23:04:23 2013 +0400 @@ -6,7 +6,7 @@ "Project-Id-Version: Mercurial\n" "Report-Msgid-Bugs-To: \n" "POT-Creation-Date: 2011-03-22 22:04+0100\n" -"PO-Revision-Date: 2011-03-15 17:05+0100\n" +"PO-Revision-Date: 2013-04-05 14:47+0100\n" "Last-Translator: Stefano Tortarolo \n" "Language-Team: Italian \n" "Language: it\n" @@ -14,6 +14,7 @@ "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" "Plural-Forms: nplurals=2; plural=(n != 1);\n" +"X-Generator: Poedit 1.5.5\n" #, python-format msgid " (default: %s)" @@ -1862,13 +1863,13 @@ "used.\n" msgstr "" -#, fuzzy, python-format +#, python-format msgid "%s should not have CRLF line endings" -msgstr " %s in %s non dovrebbe avere %s fine riga" - -#, fuzzy, python-format +msgstr "%s non dovrebbe avere fine riga CRLF" + +#, python-format msgid "%s should not have LF line endings" -msgstr " %s in %s non dovrebbe avere %s fine riga" +msgstr "%s non dovrebbe avere fine riga LF" #, python-format msgid "warning: ignoring .hgeol file due to parse error at %s: %s\n" @@ -5439,17 +5440,15 @@ msgid "hardlinks are not supported on this system" msgstr "hardlink non supportati su questo sistema" -#, fuzzy msgid "must specify local origin repository" -msgstr "%s non è un repository locale Mercurial" +msgstr "è necessario specificare il repository di origine locale" #, python-format msgid "relinking %s to %s\n" msgstr "sto ricollegando %s a %s\n" -#, fuzzy msgid "there is nothing to relink\n" -msgstr "non c'è nulla di cui fare il merge" +msgstr "non c'è nulla di cui fare il relink\n" #, python-format msgid "tip has %d files, estimated total number of files: %s\n" @@ -5649,7 +5648,7 @@ msgstr "nessuna opzione\n" msgid "transplant changesets from another branch" -msgstr "" +msgstr "trapianta changeset da un'altro branch" msgid "" " Selected changesets will be applied on top of the current working\n" @@ -6023,11 +6022,11 @@ msgstr "il nome '%s' è riservato" msgid "options --message and --logfile are mutually exclusive" -msgstr "" +msgstr "le opzioni --message e --logfile sono mutualmente esclusive" #, python-format msgid "can't read commit message '%s': %s" -msgstr "" +msgstr "impossibile leggere il messaggio di commit '%s': %s" msgid "limit must be a positive integer" msgstr "limit dev'essere un intero positivo " @@ -6116,7 +6115,7 @@ msgstr "(considera di usare --after)\n" msgid "child process failed to start" -msgstr "avvio fallito del processo figlio" +msgstr "fallito l'avvio del processo figlio" #, python-format msgid "changeset: %d:%s\n" @@ -6128,7 +6127,7 @@ #, python-format msgid "bookmark: %s\n" -msgstr "segnalibro: %s\n" +msgstr "segnalibro: %s\n" #, python-format msgid "tag: %s\n" @@ -6161,11 +6160,11 @@ #, python-format msgid "files: %s\n" -msgstr "file: %s\n" +msgstr "file: %s\n" #, python-format msgid "copies: %s\n" -msgstr "copie: %s\n" +msgstr "copie: %s\n" #, python-format msgid "extra: %s=%s\n" @@ -13379,13 +13378,13 @@ msgid "not removing repo %s because it has changes.\n" msgstr "non rimuovo il repository %s in quanto contiene modifiche\n" -#, fuzzy, python-format +#, python-format msgid "cloning subrepo %s\n" -msgstr "clonazione in corso del subrepository %s da %s\n" - -#, fuzzy, python-format +msgstr "clonazione in corso del subrepository %s\n" + +#, python-format msgid "pulling subrepo %s\n" -msgstr "pull in corso del subrepository %s da %s\n" +msgstr "pull in corso del subrepository %s\n" #, python-format msgid "revision %s does not exist in subrepo %s\n" @@ -13463,7 +13462,7 @@ #, python-format msgid "template file %s: %s" -msgstr "" +msgstr "file template %s: %s" msgid "cannot use transaction when it is already committed/aborted" msgstr "" diff -r 0890e6fd3e00 -r 838c6b72928d i18n/ja.po --- a/i18n/ja.po Sun May 12 15:35:53 2013 +0400 +++ b/i18n/ja.po Tue May 14 23:04:23 2013 +0400 @@ -1,7 +1,7 @@ # Japanese translation for Mercurial # Mercurial 日本語翻訳 # -# Copyright (C) 2009-2012 the Mercurial team +# Copyright (C) 2009-2013 the Mercurial team # # ======================================== # 【翻訳用語集】 @@ -50,11 +50,13 @@ # backout 打ち消し # basename パス名末尾要素 # binary バイナリ +# bisect 二分探索 # branch ブランチ # bundle( file) バンドルファイル # change チェンジセット/差分 # changegroup( file) バンドルファイル # changeset リビジョン (or 「チェンジセット」/「変更内容」) +# changeset description コミットログ # changeset hash ハッシュ値 # changeset header ヘッダ情報 # changeset log コミットログ @@ -64,9 +66,11 @@ # commit コミット # commit comment コミットログ # commit message コミットログ +# commit text コミットログ # copy(of file, repo) 複製 # default(, by) 指定が無い場合/通常は # delete (作業領域からの)ファイル削除 +# description(, changeset) コミットログ # diff 差分 # directory ディレクトリ # dirstate dirstate @@ -94,12 +98,14 @@ # lock ロック # manifest マニフェスト or 管理対象(一覧) # merge マージ +# modify(modified) 変更(ファイル/作業領域)、改変(リビジョン) # must(A must B) A は B してください # node リビジョン # note 備考 # patch パッチ # platform 稼働環境 # pop(patch) (パッチの)適用解除 +# pruned xxxxx (obsoleted with no successors) xxxxxx # pull (追加リビジョンの)取り込み # push (追加リビジョンの)反映 # push(patch) (パッチの)適用 @@ -118,10 +124,13 @@ # server サーバ # source url (of subrepo) (サブリポジトリの)参照先 URL # subrepo サブリポジトリ +# successor (changeset) 後継リビジョン +# successors set 後継セット # summary 要約(情報) # support(, not) (未)サポート # support(, un) (未)サポート # tag タグ +# topological xxxx 構造的 # tracked xxxx 構成管理対象の xxxx # tracked, un 未登録 # type, xxxxx xxxx 種別 @@ -135,7 +144,7 @@ msgstr "" "Project-Id-Version: Mercurial\n" "Report-Msgid-Bugs-To: \n" -"POT-Creation-Date: 2012-11-30 17:45+0900\n" +"POT-Creation-Date: 2013-04-30 18:34+0900\n" "PO-Revision-Date: 2009-11-16 21:24+0100\n" "Last-Translator: Japanese translation team \n" "Language-Team: Japanese\n" @@ -602,6 +611,53 @@ msgstr "" "acl: ユーザ \"%s\" はファイル \"%s\" が許可されていません(リビジョン \"%s\")" +msgid "log repository events to a blackbox for debugging" +msgstr "" + +msgid "" +"Logs event information to .hg/blackbox.log to help debug and diagnose " +"problems.\n" +"The events that get logged can be configured via the blackbox.track config " +"key.\n" +"Examples:" +msgstr "" + +msgid "" +" [blackbox]\n" +" track = *" +msgstr "" + +msgid "" +" [blackbox]\n" +" track = command, commandfinish, commandexception, exthook, pythonhook" +msgstr "" + +msgid "" +" [blackbox]\n" +" track = incoming" +msgstr "" + +msgid "" +" [blackbox]\n" +" # limit the size of a log file\n" +" maxsize = 1.5 MB\n" +" # rotate up to N log files when the current one gets too big\n" +" maxfiles = 3" +msgstr "" + +msgid "the number of events to show" +msgstr "イベント表示数" + +msgid "hg blackbox [OPTION]..." +msgstr "hg blackbox [OPTION]..." + +msgid "" +"view the recent repository events\n" +" " +msgstr "" +"最新のリポジトリイベントの表示\n" +" " + msgid "hooks for integrating with the Bugzilla bug tracker" msgstr "Bugzilla バグ管理システムとの連携用フック集" @@ -1642,6 +1698,10 @@ msgid "ignoring unknown color/effect %r (configured in color.%s)\n" msgstr "未知の色/効果指定 %r を無視(color.%s で設定記述)\n" +#. i18n: "label" is a keyword +msgid "label expects two arguments" +msgstr "label の引数は2つです" + #. i18n: 'always', 'auto', and 'never' are keywords and should #. not be translated msgid "when to colorize (boolean, always, auto, or never)" @@ -1741,7 +1801,15 @@ " supported by Mercurial sources." msgstr "" " --sourcesort 変換元のリビジョン順序を維持します。 変換元形式が\n" -" Mercurial でのみサポートされています。" +" Mercurial の場合のみサポートされます。" + +msgid "" +" --closesort try to move closed revisions as close as possible\n" +" to parent branches, only supported by Mercurial\n" +" sources." +msgstr "" +" --closesort 閉鎖実施リビジョンを、 極力親ブランチ傍に移動します。\n" +" 変換元形式が Mercurial の場合のみサポートされます。" msgid "" " If ``REVMAP`` isn't given, it will be put in a default location\n" @@ -1970,8 +2038,8 @@ " :convert.cvsps.cache: Set to False to disable remote log caching,\n" " for testing and debugging purposes. Default is True." msgstr "" -" :convert.cvsps.cache: リモートログのキャッシュを抑止します\n" -" (試験およびデバッグ用)。 デフォルト値は True。" +" :convert.cvsps.cache: (試験およびデバッグ用) False 設定により、\n" +" リモートログのキャッシュを抑止します。 デフォルト値は True。" msgid "" " :convert.cvsps.fuzz: Specify the maximum time (in seconds) that is\n" @@ -2011,22 +2079,31 @@ " デフォルト値は ``{{mergefrombranch ([-\\w]+)}}``" msgid "" -" :hook.cvslog: Specify a Python function to be called at the end of\n" +" :convert.localtimezone: use local time (as determined by the TZ\n" +" environment variable) for changeset date/times. The default\n" +" is False (use UTC)." +msgstr "" +" :convert.localtimezone: 新規リビジョンの作成日時情報に、 実行環境の、\n" +" タイムゾーンを使用します (TZ 環境変数から推定)。 デフォルト値は\n" +" False です (UTC として扱います)。" + +msgid "" +" :hooks.cvslog: Specify a Python function to be called at the end of\n" " gathering the CVS log. The function is passed a list with the\n" " log entries, and can modify the entries in-place, or add or\n" " delete them." msgstr "" -" :hook.cvslog: CVS のログ収集処理後に呼ばれる Python 関数。\n" +" :hooks.cvslog: CVS のログ収集処理後に呼ばれる Python 関数。\n" " 関数呼び出しの際には、 ログエントリの一覧が渡され、\n" " 一覧要素の改変や、 追加/削除を、 直接実施できます。" msgid "" -" :hook.cvschangesets: Specify a Python function to be called after\n" +" :hooks.cvschangesets: Specify a Python function to be called after\n" " the changesets are calculated from the CVS log. The\n" " function is passed a list with the changeset entries, and can\n" " modify the changesets in-place, or add or delete them." msgstr "" -" :hook.cvschangesets: CVS ログからのリビジョン算出完了後に呼ばれる\n" +" :hooks.cvschangesets: CVS ログからのリビジョン算出完了後に呼ばれる\n" " Python 関数。 関数呼び出しの際には、 リビジョン一覧が渡され、\n" " リビジョンの改変や、 追加/削除を、 直接実施できます。" @@ -2224,13 +2301,16 @@ msgstr "変換時のブランチ名変換用ファイル" msgid "try to sort changesets by branches" -msgstr "ブランチによるリビジョンの並び替えを試す" +msgstr "ブランチによるリビジョンの並び替え" msgid "try to sort changesets by date" -msgstr "日付によるリビジョンの並び替えを試す" +msgstr "日付によるリビジョンの並び替え" msgid "preserve source changesets order" -msgstr "元リポジトリでのリビジョンの並び順を尊重" +msgstr "元リポジトリでのリビジョン順を尊重" + +msgid "try to reorder closed revisions" +msgstr "閉鎖実施リビジョン群の並び替え" msgid "hg convert [OPTION]... SOURCE [DEST [REVMAP]]" msgstr "hg convert [OPTION]... SOURCE [DEST [REVMAP]]" @@ -2298,7 +2378,7 @@ #, python-format msgid "%s is not available in %s anymore" -msgstr "%s は %s において存在しません" +msgstr "%s は %s に存在しません" #, python-format msgid "%s.%s symlink has no target" @@ -2418,6 +2498,9 @@ msgid "--sourcesort is not supported by this data source" msgstr "指定の変換元は --sourcesort が未サポートです" +msgid "--closesort is not supported by this data source" +msgstr "指定の変換元は --closesort が未サポートです" + #, python-format msgid "%s does not look like a CVS checkout" msgstr "%s は CVS 作業領域ではないと思われます" @@ -2463,6 +2546,9 @@ msgid "reading cvs log cache %s\n" msgstr "CVS ログキャッシュ %s 読み込み中\n" +msgid "ignoring old cache\n" +msgstr "古いログキャッシュを無視します\n" + #, python-format msgid "cache has %d log entries\n" msgstr "キャッシュには %d 件のログエントリがあります\n" @@ -2573,6 +2659,10 @@ msgstr "%r オブジェクトが %s から読み込めません" #, python-format +msgid "cannot read submodules config file in %s" +msgstr "%s におけるサブモジュールの設定ファイルが読み込めません" + +#, python-format msgid "cannot read changes in %s" msgstr "%s の変更を読み込めません" @@ -2708,8 +2798,8 @@ "svn: cannot probe remote repository, assume it could be a subversion " "repository. Use --source-type if you know better.\n" msgstr "" -"svn: subversion の連携先リポジトリの確認に失敗しました。 --source-type の使用" -"を検討してください。\n" +"svn: 連携先リポジトリの確認に失敗しました。連携先を subversion リポジトリと仮" +"定します。他の形式の場合は --source-type を使用してください。\n" #, python-format msgid "%s does not look like a Subversion repository" @@ -3038,7 +3128,7 @@ msgstr "" "本エクステンションは、 リビジョン間、 ないしリビジョンと作業領域の間で、\n" "差分表示を行う際に、 外部コマンドを利用可能にします。 外部コマンドは、\n" -"設定で改変可能なオプション群と、 2つの引数 (比較対象ファイルを格納した、\n" +"設定で変更可能なオプション群と、 2つの引数 (比較対象ファイルを格納した、\n" "スナップショットディレクトリへのパス) を使って起動されます。" msgid "" @@ -3550,9 +3640,6 @@ msgid "do not display revision or any of its ancestors" msgstr "当該リビジョンとその祖先の表示を抑止" -msgid "show hidden changesets (DEPRECATED)" -msgstr "隠れたリビジョンの表示 (DEPRECATED)" - msgid "[OPTION]... [FILE]" msgstr "[OPTION]... [FILE]" @@ -3889,7 +3976,7 @@ " pick 7c2fd3b9020c Add delta" msgid "" -" # Edit history between 633536316234 and 7c2fd3b9020c\n" +" # Edit history between c561b4e977df and 7c2fd3b9020c\n" " #\n" " # Commands:\n" " # p, pick = use commit\n" @@ -3900,7 +3987,7 @@ " # m, mess = edit message without changing commit content\n" " #" msgstr "" -" # 633536316234 から 7c2fd3b9020c にかけての履歴の編集\n" +" # c561b4e977df から 7c2fd3b9020c にかけての履歴の編集\n" " #\n" " # 指定可能コマンド:\n" " # p, pick = リビジョンを採用\n" @@ -3908,7 +3995,7 @@ " # f, fold = リビジョンを採用: 但し直前のリビジョンに併合\n" " # (このリビジョンが N 番目なら、N - 1 番目に併合)\n" " # d, drop = リビジョンを破棄\n" -" # m, mess = 改変内容を維持しつつ、コミットログを修正\n" +" # m, mess = 変更内容を維持しつつ、コミットログを修正\n" " #" msgid "" @@ -3921,8 +4008,8 @@ msgstr "" "このファイル中の ``#`` で始まる行は無視されます。 履歴編集対象に対して、\n" "各リビジョン毎の処理内容 (rule) を指定してください。 例えば \"Add beta\"\n" -"による改変よりも \"Add gamma\" による改変を先に実施した上で、 \"Add\n" -"delta\" による改変を \"Add beta\" へと併合 (fold) する場合なら、\n" +"による変更よりも \"Add gamma\" による変更を先に実施した上で、 \"Add\n" +"delta\" による変更を \"Add beta\" へと併合 (fold) する場合なら、\n" "以下のように記述します::" msgid "" @@ -4050,11 +4137,10 @@ "履歴は改変前の状態に戻ります。" msgid "" -"If we clone the example repository above and add three more changes, such " -"that\n" -"we have the following history::" -msgstr "" -"上記の実行例におけるリポジトリを複製し、 そこで履歴を3つ追加した結果、\n" +"If we clone the histedit-ed example repository above and add four more\n" +"changes, such that we have the following history::" +msgstr "" +"改変済みの実行例リポジトリを複製し、 そこで履歴を4つ追加した結果、\n" "以下の様な履歴になったものと仮定します::" msgid "" @@ -4124,16 +4210,16 @@ "# m, mess = edit message without changing commit content\n" "#\n" msgstr "" -" # %s から %s にかけての履歴の編集\n" -" #\n" -" # 指定可能コマンド:\n" -" # p, pick = リビジョンを採用\n" -" # e, edit = リビジョンを採用: 但し修正のために一旦実行を中断\n" -" # f, fold = リビジョンを採用: 但し直前のリビジョンに併合\n" -" # (このリビジョンが N 番目なら、N - 1 番目に併合)\n" -" # d, drop = リビジョンを破棄\n" -" # m, mess = 改変内容を維持しつつ、コミットログを修正\n" -" #\n" +"# %s から %s にかけての履歴の編集\n" +"#\n" +"# 指定可能コマンド:\n" +"# p, pick = リビジョンを採用\n" +"# e, edit = リビジョンを採用: 但し修正のために一旦実行を中断\n" +"# f, fold = リビジョンを採用: 但し直前のリビジョンに併合\n" +"# (このリビジョンが N 番目なら、N - 1 番目に併合)\n" +"# d, drop = リビジョンを破棄\n" +"# m, mess = 変更内容を維持しつつ、コミットログを修正\n" +"#\n" msgid "Fix up the change and run hg histedit --continue" msgstr "衝突解消後に \"hg histedit --continue\" してください" @@ -4153,6 +4239,13 @@ msgid "%s: empty changeset" msgstr "%s: 空のリビジョン" +#, python-format +msgid "comparing with %s\n" +msgstr "%s と比較中\n" + +msgid "no outgoing ancestors" +msgstr "反映候補リビジョンがありません" + msgid "Read history edits from the specified file." msgstr "履歴改変手順を指定ファイルから読み込み" @@ -4187,13 +4280,6 @@ msgid "source has mq patches applied" msgstr "元リポジトリでは MQ パッチが適用中です" -msgid "only one repo argument allowed with --outgoing" -msgstr "--outgoing 指定時には、引数は1つしか指定できません" - -#, python-format -msgid "comparing with %s\n" -msgstr "%s と比較中\n" - msgid "--force only allowed with --outgoing" msgstr "--outgoing 指定時のみ --force を指定可能です" @@ -4206,15 +4292,18 @@ msgid "history edit already in progress, try --continue or --abort" msgstr "履歴改変は継続中です。 --continue または --abort を指定してください" +msgid "no revisions allowed with --outgoing" +msgstr "--outgoing とリビジョン指定は併用できません" + +msgid "only one repo argument allowed with --outgoing" +msgstr "--outgoing 指定時には、引数は1つしか指定できません" + msgid "histedit requires exactly one parent revision" msgstr "履歴改変には単一の親リビジョンを指定してください" -msgid "nothing to edit\n" -msgstr "改変の必要なリビジョンがありません\n" - -#, python-format -msgid "working directory parent is not a descendant of %s" -msgstr "作業領域の親リビジョンは %s の子孫ではありません" +#, python-format +msgid "%s is not an ancestor of working directory" +msgstr "%s は作業領域の祖先ではありません" #, python-format msgid "update to %s or descendant and run \"hg histedit --continue\" again" @@ -4228,25 +4317,33 @@ msgid "cannot edit immutable changeset: %s" msgstr "改変不能なリビジョンがあります: %s" -msgid "must specify a rule for each changeset once" -msgstr "1リビジョン毎に1つのルール指定が必要です" - #, python-format msgid "malformed line \"%s\"" msgstr "不正な行 \"%s\"" -msgid "may not use changesets other than the ones listed" -msgstr "対象範囲以外のリビジョンは指定できません" - #, python-format msgid "unknown changeset %s listed" msgstr "未知のリビジョン %s が指定されました" +msgid "may not use changesets other than the ones listed" +msgstr "対象範囲以外のリビジョンは指定できません" + +#, python-format +msgid "duplicated command for changeset %s" +msgstr "リビジョン %s へのルール指定が複数あります" + #, python-format msgid "unknown action \"%s\"" msgstr "未知の操作 \"%s\" が指定されました" #, python-format +msgid "missing rules for changeset %s" +msgstr "リビジョン %s へのルール指定がありません" + +msgid "do you want to use the drop action?" +msgstr "リビジョンの破棄には drop コマンド指定が必要です" + +#, python-format msgid "histedit: moving bookmarks %s from %s to %s\n" msgstr "histedit: ブックマーク %s を %s から %s に移動中\n" @@ -4396,12 +4493,6 @@ msgid "cannot start: socket is already bound" msgstr "ソケットが既にバインドされているため開始できません" -msgid "" -"cannot start: tried linking .hg/inotify.sock to a temporary socket but .hg/" -"inotify.sock already exists" -msgstr "" -"一時ソケットに使用する .hg/inotify.sock が既に存在するため開始できません" - #, python-format msgid "answering query for %r\n" msgstr "%r への問い合わせに返答中\n" @@ -4476,7 +4567,7 @@ "Keywords expand to the changeset data pertaining to the latest change\n" "relative to the working directory parent of each file." msgstr "" -"作業領域の各ファイルに対する直近の更新内容を使用して、\n" +"作業領域の各ファイルに対する直近の変更内容を使用して、\n" "キーワードの展開が行われます" msgid "" @@ -4570,7 +4661,7 @@ msgstr "" "複数行に渡る展開や、 CVS の $Log$ のような増加する内容の展開は\n" "未サポートです。 キーワードテンプレート設定 \"Log = {desc}\" は、\n" -"コミットメッセージの最初の一行を埋め込みます。\n" +"コミットログの最初の一行を埋め込みます。\n" #, python-format msgid "overwriting %s expanding keywords\n" @@ -4850,24 +4941,64 @@ "enabled for this to work." msgstr "" "連携先リポジトリに反映しようとするリビジョンが、 大容量ファイルに対して、\n" -"追加/改変を実施している場合、 該当するリビジョンの大容量ファイルが、\n" +"追加/変更を実施している場合、 該当するリビジョンの大容量ファイルが、\n" "連携先に転送されます。 この際、 連携先リポジトリで稼動する Mercurial は、\n" "largefiles エクステンションが有効になっていなければなりません。" msgid "" "When you pull a changeset that affects largefiles from a remote\n" -"repository, Mercurial behaves as normal. However, when you update to\n" -"such a revision, any largefiles needed by that revision are downloaded\n" -"and cached (if they have never been downloaded before). This means\n" -"that network access may be required to update to changesets you have\n" -"not previously updated to." -msgstr "" -"連携先リポジトリから取り込まれるリビジョンが、 大容量ファイルに対して、\n" -"影響のあるものである場合、 Mercurial の挙動は通常と変わりません。\n" -"但し、 当該リビジョンでの作業領域更新の際に、 必要な大容量ファイルが、\n" -"転送/キャッシュされます (事前入手されていない場合のみ)。 そのため、\n" -"これまで :hg:`update` 対象になっていないリビジョンを使用する際には、\n" -"ネットワーク接続を必要とする可能性が生じます。" +"repository, the largefiles for the changeset will by default not be\n" +"pulled down. However, when you update to such a revision, any\n" +"largefiles needed by that revision are downloaded and cached (if\n" +"they have never been downloaded before). One way to pull largefiles\n" +"when pulling is thus to use --update, which will update your working\n" +"copy to the latest pulled revision (and thereby downloading any new\n" +"largefiles)." +msgstr "" +"連携先から取り込むリビジョンが、 大容量ファイルに関するものであっても、\n" +"特に指定が無ければ、 大容量ファイルはダウンロードされません。 その一方で、\n" +"大容量ファイルに関係するリビジョンで、 作業領域を更新しようとした場合、\n" +"必要とされる (且つ未取得な) 大容量ファイルのダウンロードと、\n" +"キャッシュ領域への格納が実施されます。 履歴取り込みと同時に、\n" +"大容量ファイルを取得する方法としては、 作業領域を最新リビジョンで更新する\n" +"--update を、 履歴取り込みの際に指定する方法があります。" + +msgid "" +"If you want to pull largefiles you don't need for update yet, then\n" +"you can use pull with the `--lfrev` option or the :hg:`lfpull` command." +msgstr "" +"作業領域更新では必要とされない大容量ファイルも取得したい場合は、\n" +"履歴取り込みの際に `--lfrev` を指定するか、 :hg:`lfpull` を使用します。" + +msgid "" +"If you know you are pulling from a non-default location and want to\n" +"download all the largefiles that correspond to the new changesets at\n" +"the same time, then you can pull with `--lfrev \"pulled()\"`." +msgstr "" +"リビジョン取り込みの際に、 関連する全大容量ファイルを取得したい場合は、\n" +"`--lfrev \"pulled()\"` を指定してください。" + +msgid "" +"If you just want to ensure that you will have the largefiles needed to\n" +"merge or rebase with new heads that you are pulling, then you can pull\n" +"with `--lfrev \"head(pulled())\"` flag to pre-emptively download any " +"largefiles\n" +"that are new in the heads you are pulling." +msgstr "" +"取得対象大容量ファイルを、 取り込まれた新規ヘッドリビジョンのマージや移動\n" +"(rebase) に必要なものだけに限定したい場合は、 `--lfrev \"head(pulled())\"`\n" +"を指定してください。" + +msgid "" +"Keep in mind that network access may now be required to update to\n" +"changesets that you have not previously updated to. The nature of the\n" +"largefiles extension means that updating is no longer guaranteed to\n" +"be a local-only operation." +msgstr "" +"関連する大容量ファイルが未取得な場合は、 作業領域更新であっても、\n" +"ネットワークアクセスが必要になるかもしれない点に留意してください。\n" +"largefiles エクステンション使用時には、 作業領域更新操作であっても、\n" +"作業中のリポジトリに閉じた操作ではない可能性があるのです。" msgid "" "If you already have large files tracked by Mercurial without the\n" @@ -4984,6 +5115,49 @@ " --to-normal を指定します。 変換後リポジトリは、 largefiles\n" " エクステンション無しでも使用できます。" +msgid "pull largefiles for the specified revisions from the specified source" +msgstr "指定リビジョンに関連する大容量ファイルの取り込み" + +msgid "" +" Pull largefiles that are referenced from local changesets but missing\n" +" locally, pulling from a remote repository to the local cache." +msgstr "" +" 作業中リポジトリ中にある指定リビジョンに関連する大容量ファイルのうち、\n" +" 未取得のものを取り込み、 キャッシュ領域に保存します。" + +msgid "" +" If SOURCE is omitted, the 'default' path will be used.\n" +" See :hg:`help urls` for more information." +msgstr "" +" 連携先が省略された場合、 'default' パスが連携先として使用されます。\n" +" 詳細は :hg:`help urls` を参照してください。" + +msgid " .. container:: verbose" +msgstr " .. container:: verbose" + +msgid " Some examples:" +msgstr " 例:" + +msgid " - pull largefiles for all branch heads::" +msgstr " - 全名前付きブランチのヘッドに関連する大容量ファイルを取得::" + +msgid " hg lfpull -r \"head() and not closed()\"" +msgstr " hg lfpull -r \"head() and not closed()\"" + +msgid " - pull largefiles on the default branch::" +msgstr " - default ブランチのリビジョンに関連する大容量ファイルを取得::" + +msgid "" +" hg lfpull -r \"branch(default)\"\n" +" " +msgstr "" +" hg lfpull -r \"branch(default)\"\n" +" " + +#, python-format +msgid "error getting id %s from url %s for file %s: %s\n" +msgstr "識別子 %s (連携先 %s のファイル %s) に対するエラー: %s\n" + msgid "getting largefiles" msgstr "大容量ファイルの取得中" @@ -4992,6 +5166,10 @@ msgstr "ファイル %s の取得中:%s\n" #, python-format +msgid "%s: largefile %s not available from %s\n" +msgstr "%s: 大容量ファイル %s は %s に存在しません\n" + +#, python-format msgid "%s: data corruption (expected %s, got %s)\n" msgstr "%s: データ破損を検出 (想定ハッシュ値 %s に対して %s)\n" @@ -5066,15 +5244,22 @@ msgstr "大容量ファイル %d 個の取得に失敗\n" msgid "getting changed largefiles\n" -msgstr "更新された大容量ファイルの取得中\n" +msgstr "変更された大容量ファイルの取得中\n" #, python-format msgid "%d largefiles updated, %d removed\n" msgstr "大容量ファイルの更新数 %d、 削除数 %d\n" -#, python-format -msgid "largefile %s is not in cache and could not be downloaded" -msgstr "大容量ファイル %s はキャッシュされておらず、ダウンロードもできません" +msgid "no revisions specified" +msgstr "リビジョン指定がありません" + +#, python-format +msgid "pulling largefiles for revision %s\n" +msgstr "リビジョン %s に関連する大容量ファイルの取得中\n" + +#, python-format +msgid "%d largefiles cached\n" +msgstr "大容量ファイル %d 個をキャッシュ\n" msgid "minimum size (MB) for files to be converted as largefiles" msgstr "大容量ファイル化するファイルの最小サイズ (MB)" @@ -5085,6 +5270,12 @@ msgid "hg lfconvert SOURCE DEST [FILE ...]" msgstr "hg lfconvert SOURCE DEST [FILE ...]" +msgid "pull largefiles for these revisions" +msgstr "指定リビジョンに関連する大容量ファイルを入手" + +msgid "-r REV... [-e CMD] [--remotecmd CMD] [SOURCE]" +msgstr "-r REV... [-e CMD] [--remotecmd CMD] [SOURCE]" + #, python-format msgid "largefiles: size must be number (not %s)\n" msgstr "largefiles: サイズは数値で指定してください (%s は不正です)\n" @@ -5108,24 +5299,12 @@ msgstr "ファイルが手元にありません" #, python-format -msgid "" -"changeset %s: %s missing\n" -" (looked for hash %s)\n" -msgstr "" -"リビジョン %s: %s が見つかりません\n" -" (ハッシュ値 %s を想定)\n" - -#, python-format -msgid "" -"changeset %s: %s: contents differ\n" -" (%s:\n" -" expected hash %s,\n" -" but got %s)\n" -msgstr "" -"リビジョン %s: %s の内容が異なります\n" -" (%s:\n" -" 想定ハッシュ値 %s に対して\n" -" 実際のハッシュ値は %s)\n" +msgid "changeset %s: %s references missing %s\n" +msgstr "リビジョン %s: %s が参照している %s が不在です\n" + +#, python-format +msgid "changeset %s: %s references corrupted %s\n" +msgstr "リビジョン %s: %s が参照している %s が破損しています\n" #, python-format msgid "%s already a largefile\n" @@ -5139,17 +5318,17 @@ msgstr "ファイル名指定がありません" #, python-format -msgid "not removing %s: %s (use forget to undo)\n" -msgstr "%s は削除されません: %s (取り消し機能は forget)\n" - -msgid "file still exists" -msgstr "ファイルは維持されます" - -msgid "file is modified" -msgstr "ファイルは改変されています" - -msgid "file has been marked for add" -msgstr "追加登録予定のファイルです" +msgid "not removing %s: file still exists\n" +msgstr "%s は削除されません: ファイルは維持されます\n" + +#, python-format +msgid "not removing %s: file is modified (use -f to force removal)\n" +msgstr "" +"%s は削除されません: ファイルは変更されています(削除の強行は -f を指定)\n" + +#, python-format +msgid "not removing %s: file has been marked for add (use forget to undo)\n" +msgstr "%s は削除されません: 追加登録対象ファイルです (取り消しは forget)\n" #, python-format msgid "removing %s\n" @@ -5211,12 +5390,8 @@ msgid "destination largefile already exists" msgstr "大容量ファイルの複製先は既に存在します" -msgid "caching new largefiles\n" -msgstr "更新された大容量ファイルのキャッシュ中\n" - -#, python-format -msgid "%d largefiles cached\n" -msgstr "大容量ファイル %d 個をキャッシュ\n" +msgid "pulled() only available in --lfrev" +msgstr "pulled() 述語は --lfrev 指定でのみ有効です" #, python-format msgid "--all-largefiles is incompatible with non-local destination %s" @@ -5259,6 +5434,10 @@ msgid "largefiles: %d to upload\n" msgstr "largefiles: %d 個の転送予定ファイル\n" +#, python-format +msgid "largefile %s is not in cache and could not be downloaded" +msgstr "大容量ファイル %s はキャッシュされておらず、ダウンロードもできません" + msgid "largefile contents do not match hash" msgstr "大容量ファイルの内容が想定ハッシュ値と一致しません" @@ -5299,14 +5478,6 @@ msgstr "remotestore: ファイル %s を開くことができません: %s" #, python-format -msgid "remotestore: largefile %s is invalid" -msgstr "remotestore: 大容量ファイル %s は無効です" - -#, python-format -msgid "remotestore: largefile %s is missing" -msgstr "remotestore: 大容量ファイル %s は存在しません" - -#, python-format msgid "changeset %s: %s: contents differ\n" msgstr "リビジョン %s: %s: 内容が異なります\n" @@ -5315,14 +5486,6 @@ msgstr "リビジョン %s: ファイル %s が不在です\n" #, python-format -msgid "" -"largefiles: repo method %r appears to have already been wrapped by another " -"extension: largefiles may behave incorrectly\n" -msgstr "" -"largefiles: 他のエクステンションが %r 機能を書き換えている模様です:" -"largefiles が想定外の挙動をする可能性があります\n" - -#, python-format msgid "file \"%s\" is a largefile standin" msgstr "\"%s\" は大容量ファイルの代理ファイルです" @@ -5337,23 +5500,29 @@ msgstr "" "指定サイズ (単位:MB) 以上のファイルを、 大容量ファイルとして追加 (既定値:10)" -msgid "verify largefiles" -msgstr "大容量ファイルを検証" - -msgid "verify all revisions of largefiles not just current" -msgstr "現リビジョン以外でも、 大容量ファイルに関する検証を実施" - -msgid "verify largefile contents not just existence" -msgstr "大容量ファイルの存在確認以外に、 内容の検証も実施" +msgid "verify that all largefiles in current revision exists" +msgstr "現リビジョンの大容量ファイルの存在を検証" + +msgid "verify largefiles in all revisions, not just current" +msgstr "大容量ファイルの検証を全リビジョンで実施" + +msgid "verify local largefile contents, not just existence" +msgstr "大容量ファイルの存在と内容の両方を検証" + +msgid "display largefiles dirstate" +msgstr "大容量ファイルの作業領域状態を表示" msgid "display outgoing largefiles" msgstr "転送対象大容量ファイルを表示" -msgid "download all pulled versions of largefiles" -msgstr "取り込みリビジョンにおいて、 大容量ファイルを全て取得" +msgid "download all pulled versions of largefiles (DEPRECATED)" +msgstr "取り込みリビジョンに関連する大容量ファイルを全て取得 (非推奨)" + +msgid "download largefiles for these revisions" +msgstr "指定リビジョンに関連する大容量ファイルを取得" msgid "download all versions of all largefiles" -msgstr "全リビジョンにおいて、 大容量ファイルを全て取得" +msgstr "全リビジョンに関連する大容量ファイルを取得" msgid "manage a stack of patches" msgstr "パッチ併用の管理" @@ -6471,7 +6640,7 @@ msgstr "パッチに記録された親リビジョンに対して適用" msgid "list patch name in commit text" -msgstr "コミットメッセージとしてパッチ名を列挙" +msgstr "コミットログとしてパッチ名を列挙" msgid "apply all patches" msgstr "全てのパッチを適用" @@ -6631,7 +6800,7 @@ msgstr "※ このオプションは無視されます (非推奨)" msgid "do not modify working copy during strip" -msgstr "処理中の作業領域更新を抑止" +msgstr "処理中の作業領域変更を抑止" msgid "remove revs only reachable from given bookmark" msgstr "指定ブックマークから、 到達可能なリビジョンのみを除外" @@ -6798,11 +6967,11 @@ #, python-format msgid "number of unguarded, unapplied patches has changed from %d to %d\n" -msgstr "ガード設定の変更により、 適用除外パッチ数が %d から %d になりました\n" +msgstr "現適用位置以後の適用可能な未適用パッチ数が %d から %d になりました\n" #, python-format msgid "number of guarded, applied patches has changed from %d to %d\n" -msgstr "ガード設定の変更により、 適用対象パッチ数が %d から %d になりました\n" +msgstr "現適用位置までの適用除外対象パッチ数が %d から %d になりました\n" msgid "guards in series file:\n" msgstr "パッチに設定されているガードの一覧:\n" @@ -6858,9 +7027,6 @@ " この機能は、 上流のリポジトリでパッチが受理された場合や、\n" " パッチ内容を上流リポジトリに反映する場合などに有用です。" -msgid "no revisions specified" -msgstr "リビジョン指定がありません" - msgid "warning: uncommitted changes in the working directory\n" msgstr "警告: 作業領域の変更が未コミットです\n" @@ -7185,7 +7351,7 @@ " リポジトリの絶対パスが使用されます。 ``notify.strip`` によって、\n" " リポジトリのパスを、 相対パス化することができます。 例えば、\n" " ``notify.strip=3`` は ``/long/path/repository`` を ``repository``\n" -" に改変します。 デフォルト値は 0。" +" に変更します。 デフォルト値は 0。" msgid "" "notify.domain\n" @@ -7428,15 +7594,15 @@ "message contains two or three body parts:" msgstr "" "個々のメールの Subject ヘッダは、 \"[PATCH M of N]\" で始まり、 対応する\n" -"リビジョンのコミットメッセージの最初の行の内容が記載されます。 メールの\n" -"本文は、 以下の様な2ないし3の部位から構成されます:" +"リビジョンのコミットログの最初の行の内容が記載されます。 メールの本文は、\n" +"以下の様な2ないし3の部位から構成されます:" msgid "" "- The changeset description.\n" "- [Optional] The result of running diffstat on the patch.\n" "- The patch itself, as generated by :hg:`export`." msgstr "" -"- コミットメッセージ\n" +"- コミットログ\n" "- パッチの差分統計(diffstat)結果 [省略可能]\n" "- :hg:`export` 形式と同様のパッチ内容" @@ -7596,9 +7762,9 @@ " description." msgstr "" " 個々のメールの Subject ヘッダは、 \"[PATCH M of N]\" で始まり、\n" -" 対応するリビジョンのコミットメッセージの1行目が記載されます。\n" +" 対応するリビジョンのコミットログの1行目が記載されます。\n" " メール本文は、 2ないし3の部位から構成されます。\n" -" 最初の部位にはコミットメッセージの続きが配置されます。" +" 最初の部位にはコミットログの続きが配置されます。" msgid "" " With the -d/--diffstat option, if the diffstat program is\n" @@ -7945,7 +8111,7 @@ " - Ignored files (unless --all is specified)\n" " - New files added to the repository (with :hg:`add`)" msgstr "" -" - 改変の有無に関わらず、管理下にあるファイル\n" +" - Mercurial の管理下にあるファイル(変更の有無に関わらず)\n" " - 無視対象ファイル (--all 指定の無い場合)\n" " - 新規登録されたファイル (:hg:`add` 実施対象)" @@ -8076,6 +8242,13 @@ " 悪影響があります。" msgid "" +" In its default configuration, Mercurial will prevent you from\n" +" rebasing published changes. See :hg:`help phases` for details." +msgstr "" +" 通常の設定では、 公開済みリビジョンは、 移動できません。 詳細は\n" +" :hg:`help phases` を参照してください。" + +msgid "" " If you don't specify a destination changeset (``-d/--dest``),\n" " rebase uses the tipmost head of the current named branch as the\n" " destination. (The destination changeset is not modified by\n" @@ -8111,6 +8284,17 @@ " 作業領域の親リビジョンを \"base\" とみなします。" msgid "" +" For advanced usage, a third way is available through the ``--rev``\n" +" option. It allows you to specify an arbitrary set of changesets to\n" +" rebase. Descendants of revs you specify with this option are not\n" +" automatically included in the rebase." +msgstr "" +" より踏み込んだ対象指定として、 ``--rev`` を使用する方法があります。\n" +" ``--rev`` を使うことで、任意のリビジョンを、 移動対象に指定できます。\n" +" この方法でリビジョンを指定した場合、 指定リビジョンの子孫は、\n" +" 自動的には移動対象に含まれません。" + +msgid "" " By default, rebase recreates the changesets in the source branch\n" " as descendants of dest and then destroys the originals. Use\n" " ``--keep`` to preserve the original source changesets. Some\n" @@ -8183,6 +8367,9 @@ msgid "use --keep to keep original changesets" msgstr "元リビジョンを維持する場合は --keep を使用してください" +msgid "nothing to rebase\n" +msgstr "移動の必要はありません\n" + #, python-format msgid "can't rebase immutable changeset %s" msgstr "改変不能なリビジョン %s は移動できません" @@ -8190,9 +8377,6 @@ msgid "see hg help phases for details" msgstr "詳細は \"hg help phases\" を参照してください" -msgid "nothing to rebase\n" -msgstr "移動の必要はありません\n" - msgid "cannot collapse multiple named branches" msgstr "複数の名前付きブランチの単一化はできません" @@ -8245,9 +8429,6 @@ msgid "no matching revisions" msgstr "合致するリビジョンはありません" -msgid "can't rebase multiple roots" -msgstr "複数リビジョン由来のリビジョンは移動できません" - msgid "source is ancestor of destination" msgstr "移動元は移動先の祖先です" @@ -8299,7 +8480,7 @@ msgstr "&No - この変更をスキップします" msgid "&Edit the change manually" -msgstr "&Edit - 変更を手動で改変します" +msgstr "&Edit - 変更内容を手動で編集します" msgid "&Skip remaining changes to this file" msgstr "&Skip - このファイルの残りの変更を全てスキップします" @@ -8320,10 +8501,10 @@ msgstr "&?" msgid "cannot edit patch for whole file" -msgstr "ファイル全体に対するパッチの手動改変できません" +msgstr "ファイル全体に対するパッチは編集できません" msgid "cannot edit patch for binary file" -msgstr "バイナリファイル向けパッチの手動改変はできません" +msgstr "バイナリファイル向けパッチは編集できません" msgid "" "\n" @@ -8346,7 +8527,7 @@ "パッチ適用が成功した場合、 編集後の差分は、 記録対象に追加されます。\n" "適用が失敗した場合、 却下差分はファイルに保存されます。 再試行の際は、\n" "このファイルを利用可能です。 差分の全行が削除された場合、\n" -"改変作業は中断され、差分はそのまま維持されます。\n" +"編集作業は中断され、差分はそのまま維持されます。\n" msgid "edit failed" msgstr "編集に失敗" @@ -8398,7 +8579,7 @@ msgstr "" " y - この変更を記録します\n" " n - この変更をスキップします\n" -" e - この変更を手動で改変します" +" e - この変更を手動で編集します" msgid "" " s - skip remaining changes to this file\n" @@ -8443,6 +8624,10 @@ msgid "cannot partially commit a merge (use \"hg commit\" instead)" msgstr "マージの部分コミットはできません (\"hg commit\" を使用してください)" +#, python-format +msgid "error parsing patch: %s" +msgstr "パッチ解析に失敗: %s" + msgid "no changes to record\n" msgstr "記録可能な変更がありません\n" @@ -8628,6 +8813,10 @@ msgstr "定義済みスキーマは、 同名スキーマ定義により、 上書き可能です。\n" #, python-format +msgid "no '://' in scheme url '%s'" +msgstr "'://' 記述がスキーマ URL '%s' に含まれていません" + +#, python-format msgid "custom scheme %s:// conflicts with drive letter %s:\\\n" msgstr "独自スキーマ %s:// は、 ドライブ文字 %s:\\ と衝突します\n" @@ -8692,8 +8881,13 @@ msgid "command to transplant changesets from another branch" msgstr "別ブランチからのリビジョンの移植" -msgid "This extension allows you to transplant patches from another branch." -msgstr "本エクステンションは、 別ブランチからのリビジョン移植を可能にします。" +msgid "" +"This extension allows you to transplant changes to another parent revision,\n" +"possibly in another repository. The transplant is done using 'diff' patches." +msgstr "" +"本エクステンションにより、 一連のリビジョン群を別な親リビジョン\n" +"(リポジトリ横断も可能) の先に移植できます。 移植はパッチ形式 ('diff')\n" +"を元に実施されます (※ 訳注: rebase や graft は 3-way マージで実施)。" msgid "" "Transplanted patches are recorded in .hg/transplant/transplants, as a\n" @@ -8780,14 +8974,14 @@ msgid "no such option\n" msgstr "そのようなオプションはありません\n" -msgid "pull patches from REPO" -msgstr "指定リポジトリからの取り込み" - -msgid "pull patches from branch BRANCH" -msgstr "指定ブランチからの取り込み" - -msgid "pull all changesets up to BRANCH" -msgstr "指定ブランチの全てを取り込む" +msgid "transplant changesets from REPO" +msgstr "指定リポジトリからのリビジョンの移植" + +msgid "use this source changeset as head" +msgstr "指定リビジョンを移植元のヘッドとみなす" + +msgid "pull all changesets up to the --branch revisions" +msgstr "--branch での指定ブランチの全てを取り込む" msgid "skip over REV" msgstr "指定リビジョンのスキップ" @@ -8801,8 +8995,8 @@ msgid "append transplant info to log message" msgstr "コミットログへの移植情報の付与" -msgid "continue last transplant session after repair" -msgstr "中断された直前の移植作業の再開" +msgid "continue last transplant session after fixing conflicts" +msgstr "衝突解消後における、中断された直前の移植作業の再開" msgid "filter changesets through command" msgstr "コマンドによるリビジョンのフィルタリング" @@ -8816,14 +9010,23 @@ msgid "" " Selected changesets will be applied on top of the current working\n" " directory with the log of the original changeset. The changesets\n" -" are copied and will thus appear twice in the history. Use the\n" -" rebase extension instead if you want to move a whole branch of\n" -" unpublished changesets." +" are copied and will thus appear twice in the history with different\n" +" identities." msgstr "" " 移植対象リビジョンは、 作業領域の親リビジョンの子孫として、\n" " コミットログを維持しつつ複製されます。 移植での複製により、\n" -" 移植対象と同内容のリビジョンは、 履歴上に2回登場することになります。\n" -" 未公開リビジョンのブランチを、 まとめて移動したい場合は、 rebase\n" +" 移植対象と同内容のリビジョンが、 履歴上に2回 (識別子はそれぞれ異なる)\n" +" 登場することになります。" + +msgid "" +" Consider using the graft command if everything is inside the same\n" +" repository - it will use merges and will usually give a better result.\n" +" Use the rebase extension if the changesets are unpublished and you want\n" +" to move them instead of copying them." +msgstr "" +" 移植元/先が同一リポジトリの場合は、 graft の使用を検討しましょう。\n" +" graft の内部処理は 3-way マージを使用するため、 多くの場合で transplant\n" +" よりも良い結果が得られます。未公開リビジョンの移動の場合は、rebase\n" " エクステンションを使用してください。" msgid "" @@ -8845,27 +9048,30 @@ " 第2引数にはパッチが格納されたファイルが指定されます。" msgid "" -" If --source/-s is specified, selects changesets from the named\n" -" repository. If --branch/-b is specified, selects changesets from\n" -" the branch holding the named revision, up to that revision. If\n" -" --all/-a is specified, all changesets on the branch will be\n" -" transplanted, otherwise you will be prompted to select the\n" -" changesets you want." -msgstr "" -" --source/-s が指定された場合は、 指定リポジトリが移植元になります。\n" -" --branch/-b が指定された場合は、 指定リビジョンを含むブランチの、\n" -" 指定リビジョンまでが移植元になります。 --all/-a が指定された場合は、\n" -" 指定ブランチ中の全てのリビジョンが移植対象となり、 それ以外の場合は、\n" -" 移植候補リビジョンに関して、 移植要否の問い合わせが発生します。" - -msgid "" -" :hg:`transplant --branch REV --all` will transplant the\n" -" selected branch (up to the named revision) onto your current\n" -" working directory." -msgstr "" -" :hg:`transplant --branch REV --all` 実行により、\n" -" 指定された名前付きブランチ (の中の REV までのリビジョン) が、\n" -" 作業領域の親リビジョンの子として、 移植されます。" +" --source/-s specifies another repository to use for selecting " +"changesets,\n" +" just as if it temporarily had been pulled.\n" +" If --branch/-b is specified, these revisions will be used as\n" +" heads when deciding which changsets to transplant, just as if only\n" +" these revisions had been pulled.\n" +" If --all/-a is specified, all the revisions up to the heads specified\n" +" with --branch will be transplanted." +msgstr "" +" 移植対象リビジョンは --source/-s で指定したリポジトリから、\n" +" 移植時に取り込むことが可能です。 --branch/-b が指定された場合、\n" +" 指定ブランチのみの取り込みを仮定して、 移植対象が決定されます。\n" +" --all/-a が指定された場合は、 指定リビジョンに至る全リビジョンが、\n" +" 移植対象とみなされます。" + +msgid " Example:" +msgstr " 実行例:" + +msgid "" +" - transplant all changes up to REV on top of your current revision::" +msgstr " - REV までの全リビジョンを、現リビジョン上に移植::" + +msgid " hg transplant --branch REV --all" +msgstr " hg transplant --branch REV --all" msgid "" " You can optionally mark selected transplanted changesets as merge\n" @@ -8907,11 +9113,11 @@ " --continue/-c` を実行することで、 中断された移植を再開可能です。\n" " " -msgid "--continue is incompatible with branch, all or merge" +msgid "--continue is incompatible with --branch, --all and --merge" msgstr "--continue は --branch、 --all、 --merge と併用できません" -msgid "no source URL, branch tag or revision list provided" -msgstr "元 URL、 ブランチタグ、 リビジョン一覧のいずれも指定されていません" +msgid "no source URL, branch revision or revision list provided" +msgstr "元 URL、 ブランチ名、 リビジョン一覧のいずれも指定されていません" msgid "--all requires a branch revision" msgstr "--all にはブランチリビジョン指定が必要です" @@ -9200,6 +9406,9 @@ msgid "archiving" msgstr "アーカイブ中" +msgid "no files match the archive pattern" +msgstr "指定パターンに合致するファイルがありません" + #, python-format msgid "malformed line in .hg/bookmarks: %r\n" msgstr "不正な .hg/bookmarks 記述行: %r\n" @@ -9225,9 +9434,8 @@ msgid "unknown parent" msgstr "未知の親" -#, python-format -msgid "integrity check failed on %s:%d" -msgstr "%s:%d の一貫性チェックに失敗" +msgid "unknown delta base" +msgstr "未知の差分ベース" msgid "cannot create new bundle repository" msgstr "バンドルリポジトリの新規作成はできません" @@ -9510,6 +9718,10 @@ msgstr "HG: ブランチ '%s'" #, python-format +msgid "HG: bookmark '%s'" +msgstr "HG: ブックマーク '%s'" + +#, python-format msgid "HG: subrepo %s" msgstr "HG: サブリポジトリ %s" @@ -9531,6 +9743,17 @@ msgid "empty commit message" msgstr "コミットログがありません" +msgid "created new head\n" +msgstr "新規ヘッドが増えました\n" + +#, python-format +msgid "reopening closed branch head %d\n" +msgstr "閉鎖済みブランチヘッド %d の再開中\n" + +#, python-format +msgid "committed changeset %d:%s\n" +msgstr "コミット対象リビジョン %d:%s\n" + #, python-format msgid "forgetting %s\n" msgstr "%s の追加登録を取り消し中\n" @@ -9608,6 +9831,9 @@ msgid "display help and exit" msgstr "ヘルプ情報を表示して終了" +msgid "consider hidden changesets" +msgstr "不可視状態のリビジョンも対象に含める" + msgid "do not perform actions, just print output" msgstr "実施予定の処理内容の表示のみで処理実施は抑止" @@ -9708,9 +9934,6 @@ msgstr "" " ファイル名指定が無い場合、 作業領域中の全ファイルが対象となります。" -msgid " .. container:: verbose" -msgstr " .. container:: verbose" - msgid "" " An example showing how new (unknown) files are added\n" " automatically by :hg:`add`::" @@ -10011,12 +10234,10 @@ " cancel the merge and leave the child of REV as a head to be\n" " merged separately." msgstr "" -" 1.8 版より前の本コマンドの --merge 無し時挙動は、\n" -" 打消しを --merge 付きで実行した後で、\n" -" :hg:`update --clean .` を実行した場合と等価です。\n" -" ここでの :hg:`update --clean .` 実行は、\n" -" マージの実施をキャンセルし、\n" -" 打ち消しリビジョンを後から別途マージできるように、\n" +" 1.7 版より前の本コマンドの --merge 無し時挙動は、 打消しを --merge\n" +" 付きで実行した後に :hg:`update --clean .` 実行したものと等価です。\n" +" ここでの :hg:`update --clean .` 実行は、 マージ実施をキャンセルし、\n" +" 後から別途マージできるように、 打ち消しリビジョンを、\n" " ヘッドのまま残す働きをします。" msgid "please specify just one revision" @@ -10070,7 +10291,7 @@ msgstr "[-gbsr] [-U] [-c CMD] [REV]" msgid "subdivision search of changesets" -msgstr "リビジョンの分割探索" +msgstr "リビジョンの二分探索" msgid "" " This command helps to find changesets which introduce problems. To\n" @@ -10082,7 +10303,7 @@ " bad, and bisect will either update to another candidate changeset\n" " or announce that it has found the bad revision." msgstr "" -" 問題発生契機となるリビジョンの特定を補助します。 使用開始の際には、\n" +" 問題発生契機となるリビジョンを探索します。 使用開始の際には、\n" " 問題が発生する既知のリビジョンのうち、 最古のものを bad とマークし、\n" " 問題が発生しない既知のリビジョンのうち、 最新のものを good とマーク\n" " します。 本コマンドは、 検証対象リビジョンで作業領域を更新します(-U/\n" @@ -10113,13 +10334,10 @@ " はリビジョンのスキップ、 127 (コマンド不在) は探索中断、\n" " その他の非 0 終了コードは bad とみなされます。" -msgid " Some examples:" -msgstr " 例:" - msgid "" " - start a bisection with known bad revision 12, and good revision 34::" msgstr "" -" - 既知の bad なリビジョン 12 と good なリビジョン 34 から検索開始::" +" - 既知の bad なリビジョン 12 と good なリビジョン 34 から探索開始::" msgid "" " hg bisect --bad 34\n" @@ -10132,7 +10350,7 @@ " - advance the current bisection by marking current revision as good " "or\n" " bad::" -msgstr " - 現リビジョンを good ないし bad 化して検索状態を進める::" +msgstr " - 現リビジョンを good ないし bad 化して探索を継続::" msgid "" " hg bisect --good\n" @@ -10156,8 +10374,17 @@ " hg bisect --skip\n" " hg bisect --skip 23" +msgid "" +" - skip all revisions that do not touch directories ``foo`` or ``bar``" +msgstr " - ``foo`` と ``bar`` の両方を変更したリビジョン以外をスキップ::" + +msgid "" +" hg bisect --skip '!( file(\"path:foo\") & file(\"path:bar\") )'" +msgstr "" +" hg bisect --skip '!( file(\"path:foo\") & file(\"path:bar\") )'" + msgid " - forget the current bisection::" -msgstr " - 現行の検索状態をクリア::" +msgstr " - 現行の探索状態をクリア::" msgid " hg bisect --reset" msgstr " hg bisect --reset" @@ -10182,7 +10409,7 @@ msgid "" " - see all changesets whose states are already known in the current\n" " bisection::" -msgstr " - 現在の検索において、 状態の判明しているリビジョン全てを表示::" +msgstr " - 現在の探索において、 状態の判明しているリビジョン全てを表示::" msgid " hg log -r \"bisect(pruned)\"" msgstr " hg log -r \"bisect(pruned)\"" @@ -10197,7 +10424,7 @@ msgstr " hg log -r \"bisect(current)\"" msgid " - see all changesets that took part in the current bisection::" -msgstr " - 現在の検索対象になっているリビジョン全てを表示::" +msgstr " - 現在の探索対象範囲のリビジョン全てを表示::" msgid " hg log -r \"bisect(range)\"" msgstr " hg log -r \"bisect(range)\"" @@ -10225,7 +10452,7 @@ "the common ancestor, %s.\n" msgstr "" "このリビジョンの祖先に対する確認は完全ではありません。\n" -"共通の祖先 %s から検索を継続する場合、\n" +"共通の祖先 %s から探索を継続する場合、\n" "--extend 付きで \"hg bisect\" を実行してください。\n" msgid "Due to skipped revisions, the first good revision could be any of:\n" @@ -10235,10 +10462,10 @@ msgstr "検証省略により、 最初の bad なリビジョンは以下から選択可能です:\n" msgid "cannot bisect (no known good revisions)" -msgstr "分割探索できません(good リビジョンが未指定です)" +msgstr "二分探索できません(good リビジョンが未指定です)" msgid "cannot bisect (no known bad revisions)" -msgstr "分割探索できません(bad リビジョンが未指定です)" +msgstr "二分探索できません(bad リビジョンが未指定です)" msgid "(use of 'hg bisect ' is deprecated)\n" msgstr "('hg bisect ' 形式の実行は推奨されません)\n" @@ -10333,6 +10560,15 @@ " bookmarks エクステンションを有効にしてください。" msgid "" +" If you set a bookmark called '@', new clones of the repository will\n" +" have that revision checked out (and the bookmark made active) by\n" +" default." +msgstr "" +" ブックマーク '@' が設定されている場合、 特に指定がなければ、\n" +" 複製先の作業領域は、 そのリビジョンで更新されます (ブックマーク '@'\n" +" はアクティブになります)" + +msgid "" " With -i/--inactive, the new bookmark will not be made the active\n" " bookmark. If -r/--rev is given, the new bookmark will not be made\n" " active even if -i/--inactive is not given. If no NAME is given, the\n" @@ -10350,6 +10586,10 @@ msgstr "空白文字だけで構成されたタグ名は不正です" #, python-format +msgid "moving bookmark '%s' forward from %s\n" +msgstr "ブックマーク '%s' を %s から前方に移動中\n" + +#, python-format msgid "bookmark '%s' already exists (use -f to force)" msgstr "ブックマーク '%s' は存在します(強制実行する場合は -f を指定)" @@ -10692,6 +10932,13 @@ " タグ「実施」リビジョンは、 複製先に取り込まれません。" msgid "" +" If the source repository has a bookmark called '@' set, that\n" +" revision will be checked out in the new repository by default." +msgstr "" +" 複製元リポジトリにブックマーク '@' が設定されている場合、\n" +" 特に指定がなければ、 複製先の作業領域は、 そのリビジョンで更新されます。" + +msgid "" " To check out a particular version, use -u/--update, or\n" " -U/--noupdate to create a clone with no working directory." msgstr "" @@ -10732,8 +10979,8 @@ " place their metadata under the .hg directory, such as mq." msgstr "" " この方法は最速の複製方法かもしれませんが、 常に安全とは限りません。\n" -" 操作の単一性は保障されません (リポジトリの複製中改変の防止は、 \n" -" 利用者責務) し、 利用するエディタのファイル改変時の振る舞いが、\n" +" 操作の単一性は保障されません (複製中のリポジトリの変更防止は、 \n" +" 利用者責務) し、 利用するエディタのファイル変更時の振る舞いが、\n" " ハードリンクを破棄するものである必要があります (Emacs および多くの\n" " Linux 系ツールは、 そのように振舞います)。 この制約は、\n" " MQ エクステンションのように、 .hg ディレクトリ配下に、\n" @@ -10755,8 +11002,9 @@ " d) the changeset specified with -r\n" " e) the tipmost head specified with -b\n" " f) the tipmost head specified with the url#branch source syntax\n" -" g) the tipmost head of the default branch\n" -" h) tip" +" g) the revision marked with the '@' bookmark, if present\n" +" h) the tipmost head of the default branch\n" +" i) tip" msgstr "" " a) -U が指定されるか、 元リポジトリ履歴が空の場合は null リビジョン\n" " b) -u . が指定され、 且つ元リポジトリが同一ホストの場合、\n" @@ -10766,8 +11014,9 @@ " d) -r で指定されたリビジョン\n" " e) -b で指定sれたブランチの最新ヘッドリビジョン\n" " f) url#branch 形式で指定されたブランチの最新ヘッドリビジョン\n" -" g) default ブランチの最新ヘッドリビジョン\n" -" h) tip" +" g) ブックマーク '@' が存在する場合は、そのリビジョン\n" +" h) default ブランチの最新ヘッドリビジョン\n" +" i) tip" msgid " - clone a remote repository to a new directory named hg/::" msgstr " - 遠隔ホストのリポジトリを、 新規 hg/ ディレクトリ配下に複製::" @@ -10903,18 +11152,12 @@ " 成功時のコマンドの終了値は 0、 変更が検出できない場合は 1 です。\n" " " -msgid "can only close branch heads" -msgstr "ブランチヘッドのみ閉鎖できます" - msgid "cannot amend recursively" msgstr "サブリポジトリを含む再帰的な改変はできません" msgid "cannot amend public changesets" msgstr "public フェーズのリビジョンは改変できません" -msgid "cannot amend merge changesets" -msgstr "マージ実施リビジョンは改変できません" - msgid "cannot amend while merging" msgstr "マージ実施中の作業領域では改変できません" @@ -10926,18 +11169,7 @@ #, python-format msgid "nothing changed (%d missing files, see 'hg status')\n" -msgstr "改変はありません (%d 個のファイルが不在。 'hg status' で確認を)\n" - -msgid "created new head\n" -msgstr "新規ヘッドが増えました\n" - -#, python-format -msgid "reopening closed branch head %d\n" -msgstr "閉鎖済みブランチヘッド %d の再開中\n" - -#, python-format -msgid "committed changeset %d:%s\n" -msgstr "コミット対象リビジョン %d:%s\n" +msgstr "変更はありません (%d 個のファイルが不在。 'hg status' で確認を)\n" msgid "record a copy that has already occurred" msgstr "手動で複製済みのファイルに対して、 複製の旨を記録" @@ -10994,7 +11226,7 @@ msgstr "2ないし3の引数が必要です" msgid "add single file mergeable changes" -msgstr "ファイルを1つ登録して、 リビジョン毎にマージ可能な改変を実施" +msgstr "ファイルを1つ登録して、 マージ可能な変更をリビジョン毎に実施" msgid "add single file all revs overwrite" msgstr "ファイルを1つ登録して、 リビジョン毎に上書きを実施" @@ -11292,7 +11524,7 @@ msgstr " エディタが起動できません(vi にも PATH が通っていません)\n" msgid " (specify a commit editor in your configuration file)\n" -msgstr " (コミットメッセージ用エディタを設定ファイルで設定してください)\n" +msgstr " (コミットログ用エディタを設定ファイルで設定してください)\n" #, python-format msgid " Can't find editor '%s' in PATH\n" @@ -11326,6 +11558,12 @@ " 各 ID 毎の既知性を、 0 と 1 で表現したリストが出力されます。\n" " " +msgid "LABEL..." +msgstr "LABEL..." + +msgid "complete \"labels\" - tags, open branch names, bookmark names" +msgstr "" + msgid "markers flag" msgstr "廃止マーカ用フラグ" @@ -11335,6 +11573,37 @@ msgid "create arbitrary obsolete marker" msgstr "任意の廃止状態の設定" +msgid " With no arguments, displays the list of obsolescence markers." +msgstr " 引数指定が無い場合、 廃止マーカを一覧表示します。" + +msgid "complete an entire path" +msgstr "パス全体を補完" + +msgid "show only normal files" +msgstr "通常ファイルのみを表示" + +msgid "show only added files" +msgstr "追加登録されたファイルを表示" + +msgid "show only removed files" +msgstr "登録除外されたファイルを表示" + +msgid "FILESPEC..." +msgstr "FILESPEC..." + +msgid "complete part or all of a tracked path" +msgstr "管理対象ファイルのパスの一部/全部の補完" + +msgid "" +" This command supports shells that offer path name completion. It\n" +" currently completes only files already known to the dirstate." +msgstr "" + +msgid "" +" Completion extends only to the next path segment unless\n" +" --full is specified, in which case entire paths are used." +msgstr "" + msgid "REPO NAMESPACE [KEY OLD NEW]" msgstr "REPO NAMESPACE [KEY OLD NEW]" @@ -11375,12 +11644,27 @@ msgid "revision to rebuild to" msgstr "再構築対象リビジョン" -msgid "[-r REV] [REV]" -msgstr "[-r REV] [REV]" +msgid "[-r REV]" +msgstr "[-r REV]" msgid "rebuild the dirstate as it would look like for the given revision" msgstr "指定リビジョン時点相当の dirstate の再構築" +msgid " If no revision is specified the first current parent will be used." +msgstr " リビジョン指定が無い場合、 作業領域の第1親指定とみなします。" + +msgid "" +" The dirstate will be set to the files of the given revision.\n" +" The actual working directory content or existing dirstate\n" +" information such as adds or removes is not considered." +msgstr "" + +msgid "" +" One use of this command is to make the next :hg:`status` invocation\n" +" check the actual file content.\n" +" " +msgstr "" + msgid "revision to debug" msgstr "デバッグ対象リビジョン" @@ -11450,6 +11734,86 @@ msgid "revision to check" msgstr "確認対象リビジョン" +msgid "[-r REV] [REV]" +msgstr "[-r REV] [REV]" + +msgid "[REV]" +msgstr "[REV]" + +msgid "show set of successors for revision" +msgstr "指定リビジョンの後継セットの表示" + +msgid "" +" A successors set of changeset A is a consistent group of revisions that\n" +" succeed A. It contains non-obsolete changesets only." +msgstr "" +" リビジョン A の後継リビジョンの、 意味のある集まりのことを、\n" +" 『リビジョン A の後継セット』と呼びます。 後継セットには、\n" +" 廃止リビジョンが含まれません。" + +msgid "" +" In most cases a changeset A has a single successors set containing a " +"single\n" +" successor (changeset A replaced by A')." +msgstr "" +" 殆どの場合、 リビジョン A の後継セットは1つだけで、 その構成要素は、\n" +" 単一後継リビジョン (対象 A を置き換える A') のみです。" + +msgid "" +" A changeset that is made obsolete with no successors are called \"pruned" +"\".\n" +" Such changesets have no successors sets at all." +msgstr "" +" 廃止の際に、 後継リビジョン指定が無かったリビジョンは、 \"pruned\"\n" +" と呼ばれます。 このようなリビジョンには、 後継セットが存在しません。" + +msgid "" +" A changeset that has been \"split\" will have a successors set " +"containing\n" +" more than one successor." +msgstr "" +" 分割 (\"split\") されたリビジョンには、 後継リビジョンを複数持つ、\n" +" 後継セットが存在します。" + +msgid "" +" A changeset that has been rewritten in multiple different ways is " +"called\n" +" \"divergent\". Such changesets have multiple successor sets (each of " +"which\n" +" may also be split, i.e. have multiple successors)." +msgstr "" +" 複数の異なる方法で書き換えられたリビジョンを、 \"分岐\" (divergent)\n" +" と呼びます。 分岐リビジョンには、 複数の後継セットが存在します。\n" +" (複数の後継セットのそれぞれが、 複数の後継リビジョンから構成される、\n" +" 分割後継セットかもしれません)" + +msgid " Results are displayed as follows::" +msgstr " 実行結果は以下のように表示されます::" + +msgid "" +" \n" +" \n" +" \n" +" \n" +" " +msgstr "" +" <リビジョン-1>\n" +" <後継-1A>\n" +" <リビジョン-2>\n" +" <後継-2A>\n" +" <後継-2B1> <後継-2B2> <後継-2B3>" + +msgid "" +" Here rev2 has two possible (i.e. divergent) successors sets. The first\n" +" holds one element, whereas the second holds three (i.e. the changeset " +"has\n" +" been split).\n" +" " +msgstr "" +" 上記の実行例では、 リビジョン-2には、 2つの後継セットが存在しています\n" +" (=分岐)。 1つ目の後継セットは、 単一リビジョンで構成されていますが、\n" +" 2つ目の後継セットは、 3つのリビジョンから構成されています (=分割)。" + msgid "show how files match on given patterns" msgstr "指定パターンへのファイル合致状況の表示" @@ -11545,15 +11909,18 @@ msgid "revisions to export" msgstr "対象リビジョン" -msgid "[OPTION]... [-o OUTFILESPEC] [-r] REV..." -msgstr "[OPTION]... [-o OUTFILESPEC] [-r] REV..." +msgid "[OPTION]... [-o OUTFILESPEC] [-r] [REV]..." +msgstr "[OPTION]... [-o OUTFILESPEC] [-r] [REV]..." msgid "dump the header and diffs for one or more changesets" msgstr "1つ以上のリビジョンに対するヘッダおよび変更内容の出力" -msgid " Print the changeset header and diffs for one or more revisions." -msgstr "" -" 1つ以上のリビジョンに対して、 ヘッダ情報および変更内容を表示します。" +msgid "" +" Print the changeset header and diffs for one or more revisions.\n" +" If no revision is given, the parent of the working directory is used." +msgstr "" +" 指定リビジョンに対して、 ヘッダ情報および変更内容を表示します。\n" +" リビジョン指定が無い場合は、 作業領域の第1親指定とみなします。" msgid "" " The information shown in the changeset header is: author, date,\n" @@ -11899,7 +12266,7 @@ msgstr "指定リビジョンの子孫となるヘッドのみを表示" msgid "show topological heads only" -msgstr "子を持たない全てのリビジョンを表示" +msgstr "構造的なヘッドのみを表示" msgid "show active branchheads only (DEPRECATED)" msgstr "アクティブなブランチヘッドのみを表示 (非推奨)" @@ -11925,7 +12292,7 @@ msgstr "" " リポジトリの「ヘッド」とは、\n" " 子リビジョンを持たないリビジョンのことです。\n" -" 改変作業の実施や、 update/merge コマンド実施の際には、\n" +" 変更作業の実施や、 update/merge コマンド実施の際には、\n" " このリビジョンを対象とするのが一般的です。\n" " 「ブランチヘッド」とは、 同じ名前付きブランチ内に、\n" " 子リビジョンを持たないリビジョンのことです。" @@ -12010,128 +12377,6 @@ " 成功時のコマンド終了値は 0 です。\n" " " -#, python-format -msgid "" -"\n" -"aliases: %s\n" -msgstr "" -"\n" -"別名: %s\n" - -msgid "(no help text available)" -msgstr "(ヘルプはありません)" - -#, python-format -msgid "shell alias for::" -msgstr "シェルコマンドの別名::" - -#, python-format -msgid " %s" -msgstr " %s" - -#, python-format -msgid "alias for: hg %s" -msgstr "コマンドの別名: hg %s" - -#, python-format -msgid "%s" -msgstr "%s" - -#, python-format -msgid "use \"hg help -e %s\" to show help for the %s extension" -msgstr "\"hg help -e %s\" によってエクステンション %s のヘルプが表示されます" - -msgid "options:" -msgstr "オプション:" - -msgid "global options:" -msgstr "グローバルオプション:" - -#, python-format -msgid "" -"\n" -"use \"hg help %s\" to show the full help text\n" -msgstr "" -"\n" -"\"hg help %s\" で詳細なヘルプが表示されます\n" - -#, python-format -msgid "use \"hg -v help %s\" to show more complete help and the global options" -msgstr "省略されたヘルプの詳細やグローバルオプションの表示は \"hg -v help %s\"" - -#, python-format -msgid "use \"hg -v help %s\" to show the global options" -msgstr "グローバルオプションの表示は \"hg -v help %s\"" - -msgid "basic commands:" -msgstr "基本コマンド:" - -msgid "list of commands:" -msgstr "コマンド一覧:" - -msgid "no commands defined\n" -msgstr "コマンドが定義されていません\n" - -msgid "enabled extensions:" -msgstr "有効化されているエクステンション:" - -msgid "" -"\n" -"additional help topics:" -msgstr "" -"\n" -"追加のヘルプトピック:" - -msgid "use \"hg help\" for the full list of commands" -msgstr "全コマンドの一覧は \"hg help\" で表示されます" - -msgid "use \"hg help\" for the full list of commands or \"hg -v\" for details" -msgstr "" -"全コマンドの一覧は \"hg help\" で、 コマンド詳細は \"hg -v\" で表示されます" - -#, python-format -msgid "use \"hg help %s\" to show the full help text" -msgstr "詳細なヘルプの表示は \"hg help %s\"" - -#, python-format -msgid "use \"hg -v help%s\" to show builtin aliases and global options" -msgstr "組み込み別名およびグローバルオプションの表示は \"hg -v help%s\"" - -#, python-format -msgid "use \"hg help -v %s\" to show more complete help" -msgstr "省略されたヘルプの詳細の表示は \"hg help %s\"" - -#, python-format -msgid "" -"\n" -"use \"hg help -c %s\" to see help for the %s command\n" -msgstr "" -"\n" -"\"hg help -c %s\" によってコマンド %s のヘルプが表示されます\n" - -msgid "no help text available" -msgstr "ヘルプはありません" - -#, python-format -msgid "%s extension - %s" -msgstr "%s エクステンション - %s" - -msgid "use \"hg help extensions\" for information on enabling extensions\n" -msgstr "\"hg help extensions\" で有効なエクステンションの情報が表示されます\n" - -#, python-format -msgid "'%s' is provided by the following extension:" -msgstr "以下のエクステンションにより '%s' が提供されています:" - -msgid "Topics" -msgstr "トピック" - -msgid "Extension Commands" -msgstr "エクステンション由来のコマンド" - -msgid "Mercurial Distributed SCM\n" -msgstr "Mercurial - 分散構成管理ツール\n" - msgid "identify the specified revision" msgstr "当該リビジョンの識別情報を表示" @@ -12222,7 +12467,7 @@ msgstr "作業領域の更新のみで、 コミット実施を抑止" msgid "apply patch without touching the working directory" -msgstr "作業領域を改変せずにパッチを適用" +msgstr "作業領域の内容を変更せずにパッチを適用" msgid "apply patch to the nodes from which it was generated" msgstr "パッチ作成時と同じ親リビジョンに対して適用" @@ -12294,7 +12539,7 @@ " patches will be applied on top of the working directory parent\n" " revision." msgstr "" -" --bypass 指定時は、 作業領域の改変無しに変更内容を反映します。\n" +" --bypass 指定時は、 作業領域内容の変更無しに、 履歴に記録します。\n" " --exact 指定が無い場合、 変更は作業領域の親リビジョンに適用されます。" msgid "" @@ -12349,15 +12594,15 @@ msgid "cannot use --similarity with --bypass" msgstr "--similarity と --bypass は併用できません" -msgid "patch is damaged or loses information" -msgstr "パッチには破損ないし情報の欠落があります" - msgid "applied to working directory" msgstr "作業領域への適用" msgid "not a Mercurial patch" msgstr "Mercurial 向けのパッチではありません" +msgid "patch is damaged or loses information" +msgstr "パッチには破損ないし情報の欠落があります" + #. i18n: refers to a short changeset id #, python-format msgid "created %s" @@ -12567,7 +12812,7 @@ " --removed を指定してください。" msgid " - changesets with full descriptions and file lists::" -msgstr " - 全リビジョンのコミットメッセージとファイル一覧の表示::" +msgstr " - 全リビジョンのコミットログとファイル一覧の表示::" msgid " hg log -v" msgstr " hg log -v" @@ -12644,9 +12889,6 @@ msgid "list files from all revisions" msgstr "関連する全ファイルの表示" -msgid "[-r REV]" -msgstr "[-r REV]" - msgid "output the current or given revision of the project manifest" msgstr "現時点ないし指定時点でのリポジトリマニフェストの出力" @@ -12928,7 +13170,7 @@ msgstr "[-p|-d|-s] [-f] [-r] REV..." msgid "set or show the current phase name" -msgstr "現行フェーズ状態の改変ないし表示" +msgstr "現行フェーズ状態の変更ないし表示" msgid " With no argument, show the phase name of specified revisions." msgstr " 引数無しの場合、 指定リビジョンのフェーズ名を表示します。" @@ -13034,13 +13276,6 @@ " 最後のリビジョンを ``-r`` の引数にして :hg:`pull -r X` を実行します。" msgid "" -" If SOURCE is omitted, the 'default' path will be used.\n" -" See :hg:`help urls` for more information." -msgstr "" -" 連携先が省略された場合、 'default' パスが連携先として使用されます。\n" -" 詳細は :hg:`help urls` を参照してください。" - -msgid "" " Returns 0 on success, 1 if an update had unresolved files.\n" " " msgstr "" @@ -13221,7 +13456,7 @@ " ファイルの状態 (横) と、 オプション指定 (縦) の組み合わせと挙動は、\n" " 以下の一覧を参照してください。\n" " ファイルの状態は、 :hg:`status` の表示に倣い、\n" -" 追加 (Added) [A]、 改変無し (Clean) [C]、 改変有り (Modified) [M]\n" +" 追加 (Added) [A]、 変更無し (Clean) [C]、 変更有り (Modified) [M]\n" " および不在 (Missing) [!] で表します。\n" " 挙動は、 警告 (Warn) [W]、 構成管理からの登録除外 (Remove) [R]\n" " および作業領域からの削除 (Delete) [D] で表します::" @@ -13268,19 +13503,6 @@ msgid "not removing %s: file is untracked\n" msgstr "%s は削除されません: 未登録ファイルです\n" -#, python-format -msgid "not removing %s: file still exists (use -f to force removal)\n" -msgstr "%s は削除されません: ファイルは保持されます(削除の強行は -f を指定)\n" - -#, python-format -msgid "not removing %s: file is modified (use -f to force removal)\n" -msgstr "" -"%s は削除されません: ファイルは改変されています(削除の強行は -f を指定)\n" - -#, python-format -msgid "not removing %s: file has been marked for add (use forget to undo)\n" -msgstr "%s は削除されません: 追加登録対象ファイルです (取り消しは forget)\n" - msgid "record a rename that has already occurred" msgstr "手動で改名済みのファイルに対して、 改名の旨を記録" @@ -13450,7 +13672,7 @@ " リビジョン指定が無い場合は、 \n" " 指定されたファイル/ディレクトリを、\n" " 作業領域の親リビジョン時点の内容へと復旧します。\n" -" 本コマンドは対象ファイルに対して、 状態を「改変無し」とし、\n" +" 本コマンドは対象ファイルに対して、 状態を「変更無し」とし、\n" " add/remove/copy/rename の実施予定を取り消します。\n" " 作業領域が複数の親リビジョンを持つ場合、\n" " いずれかのリビジョンを明示的に指定して下さい。" @@ -13472,8 +13694,8 @@ " Modified files are saved with a .orig suffix before reverting.\n" " To disable these backups, use --no-backup." msgstr "" -" 改変ファイルの復旧の際には、 復旧前の内容が .orig 拡張子を付けた\n" -" ファイルに保存されます。 この保存は --no-backup で無効化されます。" +" 変更ありのファイルを復旧した場合、 .orig 拡張子を付けたファイルに、\n" +" 復旧前の内容が保存されます。 この保存は --no-backup で無効化されます。" msgid "you can't specify a revision and a date" msgstr "リビジョンと日時は同時には指定できません" @@ -13537,7 +13759,8 @@ " repository." msgstr "" " トランザクションとは、 コマンド実行による、 新規リビジョンの作成や、\n" -" 外部からのリビジョンの取り込みといった、 改変操作を一括化するものです。" +" 外部からのリビジョンの取り込みといった、 リポジトリ操作を、\n" +" ひとまとめにするものです。" msgid "" " For example, the following commands are transactional, and their\n" @@ -13736,12 +13959,6 @@ msgid "show only modified files" msgstr "変更されたファイルを表示" -msgid "show only added files" -msgstr "追加登録されたファイルを表示" - -msgid "show only removed files" -msgstr "登録除外されたファイルを表示" - msgid "show only deleted (but tracked) files" msgstr "削除されたファイル(登録除外は未実施)を表示" @@ -13761,7 +13978,7 @@ msgstr "当該リビジョンとの差分で状態を判定" msgid "list the changed files of a revision" -msgstr "指定リビジョンにおける更新ファイルの一覧" +msgstr "指定リビジョンにおける変更対象ファイルの一覧" msgid "show changed files in the working directory" msgstr "作業領域のファイル操作状況の表示" @@ -13827,7 +14044,7 @@ " I = ignored\n" " = origin of the previous file listed as A (added)" msgstr "" -" M = 改変有り(Modified)\n" +" M = 変更有り(Modified)\n" " A = 追加登録予定(Added)\n" " R = 登録除外予定(Removed)\n" " C = 変更無し(Clean)\n" @@ -13946,7 +14163,7 @@ msgstr " (閉鎖済み)" msgid " (clean)" -msgstr " (改変無し)" +msgstr " (変更無し)" msgid " (new branch head)" msgstr " (新規ブランチヘッド)" @@ -14102,8 +14319,8 @@ msgid "not at a branch head (use -f to force)" msgstr "親リビジョンがブランチのヘッドではありません(強制実行は -f 指定)" -msgid "null revision specified" -msgstr "null リビジョンが指定されました" +msgid "cannot tag null revision" +msgstr "null リビジョンにはタグ付けできません" msgid "list repository tags" msgstr "リポジトリ中のタグ一覧の表示" @@ -14264,6 +14481,10 @@ " 特定のファイルだけを以前の状態に戻す場合は、\n" " :hg:`revert [-r リビジョン] ファイル名` を使用してください。" +#, python-format +msgid "updating to active bookmark %s\n" +msgstr "アクティブなブックマーク %s への更新中\n" + msgid "cannot specify both -c/--check and -C/--clean" msgstr "-c/--check と -C/--clean は併用できません" @@ -14325,13 +14546,13 @@ msgstr "%s を読み込めません(%s)" #, python-format +msgid "unknown revision '%s'" +msgstr "'%s' は未知のリビジョンです" + +#, python-format msgid "working directory has unknown parent '%s'!" msgstr "作業領域の親 '%s' が未知のリビジョンです!" -#, python-format -msgid "unknown revision '%s'" -msgstr "'%s' は未知のリビジョンです" - msgid "not found in manifest" msgstr "マニフェストにありません" @@ -14547,6 +14768,10 @@ msgid "broken pipe\n" msgstr "パイプ破壊(EPIPE)\n" +#, python-format +msgid "abort: %s: '%s'\n" +msgstr "中断: %s: '%s'\n" + msgid "interrupted!\n" msgstr "中断されました!\n" @@ -14685,6 +14910,9 @@ msgid "*** failed to import extension %s: %s\n" msgstr "*** %s のインポートに失敗: %s\n" +msgid "(no help text available)" +msgstr "(ヘルプはありません)" + #, python-format msgid "warning: error finding commands in %s\n" msgstr "警告: ファイル %s でのコマンド解析中にエラー発生\n" @@ -14770,6 +14998,10 @@ msgid "merging %s incomplete! (edit conflicts, then use 'hg resolve --mark')\n" msgstr "%s のマージは不完全です (衝突解消後に 'hg resolve --mark' が必要)\n" +#, python-format +msgid "warning: internal:merge cannot merge symlinks for %s\n" +msgstr "警告: internal:merge はシンボリックリンク %s のマージができません\n" + msgid "" "``internal:dump``\n" "Creates three versions of the files to merge, containing the\n" @@ -14821,7 +15053,7 @@ " File that is modified according to status." msgstr "" "``modified()``\n" -" 更新ステータスを持つファイル (※ 訳注: 未コミット時点でのみ判定可能)" +" 変更ステータスを持つファイル (※ 訳注: 未コミット時点でのみ判定可能)" #. i18n: "modified" is a keyword msgid "modified takes no arguments" @@ -15029,6 +15261,17 @@ msgstr "不明なエンコーディング '%s' が指定されました" msgid "" +"``eol(style)``\n" +" File contains newlines of the given style (dos, unix, mac). Binary\n" +" files are excluded, files with mixed line endings match multiple\n" +" styles." +msgstr "" +"``eol(style)``\n" +" 指定形式 (dos, unix, mac) の改行を含むファイル。\n" +" バイナリファイルは除外されます。 複数形式が混在するファイルは、\n" +" 複数の形式指定に合致します。" + +msgid "" "``copied()``\n" " File that is recorded as being copied." msgstr "" @@ -15065,10 +15308,10 @@ #, python-format msgid "unknown bisect kind %s" -msgstr "未知の分岐種類 %s" +msgstr "未知の探索種別 %s" msgid "invalid bisect state" -msgstr "bisect 状態が不正です" +msgstr "探索状態が不正です" #. i18n: bisect changeset status msgid "good" @@ -15094,6 +15337,9 @@ msgid "bad (implicit)" msgstr "bad (推定)" +msgid "enabled extensions:" +msgstr "有効化されているエクステンション:" + msgid "disabled extensions:" msgstr "無効化されているエクステンション:" @@ -15164,6 +15410,122 @@ msgid "Working with Phases" msgstr "フェーズの利用" +#, python-format +msgid "" +"\n" +"aliases: %s\n" +msgstr "" +"\n" +"別名: %s\n" + +#, python-format +msgid "shell alias for::" +msgstr "シェルコマンドの別名::" + +#, python-format +msgid " %s" +msgstr " %s" + +#, python-format +msgid "alias for: hg %s" +msgstr "コマンドの別名: hg %s" + +#, python-format +msgid "%s" +msgstr "%s" + +#, python-format +msgid "use \"hg help -e %s\" to show help for the %s extension" +msgstr "\"hg help -e %s\" によってエクステンション %s のヘルプが表示されます" + +msgid "options:" +msgstr "オプション:" + +msgid "global options:" +msgstr "グローバルオプション:" + +#, python-format +msgid "" +"\n" +"use \"hg help %s\" to show the full help text\n" +msgstr "" +"\n" +"\"hg help %s\" で詳細なヘルプが表示されます\n" + +#, python-format +msgid "use \"hg -v help %s\" to show more complete help and the global options" +msgstr "省略されたヘルプの詳細やグローバルオプションの表示は \"hg -v help %s\"" + +#, python-format +msgid "use \"hg -v help %s\" to show the global options" +msgstr "グローバルオプションの表示は \"hg -v help %s\"" + +msgid "basic commands:" +msgstr "基本コマンド:" + +msgid "list of commands:" +msgstr "コマンド一覧:" + +msgid "no commands defined\n" +msgstr "コマンドが定義されていません\n" + +msgid "" +"\n" +"additional help topics:" +msgstr "" +"\n" +"追加のヘルプトピック:" + +msgid "use \"hg help\" for the full list of commands" +msgstr "全コマンドの一覧は \"hg help\" で表示されます" + +msgid "use \"hg help\" for the full list of commands or \"hg -v\" for details" +msgstr "" +"全コマンドの一覧は \"hg help\" で、 コマンド詳細は \"hg -v\" で表示されます" + +#, python-format +msgid "use \"hg help %s\" to show the full help text" +msgstr "詳細なヘルプの表示は \"hg help %s\"" + +#, python-format +msgid "use \"hg -v help%s\" to show builtin aliases and global options" +msgstr "組み込み別名およびグローバルオプションの表示は \"hg -v help%s\"" + +#, python-format +msgid "use \"hg help -v %s\" to show more complete help" +msgstr "省略されたヘルプの詳細の表示は \"hg help %s\"" + +#, python-format +msgid "" +"\n" +"use \"hg help -c %s\" to see help for the %s command\n" +msgstr "" +"\n" +"\"hg help -c %s\" によってコマンド %s のヘルプが表示されます\n" + +msgid "no help text available" +msgstr "ヘルプはありません" + +#, python-format +msgid "%s extension - %s" +msgstr "%s エクステンション - %s" + +msgid "use \"hg help extensions\" for information on enabling extensions\n" +msgstr "\"hg help extensions\" で有効なエクステンションの情報が表示されます\n" + +#, python-format +msgid "'%s' is provided by the following extension:" +msgstr "以下のエクステンションにより '%s' が提供されています:" + +msgid "Topics" +msgstr "トピック" + +msgid "Extension Commands" +msgstr "エクステンション由来のコマンド" + +msgid "Mercurial Distributed SCM\n" +msgstr "Mercurial - 分散構成管理ツール\n" + msgid "" "The Mercurial system uses a set of configuration files to control\n" "aspects of its behavior." @@ -16383,7 +16745,7 @@ " priority.incoming.autobuild = 1" msgstr "" " [hooks]\n" -" # 更新の取り込み毎に作業領域を更新\n" +" # 履歴の取り込み毎に作業領域を更新\n" " changegroup.update = hg update\n" " # ホスト毎設定の無効化\n" " incoming =\n" @@ -16786,12 +17148,12 @@ msgid "" " [hostfingerprints]\n" -" hg.intevation.org = 38:76:52:7c:87:26:9a:8f:4a:f8:d3:de:08:45:3b:ea:" -"d6:4b:ee:cc" +" hg.intevation.org = 44:ed:af:1f:97:11:b6:01:7a:48:45:fc:10:3c:b7:f9:" +"d4:89:2a:9d" msgstr "" " [hostfingerprints]\n" -" hg.intevation.org = 38:76:52:7c:87:26:9a:8f:4a:f8:d3:de:08:45:3b:ea:" -"d6:4b:ee:cc" +" hg.intevation.org = 44:ed:af:1f:97:11:b6:01:7a:48:45:fc:10:3c:b7:f9:" +"d4:89:2a:9d" msgid "This feature is only supported when using Python 2.6 or later." msgstr "本機能は、 Python 2.6 以降でのみ使用可能です。" @@ -17010,24 +17372,6 @@ " ツールの戻り値がマージ成功を示す場合でも、 常にマージ成否を問い合わせ。" msgid "" -"``checkchanged``\n" -" True is equivalent to ``check = changed``.\n" -" Default: False" -msgstr "" -"``checkchanged``\n" -" 本設定を True にするのは、 ``check = changed`` 設定と等価です。\n" -" デフォルト値: False" - -msgid "" -"``checkconflicts``\n" -" True is equivalent to ``check = conflicts``.\n" -" Default: False" -msgstr "" -"``checkconflicts``\n" -" 本設定を True にするのは、 ``check = conflicts`` 設定と等価です。\n" -" デフォルト値: False" - -msgid "" "``fixeol``\n" " Attempt to fix up EOL changes caused by the merge tool.\n" " Default: False" @@ -17335,6 +17679,39 @@ " デフォルト値: None (結果は標準エラー出力から出力)" msgid "" +"``sort``\n" +" Sort field. Specific to the ``ls`` instrumenting profiler.\n" +" One of ``callcount``, ``reccallcount``, ``totaltime`` and\n" +" ``inlinetime``.\n" +" Default: inlinetime." +msgstr "" +"``sort``\n" +" 出力の整列。 詳細プロファイラ ``ls`` 固有の設定。\n" +" ``callcount``, ``reccallcount``, ``totaltime`` または ``inlinetime``\n" +" から1つを指定してください。 デフォルト値: inlinetime" + +msgid "" +"``limit``\n" +" Number of lines to show. Specific to the ``ls`` instrumenting profiler.\n" +" Default: 30." +msgstr "" +"``limit``\n" +" 表示対象行数。 詳細プロファイラ ``ls`` 固有の設定。 デフォルト値: 30" + +msgid "" +"``nested``\n" +" Show at most this number of lines of drill-down info after each main " +"entry.\n" +" This can help explain the difference between Total and Inline.\n" +" Specific to the ``ls`` instrumenting profiler.\n" +" Default: 5." +msgstr "" +"``nested``\n" +" 個々のメインエントリ以後の、 掘り下げ (drill-down) 情報表示の、\n" +" 最大行数。 Total と Inline の差の説明を助けます。\n" +" 詳細プロファイラ ``ls`` 固有の設定。 デフォルト値: 5" + +msgid "" "``revsetalias``\n" "---------------" msgstr "" @@ -17416,14 +17793,16 @@ " Host name of mail server, e.g. \"mail.example.com\"." msgstr "" "``host``\n" -" SMTP サーバのホスト名。 設定例: \"mail.example.com\"" +" メールサーバのホスト名。 設定例: \"mail.example.com\"" msgid "" "``port``\n" -" Optional. Port to connect to on mail server. Default: 25." +" Optional. Port to connect to on mail server. Default: 465 (if\n" +" ``tls`` is smtps) or 25 (otherwise)." msgstr "" "``port``\n" -" 省略可能。 SMTP サーバのポート番号。 デフォルト値: 25" +" 省略可能。 メールサーバのポート番号。 デフォルト値: 465 (``tls``\n" +" 設定が smtps の場合) あるいは 25 (それ以外)" msgid "" "``tls``\n" @@ -17432,16 +17811,37 @@ " smtps or none. Default: none." msgstr "" "``tls``\n" -" 省略可能。 SMTP サーバ接続における TLS 接続の有無/方式の指定。\n" +" 省略可能。 メールサーバ接続における TLS 接続の有無/方式の指定。\n" " starttls、 smtps ないし none。 デフォルト値: none" msgid "" +"``verifycert``\n" +" Optional. Verification for the certificate of mail server, when\n" +" ``tls`` is starttls or smtps. \"strict\", \"loose\" or False. For\n" +" \"strict\" or \"loose\", the certificate is verified as same as the\n" +" verification for HTTPS connections (see ``[hostfingerprints]`` and\n" +" ``[web] cacerts`` also). For \"strict\", sending email is also\n" +" aborted, if there is no configuration for mail server in\n" +" ``[hostfingerprints]`` and ``[web] cacerts``. --insecure for\n" +" :hg:`email` overwrites this as \"loose\". Default: \"strict\"." +msgstr "" +"``verifycert``\n" +" 省略可能。 ``tls`` 設定が starttls あるいは smtps の場合における、\n" +" メールサーバの証明書の検証方式。 \"strict\" \"loose\" あるいは False。\n" +" \"strict\" あるいは \"loose\" の場合、HTTPS接続の際と同じ要領で、\n" +" 証明書が検証されます。(``[hostfingerprints]`` および ``[web] cacerts``\n" +" も参照) \"strict\" の場合、 接続先メールサーバに関する設定が\n" +" ``[hostfingerprints]`` と ``[web] cacerts`` のいずれにも無い場合も、\n" +" メール送信が中断されます。 :hg:`email` に --insecure が指定された場合、\n" +" この設定値は \"loose\" で上書きされます。デフォルト値: \"strict\"" + +msgid "" "``username``\n" " Optional. User name for authenticating with the SMTP server.\n" " Default: none." msgstr "" "``username``\n" -" 省略可能。 SMTP サーバ接続の認証におけるユーザ名。\n" +" 省略可能。 メールサーバ接続の認証におけるユーザ名。\n" " デフォルト値: none" msgid "" @@ -17451,7 +17851,7 @@ " password; non-interactive sessions will fail. Default: none." msgstr "" "``password``\n" -" 省略可能。 SMTP サーバ接続の認証におけるパスワード。\n" +" 省略可能。 メールサーバ接続の認証におけるパスワード。\n" " 無指定の場合、 対話的な実行であれば、\n" " パスワード入力プロンプトが表示されますが、\n" " 非対話的な実行であれば、 処理が中断されます。\n" @@ -17993,6 +18393,15 @@ " デフォルト値: False" msgid "" +"``archivesubrepos``\n" +" Whether to recurse into subrepositories when archiving. Default is\n" +" False." +msgstr "" +"``archivesubrepos``\n" +" アーカイブ作成における、 サブリポジトリへの再帰実施の有無。\n" +" デフォルト値: False" + +msgid "" "``baseurl``\n" " Base URL to use when publishing URLs in other locations, so\n" " third-party tools like email notification hooks can construct\n" @@ -18334,10 +18743,93 @@ msgid "" "``templates``\n" -" Where to find the HTML templates. Default is install path.\n" +" Where to find the HTML templates. Default is install path." msgstr "" "``templates``\n" -" HTML テンプレートの検索先。 無指定時はインストール先。\n" +" HTML テンプレートの検索先。 無指定時はインストール先。" + +msgid "" +"``websub``\n" +"----------" +msgstr "" +"``websub``\n" +"----------" + +msgid "" +"Web substitution filter definition. You can use this section to\n" +"define a set of regular expression substitution patterns which\n" +"let you automatically modify the hgweb server output." +msgstr "" + +msgid "" +"The default hgweb templates only apply these substitution patterns\n" +"on the revision description fields. You can apply them anywhere\n" +"you want when you create your own templates by adding calls to the\n" +"\"websub\" filter (usually after calling the \"escape\" filter)." +msgstr "" + +msgid "" +"This can be used, for example, to convert issue references to links\n" +"to your issue tracker, or to convert \"markdown-like\" syntax into\n" +"HTML (see the examples below)." +msgstr "" + +msgid "" +"Each entry in this section names a substitution filter.\n" +"The value of each entry defines the substitution expression itself.\n" +"The websub expressions follow the old interhg extension syntax,\n" +"which in turn imitates the Unix sed replacement syntax::" +msgstr "" + +msgid " patternname = s/SEARCH_REGEX/REPLACE_EXPRESSION/[i]" +msgstr " patternname = s/SEARCH_REGEX/REPLACE_EXPRESSION/[i]" + +msgid "" +"You can use any separator other than \"/\". The final \"i\" is optional\n" +"and indicates that the search must be case insensitive." +msgstr "" + +msgid "Examples::" +msgstr "記述例::" + +msgid "" +" [websub]\n" +" issues = s|issue(\\d+)|issue" +"\\1|i\n" +" italic = s/\\b_(\\S+)_\\b/\\1<\\/i>/\n" +" bold = s/\\*\\b(\\S+)\\b\\*/\\1<\\/b>/" +msgstr "" +" [websub]\n" +" issues = s|issue(\\d+)|issue" +"\\1|i\n" +" italic = s/\\b_(\\S+)_\\b/\\1<\\/i>/\n" +" bold = s/\\*\\b(\\S+)\\b\\*/\\1<\\/b>/" + +msgid "" +"``worker``\n" +"----------" +msgstr "" +"``worker``\n" +"----------" + +msgid "" +"Parallel master/worker configuration. We currently perform working\n" +"directory updates in parallel on Unix-like systems, which greatly\n" +"helps performance." +msgstr "" +"並列実施に関する設定。 現状では、 Unix 系システムにおける作業領域更新で、\n" +"処理が並列に実施され、大規模リポジトリにおける大幅な性能改善が見込まれます。" + +msgid "" +"``numcpus``\n" +" Number of CPUs to use for parallel operations. Default is 4 or the\n" +" number of CPUs on the system, whichever is larger. A zero or\n" +" negative value is treated as ``use the default``.\n" +msgstr "" +"``numcpus``\n" +" 並列実施に使用可能な CPU 数。 デフォルト値は 4 あるいは\n" +" システム上の CPU 数の大きい方の値。 0 あるいは負値は、\n" +" 『デフォルト値の使用』を意味します。\n" msgid "Some commands allow the user to specify a date, e.g.:" msgstr "以下のコマンドで日時指定が可能です:" @@ -18365,7 +18857,10 @@ "- ``2006-12-6``\n" "- ``12-6``\n" "- ``12/6``\n" -"- ``12/6/6`` (Dec 6 2006)" +"- ``12/6/6`` (Dec 6 2006)\n" +"- ``today`` (midnight)\n" +"- ``yesterday`` (midnight)\n" +"- ``now`` - right now" msgstr "" "- ``Wed Dec 6 13:18:29 2006`` (「ローカルタイムゾーン」を想定)\n" "- ``Dec 6 13:18 -0600`` (「今年」を想定、 タイムゾーンはオフセット指定)\n" @@ -18379,7 +18874,10 @@ "- ``2006-12-6``\n" "- ``12-6``\n" "- ``12/6``\n" -"- ``12/6/6`` (2006年12月6日)" +"- ``12/6/6`` (2006年12月6日)\n" +"- ``today`` (当日午前0時)\n" +"- ``yesterday`` (昨日午前0時)\n" +"- ``now`` - 現在時刻" msgid "Lastly, there is Mercurial's internal format:" msgstr "最後に、 Mercurial 固有の内部形式を示します:" @@ -18512,8 +19010,8 @@ msgstr "" "HGENCODING\n" " Mercurial によるロケール自動検出の上書き。 この設定は、 ユーザ名、\n" -" コミットメッセージ、 タグ名およびブランチ名を内部データ形式に変換する\n" -" 際に使用されます。 この環境変数設定は、 コマンドラインでの --encoding\n" +" コミットログ、 タグ名およびブランチ名の、 記録の際に使用されます。\n" +" この環境変数設定は、 コマンドラインでの --encoding\n" " 使用により、 更に上書きすることができます。" msgid "" @@ -18776,7 +19274,7 @@ msgid "" "Mercurial supports a functional language for selecting a set of\n" -"files. " +"files." msgstr "Mercurial はファイル指定のための問い合わせ言語を提供しています。" msgid "" @@ -18942,8 +19440,8 @@ "Branch\n" " (名詞) [ブランチ] ヘッドではない (= 他に子リビジョンを持つ)\n" " リビジョンを親として、 作成された子リビジョン。\n" -" これは 「位相的 (topological) ブランチ」 と呼ばれます。\n" -" ('Branch, topological' 参照) 位相的ブランチが名前を持つ場合は\n" +" これは 「構造的 (topological) ブランチ」 と呼ばれます。\n" +" ('Branch, topological' 参照) 構造的ブランチが名前を持つ場合は\n" " 「名前付きブランチ」、 名前を持たない場合は「名前無しブランチ」\n" " と呼ばれます。 (※ 訳注: 名前を「持つ/持たない」は、\n" " 「親と異なる名前」を持つ/持たない、 を意味します)\n" @@ -19028,7 +19526,7 @@ " :hg:`branches --active`." msgstr "" "Branch, inactive\n" -" [非アクティブなブランチ] 位相的なヘッドが無い名前付きブランチは、\n" +" [非アクティブなブランチ] 構造的なヘッドが無い名前付きブランチは、\n" " 非アクティブなブランチとみなされます。 例えば default ブランチに、\n" " 機能実装用の名前付きブランチがマージされると、 機能実装用ブランチは、\n" " 非アクティブになります。 :hg:`branches` は、 --active 指定が無い場合、\n" @@ -19072,10 +19570,10 @@ msgstr "" " 名前付きブランチは、 リポジトリの履歴を構成するリビジョン群を、\n" " 重複の無い部分集合へと分割する、 名前空間の一種とも言えます。\n" -" 名前付きブランチは、 必ずしも位相的ブランチである必要はありません。\n" +" 名前付きブランチは、 必ずしも構造的ブランチである必要はありません。\n" " ある名前付きブランチ (default でも可) のヘッドとなるリビジョンを親に、\n" " 別の名前付きブランチを新規生成した場合、 元ブランチに対して、\n" -" 以後の新規リビジョン追加が無ければ、 元ブランチは (位相的な意味で)\n" +" 以後の新規リビジョン追加が無ければ、 元ブランチは (構造的な意味で)\n" " 『分岐』したのではなく、 名前が付いているだけと言えます。" msgid "" @@ -19094,11 +19592,11 @@ " current, possibly default, branch." msgstr "" "Branch, topological\n" -" [位相的ブランチ] ヘッドではない (= 他に子リビジョンを持つ)\n" +" [構造的ブランチ] ヘッドではない (= 他に子リビジョンを持つ)\n" " リビジョンを親として、 新規に作成されたリビジョンは、 \n" -" 位相的ブランチとなります。 位相的ブランチに名前が与えられた場合、\n" +" 構造的ブランチとなります。 構造的ブランチに名前が与えられた場合、\n" " それは名前付きブランチとなります。 (※ 訳注: 名前付きブランチは、\n" -" 必ずしも位相的ブランチとは限りません) 名前が与えられない場合は、\n" +" 必ずしも構造的ブランチとは限りません) 名前が与えられない場合は、\n" " 現行の名前付きブランチ (一般的には default) における、\n" " 名前無しブランチとなります。" @@ -19434,14 +19932,14 @@ " A topological head which has not been closed." msgstr "" "Head, repository\n" -" [リポジトリ(の)ヘッド] 閉鎖されていない、 位相的なヘッド。" +" [リポジトリ(の)ヘッド] 閉鎖されていない、 構造的なヘッド。" msgid "" "Head, topological\n" " A changeset with no children in the repository." msgstr "" "Head, topological\n" -" [位相的(な)ヘッド] リポジトリ内に、 子を持たないリビジョン。" +" [構造的(な)ヘッド] リポジトリ内に、 子を持たないリビジョン。" msgid "" "History, immutable\n" @@ -19571,7 +20069,7 @@ " changeset into another." msgstr "" " (動詞) [パッチ(を)当て(る)] あるリビジョン時点の内容に対する、\n" -" パッチ適用による改変操作 (※ 訳注: 暗に新規リビジョンの生成を想定)。" +" パッチ適用による変更操作 (※ 訳注: 暗に新規リビジョンの生成を想定)。" msgid " Example: \"You will need to patch that revision.\"" msgstr " Example: \"そのリビジョンへのパッチ当てが必要です。\"" @@ -19823,7 +20321,7 @@ "対象ファイルの名前を列挙することで、 これらを無視することができます。\n" "``.hgignore`` は明示的に手動で作成しなければなりません。\n" "一般的には、 このファイルも構成管理対象に含めますので、\n" -"更新内容の反映や取り込みによって、 設定内容は他のリポジトリにも伝播します。" +"履歴の反映や取り込みによって、 設定内容は他のリポジトリにも伝播します。" msgid "" "An untracked file is ignored if its path relative to the repository\n" @@ -20089,7 +20587,7 @@ "を使用してください。\n" msgid "To merge files Mercurial uses merge tools." -msgstr "Mercurial での更新内容マージには、 マージツールを使用します。" +msgstr "Mercurial でのファイル内容のマージには、 マージツールを使用します。" msgid "" "A merge tool combines two different versions of a file into a merged\n" @@ -20348,7 +20846,7 @@ msgid "" ".. note::\n" -" Patterns specified in ``.hgignore`` are not rooted. \n" +" Patterns specified in ``.hgignore`` are not rooted.\n" " Please see :hg:`help hgignore` for details." msgstr "" ".. note::\n" @@ -20620,7 +21118,7 @@ msgid " - resynchronize draft changesets relative to a remote repository::" msgstr " - 連携先リポジトリに応じて、リビジョンを draft フェーズ化::" -msgid " hg phase -fd 'outgoing(URL)' " +msgid " hg phase -fd 'outgoing(URL)'" msgstr " hg phase -fd 'outgoing(URL)'" msgid "" @@ -20876,7 +21374,7 @@ msgstr " hg log -r \"branch(default) and 1.5:: and not merge()\"" msgid "- Open branch heads::" -msgstr "- 閉鎖 (close) されていないブランチンのヘッド::" +msgstr "- 閉鎖 (close) されていないブランチのヘッド::" msgid " hg log -r \"head() and not closed()\"" msgstr " hg log -r \"head() and not closed()\"" @@ -20905,11 +21403,9 @@ " 時点の内容に含まれないもの::" msgid "" -" hg log -r \"(keyword(bug) or keyword(issue)) and not ancestors(tagged" -"())\"\n" -msgstr "" -" hg log -r \"(keyword(bug) or keyword(issue)) and not ancestors(tagged" -"())\"\n" +" hg log -r \"(keyword(bug) or keyword(issue)) and not ancestors(tag())\"\n" +msgstr "" +" hg log -r \"(keyword(bug) or keyword(issue)) and not ancestors(tag())\"\n" msgid "" "Subrepositories let you nest external repositories or projects into a\n" @@ -21112,11 +21608,11 @@ msgstr "" ":commit: コミットの実施により、 親リポジトリと配下のサブリポジトリ間の、\n" " 整合性の取れた対応関係が、 (親リポジトリ側に) 記録されます。\n" -" 未コミット改変を持つサブリポジトリは、 コミット動作を中断させます。\n" +" 未コミット変更があるサブリポジトリは、 コミット動作を中断させます。\n" " -S/--subrepos を指定するか、 設定ファイル記述 (:hg:`help config` 参照)\n" " での \"ui.commitsubrepos=True\" 設定により、 コミット実施の際に、\n" " サブリポジトリ中の未コミット変更が、 再帰的にコミットされます。\n" -" 全てのサブリポジトリから、 未コミット改変が無くなった後で、\n" +" 全てのサブリポジトリから、 未コミット変更が無くなった後で、\n" " 各サブリポジトリの状態記録が、 親リポジトリにおいてコミットされます。" msgid "" @@ -21312,8 +21808,110 @@ msgid "List of filters:" msgstr "フィルター一覧(入力と、 それに対する出力):" -msgid ".. filtersmarker\n" -msgstr ".. filtersmarker\n" +msgid ".. filtersmarker" +msgstr ".. filtersmarker" + +msgid "" +"Note that a filter is nothing more than a function call, i.e.\n" +"``expr|filter`` is equivalent to ``filter(expr)``." +msgstr "" +"フィルタは関数呼び出しに過ぎません。 例えば、 ``expr|filter`` は\n" +"``filter(expr)`` と等価です。" + +msgid "In addition to filters, there are some basic built-in functions:" +msgstr "フィルタの他に、 以下の様な基本的な組み込み関数があります:" + +msgid "- date(date[, fmt])" +msgstr "- date(date[, fmt])" + +msgid "- fill(text[, width])" +msgstr "- fill(text[, width])" + +msgid "- get(dict, key)" +msgstr "- get(dict, key)" + +msgid "- if(expr, then[, else])" +msgstr "- if(expr, then[, else])" + +msgid "- ifeq(expr, expr, then[, else])" +msgstr "- ifeq(expr, expr, then[, else])" + +msgid "- join(list, sep)" +msgstr "- join(list, sep)" + +msgid "- label(label, expr)" +msgstr "- label(label, expr)" + +msgid "- sub(pat, repl, expr)" +msgstr "- sub(pat, repl, expr)" + +msgid "- rstdoc(text, style)" +msgstr "- rstdoc(text, style)" + +msgid "Also, for any expression that returns a list, there is a list operator:" +msgstr "また、 列挙形式を返す expr に対しては、 以下の様な記述が可能です:" + +msgid "- expr % \"{template}\"" +msgstr "- expr % \"{template}\"" + +msgid "Some sample command line templates:" +msgstr "コマンドラインでのテンプレート指定例:" + +msgid "- Format lists, e.g. files::" +msgstr "- files のような列挙形式の整形::" + +msgid " $ hg log -r 0 --template \"files:\\n{files % ' {file}\\n'}\"" +msgstr " $ hg log -r 0 --template \"files:\\n{files % ' {file}\\n'}\"" + +msgid "- Join the list of files with a \", \"::" +msgstr "- ファイル一覧を \", \" で連結::" + +msgid " $ hg log -r 0 --template \"files: {join(files, ', ')}\\n\"" +msgstr " $ hg log -r 0 --template \"files: {join(files, ', ')}\\n\"" + +msgid "- Format date::" +msgstr "- 日時情報の整形::" + +msgid " $ hg log -r 0 --template \"{date(date, '%Y')}\\n\"" +msgstr " $ hg log -r 0 --template \"{date(date, '%Y')}\\n\"" + +msgid "- Output the description set to a fill-width of 30::" +msgstr "- コミットログの各行を30桁で揃えて出力::" + +msgid " $ hg log -r 0 --template \"{fill(desc, '30')}\"" +msgstr " $ hg log -r 0 --template \"{fill(desc, '30')}\"" + +msgid "- Use a conditional to test for the default branch::" +msgstr "- default ブランチか否かで表示内容を切り替え::" + +msgid "" +" $ hg log -r 0 --template \"{ifeq(branch, 'default', 'on the main " +"branch',\n" +" 'on branch {branch}')}\\n\"" +msgstr "" +" $ hg log -r 0 --template \"{ifeq(branch, 'default', 'on the main " +"branch',\n" +" 'on branch {branch}')}\\n\"" + +msgid "- Append a newline if not empty::" +msgstr "- 空でない場合は改行を追加::" + +msgid " $ hg tip --template \"{if(author, '{author}\\n')}\"" +msgstr " $ hg tip --template \"{if(author, '{author}\\n')}\"" + +msgid "- Label the output for use with the color extension::" +msgstr "- color エクステンション向けに、出力をラベル付け::" + +msgid "" +" $ hg log -r 0 --template \"{label('changeset.{phase}', node|short)}\\n\"" +msgstr "" +" $ hg log -r 0 --template \"{label('changeset.{phase}', node|short)}\\n\"" + +msgid "- Invert the firstline filter, i.e. everything but the first line::" +msgstr "- firstline フィルタの逆(一行目以外)::" + +msgid " $ hg log -r 0 --template \"{sub(r'^.*\\n?\\n?', '', desc)}\\n\"\n" +msgstr " $ hg log -r 0 --template \"{sub(r'^.*\\n?\\n?', '', desc)}\\n\"\n" msgid "Valid URLs are of the form::" msgstr "有効な URL 指定は以下の形式です::" @@ -21536,6 +22134,14 @@ msgstr "(マージ結果の commit を忘れずに)\n" #, python-format +msgid "websub: invalid pattern for %s: %s\n" +msgstr "websub: %s のパターンが不正です: %s\n" + +#, python-format +msgid "websub: invalid regexp for %s: %s\n" +msgstr "websub: %s の正規表現が不正です: %s\n" + +#, python-format msgid "config file %s not found!" msgstr "設定ファイル %s が見つかりません!" @@ -21832,12 +22438,13 @@ msgstr "" #, python-format +msgid "push includes divergent changeset: %s!" +msgstr "履歴反映対象に分岐 (divergent) した後継リビジョンが含まれます!: %s" + +#, python-format msgid "updating %s to public failed!\n" msgstr "%s のフェーズの public 化に失敗!\n" -msgid "failed to push some obsolete markers!\n" -msgstr "リビジョンの廃止情報の反映に失敗しました!\n" - #, python-format msgid "%d changesets found\n" msgstr "%d 個のリビジョンがあります\n" @@ -21870,6 +22477,9 @@ msgid "received file revlog group is empty" msgstr "ファイルのリビジョンログが空です" +msgid "received spurious file revlog entry" +msgstr "ファイルのリビジョンログが不正な情報を含んでいます" + #, python-format msgid "missing file data for %s:%s - run hg verify" msgstr "%s:%s のファイルデータが不在です - hg verify を実施してください" @@ -21908,25 +22518,35 @@ msgid "transferred %s in %.1f seconds (%s/sec)\n" msgstr "%s を %.1f 秒で送信しました(%s/秒)\n" +msgid "SMTPS requires Python 2.6 or later" +msgstr "SMTPS の利用には Python 2.6 以降が必要です" + msgid "can't use TLS: Python SSL support not installed" msgstr "TLS を利用できません: Python SSL サポートがインストールされていません" -msgid "(using smtps)\n" -msgstr "(SMTP を使用)\n" - msgid "smtp.host not configured - cannot send mail" msgstr "設定ファイルに smtp.host 指定がありません - メール送信に失敗" #, python-format +msgid "invalid smtp.verifycert configuration: %s" +msgstr "smtp.verifycert 設定が不正です: %s" + +msgid "(using smtps)\n" +msgstr "(smtps を使用中)\n" + +#, python-format msgid "sending mail: smtp host %s, port %s\n" msgstr "メール送信中: SMTP ホスト %s、 ポート番号 %s\n" msgid "(using starttls)\n" -msgstr "(starttls を使用)\n" +msgstr "(starttls を使用中)\n" + +msgid "(verifying remote certificate)\n" +msgstr "(接続先の証明書を検証中)\n" #, python-format msgid "(authenticating to mail server as %s)\n" -msgstr "(%s としてメールサーバの認証中)\n" +msgstr "(メールサーバに %s として認証要求中)\n" #, python-format msgid "sending mail: %s\n" @@ -21975,6 +22595,10 @@ msgstr "差分コンテキストでの行数指定が不正です: %r" #, python-format +msgid "warning: cannot merge flags for %s\n" +msgstr "警告: ファイル %s の属性設定はマージできません\n" + +#, python-format msgid "%s: untracked file differs\n" msgstr "%s: 未登録ファイルに差分あり\n" @@ -21986,29 +22610,12 @@ msgid "case-folding collision between %s and %s" msgstr "ファイル名の文字大小の問題で %s と %s が衝突します" -#, python-format -msgid "" -" conflicting flags for %s\n" -"(n)one, e(x)ec or sym(l)ink?" -msgstr "" -"ファイル %s のビット設定に衝突があります\n" -"どの設定にしますか? 無効:(n)one 実行可能:e(x)ec リンク:sym(l)ink" - -msgid "&None" -msgstr "&None" - -msgid "E&xec" -msgstr "E&xec" - -msgid "Sym&link" -msgstr "Sym&link" - msgid "resolving manifests\n" msgstr "管理ファイル一覧を解決しています\n" #, python-format msgid "" -" local changed %s which remote deleted\n" +"local changed %s which remote deleted\n" "use (c)hanged version or (d)elete?" msgstr "" "変更したファイル %s は別リビジョンで登録除外されています\n" @@ -22031,9 +22638,6 @@ msgid "&Deleted" msgstr "&Deleted" -msgid "updating" -msgstr "更新中" - #, python-format msgid "update failed to remove %s: %s!\n" msgstr "%s の削除に失敗: %s!\n" @@ -22042,6 +22646,9 @@ msgid "getting %s\n" msgstr "%s を取得しています\n" +msgid "updating" +msgstr "更新中" + #, python-format msgid "getting %s to %s\n" msgstr "%s から %s に複製中\n" @@ -22120,6 +22727,9 @@ msgid "unexpected old value" msgstr "旧値の指定は想定外です" +msgid "failed to push some obsolete markers!\n" +msgstr "リビジョンの廃止情報の反映に失敗しました!\n" + #, python-format msgid "unexpected token: %s" msgstr "未知の記述: %s" @@ -22306,15 +22916,16 @@ msgstr "%x は互換性のないリビジョンフラグです" #, python-format +msgid "integrity check failed on %s:%d" +msgstr "%s:%d の一貫性チェックに失敗" + +#, python-format msgid "%s not found in the transaction" msgstr "トランザクション中に %s は見つかりませんでした" msgid "consistency error in delta" msgstr "差分情報の不整合" -msgid "unknown delta base" -msgstr "未知の差分ベース" - #, python-format msgid "can't use %s here" msgstr "ここでは %s を使用できません" @@ -22331,19 +22942,21 @@ msgstr "adds にはパターンを指定してください" msgid "" -"``ancestor(single, single)``\n" -" Greatest common ancestor of the two changesets." -msgstr "" -"``ancestor(single, single)``\n" -" 2つのリビジョンに共通な最新の祖先。" - -#. i18n: "ancestor" is a keyword -msgid "ancestor requires two arguments" -msgstr "ancestor の引数は2つです" - -#. i18n: "ancestor" is a keyword -msgid "ancestor arguments must be single revisions" -msgstr "ancestor の引数にはそれぞれ単一リビジョンを指定してください" +"``ancestor(*changeset)``\n" +" Greatest common ancestor of the changesets." +msgstr "" +"``ancestor(*changeset)``\n" +" 指定リビジョン郡に共通な最新の祖先。" + +msgid "" +" Accepts 0 or more changesets.\n" +" Will return empty list when passed no args.\n" +" Greatest common ancestor of a single changeset is that changeset." +msgstr "" +" 任意の数のリビジョンを指定可能です。\n" +" リビジョン指定が無い場合、結果は空となります。\n" +" 1つのリビジョンだけが指定された場合、\n" +" そのリビジョン自身が『共通の祖先』とみなされます。" msgid "" "``ancestors(set)``\n" @@ -22385,7 +22998,7 @@ " - ``good``, ``bad``, ``skip``: 各状態にマークされたリビジョン群\n" " - ``goods``, ``bads`` : good ないし bad と判断されたリビジョン群\n" " - ``range`` : 探索範囲中のリビジョン群 \n" -" - ``pruned`` : 状態が確定したリビジョン群\n" +" - ``pruned`` : goods/bads あるいは skip 相当のリビジョン群\n" " - ``untested`` : 状態が未確定のリビジョン群\n" " - ``ignored`` : 探索対象から除外されたリビジョン群\n" " - ``current`` : 現在の探索対象リビジョン" @@ -22459,6 +23072,19 @@ msgstr "bumped には引数が指定できません" msgid "" +"``bundle()``\n" +" Changesets in the bundle." +msgstr "" +"``bundle()``\n" +" バンドルファイル中のリビジョン群。" + +msgid " Bundle must be specified by the -R option." +msgstr " バンドルファイルは -R オプションで指定される必要があります。" + +msgid "no bundle provided - specify with -R" +msgstr "バンドルファイルが指定されていません。-R を使って指定してください。" + +msgid "" "``children(set)``\n" " Child changesets of changesets in set." msgstr "" @@ -22550,6 +23176,18 @@ " 指定相当とみなします。" msgid "" +"``divergent()``\n" +" Final successors of changesets with an alternative set of final " +"successors." +msgstr "" +"``divergent()``\n" +" 他の最終後継リビジョンが存在する、 最終後継リビジョン群。" + +#. i18n: "divergent" is a keyword +msgid "divergent takes no arguments" +msgstr "divergent には引数が指定できません" + +msgid "" "``draft()``\n" " Changeset in draft phase." msgstr "" @@ -22606,7 +23244,7 @@ " Changesets connected to the specified filelog." msgstr "" "``filelog(pattern)``\n" -" パターンに合致するファイルの改変に関連付けられたリビジョン群。" +" パターンに合致するファイルの変更に関連付けられたリビジョン群。" msgid "" " For performance reasons, ``filelog()`` does not show every changeset\n" @@ -22701,7 +23339,7 @@ " Changesets affecting files matched by pattern." msgstr "" "``file(pattern)``\n" -" パターンに合致するファイルに改変を行ったリビジョン群。" +" パターンに合致するファイルに変更を行ったリビジョン群。" msgid "" " For a faster but less accurate result, consider using ``filelog()``\n" @@ -22818,7 +23456,7 @@ " Changesets with more than one child." msgstr "" "``branchpoint()``\n" -" 子リビジョンを1つ以上持つリビジョン群。" +" 子リビジョンを2つ以上持つリビジョン群。" #. i18n: "branchpoint" is a keyword msgid "branchpoint takes no arguments" @@ -22836,7 +23474,7 @@ " Changesets modifying files matched by pattern." msgstr "" "``modifies(pattern)``\n" -" パターンに合致するファイルを改変したリビジョン群。" +" パターンに合致するファイルを変更したリビジョン群。" #. i18n: "modifies" is a keyword msgid "modifies requires a pattern" @@ -23158,10 +23796,6 @@ msgid "the argument to tag must be a string" msgstr "tag には文字列を指定してください" -#, python-format -msgid "no tags exist that match '%s'" -msgstr "'%s' に合致するタグはありません" - msgid "" "``unstable()``\n" " Non-obsolete changesets with obsolete ancestors." @@ -23216,6 +23850,9 @@ msgid "%r cannot be used in a name" msgstr "%r は名前定義に使用できません" +msgid "cannot use an integer as a name" +msgstr "数値だけの名前は使用できません" + #, python-format msgid "ui.portablefilenames value is invalid ('%s')" msgstr "ui.portablefilenames 値が不正です ('%s')" @@ -23244,6 +23881,10 @@ msgid "could not symlink to %r: %s" msgstr "%r に対してシンボリックリンクできません: %s" +#, python-format +msgid "%s not under root '%s'" +msgstr "%s はルートディレクトリ '%s' の配下にはありません" + msgid "empty revision range" msgstr "リビジョンの範囲指定が空です" @@ -23347,6 +23988,10 @@ msgstr "ホスト %s のフィンガープリントが検証できません (Python が古いため)" #, python-format +msgid "certificate for %s can't be verified (Python too old)" +msgstr "%s の証明書は検証できません (Python が古いため)" + +#, python-format msgid "warning: certificate for %s can't be verified (Python too old)\n" msgstr "警告: %s の証明書は検証できません (Python が古いため)\n" @@ -23376,11 +24021,18 @@ "指定してください" #, python-format +msgid "%s certificate with fingerprint %s not verified" +msgstr "%s の証明書のフィンガープリント %s は検証できません" + +msgid "check hostfingerprints or web.cacerts config setting" +msgstr "hostfingerprint または web.cacerts 設定を確認してください" + +#, python-format msgid "" "warning: %s certificate with fingerprint %s not verified (check " "hostfingerprints or web.cacerts config setting)\n" msgstr "" -"警告: %s の証明書 (fingerprint は %s) 検証を省略(設定ファイルの " +"警告: %s の証明書 (フィンガープリントは %s) 検証を省略(設定ファイルの " "hostfingerprints ないし web.cacerts 設定を確認してください)\n" #, python-format @@ -23398,6 +24050,10 @@ msgstr "ファイル名キャッシュに不正なエントリ: %s 行目" #, python-format +msgid "(in subrepo %s)" +msgstr "(サブリポジトリ %s で発生)" + +#, python-format msgid "warning: subrepo spec file %s not found\n" msgstr "警告: サブリポジトリの spec ファイル %s が見つかりません\n" @@ -23459,9 +24115,8 @@ " サブリポジトリ %s のリビジョンに差分が検出されました\n" "どちらを採用しますか? 手元(%s):(l)ocal 連携先(%s):(r)emote\n" -#, python-format -msgid "default path for subrepository %s not found" -msgstr "サブリポジトリ %s の が見つかりません" +msgid "default path for subrepository not found" +msgstr "サブリポジトリの連携先が見つかりません" #, python-format msgid "unknown subrepo type %s" @@ -23488,6 +24143,10 @@ msgstr "サブリポジトリ %s に %s から取り込み中\n" #, python-format +msgid "no changes made to subrepo %s since last push to %s\n" +msgstr "サブリポジトリ %s は、直前の %s への反映以降の変更がありません\n" + +#, python-format msgid "pushing subrepo %s to %s\n" msgstr "サブリポジトリ %s から %s へ反映中\n" @@ -23660,7 +24319,7 @@ " ※ 後述する rfc3339date フィルタの説明も参照してください。" msgid ":localdate: Date. Converts a date to local date." -msgstr ":localdate: 日時情報。 ローカル日時で可読化します。" +msgstr ":localdate: 日時情報。 日時情報をローカルタイムゾーンに変換します。" msgid ":nonempty: Any text. Returns '(none)' if the string is empty." msgstr ":nonempty: 文字列。 与えられた文字列が空の場合 '(none)'となります。" @@ -23733,9 +24392,9 @@ " S: skipped, U: untested, I: ignored). Returns single space if `text`\n" " is not a valid bisection status." msgstr "" -":shortbisect: 文字列。 `文字列` を2分探索 (bisect) 状態とみなし、\n" +":shortbisect: 文字列。 `文字列` を二分探索 (bisect) 状態とみなし、\n" " 状態に見合った1文字 (G: good, B: bad, S: skipped, U: untested, I:\n" -" ignored) を返します。 `文字列` が2分探索状態として不適切な場合、\n" +" ignored) を返します。 `文字列` が二分探索状態として不適切な場合、\n" " 空白文字を返します。" msgid ":shortdate: Date. Returns a date like \"2006-09-18\"." @@ -23792,7 +24451,7 @@ msgstr ":author: 文字列。 リビジョンの作者名(記録情報そのまま)。" msgid ":bisect: String. The changeset bisection status." -msgstr ":bisect: 文字列。 当該リビジョンの2分探索状態。" +msgstr ":bisect: 文字列。 当該リビジョンの二分探索状態。" msgid "" ":branch: String. The name of the branch on which the changeset was\n" @@ -23810,7 +24469,8 @@ msgid "" ":bookmarks: List of strings. Any bookmarks associated with the\n" " changeset." -msgstr ":tags: 文字列列挙。 当該リビジョンに付与されたブックマークの一覧。" +msgstr "" +":bookmarks: 文字列列挙。 当該リビジョンに付与されたブックマークの一覧。" msgid ":children: List of strings. The children of the changeset." msgstr ":children: 文字列列挙。 リビジョンの子供。" @@ -23819,7 +24479,7 @@ msgstr ":date: 日時情報。 リビジョンが記録された日時。" msgid ":desc: String. The text of the changeset description." -msgstr ":desc: 文字列。 リビジョンのコミットメッセージ。" +msgstr ":desc: 文字列。 リビジョンのコミットログ。" msgid "" ":diffstat: String. Statistics of changes with the following format:\n" @@ -23944,6 +24604,14 @@ msgid "filter %s expects one argument" msgstr "フィルタ %s は引数が1つ必要です" +#. i18n: "get" is a keyword +msgid "get() expects two arguments" +msgstr "get() の引数は2つです" + +#. i18n: "get" is a keyword +msgid "get() expects a dict as first argument" +msgstr "get() の第1引数は辞書でなければなりません" + #. i18n: "join" is a keyword msgid "join expects one or two arguments" msgstr "join の引数は1つないし2つです" @@ -23960,6 +24628,10 @@ msgid "ifeq expects three or four arguments" msgstr "ifeq は3ないし4の引数が必要です" +#. i18n: "rstdoc" is a keyword +msgid "rstdoc expects two arguments" +msgstr "rstdoc の引数は2つです" + msgid "unmatched quotes" msgstr "引用符の対応関係が不正です" @@ -24018,6 +24690,10 @@ msgid "%s.%s is not an integer ('%s')" msgstr "%s.%s の値 ('%s') は整数値ではありません" +#, python-format +msgid "%s.%s is not a byte quantity ('%s')" +msgstr "%s.%s の値 ('%s') はバイト数を表す値ではありません" + msgid "enter a commit username:" msgstr "コミットするユーザ名を入力してください:" @@ -24041,6 +24717,9 @@ msgid "password: " msgstr "パスワード: " +msgid "cannot create new union repository" +msgstr "" + msgid "http authorization required" msgstr "HTTP 認証に失敗" @@ -24085,6 +24764,15 @@ msgid "negative timestamp: %d" msgstr "負のタイムスタンプ: %d" +msgid "now" +msgstr "now" + +msgid "today" +msgstr "today" + +msgid "yesterday" +msgstr "yesterday" + #, python-format msgid "invalid date: %r" msgstr "不正な日付: %r" @@ -24165,6 +24853,58 @@ msgid "file:// URLs can only refer to localhost" msgstr "file:// URL が参照できるのはローカルホストのみです" +#, python-format +msgid "%.0f s" +msgstr "%.0f 秒" + +#, python-format +msgid "%.1f s" +msgstr "%.1f 秒" + +#, python-format +msgid "%.2f s" +msgstr "%.2f 秒" + +#, python-format +msgid "%.3f s" +msgstr "%.3f 秒" + +#, python-format +msgid "%.1f ms" +msgstr "%.1f ミリ秒" + +#, python-format +msgid "%.2f ms" +msgstr "%.2f ミリ秒" + +#, python-format +msgid "%.3f ms" +msgstr "%.3f ミリ秒" + +#, python-format +msgid "%.1f us" +msgstr "%.1f マイクロ秒" + +#, python-format +msgid "%.2f us" +msgstr "%.2f マイクロ秒" + +#, python-format +msgid "%.3f us" +msgstr "%.3f マイクロ秒" + +#, python-format +msgid "%.1f ns" +msgstr "%.1f ナノ秒" + +#, python-format +msgid "%.2f ns" +msgstr "%.2f ナノ秒" + +#, python-format +msgid "%.3f ns" +msgstr "%.3f ナノ秒" + msgid "cannot verify bundle or remote repos" msgstr "ローカルリポジトリ以外は検証できません" @@ -24343,3 +25083,6 @@ msgid "push failed:" msgstr "履歴反映に失敗:" + +msgid "number of cpus must be an integer" +msgstr "CPU 数は数値でなければなりません" diff -r 0890e6fd3e00 -r 838c6b72928d i18n/polib.py --- a/i18n/polib.py Sun May 12 15:35:53 2013 +0400 +++ b/i18n/polib.py Tue May 14 23:04:23 2013 +0400 @@ -277,7 +277,7 @@ an instance of :class:`~polib._BaseEntry`. """ return self.find(entry.msgid, by='msgid') is not None - + def __eq__(self, other): return unicode(self) == unicode(other) @@ -502,7 +502,7 @@ 7*4+entries_len*8, # start of value index 0, keystart # size and offset of hash table # Important: we don't use hash tables - ) + ) output += array.array("i", offsets).tostring() output += ids output += strs @@ -631,7 +631,7 @@ def __init__(self, *args, **kwargs): """ - Constructor, accepts all keywords arguments accepted by + Constructor, accepts all keywords arguments accepted by :class:`~polib._BaseFile` class. """ _BaseFile.__init__(self, *args, **kwargs) @@ -774,7 +774,7 @@ Returns the string representation of the entry. """ return unicode(self).encode(self.encoding) - + def __eq__(self, other): return unicode(self) == unicode(other) @@ -787,7 +787,7 @@ specialchars_count = 0 for c in ['\\', '\n', '\r', '\t', '"']: specialchars_count += field.count(c) - # comparison must take into account fieldname length + one space + # comparison must take into account fieldname length + one space # + 2 quotes (eg. msgid "") flength = len(fieldname) + 3 if plural_index: @@ -890,9 +890,9 @@ filelist.append(fpath) filestr = ' '.join(filelist) if wrapwidth > 0 and len(filestr) + 3 > wrapwidth: - # textwrap split words that contain hyphen, this is not - # what we want for filenames, so the dirty hack is to - # temporally replace hyphens with a char that a file cannot + # textwrap split words that contain hyphen, this is not + # what we want for filenames, so the dirty hack is to + # temporally replace hyphens with a char that a file cannot # contain, like "*" ret += [l.replace('*', '-') for l in wrap( filestr.replace('-', '*'), @@ -1099,7 +1099,7 @@ self.add('PP', all, 'PP') self.add('CT', ['ST', 'HE', 'GC', 'OC', 'FL', 'TC', 'PC', 'PM', 'PP', 'MS', 'MX'], 'CT') - self.add('MI', ['ST', 'HE', 'GC', 'OC', 'FL', 'CT', 'TC', 'PC', + self.add('MI', ['ST', 'HE', 'GC', 'OC', 'FL', 'CT', 'TC', 'PC', 'PM', 'PP', 'MS', 'MX'], 'MI') self.add('MP', ['TC', 'GC', 'PC', 'PM', 'PP', 'MI'], 'MP') self.add('MS', ['MI', 'MP', 'TC'], 'MS') @@ -1213,7 +1213,7 @@ # since entries are added when another entry is found, we must add # the last entry here (only if there are lines) self.instance.append(self.current_entry) - # before returning the instance, check if there's metadata and if + # before returning the instance, check if there's metadata and if # so extract it in a dict firstentry = self.instance[0] if firstentry.msgid == '': # metadata found @@ -1512,7 +1512,7 @@ # close opened file self.fhandle.close() return self.instance - + def _build_entry(self, msgid, msgstr=None, msgid_plural=None, msgstr_plural=None): msgctxt_msgid = msgid.split('\x04') @@ -1551,7 +1551,7 @@ drop_whitespace option. """ def __init__(self, *args, **kwargs): - drop_whitespace = kwargs.pop('drop_whitespace', True) + drop_whitespace = kwargs.pop('drop_whitespace', True) textwrap.TextWrapper.__init__(self, *args, **kwargs) self.drop_whitespace = drop_whitespace diff -r 0890e6fd3e00 -r 838c6b72928d i18n/pt_BR.po --- a/i18n/pt_BR.po Sun May 12 15:35:53 2013 +0400 +++ b/i18n/pt_BR.po Tue May 14 23:04:23 2013 +0400 @@ -1,13 +1,13 @@ # Brazilian Portuguese translations for Mercurial # Traduções do Mercurial para português do Brasil # Copyright (C) 2011 Matt Mackall and others -# +# # Translators: # Diego Oliveira # Wagner Bruna -# +# # Translation dictionary: -# +# # archive pacote # branch ramificar (v.), ramo (s.) # bundle bundle @@ -26,12 +26,12 @@ # tip tip (tag), ponta # update atualizar (v.), atualização (s.) # working directory diretório de trabalho -# +# msgid "" msgstr "" "Project-Id-Version: Mercurial\n" "Report-Msgid-Bugs-To: \n" -"POT-Creation-Date: 2013-01-21 11:21-0200\n" +"POT-Creation-Date: 2013-04-20 18:56-0300\n" "PO-Revision-Date: 2011-06-28 09:55+0200\n" "Last-Translator: Wagner Bruna \n" "Language-Team: Brazilian Portuguese\n" @@ -498,6 +498,66 @@ msgid "acl: user \"%s\" not allowed on \"%s\" (changeset \"%s\")" msgstr "acl: o acesso do usuário \"%s\" a \"%s\" não foi permitido (revisão \"%s\")" +msgid "log repository events to a blackbox for debugging" +msgstr "registra eventos do repositório para depuração" + +msgid "" +"Logs event information to .hg/blackbox.log to help debug and diagnose problems.\n" +"The events that get logged can be configured via the blackbox.track config key.\n" +"Examples::" +msgstr "" +"Registra informação de eventos no arquivo .hg/blackbox.log para auxiliar\n" +"depuração e diagnóstico de problemas. Os eventos que serão registrados\n" +"podem ser configurados através da opção de configuração blackbox.track.\n" +"Exemplos::" + +msgid "" +" [blackbox]\n" +" track = *" +msgstr "" +" [blackbox]\n" +" track = *" + +msgid "" +" [blackbox]\n" +" track = command, commandfinish, commandexception, exthook, pythonhook" +msgstr "" +" [blackbox]\n" +" track = command, commandfinish, commandexception, exthook, pythonhook" + +msgid "" +" [blackbox]\n" +" track = incoming" +msgstr "" +" [blackbox]\n" +" track = incoming" + +msgid "" +" [blackbox]\n" +" # limit the size of a log file\n" +" maxsize = 1.5 MB\n" +" # rotate up to N log files when the current one gets too big\n" +" maxfiles = 3" +msgstr "" +" [blackbox]\n" +" # limita o tamanho de um arquivo de log\n" +" maxsize = 1.5 MB\n" +" # rotaciona até N arquivos de log quando o atual se tornar grande demais\n" +" maxfiles = 3" + +msgid "the number of events to show" +msgstr "número de eventos a serem mostrados" + +msgid "hg blackbox [OPTION]..." +msgstr "hg blackbox [OPÇÃO]..." + +msgid "" +"view the recent repository events\n" +" " +msgstr "" +"visualiza os eventos recentes do repositório\n" +" " + msgid "hooks for integrating with the Bugzilla bug tracker" msgstr "ganchos para integração com o bug tracker Bugzilla" @@ -1675,6 +1735,15 @@ " suportada apenas pela origem Mercurial." msgid "" +" --closesort try to move closed revisions as close as possible\n" +" to parent branches, only supported by Mercurial\n" +" sources." +msgstr "" +" --closesort tenta mover revisões fechadas o mais próximo\n" +" possível de seus ramos pais, opção suportada\n" +" apenas pela origem Mercurial." + +msgid "" " If ``REVMAP`` isn't given, it will be put in a default location\n" " (``/.hg/shamap`` by default). The ``REVMAP`` is a simple\n" " text file that maps each source commit ID to the destination ID\n" @@ -2190,6 +2259,9 @@ msgid "preserve source changesets order" msgstr "preserva a ordem de revisões da origem" +msgid "try to reorder closed revisions" +msgstr "tenta reordenar revisões fechadas" + msgid "hg convert [OPTION]... SOURCE [DEST [REVMAP]]" msgstr "hg convert [OPÇÃO]... ORIGEM [DESTINO [REVMAP]]" @@ -2380,6 +2452,9 @@ msgid "--sourcesort is not supported by this data source" msgstr "--sourcesort não é suportado para esta origem de dados" +msgid "--closesort is not supported by this data source" +msgstr "--closesort não é suportado para esta origem de dados" + #, python-format msgid "%s does not look like a CVS checkout" msgstr "%s não parece ser uma cópia de trabalho do CVS" @@ -4155,6 +4230,13 @@ msgid "%s: empty changeset" msgstr "%s: revisão vazia" +#, python-format +msgid "comparing with %s\n" +msgstr "comparando com %s\n" + +msgid "no outgoing ancestors" +msgstr "nenhum ancestral a ser enviado" + msgid "Read history edits from the specified file." msgstr "Lê alterações de histórico a partir do arquivo especificado." @@ -4189,13 +4271,6 @@ msgid "source has mq patches applied" msgstr "a origem tem patches mq aplicados" -msgid "only one repo argument allowed with --outgoing" -msgstr "apenas um repositório pode ser usado com --outgoing" - -#, python-format -msgid "comparing with %s\n" -msgstr "comparando com %s\n" - msgid "--force only allowed with --outgoing" msgstr "--force só é permitido com --outgoing" @@ -4209,15 +4284,18 @@ msgstr "" "uma edição de histórico já está em progresso, tente --continue ou --abort" +msgid "no revisions allowed with --outgoing" +msgstr "nenhuma revisão é permitida com --outgoing" + +msgid "only one repo argument allowed with --outgoing" +msgstr "apenas um repositório pode ser usado com --outgoing" + msgid "histedit requires exactly one parent revision" msgstr "histedit requer exatamente uma revisão pai" -msgid "nothing to edit\n" -msgstr "nada para editar\n" - -#, python-format -msgid "working directory parent is not a descendant of %s" -msgstr "a revisão do diretório de trabalho não é descendente de %s" +#, python-format +msgid "%s is not an ancestor of working directory" +msgstr "%s não é um ancestral do diretório de trabalho" #, python-format msgid "update to %s or descendant and run \"hg histedit --continue\" again" @@ -4232,25 +4310,33 @@ msgid "cannot edit immutable changeset: %s" msgstr "não é possível editar uma revisão imutável: %s" -msgid "must specify a rule for each changeset once" -msgstr "é necessário especificar uma vez uma regra para cada revisão" - #, python-format msgid "malformed line \"%s\"" msgstr "linha malformada \"%s\"" -msgid "may not use changesets other than the ones listed" -msgstr "não é possível usar revisões além das listadas" - #, python-format msgid "unknown changeset %s listed" msgstr "revisão desconhecida %s listada" +msgid "may not use changesets other than the ones listed" +msgstr "não é possível usar revisões além das listadas" + +#, python-format +msgid "duplicated command for changeset %s" +msgstr "comando duplicado para a revisão %s" + #, python-format msgid "unknown action \"%s\"" msgstr "ação desconhecida \"%s\"" #, python-format +msgid "missing rules for changeset %s" +msgstr "regras faltando para a revisão %s" + +msgid "do you want to use the drop action?" +msgstr "você gostaria de usar a ação drop?" + +#, python-format msgid "histedit: moving bookmarks %s from %s to %s\n" msgstr "histedit: movendo marcadores %s de %s para %s\n" @@ -4867,18 +4953,62 @@ msgid "" "When you pull a changeset that affects largefiles from a remote\n" -"repository, Mercurial behaves as normal. However, when you update to\n" -"such a revision, any largefiles needed by that revision are downloaded\n" -"and cached (if they have never been downloaded before). This means\n" -"that network access may be required to update to changesets you have\n" -"not previously updated to." -msgstr "" -"Ao trazer revisões de um repositório remoto que afetam largefiles, o\n" -"Mercurial se comporta normalmente. No entanto, ao atualizar para tal\n" +"repository, the largefiles for the changeset will by default not be\n" +"pulled down. However, when you update to such a revision, any\n" +"largefiles needed by that revision are downloaded and cached (if\n" +"they have never been downloaded before). One way to pull largefiles\n" +"when pulling is thus to use --update, which will update your working\n" +"copy to the latest pulled revision (and thereby downloading any new\n" +"largefiles)." +msgstr "" +"Ao trazer revisões de um repositório remoto que afetam largefiles,\n" +"os largefiles para essas revisões por padrão não serão trazidos.\n" +"No entanto, ao atualizar para tal\n" "revisão, quaisquer largefiles necessárias para tal revisão serão\n" -"baixadas e guardadas em um cache (se elas nunca foram baixadas antes)\n" -". Isto quer dizer que acesso à rede pode ser necessário para atualizar\n" -"para revisões que ainda não estiveram no diretório de trabalho." +"baixadas e guardadas em um cache (se elas nunca foram baixadas antes).\n" +"Para trazer os largefiles em uma operação pull, você pode usar a opção\n" +"--update, que atualizará sua cópia de trabalho para a última revisão\n" +"trazida (e consequentemente baixará qualquer novo largefile)." + +msgid "" +"If you want to pull largefiles you don't need for update yet, then\n" +"you can use pull with the `--lfrev` option or the :hg:`lfpull` command." +msgstr "" +"Se você quiser baixar largefiles que ainda não são necessários para\n" +"um update, use a opção `--lfrev` com o comando pull, ou o comando\n" +":hg:`lfpull`." + +msgid "" +"If you know you are pulling from a non-default location and want to\n" +"download all the largefiles that correspond to the new changesets at\n" +"the same time, then you can pull with `--lfrev \"pulled()\"`." +msgstr "" +"Se você estiver trazendo revisões de um local não-padrão e quiser\n" +"ao mesmo tempo baixar todos os largefiles que correspondam a essas\n" +"revisões, use o comando pull com as opções `--lfrev \"pulled()\"`." + +msgid "" +"If you just want to ensure that you will have the largefiles needed to\n" +"merge or rebase with new heads that you are pulling, then you can pull\n" +"with `--lfrev \"head(pulled())\"` flag to pre-emptively download any largefiles\n" +"that are new in the heads you are pulling." +msgstr "" +"Se você quiser apenas garantir que você terá os largefiles necessários\n" +"para realizar merge ou rebase com as novas cabeças que você estiver\n" +"trazendo, você pode usar `--lfrev \"head(pulled())\"` com o comando\n" +"pull para baixar preemptivamente quaisquer largefiles novos nas cabeças\n" +"trazidas." + +msgid "" +"Keep in mind that network access may now be required to update to\n" +"changesets that you have not previously updated to. The nature of the\n" +"largefiles extension means that updating is no longer guaranteed to\n" +"be a local-only operation." +msgstr "" +"Tenha em mente que acesso à rede pode ser necessário para atualizar\n" +"para revisões para as quais você não atualizou anteriormente. O\n" +"modo de operação da extensão largefiles implica que não é mais\n" +"garantido que a operação update seja apenas local." msgid "" "If you already have large files tracked by Mercurial without the\n" @@ -5003,8 +5133,50 @@ " normais; após essa conversão, o repositório DEST poderá ser\n" " usado normalmente, sem a extensão largefiles." -#, python-format -msgid "error getting %s from %s for %s: %s\n" +msgid "pull largefiles for the specified revisions from the specified source" +msgstr "" +"traz largefiles para as revisões especificadas a partir da origem " +"especificada" + +msgid "" +" Pull largefiles that are referenced from local changesets but missing\n" +" locally, pulling from a remote repository to the local cache." +msgstr "" +" Traz de um repositório remoto e adiciona ao cache local\n" +" largefiles que são referenciadas por revisões locais mas\n" +" não estão presentes localmente." + +msgid "" +" If SOURCE is omitted, the 'default' path will be used.\n" +" See :hg:`help urls` for more information." +msgstr "" +" Se ORIGEM for omitida, o caminho 'default' será usado. Veja\n" +" :hg:`help urls` para mais informações." + +msgid " .. container:: verbose" +msgstr " .. container:: verbose" + +msgid " Some examples:" +msgstr " Alguns exemplos::" + +msgid " - pull largefiles for all branch heads::" +msgstr " - traz largefiles para todas as cabeças de ramo::" + +msgid " hg lfpull -r \"head() and not closed()\"" +msgstr " hg lfpull -r \"head() and not closed()\"" + +msgid " - pull largefiles on the default branch::" +msgstr " - traz largefiles no ramo default::" + +msgid "" +" hg lfpull -r \"branch(default)\"\n" +" " +msgstr "" +" hg lfpull -r \"branch(default)\"\n" +" " + +#, python-format +msgid "error getting id %s from url %s for file %s: %s\n" msgstr "erro ao obter id %s a partir da url %s para o arquivo %s: %s\n" msgid "getting largefiles" @@ -5015,6 +5187,10 @@ msgstr "obtendo %s:%s\n" #, python-format +msgid "%s: largefile %s not available from %s\n" +msgstr "%s: largefile %s não está disponível em %s\n" + +#, python-format msgid "%s: data corruption (expected %s, got %s)\n" msgstr "%s: corrupção de dados (esperado %s, obtido %s)\n" @@ -5095,9 +5271,16 @@ msgid "%d largefiles updated, %d removed\n" msgstr "%d largefiles atualizados, %d removidos\n" -#, python-format -msgid "largefile %s is not in cache and could not be downloaded" -msgstr "o largefile %s não está no cache e não pôde ser baixado" +msgid "no revisions specified" +msgstr "nenhuma revisão especificada" + +#, python-format +msgid "pulling largefiles for revision %s\n" +msgstr "trazendo largefiles para a revisão %s\n" + +#, python-format +msgid "%d largefiles cached\n" +msgstr "%d largefiles adicionados ao cache\n" msgid "minimum size (MB) for files to be converted as largefiles" msgstr "tamanho mínimo (MB) para arquivos serem convertidos em largefiles" @@ -5108,6 +5291,12 @@ msgid "hg lfconvert SOURCE DEST [FILE ...]" msgstr "hg lfconvert ORIGEM DEST [ARQUIVO ...]" +msgid "pull largefiles for these revisions" +msgstr "traz largefiles para estas revisões" + +msgid "-r REV... [-e CMD] [--remotecmd CMD] [SOURCE]" +msgstr "-r REV... [-e CMD] [--remotecmd CMD] [ORIGEM]" + #, python-format msgid "largefiles: size must be number (not %s)\n" msgstr "largefiles: o tamanho deve ser um número (e não %s)\n" @@ -5131,24 +5320,12 @@ msgstr "não é possível obter o arquivo localmente" #, python-format -msgid "" -"changeset %s: %s missing\n" -" (looked for hash %s)\n" -msgstr "" -"revisão %s: %s faltando\n" -" (procurou pelo hash %s)\n" - -#, python-format -msgid "" -"changeset %s: %s: contents differ\n" -" (%s:\n" -" expected hash %s,\n" -" but got %s)\n" -msgstr "" -"revisão %s: %s: o conteúdo é diferente\n" -" (%s:\n" -" esperado hash %s,\n" -" mas obteve %s)\n" +msgid "changeset %s: %s references missing %s\n" +msgstr "revisão %s: %s referencia %s faltando\n" + +#, python-format +msgid "changeset %s: %s references corrupted %s\n" +msgstr "revisão %s: %s referencia %s corrompido\n" #, python-format msgid "%s already a largefile\n" @@ -5236,12 +5413,8 @@ msgid "destination largefile already exists" msgstr "largefile de destino já existe" -msgid "caching new largefiles\n" -msgstr "adicionando novos largefiles ao cache\n" - -#, python-format -msgid "%d largefiles cached\n" -msgstr "%d largefiles adicionados ao cache\n" +msgid "pulled() only available in --lfrev" +msgstr "pulled() só está disponível para --lfrev" #, python-format msgid "--all-largefiles is incompatible with non-local destination %s" @@ -5286,6 +5459,10 @@ msgid "largefiles: %d to upload\n" msgstr "largefiles: %d a serem enviados\n" +#, python-format +msgid "largefile %s is not in cache and could not be downloaded" +msgstr "o largefile %s não está no cache e não pôde ser baixado" + msgid "largefile contents do not match hash" msgstr "o conteúdo do largefile não combina com o hash" @@ -5326,14 +5503,6 @@ msgstr "remotestore: não foi possível abrir arquivo %s: %s" #, python-format -msgid "remotestore: largefile %s is invalid" -msgstr "remotestore: largefile %s é inválido" - -#, python-format -msgid "remotestore: largefile %s is missing" -msgstr "remotestore: largefile %s está faltando" - -#, python-format msgid "changeset %s: %s: contents differ\n" msgstr "revisão %s: %s: o conteúdo está diferente\n" @@ -5342,14 +5511,6 @@ msgstr "revisão %s: %s faltando\n" #, python-format -msgid "" -"largefiles: repo method %r appears to have already been wrapped by another " -"extension: largefiles may behave incorrectly\n" -msgstr "" -"largefiles: o método %r do repositório parece já ter sido sobreposto\n" -"por uma outra extensão: largefiles pode se comportar incorretamente\n" - -#, python-format msgid "file \"%s\" is a largefile standin" msgstr "o arquivo \"%s\" é um standin largefile" @@ -5365,14 +5526,15 @@ "adiciona todos os arquivos acima deste tamanho (em megabytes) como " "largefiles (padrão: 10)" -msgid "verify largefiles" -msgstr "verifica largefiles" - -msgid "verify all revisions of largefiles not just current" -msgstr "verifica todas as revisões de largefiles, não apenas a revisão atual" - -msgid "verify largefile contents not just existence" -msgstr "verifica conteúdos de largefiles, não apenas existência" +msgid "verify that all largefiles in current revision exists" +msgstr "verifica se todos os largefiles na revisão atual existem" + +msgid "verify largefiles in all revisions, not just current" +msgstr "" +"verifica largefiles em todas as revisões, e não apenas na revisão atual" + +msgid "verify local largefile contents, not just existence" +msgstr "verifica conteúdos de largefiles locais, e não apenas sua existência" msgid "display largefiles dirstate" msgstr "mostra o dirstate de largefiles" @@ -5380,8 +5542,11 @@ msgid "display outgoing largefiles" msgstr "exibe largefiles a serem enviados" -msgid "download all pulled versions of largefiles" -msgstr "baixa todas as versões trazidas dos largefiles" +msgid "download all pulled versions of largefiles (DEPRECATED)" +msgstr "baixa todas as versões trazidas dos largefiles (OBSOLETO)" + +msgid "download largefiles for these revisions" +msgstr "baixa largefiles para estas revisões" msgid "download all versions of all largefiles" msgstr "baixa todas as versões de todos os largefiles" @@ -6913,9 +7078,6 @@ " a um repositório upstream, ou se você pretender enviar essas\n" " mudanças para upstream." -msgid "no revisions specified" -msgstr "nenhuma revisão especificada" - msgid "warning: uncommitted changes in the working directory\n" msgstr "aviso: mudanças não consolidadas no diretório de trabalho\n" @@ -8160,6 +8322,13 @@ " repositórios, que aparecerão como duplicatas das revisões rebaseadas." msgid "" +" In its default configuration, Mercurial will prevent you from\n" +" rebasing published changes. See :hg:`help phases` for details." +msgstr "" +" Em sua configuração padrão, o Mercurial impede que revisões\n" +" públicas sejam rebaseadas. Veja :hg:`help phases` para mais detalhes." + +msgid "" " If you don't specify a destination changeset (``-d/--dest``),\n" " rebase uses the tipmost head of the current named branch as the\n" " destination. (The destination changeset is not modified by\n" @@ -8198,6 +8367,18 @@ " como revisão base." msgid "" +" For advanced usage, a third way is available through the ``--rev``\n" +" option. It allows you to specify an arbitrary set of changesets to\n" +" rebase. Descendants of revs you specify with this option are not\n" +" automatically included in the rebase." +msgstr "" +" Um modo avançado adicional está disponível através da opção\n" +" ``--rev``. Ele possibilita a especificação de um conjunto\n" +" arbitrário de revisões a serem rebaseadas. Os descendentes\n" +" das revisões especificadas dessa maneira não serão\n" +" automaticamente incluídos no rebaseamento." + +msgid "" " By default, rebase recreates the changesets in the source branch\n" " as descendants of dest and then destroys the originals. Use\n" " ``--keep`` to preserve the original source changesets. Some\n" @@ -8273,6 +8454,9 @@ msgid "use --keep to keep original changesets" msgstr "use --keep para manter as revisões originais" +msgid "nothing to rebase\n" +msgstr "nada para rebasear\n" + #, python-format msgid "can't rebase immutable changeset %s" msgstr "não é possível rebasear a revisão imutável %s" @@ -8280,9 +8464,6 @@ msgid "see hg help phases for details" msgstr "veja hg help phases para mais detalhes" -msgid "nothing to rebase\n" -msgstr "nada para rebasear\n" - msgid "cannot collapse multiple named branches" msgstr "não é possível colapsar múltiplos ramos nomeados" @@ -8542,6 +8723,10 @@ msgid "cannot partially commit a merge (use \"hg commit\" instead)" msgstr "não é possível consolidar parcialmente uma mesclagem (use \"hg commit\")" +#, python-format +msgid "error parsing patch: %s" +msgstr "erro decodificando patch: %s" + msgid "no changes to record\n" msgstr "nenhuma mudança a ser gravada\n" @@ -8730,6 +8915,10 @@ "mesmo nome.\n" #, python-format +msgid "no '://' in scheme url '%s'" +msgstr "nenhum '://' na url da extensão scheme '%s'" + +#, python-format msgid "custom scheme %s:// conflicts with drive letter %s:\\\n" msgstr "esquema personalizado %s:// conflita com a letra de unidade %s:\\\n" @@ -8795,8 +8984,13 @@ msgid "command to transplant changesets from another branch" msgstr "comando para transplantar revisões de um outro ramo" -msgid "This extension allows you to transplant patches from another branch." -msgstr "Esta extensão lhe permite transplantar patches de outro ramo." +msgid "" +"This extension allows you to transplant changes to another parent revision,\n" +"possibly in another repository. The transplant is done using 'diff' patches." +msgstr "" +"Esta extensão possibilita o transplante de mudanças para outra\n" +"revisão pai, possivelmente em outro repositório. O transplante\n" +"é realizado usando patches no formato 'diff'." msgid "" "Transplanted patches are recorded in .hg/transplant/transplants, as a\n" @@ -8883,14 +9077,14 @@ msgid "no such option\n" msgstr "não existe tal opção\n" -msgid "pull patches from REPO" -msgstr "traz patches do REPOSITÓRIO" - -msgid "pull patches from branch BRANCH" -msgstr "traz patches do ramo RAMO" - -msgid "pull all changesets up to BRANCH" -msgstr "traz todas as revisões até RAMO" +msgid "transplant changesets from REPO" +msgstr "transplanta revisões de REPO" + +msgid "use this source changeset as head" +msgstr "usa esta revisão de origem como cabeça" + +msgid "pull all changesets up to the --branch revisions" +msgstr "traz todas as revisões até as revisões do --branch" msgid "skip over REV" msgstr "omite revisão REV" @@ -8904,8 +9098,8 @@ msgid "append transplant info to log message" msgstr "anexa informações de transplante à mensagem de log" -msgid "continue last transplant session after repair" -msgstr "continua a última sessão de transplante após reparos" +msgid "continue last transplant session after fixing conflicts" +msgstr "continua a última sessão de transplante após a resolução de conflitos" msgid "filter changesets through command" msgstr "filtra revisões através do comando" @@ -8920,15 +9114,25 @@ msgid "" " Selected changesets will be applied on top of the current working\n" " directory with the log of the original changeset. The changesets\n" -" are copied and will thus appear twice in the history. Use the\n" -" rebase extension instead if you want to move a whole branch of\n" -" unpublished changesets." +" are copied and will thus appear twice in the history with different\n" +" identities." msgstr "" " As revisões selecionadas serão aplicadas sobre o diretório de\n" " trabalho atual com a descrição da revisão original. As revisões\n" -" são copiadas, e portanto aparecerão duas vezes no histórico. Se\n" -" ao invés disso você quiser mover um ramo inteiro de revisões não\n" -" publicadas, use a extensão rebase." +" são copiadas, e portanto aparecerão duas vezes no histórico com\n" +" identidades diferentes." + +msgid "" +" Consider using the graft command if everything is inside the same\n" +" repository - it will use merges and will usually give a better result.\n" +" Use the rebase extension if the changesets are unpublished and you want\n" +" to move them instead of copying them." +msgstr "" +" Se tudo estiver dentro do mesmo repositório, considere usar o\n" +" comando graft - ele usará mesclagens para combinar as mudanças,\n" +" tipicamente produzindo melhores resultados.\n" +" Use a extensão rebase se as revisões não tiverem sido publicadas\n" +" e você quiser movê-las ao invés de copiá-las." msgid "" " If --log is specified, log messages will have a comment appended\n" @@ -8950,28 +9154,32 @@ " changelog em $1 e o patch em $2." msgid "" -" If --source/-s is specified, selects changesets from the named\n" -" repository. If --branch/-b is specified, selects changesets from\n" -" the branch holding the named revision, up to that revision. If\n" -" --all/-a is specified, all changesets on the branch will be\n" -" transplanted, otherwise you will be prompted to select the\n" -" changesets you want." -msgstr "" -" Se --source/-s for especificado, seleciona revisões do\n" -" repositório pedido. Se --branch/-b for especificado, seleciona\n" -" revisões do ramo que contém a revisão especificada, até essa\n" -" revisão. Se --all/-a for especificado, todas as revisões do\n" -" ramo serão transplantadas, de outro modo as revisões a serem\n" -" transplantadas serão pedidas interativamente." - -msgid "" -" :hg:`transplant --branch REV --all` will transplant the\n" -" selected branch (up to the named revision) onto your current\n" -" working directory." -msgstr "" -" :hg:`transplant --branch REV --all` irá transplantar o ramo\n" -" selecionado (até a revisão pedida) no seu diretório de trabalho\n" -" atual." +" --source/-s specifies another repository to use for selecting changesets,\n" +" just as if it temporarily had been pulled.\n" +" If --branch/-b is specified, these revisions will be used as\n" +" heads when deciding which changsets to transplant, just as if only\n" +" these revisions had been pulled.\n" +" If --all/-a is specified, all the revisions up to the heads specified\n" +" with --branch will be transplanted." +msgstr "" +" --source/-s especifica outro repositório a ser usado para\n" +" selecionar revisões, temporariamente como se tivessem sido\n" +" trazidas usando pull.\n" +" Se --branch/-b for especificado, estas revisões serão usadas\n" +" como cabeças ao decidir quais revisões serão transplantadas,\n" +" como se apenas essas revisões tivessem sido trazidas.\n" +" Se --all/-a for especificado, todas as revisões até as\n" +" cabeças especificadas com --branch serão transplantadas." + +msgid " Example:" +msgstr " Exemplo:" + +msgid "" +" - transplant all changes up to REV on top of your current revision::" +msgstr " - transplanta todas as mudanças até REV sobre sua revisão atual::" + +msgid " hg transplant --branch REV --all" +msgstr " hg transplant --branch REV --all" msgid "" " You can optionally mark selected transplanted changesets as merge\n" @@ -9011,10 +9219,10 @@ " :hg:`transplant --continue/-c` para retomar o transplante.\n" " " -msgid "--continue is incompatible with branch, all or merge" -msgstr "--continue é incompatível com branch, all ou merge" - -msgid "no source URL, branch tag or revision list provided" +msgid "--continue is incompatible with --branch, --all and --merge" +msgstr "--continue é incompatível com --branch, --all e --merge" + +msgid "no source URL, branch revision or revision list provided" msgstr "URL de origem, nome de ramo ou lista de revisões não fornecidas" msgid "--all requires a branch revision" @@ -9318,6 +9526,9 @@ msgid "archiving" msgstr "empacotando" +msgid "no files match the archive pattern" +msgstr "nenhum arquivo corresponde ao padrão pedido" + #, python-format msgid "malformed line in .hg/bookmarks: %r\n" msgstr "linha malformada em .hg/bookmarks: %r\n" @@ -9634,6 +9845,10 @@ msgstr "HG: ramo '%s'" #, python-format +msgid "HG: bookmark '%s'" +msgstr "HG: marcador '%s'" + +#, python-format msgid "HG: subrepo %s" msgstr "HG: subrepo %s" @@ -9655,6 +9870,17 @@ msgid "empty commit message" msgstr "mensagem de consolidação vazia" +msgid "created new head\n" +msgstr "nova cabeça criada\n" + +#, python-format +msgid "reopening closed branch head %d\n" +msgstr "reabrindo cabeça de ramo fechada %d\n" + +#, python-format +msgid "committed changeset %d:%s\n" +msgstr "consolidada a revisão %d:%s\n" + #, python-format msgid "forgetting %s\n" msgstr "esquecendo %s\n" @@ -9844,9 +10070,6 @@ " Se nomes não forem dados, adiciona todos os arquivos ao\n" " repositório." -msgid " .. container:: verbose" -msgstr " .. container:: verbose" - msgid "" " An example showing how new (unknown) files are added\n" " automatically by :hg:`add`::" @@ -10254,9 +10477,6 @@ " abortará a bissecção, e qualquer outro código maior que 0\n" " marcará a revisão como ruim." -msgid " Some examples:" -msgstr " Alguns exemplos::" - msgid "" " - start a bisection with known bad revision 12, and good revision 34::" msgstr "" @@ -10492,6 +10712,15 @@ " " msgid "" +" If you set a bookmark called '@', new clones of the repository will\n" +" have that revision checked out (and the bookmark made active) by\n" +" default." +msgstr "" +" Se você definir um marcador chamado '@', novos clones do\n" +" repositório por padrão atualizarão para essa revisão e\n" +" tornarão esse marcador ativo." + +msgid "" " With -i/--inactive, the new bookmark will not be made the active\n" " bookmark. If -r/--rev is given, the new bookmark will not be made\n" " active even if -i/--inactive is not given. If no NAME is given, the\n" @@ -10508,6 +10737,10 @@ msgstr "nomes de marcadores não podem conter apenas espaços em branco" #, python-format +msgid "moving bookmark '%s' forward from %s\n" +msgstr "movendo marcador '%s' para frente a partir de %s\n" + +#, python-format msgid "bookmark '%s' already exists (use -f to force)" msgstr "o marcador '%s' já existe (use -f para forçar)" @@ -10854,6 +11087,14 @@ " etiquetada mas não incluirá a revisão que define a etiqueta em si." msgid "" +" If the source repository has a bookmark called '@' set, that\n" +" revision will be checked out in the new repository by default." +msgstr "" +" Se o repositório de origem possuir um marcador de nome '@'\n" +" definido, por padrão o novo repositório será atualizado para\n" +" essa revisão." + +msgid "" " To check out a particular version, use -u/--update, or\n" " -U/--noupdate to create a clone with no working directory." msgstr "" @@ -10918,8 +11159,9 @@ " d) the changeset specified with -r\n" " e) the tipmost head specified with -b\n" " f) the tipmost head specified with the url#branch source syntax\n" -" g) the tipmost head of the default branch\n" -" h) tip" +" g) the revision marked with the '@' bookmark, if present\n" +" h) the tipmost head of the default branch\n" +" i) tip" msgstr "" " a) null, se for passada a opção -U ou se o repositório de origem não\n" " tiver revisões\n" @@ -10931,8 +11173,9 @@ " e) a cabeça mais recente especificada com -b\n" " f) a cabeça mais recente especificada com a sintaxe de origem\n" " url#ramo\n" -" g) a cabeça mais recente do ramo default\n" -" h) a tip" +" g) a revisão apontada pelo marcador '@', se presente\n" +" h) a cabeça mais recente do ramo default\n" +" i) a tip" msgid " - clone a remote repository to a new directory named hg/::" msgstr "" @@ -11074,18 +11317,12 @@ " Devolve 0 para indicar sucesso, 1 se nada mudou.\n" " " -msgid "can only close branch heads" -msgstr "só pode fechar cabeças de ramo" - msgid "cannot amend recursively" msgstr "não é possível emendar recursivamente" msgid "cannot amend public changesets" msgstr "não é possível emendar revisões públicas" -msgid "cannot amend merge changesets" -msgstr "não é possível emendar revisões de mesclagem" - msgid "cannot amend while merging" msgstr "não é possível emendar durante uma mesclagem" @@ -11099,17 +11336,6 @@ msgid "nothing changed (%d missing files, see 'hg status')\n" msgstr "nada mudou (%d arquivos faltando, veja 'hg status')\n" -msgid "created new head\n" -msgstr "nova cabeça criada\n" - -#, python-format -msgid "reopening closed branch head %d\n" -msgstr "reabrindo cabeça de ramo fechada %d\n" - -#, python-format -msgid "committed changeset %d:%s\n" -msgstr "consolidada a revisão %d:%s\n" - msgid "record a copy that has already occurred" msgstr "grava uma cópia que já ocorreu" @@ -11371,7 +11597,7 @@ msgstr "aplica o filespec nesta revisão" msgid "[-r REV] FILESPEC" -msgstr "[-r REV] FILESPEC" +msgstr "[-r REV] PADRÃOARQ" msgid "parse and apply a fileset specification" msgstr "interpreta e aplica uma especificação de fileset" @@ -11501,6 +11727,14 @@ " desconhecidos e conhecidos.\n" " " +msgid "LABEL..." +msgstr "RÓTULO..." + +msgid "complete \"labels\" - tags, open branch names, bookmark names" +msgstr "" +"completa \"rótulos\" - etiquetas, nomes de ramos abertos, nomes de " +"marcadores" + msgid "markers flag" msgstr "flag de marcação" @@ -11510,6 +11744,43 @@ msgid "create arbitrary obsolete marker" msgstr "cria uma marcação de obsolescência arbitrária" +msgid " With no arguments, displays the list of obsolescence markers." +msgstr " Sem parâmetros, mostra a lista de marcações de obsolescência." + +msgid "complete an entire path" +msgstr "completa um caminho completo" + +msgid "show only normal files" +msgstr "mostra apenas arquivos normais" + +msgid "show only added files" +msgstr "mostra apenas arquivos adicionados" + +msgid "show only removed files" +msgstr "mostra apenas arquivos removidos" + +msgid "FILESPEC..." +msgstr "PADRÃOARQ..." + +msgid "complete part or all of a tracked path" +msgstr "completa todo ou uma parte de um caminho rastreado" + +msgid "" +" This command supports shells that offer path name completion. It\n" +" currently completes only files already known to the dirstate." +msgstr "" +" Este comando suporta shells que ofereçam completação de\n" +" nomes de caminhos. Atualmente, apenas arquivos já presentes\n" +" no dirstate serão completados." + +msgid "" +" Completion extends only to the next path segment unless\n" +" --full is specified, in which case entire paths are used." +msgstr "" +" A completação por padrão é feita apenas para o próximo segmento\n" +" do caminho. Se --full for especificado, são usados caminhos\n" +" completos." + msgid "REPO NAMESPACE [KEY OLD NEW]" msgstr "REPOSITÓRIO NAMESPACE [CHAVE ANTIGO NOVO]" @@ -11551,12 +11822,35 @@ msgid "revision to rebuild to" msgstr "revisão para a qual reconstruir" -msgid "[-r REV] [REV]" -msgstr "[-r REV] [REV]" +msgid "[-r REV]" +msgstr "[-r REV]" msgid "rebuild the dirstate as it would look like for the given revision" msgstr "reconstrói o dirstate como ele pareceria para a revisão dada" +msgid " If no revision is specified the first current parent will be used." +msgstr "" +" Se nenhuma revisão for especificada, será usado o primeiro pai do " +"diretório de trabalho." + +msgid "" +" The dirstate will be set to the files of the given revision.\n" +" The actual working directory content or existing dirstate\n" +" information such as adds or removes is not considered." +msgstr "" +" O dirstate será definido para os arquivos da revisão pedida.\n" +" O conteúdo real do diretório de trabalho e informações existentes\n" +" no dirstate (como adições e remoções) não são considerados." + +msgid "" +" One use of this command is to make the next :hg:`status` invocation\n" +" check the actual file content.\n" +" " +msgstr "" +" Um uso para este comando é fazer com que a próxima execução\n" +" de :hg:`status` verifique o conteúdo real dos arquivos.\n" +" " + msgid "revision to debug" msgstr "revisão a ser depurada" @@ -11627,6 +11921,9 @@ msgid "revision to check" msgstr "revisão para verificar" +msgid "[-r REV] [REV]" +msgstr "[-r REV] [REV]" + msgid "[REV]" msgstr "[REV]" @@ -11642,7 +11939,7 @@ msgid "" " In most cases a changeset A has a single successors set containing a single\n" -" successors (changeset A replaced by A')." +" successor (changeset A replaced by A')." msgstr "" " Na maior parte dos casos uma revisão A possui um único conjunto\n" " de sucessores contendo um único sucessor (revisão A substituída por A')." @@ -11656,7 +11953,7 @@ msgid "" " A changeset that has been \"split\" will have a successors set containing\n" -" more than one successors." +" more than one successor." msgstr "" " Uma revisão que tenha sido \"dividida\" terá um conjunto de\n" " sucessores contendo mais de um sucessor." @@ -11803,16 +12100,19 @@ msgid "revisions to export" msgstr "revisões a serem exportadas" -msgid "[OPTION]... [-o OUTFILESPEC] [-r] REV..." -msgstr "[OPÇÃO]... [-o PADRÃOARQSAÍDA] [-r] REV..." +msgid "[OPTION]... [-o OUTFILESPEC] [-r] [REV]..." +msgstr "[OPÇÃO]... [-o PADRÃOARQSAÍDA] [-r] [REV]..." msgid "dump the header and diffs for one or more changesets" msgstr "exibe o cabeçalho e diffs para uma ou mais revisões" -msgid " Print the changeset header and diffs for one or more revisions." -msgstr "" -" Imprime o cabeçalho de revisão e diffs para uma ou mais\n" -" revisões." +msgid "" +" Print the changeset header and diffs for one or more revisions.\n" +" If no revision is given, the parent of the working directory is used." +msgstr "" +" Imprime o conteúdo dos arquivos especificados na revisão pedida.\n" +" Se a revisão não for pedida, será usado o pai do diretório de\n" +" trabalho." msgid "" " The information shown in the changeset header is: author, date,\n" @@ -12286,132 +12586,6 @@ " Devolve 0 para indicar sucesso.\n" " " -#, python-format -msgid "" -"\n" -"aliases: %s\n" -msgstr "" -"\n" -"apelidos: %s\n" - -msgid "(no help text available)" -msgstr "(texto de ajuda não disponível)" - -#, python-format -msgid "shell alias for::" -msgstr "apelido de shell para::" - -#, python-format -msgid " %s" -msgstr " %s" - -#, python-format -msgid "alias for: hg %s" -msgstr "apelido para: hg %s" - -#, python-format -msgid "%s" -msgstr "%s" - -#, python-format -msgid "use \"hg help -e %s\" to show help for the %s extension" -msgstr "use \"hg help -e %s\" para mostrar a ajuda para a extensão %s" - -msgid "options:" -msgstr "opções:" - -msgid "global options:" -msgstr "opções globais:" - -#, python-format -msgid "" -"\n" -"use \"hg help %s\" to show the full help text\n" -msgstr "" -"\n" -"use \"hg help %s\" para mostrar o texto completo de ajuda\n" - -#, python-format -msgid "use \"hg -v help %s\" to show more complete help and the global options" -msgstr "" -"use \"hg -v help %s\" para mostrar um texto de ajuda mais completo e opções " -"globais" - -#, python-format -msgid "use \"hg -v help %s\" to show the global options" -msgstr "use \"hg -v help %s\" para mostrar opções globais" - -msgid "basic commands:" -msgstr "comandos básicos:" - -msgid "list of commands:" -msgstr "lista de comandos:" - -msgid "no commands defined\n" -msgstr "nenhum comando definido\n" - -msgid "enabled extensions:" -msgstr "extensões habilitadas:" - -msgid "" -"\n" -"additional help topics:" -msgstr "" -"\n" -"tópicos adicionais de ajuda:" - -msgid "use \"hg help\" for the full list of commands" -msgstr "use \"hg help\" para a lista completa de comandos" - -msgid "use \"hg help\" for the full list of commands or \"hg -v\" for details" -msgstr "use \"hg help\" para a lista completa de comandos ou \"hg -v\" para detalhes" - -#, python-format -msgid "use \"hg help %s\" to show the full help text" -msgstr "use \"hg help %s\" para mostrar o texto completo de ajuda" - -#, python-format -msgid "use \"hg -v help%s\" to show builtin aliases and global options" -msgstr "" -"use \"hg -v help%s\" para mostrar apelidos pré-definidos de comandos e " -"opções globais" - -#, python-format -msgid "use \"hg help -v %s\" to show more complete help" -msgstr "use \"hg help -v %s\" para mostrar um texto de ajuda mais completo" - -#, python-format -msgid "" -"\n" -"use \"hg help -c %s\" to see help for the %s command\n" -msgstr "" -"\n" -"use \"hg help -c %s\" para mostrar a ajuda do comando %s\n" - -msgid "no help text available" -msgstr "texto de ajuda não disponível" - -#, python-format -msgid "%s extension - %s" -msgstr "extensão %s - %s" - -msgid "use \"hg help extensions\" for information on enabling extensions\n" -msgstr "" -"use \"hg help extensions\" para informações sobre como habilitar extensões\n" - -#, python-format -msgid "'%s' is provided by the following extension:" -msgstr "'%s' é fornecido pela seguinte extensão:" - -msgid "Topics" -msgstr "Tópicos" - -msgid "Extension Commands" -msgstr "Comandos de Extensões" - -msgid "Mercurial Distributed SCM\n" -msgstr "Sistema de controle de versão distribuído Mercurial\n" - msgid "identify the specified revision" msgstr "identifica a revisão especificada" @@ -12641,15 +12815,15 @@ msgid "cannot use --similarity with --bypass" msgstr "não se pode usar --similarity com --bypass" -msgid "patch is damaged or loses information" -msgstr "o patch está danificado ou perde informação" - msgid "applied to working directory" msgstr "aplicado no diretório de trabalho" msgid "not a Mercurial patch" msgstr "não é um patch do Mercurial" +msgid "patch is damaged or loses information" +msgstr "o patch está danificado ou perde informação" + #. i18n: refers to a short changeset id #, python-format msgid "created %s" @@ -12949,9 +13123,6 @@ msgid "list files from all revisions" msgstr "lista os arquivos de todas as revisões" -msgid "[-r REV]" -msgstr "[-r REV]" - msgid "output the current or given revision of the project manifest" msgstr "mostra a revisão atual ou pedida do manifesto do projeto" @@ -13354,13 +13525,6 @@ " revisão listada por :hg:`incoming`." msgid "" -" If SOURCE is omitted, the 'default' path will be used.\n" -" See :hg:`help urls` for more information." -msgstr "" -" Se ORIGEM for omitida, o caminho 'default' será usado. Veja\n" -" :hg:`help urls` para mais informações." - -msgid "" " Returns 0 on success, 1 if an update had unresolved files.\n" " " msgstr "" @@ -14068,12 +14232,6 @@ msgid "show only modified files" msgstr "mostra apenas arquivos modificados" -msgid "show only added files" -msgstr "mostra apenas arquivos adicionados" - -msgid "show only removed files" -msgstr "mostra apenas arquivos removidos" - msgid "show only deleted (but tracked) files" msgstr "mostra apenas arquivos removidos (mas rastreados)" @@ -14444,8 +14602,8 @@ msgid "not at a branch head (use -f to force)" msgstr "não está em uma cabeça de ramo (use -f para forçar)" -msgid "null revision specified" -msgstr "foi especificada a revisão nula" +msgid "cannot tag null revision" +msgstr "não é possível adicionar uma etiqueta na revisão nula" msgid "list repository tags" msgstr "lista as etiquetas do repositório" @@ -14615,6 +14773,10 @@ " Se você quiser reverter apenas um arquivo para seu conteúdo em\n" " uma revisão anterior, use :hg:`revert [-r REV] NOME`." +#, python-format +msgid "updating to active bookmark %s\n" +msgstr "atualizando para o marcador ativo %s\n" + msgid "cannot specify both -c/--check and -C/--clean" msgstr "não se pode especificar ao mesmo tempo -c/--check e -C/--clean" @@ -15047,6 +15209,9 @@ msgid "*** failed to import extension %s: %s\n" msgstr "*** falha ao importar a extensão %s: %s\n" +msgid "(no help text available)" +msgstr "(texto de ajuda não disponível)" + #, python-format msgid "warning: error finding commands in %s\n" msgstr "aviso: erro ao localizar comandos em %s\n" @@ -15396,6 +15561,18 @@ msgstr "codificação desconhecida '%s'" msgid "" +"``eol(style)``\n" +" File contains newlines of the given style (dos, unix, mac). Binary\n" +" files are excluded, files with mixed line endings match multiple\n" +" styles." +msgstr "" +"``eol(estilo)``\n" +" O arquivo contém quebras de linha no estilo especificado\n" +" (dos, unix, mac). Arquivos binários são excluídos; arquivos\n" +" contendo quebras de linha misturadas são incluídos em cada\n" +" um dos estilos correspondentes." + +msgid "" "``copied()``\n" " File that is recorded as being copied." msgstr "" @@ -15461,6 +15638,9 @@ msgid "bad (implicit)" msgstr "ruim (implicitamente)" +msgid "enabled extensions:" +msgstr "extensões habilitadas:" + msgid "disabled extensions:" msgstr "extensões desabilitadas:" @@ -15531,6 +15711,126 @@ msgid "Working with Phases" msgstr "Trabalhando Com Fases" +#, python-format +msgid "" +"\n" +"aliases: %s\n" +msgstr "" +"\n" +"apelidos: %s\n" + +#, python-format +msgid "shell alias for::" +msgstr "apelido de shell para::" + +#, python-format +msgid " %s" +msgstr " %s" + +#, python-format +msgid "alias for: hg %s" +msgstr "apelido para: hg %s" + +#, python-format +msgid "%s" +msgstr "%s" + +#, python-format +msgid "use \"hg help -e %s\" to show help for the %s extension" +msgstr "use \"hg help -e %s\" para mostrar a ajuda para a extensão %s" + +msgid "options:" +msgstr "opções:" + +msgid "global options:" +msgstr "opções globais:" + +#, python-format +msgid "" +"\n" +"use \"hg help %s\" to show the full help text\n" +msgstr "" +"\n" +"use \"hg help %s\" para mostrar o texto completo de ajuda\n" + +#, python-format +msgid "use \"hg -v help %s\" to show more complete help and the global options" +msgstr "" +"use \"hg -v help %s\" para mostrar um texto de ajuda mais completo e opções " +"globais" + +#, python-format +msgid "use \"hg -v help %s\" to show the global options" +msgstr "use \"hg -v help %s\" para mostrar opções globais" + +msgid "basic commands:" +msgstr "comandos básicos:" + +msgid "list of commands:" +msgstr "lista de comandos:" + +msgid "no commands defined\n" +msgstr "nenhum comando definido\n" + +msgid "" +"\n" +"additional help topics:" +msgstr "" +"\n" +"tópicos adicionais de ajuda:" + +msgid "use \"hg help\" for the full list of commands" +msgstr "use \"hg help\" para a lista completa de comandos" + +msgid "use \"hg help\" for the full list of commands or \"hg -v\" for details" +msgstr "use \"hg help\" para a lista completa de comandos ou \"hg -v\" para detalhes" + +#, python-format +msgid "use \"hg help %s\" to show the full help text" +msgstr "use \"hg help %s\" para mostrar o texto completo de ajuda" + +#, python-format +msgid "use \"hg -v help%s\" to show builtin aliases and global options" +msgstr "" +"use \"hg -v help%s\" para mostrar apelidos pré-definidos de comandos e " +"opções globais" + +#, python-format +msgid "use \"hg help -v %s\" to show more complete help" +msgstr "use \"hg help -v %s\" para mostrar um texto de ajuda mais completo" + +#, python-format +msgid "" +"\n" +"use \"hg help -c %s\" to see help for the %s command\n" +msgstr "" +"\n" +"use \"hg help -c %s\" para mostrar a ajuda do comando %s\n" + +msgid "no help text available" +msgstr "texto de ajuda não disponível" + +#, python-format +msgid "%s extension - %s" +msgstr "extensão %s - %s" + +msgid "use \"hg help extensions\" for information on enabling extensions\n" +msgstr "" +"use \"hg help extensions\" para informações sobre como habilitar extensões\n" + +#, python-format +msgid "'%s' is provided by the following extension:" +msgstr "'%s' é fornecido pela seguinte extensão:" + +msgid "Topics" +msgstr "Tópicos" + +msgid "Extension Commands" +msgstr "Comandos de Extensões" + +msgid "Mercurial Distributed SCM\n" +msgstr "Sistema de controle de versão distribuído Mercurial\n" + msgid "" "The Mercurial system uses a set of configuration files to control\n" "aspects of its behavior." @@ -17175,10 +17475,10 @@ msgid "" " [hostfingerprints]\n" -" hg.intevation.org = 38:76:52:7c:87:26:9a:8f:4a:f8:d3:de:08:45:3b:ea:d6:4b:ee:cc" +" hg.intevation.org = 44:ed:af:1f:97:11:b6:01:7a:48:45:fc:10:3c:b7:f9:d4:89:2a:9d" msgstr "" " [hostfingerprints]\n" -" hg.intevation.org = 38:76:52:7c:87:26:9a:8f:4a:f8:d3:de:08:45:3b:ea:d6:4b:ee:cc" +" hg.intevation.org = 44:ed:af:1f:97:11:b6:01:7a:48:45:fc:10:3c:b7:f9:d4:89:2a:9d" msgid "This feature is only supported when using Python 2.6 or later." msgstr "" @@ -17721,6 +18021,42 @@ " são impressos na saída de erros." msgid "" +"``sort``\n" +" Sort field. Specific to the ``ls`` instrumenting profiler.\n" +" One of ``callcount``, ``reccallcount``, ``totaltime`` and\n" +" ``inlinetime``.\n" +" Default: inlinetime." +msgstr "" +"``sort``\n" +" Campo de ordenação. Específico para o profiler de instrumentação.\n" +" Pode ter os valores ``callcount``, ``reccallcount``, ``totaltime``\n" +" e ``inlinetime``.\n" +" Padrão: inlinetime." + +msgid "" +"``limit``\n" +" Number of lines to show. Specific to the ``ls`` instrumenting profiler.\n" +" Default: 30." +msgstr "" +"``limit``\n" +" Número de linhas mostradas. Específico do profiler de instrumentação ``ls``.\n" +" Padrão: 30." + +msgid "" +"``nested``\n" +" Show at most this number of lines of drill-down info after each main entry.\n" +" This can help explain the difference between Total and Inline.\n" +" Specific to the ``ls`` instrumenting profiler.\n" +" Default: 5." +msgstr "" +"``nested``\n" +" Mostra no máximo este número de linhas de informações\n" +" após cada entrada principal. Isto pode\n" +" ajudar a explicar a diferença entre Total e Inline.\n" +" Específico do profiler de instrumentação ``ls``.\n" +" Padrão: 5." + +msgid "" "``revsetalias``\n" "---------------" msgstr "" @@ -17809,10 +18145,12 @@ msgid "" "``port``\n" -" Optional. Port to connect to on mail server. Default: 25." +" Optional. Port to connect to on mail server. Default: 465 (if\n" +" ``tls`` is smtps) or 25 (otherwise)." msgstr "" "``port``\n" -" Opcional. Porta usada pelo servidor de emails. Padrão: 25." +" Opcional. Porta usada pelo servidor de emails. O valor padrão é\n" +" 465 se ``tls`` = smtps, ou 25 caso contrário." msgid "" "``tls``\n" @@ -17824,6 +18162,29 @@ " emails: starttls, smtps ou none. Padrão: none." msgid "" +"``verifycert``\n" +" Optional. Verification for the certificate of mail server, when\n" +" ``tls`` is starttls or smtps. \"strict\", \"loose\" or False. For\n" +" \"strict\" or \"loose\", the certificate is verified as same as the\n" +" verification for HTTPS connections (see ``[hostfingerprints]`` and\n" +" ``[web] cacerts`` also). For \"strict\", sending email is also\n" +" aborted, if there is no configuration for mail server in\n" +" ``[hostfingerprints]`` and ``[web] cacerts``. --insecure for\n" +" :hg:`email` overwrites this as \"loose\". Default: \"strict\"." +msgstr "" +"``verifycert``\n" +" Opcional. Verificação do certificado do servidor de emails, se\n" +" ``tls`` for starttls ou smtps. Pode ser \"strict\", \"loose\" ou False.\n" +" Para \"strict\" ou \"loose\", o certificado será verificado da mesma\n" +" maneira que para conexões HTTPS (veja ``[hostfingerprints]`` e\n" +" ``[web] cacerts``).\n" +" Para \"strict\", o envio de emails também será abortado, se não\n" +" houver configuração para o servidor de emails em\n" +" ``[hostfingerprints]`` e ``[web] cacerts``.\n" +" A opção --insecure de :hg:`email` força o valor \"loose\".\n" +" O padrão é \"strict\"." + +msgid "" "``username``\n" " Optional. User name for authenticating with the SMTP server.\n" " Default: none." @@ -18755,10 +19116,110 @@ msgid "" "``templates``\n" -" Where to find the HTML templates. Default is install path.\n" +" Where to find the HTML templates. Default is install path." msgstr "" "``templates``\n" -" Onde modelos HTML são encontrados. O padrão é o caminho de instalação.\n" +" Onde modelos HTML são encontrados. O padrão é o caminho de instalação." + +msgid "" +"``websub``\n" +"----------" +msgstr "" +"``websub``\n" +"----------" + +msgid "" +"Web substitution filter definition. You can use this section to\n" +"define a set of regular expression substitution patterns which\n" +"let you automatically modify the hgweb server output." +msgstr "" +"Definições de filtros de substituição para web. Nesta seção você\n" +"pode definir um conjunto de substituições por expressões regulares\n" +"para modificação automática da saída do servidor hgweb." + +msgid "" +"The default hgweb templates only apply these substitution patterns\n" +"on the revision description fields. You can apply them anywhere\n" +"you want when you create your own templates by adding calls to the\n" +"\"websub\" filter (usually after calling the \"escape\" filter)." +msgstr "" +"Os modelos padrão do hgweb aplicam estas substituições apenas\n" +"nos campos de descrição de revisões. Ao criar seus próprios\n" +"modelos, você pode aplicá-los em outros locais adicionando\n" +"chamadas ao filtro \"websub\" (tipicamente após chamar o\n" +"filtro \"escape\")." + +msgid "" +"This can be used, for example, to convert issue references to links\n" +"to your issue tracker, or to convert \"markdown-like\" syntax into\n" +"HTML (see the examples below)." +msgstr "" +"Isto pode ser usado por exemplo para converter referências para um\n" +"tíquete em links para o seu rastreador de tíquetes, ou converter\n" +"sintaxe \"markdown\" em HTML (veja exemplos abaixo)." + +msgid "" +"Each entry in this section names a substitution filter.\n" +"The value of each entry defines the substitution expression itself.\n" +"The websub expressions follow the old interhg extension syntax,\n" +"which in turn imitates the Unix sed replacement syntax::" +msgstr "" +"Cada entrada nesta seção dá nome a um filtro de substituição.\n" +"O valor de cada entrada define a própria expressão de substituição.\n" +"As expressões websub usam a sintaxe da antiga extensão interhg,\n" +"que por sua vez imita a sintaxe de substituição do comando sed do Unix::" + +msgid " patternname = s/SEARCH_REGEX/REPLACE_EXPRESSION/[i]" +msgstr " nomedopadrao = s/EXPRESSAO_DE_BUSCA/EXPRESSAO_DE_SUBSTITUICAO/[i]" + +msgid "" +"You can use any separator other than \"/\". The final \"i\" is optional\n" +"and indicates that the search must be case insensitive." +msgstr "" +"Além de \"/\", você pode usar qualquer outro separador. O \"i\"\n" +"final é opcional e indica que a busca não deve ser sensível a\n" +"maiúsculas e minúsculas." + +msgid "Examples::" +msgstr "Exemplos::" + +msgid "" +" [websub]\n" +" issues = s|issue(\\d+)|issue\\1|i\n" +" italic = s/\\b_(\\S+)_\\b/\\1<\\/i>/\n" +" bold = s/\\*\\b(\\S+)\\b\\*/\\1<\\/b>/" +msgstr "" +" [websub]\n" +" issues = s|issue(\\d+)|issue\\1|i\n" +" italic = s/\\b_(\\S+)_\\b/\\1<\\/i>/\n" +" bold = s/\\*\\b(\\S+)\\b\\*/\\1<\\/b>/" + +msgid "" +"``worker``\n" +"----------" +msgstr "" +"``worker``\n" +"----------" + +msgid "" +"Parallel master/worker configuration. We currently perform working\n" +"directory updates in parallel on Unix-like systems, which greatly\n" +"helps performance." +msgstr "" +"Configuração master/worker em paralelo. Em sistemas semelhantes ao\n" +"Unix, atualizações do diretório de trabalho são feitas com processos\n" +"paralelos, o que melhora consideravelmente o desempenho." + +msgid "" +"``numcpus``\n" +" Number of CPUs to use for parallel operations. Default is 4 or the\n" +" number of CPUs on the system, whichever is larger. A zero or\n" +" negative value is treated as ``use the default``.\n" +msgstr "" +"``numcpus``\n" +" Número de CPUs a serem usadas para operações em paralelo. O padrão\n" +" é o maior valor entre 4 e o número de CPUs no sistema. Zero ou\n" +" negativo indicam o uso do valor padrão.\n" msgid "Some commands allow the user to specify a date, e.g.:" msgstr "Alguns comandos permitem ao usuário especificar uma data, como:" @@ -18786,7 +19247,10 @@ "- ``2006-12-6``\n" "- ``12-6``\n" "- ``12/6``\n" -"- ``12/6/6`` (Dec 6 2006)" +"- ``12/6/6`` (Dec 6 2006)\n" +"- ``today`` (midnight)\n" +"- ``yesterday`` (midnight)\n" +"- ``now`` - right now" msgstr "" "- ``Wed Dec 6 13:18:29 2006`` (assumido fuso horário local)\n" "- ``Dec 6 13:18 -0600`` (ano atual, defasagem de horário local fornecida)\n" @@ -18800,7 +19264,10 @@ "- ``2006-12-6``\n" "- ``12-6``\n" "- ``12/6``\n" -"- ``12/6/6`` (Dec 6 2006)" +"- ``12/6/6`` (Dec 6 2006)\n" +"- ``today`` (hoje, à meia noite)\n" +"- ``yesterday`` (ontem, à meia noite)\n" +"- ``now`` - neste momento" msgid "Lastly, there is Mercurial's internal format:" msgstr "E por fim, há um formato interno do Mercurial:" @@ -19203,7 +19670,7 @@ msgid "" "Mercurial supports a functional language for selecting a set of\n" -"files. " +"files." msgstr "" "O Mercurial suporta uma linguagem funcional para selecionar um conjunto\n" "de arquivos." @@ -20830,7 +21297,7 @@ msgid "" ".. note::\n" -" Patterns specified in ``.hgignore`` are not rooted. \n" +" Patterns specified in ``.hgignore`` are not rooted.\n" " Please see :hg:`help hgignore` for details." msgstr "" ".. note::\n" @@ -21120,8 +21587,8 @@ " - sincroniza novamente revisões de rascunho relativas a um repositório " "remoto::" -msgid " hg phase -fd 'outgoing(URL)' " -msgstr " hg phase -fd 'outgoing(URL)' " +msgid " hg phase -fd 'outgoing(URL)'" +msgstr " hg phase -fd 'outgoing(URL)'" msgid "" "See :hg:`help phase` for more information on manually manipulating phases.\n" @@ -21422,10 +21889,10 @@ msgid "" " hg log -r \"(keyword(bug) or keyword(issue)) and not " -"ancestors(tagged())\"\n" +"ancestors(tag())\"\n" msgstr "" " hg log -r \"(keyword(bug) or keyword(issue)) and not " -"ancestors(tagged())\"\n" +"ancestors(tag())\"\n" msgid "" "Subrepositories let you nest external repositories or projects into a\n" @@ -21849,8 +22316,109 @@ msgid "List of filters:" msgstr "Lista de filtros:" -msgid ".. filtersmarker\n" -msgstr ".. filtersmarker\n" +msgid ".. filtersmarker" +msgstr ".. filtersmarker" + +msgid "" +"Note that a filter is nothing more than a function call, i.e.\n" +"``expr|filter`` is equivalent to ``filter(expr)``." +msgstr "" +"Note que um filtro é apenas uma chamada de função, ou seja,\n" +"``expr|filtro`` é equivalente a ``filtro(expr)``." + +msgid "In addition to filters, there are some basic built-in functions:" +msgstr "Além de filtros, há algumas funções básicas disponíveis:" + +msgid "- date(date[, fmt])" +msgstr "- date(data[, formato])" + +msgid "- fill(text[, width])" +msgstr "- fill(texto[, comprimento])" + +msgid "- get(dict, key)" +msgstr "- get(dicionário, chave)" + +msgid "- if(expr, then[, else])" +msgstr "- if(expr, então[, senão])" + +msgid "- ifeq(expr, expr, then[, else])" +msgstr "- ifeq(expr, expr, então[, senão])" + +msgid "- join(list, sep)" +msgstr "- join(lista, separador)" + +msgid "- label(label, expr)" +msgstr "- label(label, expr)" + +msgid "- sub(pat, repl, expr)" +msgstr "- sub(padrão, substituição, expr)" + +msgid "- rstdoc(text, style)" +msgstr "- rstdoc(texto, estilo)" + +msgid "" +"Also, for any expression that returns a list, there is a list operator:" +msgstr "" +"Além disso, para cada expressão que devolve uma lista, há um\n" +"operador de lista:" + +msgid "- expr % \"{template}\"" +msgstr "- expr % \"{modelo}\"" + +msgid "Some sample command line templates:" +msgstr "Alguns exemplos de modelos de linha de comando:" + +msgid "- Format lists, e.g. files::" +msgstr "- Formatação de listas, por exemplo arquivos::" + +msgid " $ hg log -r 0 --template \"files:\\n{files % ' {file}\\n'}\"" +msgstr " $ hg log -r 0 --template \"files:\\n{files % ' {file}\\n'}\"" + +msgid "- Join the list of files with a \", \"::" +msgstr "- Juntar a lista de arquivos com \", \"::" + +msgid " $ hg log -r 0 --template \"files: {join(files, ', ')}\\n\"" +msgstr " $ hg log -r 0 --template \"files: {join(files, ', ')}\\n\"" + +msgid "- Format date::" +msgstr "- Formatação de datas::" + +msgid " $ hg log -r 0 --template \"{date(date, '%Y')}\\n\"" +msgstr " $ hg log -r 0 --template \"{date(date, '%Y')}\\n\"" + +msgid "- Output the description set to a fill-width of 30::" +msgstr "- Informar as descrições em um campo de largura 30::" + +msgid " $ hg log -r 0 --template \"{fill(desc, '30')}\"" +msgstr " $ hg log -r 0 --template \"{fill(desc, '30')}\"" + +msgid "- Use a conditional to test for the default branch::" +msgstr "- Usar uma conditional para testar pelo ramo default::" + +msgid "" +" $ hg log -r 0 --template \"{ifeq(branch, 'default', 'on the main branch',\n" +" 'on branch {branch}')}\\n\"" +msgstr "" +" $ hg log -r 0 --template \"{ifeq(branch, 'default', 'on the main branch',\n" +" 'on branch {branch}')}\\n\"" + +msgid "- Append a newline if not empty::" +msgstr "- Anexar uma quebra de linha se não for vazio::" + +msgid " $ hg tip --template \"{if(author, '{author}\\n')}\"" +msgstr " $ hg tip --template \"{if(author, '{author}\\n')}\"" + +msgid "- Label the output for use with the color extension::" +msgstr "- Rotular a saída para uso da extensão color::" + +msgid " $ hg log -r 0 --template \"{label('changeset.{phase}', node|short)}\\n\"" +msgstr " $ hg log -r 0 --template \"{label('changeset.{phase}', node|short)}\\n\"" + +msgid "- Invert the firstline filter, i.e. everything but the first line::" +msgstr "- Inverter o filtro firstline, ou seja, tudo menos a primeira linha::" + +msgid " $ hg log -r 0 --template \"{sub(r'^.*\\n?\\n?', '', desc)}\\n\"\n" +msgstr " $ hg log -r 0 --template \"{sub(r'^.*\\n?\\n?', '', desc)}\\n\"\n" msgid "Valid URLs are of the form::" msgstr "URLs válidas são da forma::" @@ -22078,6 +22646,14 @@ msgstr "(mesclagem de ramo, não esqueça de consolidar)\n" #, python-format +msgid "websub: invalid pattern for %s: %s\n" +msgstr "websub: padrão inválido para %s: %s\n" + +#, python-format +msgid "websub: invalid regexp for %s: %s\n" +msgstr "websub: expressão regular inválida para %s: %s\n" + +#, python-format msgid "config file %s not found!" msgstr "arquivo de configuração %s não encontrado!" @@ -22375,6 +22951,12 @@ msgstr "o destino não suporta push" #, python-format +msgid "cannot lock source repo, skipping local %s phase update\n" +msgstr "" +"não é possível travar o repositório de origem, a mudança local para fase " +"'%s' não será feita\n" + +#, python-format msgid "push includes obsolete changeset: %s!" msgstr "push inclui uma revisão obsoleta: %s!" @@ -22394,9 +22976,6 @@ msgid "updating %s to public failed!\n" msgstr "a atualização da fase de %s para pública falhou!\n" -msgid "failed to push some obsolete markers!\n" -msgstr "erro ao enviar algumas marcações de obsolescência!\n" - #, python-format msgid "%d changesets found\n" msgstr "%d revisões encontradas\n" @@ -22470,15 +23049,22 @@ msgid "transferred %s in %.1f seconds (%s/sec)\n" msgstr "transferidos %s em %.1f segundos (%s/s)\n" +msgid "SMTPS requires Python 2.6 or later" +msgstr "SMTPS exige Python 2.6 ou posterior" + msgid "can't use TLS: Python SSL support not installed" msgstr "impossível usar TLS: suporte Python a SSL não instalado" +msgid "smtp.host not configured - cannot send mail" +msgstr "servidor smtp não configurado - impossível enviar e-mail" + +#, python-format +msgid "invalid smtp.verifycert configuration: %s" +msgstr "configuração smtp.verifycert inválida: %s" + msgid "(using smtps)\n" msgstr "(usando smtps)\n" -msgid "smtp.host not configured - cannot send mail" -msgstr "servidor smtp não configurado - impossível enviar e-mail" - #, python-format msgid "sending mail: smtp host %s, port %s\n" msgstr "enviando e-mail: servidor smtp %s, porta %s\n" @@ -22486,6 +23072,9 @@ msgid "(using starttls)\n" msgstr "(usando starttls)\n" +msgid "(verifying remote certificate)\n" +msgstr "(verificando certificado remoto)\n" + #, python-format msgid "(authenticating to mail server as %s)\n" msgstr "(autenticando com o servidor de e-mail como %s)\n" @@ -22562,10 +23151,10 @@ #, python-format msgid "" -" local changed %s which remote deleted\n" +"local changed %s which remote deleted\n" "use (c)hanged version or (d)elete?" msgstr "" -" local alterou %s, que a remota removeu\n" +"local alterou %s, que a remota removeu\n" "use (c) a versão alterada, ou (d) apague?" msgid "&Changed" @@ -22585,9 +23174,6 @@ msgid "&Deleted" msgstr "(&D) apagada" -msgid "updating" -msgstr "atualizando" - #, python-format msgid "update failed to remove %s: %s!\n" msgstr "update falhou ao remover %s: %s!\n" @@ -22596,6 +23182,9 @@ msgid "getting %s\n" msgstr "obtendo %s\n" +msgid "updating" +msgstr "atualizando" + #, python-format msgid "getting %s to %s\n" msgstr "obtendo %s para %s\n" @@ -22680,6 +23269,9 @@ msgid "unexpected old value" msgstr "valor antigo inesperado" +msgid "failed to push some obsolete markers!\n" +msgstr "erro ao enviar algumas marcações de obsolescência!\n" + #, python-format msgid "unexpected token: %s" msgstr "token inesperado: %s" @@ -22894,19 +23486,20 @@ msgstr "adds requer um padrão" msgid "" -"``ancestor(single, single)``\n" -" Greatest common ancestor of the two changesets." -msgstr "" -"``ancestor(revisão, revisão)``\n" -" Maior ancestral comum das duas revisões." - -#. i18n: "ancestor" is a keyword -msgid "ancestor requires two arguments" -msgstr "ancestor requer dois argumentos" - -#. i18n: "ancestor" is a keyword -msgid "ancestor arguments must be single revisions" -msgstr "os argumentos de ancestor devem ser revisões únicas" +"``ancestor(*changeset)``\n" +" Greatest common ancestor of the changesets." +msgstr "" +"``ancestor(*revisões)``\n" +" Maior ancestral comum das revisões." + +msgid "" +" Accepts 0 or more changesets.\n" +" Will return empty list when passed no args.\n" +" Greatest common ancestor of a single changeset is that changeset." +msgstr "" +" Aceita 0 ou mais revisões.\n" +" Se não forem passadas revisões, devolve uma lista vazia.\n" +" O maior ancestral comum de uma única revisão é a própria revisão." msgid "" "``ancestors(set)``\n" @@ -23746,10 +24339,6 @@ msgid "the argument to tag must be a string" msgstr "o argumento de tag deve ser uma string" -#, python-format -msgid "no tags exist that match '%s'" -msgstr "não existem etiquetas que correspondem a '%s'" - msgid "" "``unstable()``\n" " Non-obsolete changesets with obsolete ancestors." @@ -23803,6 +24392,9 @@ msgid "%r cannot be used in a name" msgstr "\"%s\" não pode ser usado em um nome" +msgid "cannot use an integer as a name" +msgstr "um inteiro não pode ser usado como um nome" + #, python-format msgid "ui.portablefilenames value is invalid ('%s')" msgstr "o valor de ui.portablefilenames é inválido ('%s')" @@ -23944,6 +24536,11 @@ "Python muito antiga)" #, python-format +msgid "certificate for %s can't be verified (Python too old)" +msgstr "" +"o certificado %s não pode ser verificado (versão do Python muito antiga)" + +#, python-format msgid "warning: certificate for %s can't be verified (Python too old)\n" msgstr "" "aviso: certificado %s não pode ser verificado (versão do Python muito " @@ -23975,6 +24572,14 @@ "inseguro" #, python-format +msgid "%s certificate with fingerprint %s not verified" +msgstr "" +"o certificado para o host %s com impressão digital %s não foi verificado" + +msgid "check hostfingerprints or web.cacerts config setting" +msgstr "verifique as configurações hostfingerprints ou web.cacerts" + +#, python-format msgid "" "warning: %s certificate with fingerprint %s not verified (check " "hostfingerprints or web.cacerts config setting)\n" @@ -24092,6 +24697,10 @@ msgstr "trazendo sub-repositório %s de %s\n" #, python-format +msgid "no changes made to subrepo %s since last push to %s\n" +msgstr "nenhuma mudança no sub-repositório %s desde o último push para %s\n" + +#, python-format msgid "pushing subrepo %s to %s\n" msgstr "enviando sub-repositório %s para %s\n" @@ -24582,6 +25191,14 @@ msgid "filter %s expects one argument" msgstr "o filtro %s espera um argumento" +#. i18n: "get" is a keyword +msgid "get() expects two arguments" +msgstr "get() exige dois argumentos" + +#. i18n: "get" is a keyword +msgid "get() expects a dict as first argument" +msgstr "get() exige um dicionário como primeiro argumento" + #. i18n: "join" is a keyword msgid "join expects one or two arguments" msgstr "join exige um ou dois argumentos" @@ -24598,6 +25215,10 @@ msgid "ifeq expects three or four arguments" msgstr "ifeq espera três ou quatro argumentos" +#. i18n: "rstdoc" is a keyword +msgid "rstdoc expects two arguments" +msgstr "rstdoc exige dois argumentos" + msgid "unmatched quotes" msgstr "aspas não combinam" @@ -24656,6 +25277,10 @@ msgid "%s.%s is not an integer ('%s')" msgstr "%s.%s não é um inteiro ('%s')" +#, python-format +msgid "%s.%s is not a byte quantity ('%s')" +msgstr "%s.%s não é uma quantidade de bytes ('%s')" + msgid "enter a commit username:" msgstr "entre o nome do usuário para consolidação:" @@ -24679,6 +25304,9 @@ msgid "password: " msgstr "senha: " +msgid "cannot create new union repository" +msgstr "não é possível criar novo repositório de união" + msgid "http authorization required" msgstr "autorização http requerida" @@ -24723,6 +25351,15 @@ msgid "negative timestamp: %d" msgstr "timestamp negativo: %d" +msgid "now" +msgstr "now" + +msgid "today" +msgstr "today" + +msgid "yesterday" +msgstr "yesterday" + #, python-format msgid "invalid date: %r" msgstr "data inválida: %r" @@ -24803,6 +25440,58 @@ msgid "file:// URLs can only refer to localhost" msgstr "URLs file:// só podem se referir a localhost" +#, python-format +msgid "%.0f s" +msgstr "%.0f s" + +#, python-format +msgid "%.1f s" +msgstr "%.1f s" + +#, python-format +msgid "%.2f s" +msgstr "%.2f s" + +#, python-format +msgid "%.3f s" +msgstr "%.3f s" + +#, python-format +msgid "%.1f ms" +msgstr "%.1f ms" + +#, python-format +msgid "%.2f ms" +msgstr "%.2f ms" + +#, python-format +msgid "%.3f ms" +msgstr "%.3f ms" + +#, python-format +msgid "%.1f us" +msgstr "%.1f us" + +#, python-format +msgid "%.2f us" +msgstr "%.2f us" + +#, python-format +msgid "%.3f us" +msgstr "%.3f us" + +#, python-format +msgid "%.1f ns" +msgstr "%.1f ns" + +#, python-format +msgid "%.2f ns" +msgstr "%.2f ns" + +#, python-format +msgid "%.3f ns" +msgstr "%.3f ns" + msgid "cannot verify bundle or remote repos" msgstr "impossível verificar bundle ou repositório remoto" @@ -24981,3 +25670,6 @@ msgid "push failed:" msgstr "o push falhou:" + +msgid "number of cpus must be an integer" +msgstr "o número de cpus deve ser um inteiro" diff -r 0890e6fd3e00 -r 838c6b72928d i18n/ru.po --- a/i18n/ru.po Sun May 12 15:35:53 2013 +0400 +++ b/i18n/ru.po Tue May 14 23:04:23 2013 +0400 @@ -7,15 +7,15 @@ # Файловая структура # # repository — хранилище (не «репозиторий») -# subrepository — подхранилище (не «субрепозиторий») -# store — склад (= место хранения файлов в LargefilesExtension), +# subrepository — подхранилище (не «субрепозиторий») +# store — склад (= место хранения файлов в LargefilesExtension), # (не «хранилище») # directory — каталог (не «директорий», не «директория») # folder — папка # source — источник -# source file — исходный файл +# source file — исходный файл # destination — назначение -# destination file — файл назначения +# destination file — файл назначения # # # Названия основных действий @@ -33,30 +33,30 @@ # checkout — извлечь (напр. из хранилища Subversion) # resolve — уладить (конфликт) # revert — вернуть -# discard — отбросить +# discard — отбросить # -# Слова на тему «удаление» +# Слова на тему «удаление» # # clean — чистый (~ локальный файл, не содержащий изменений по сравнению с хранилищем) -# clean working copy — чистая рабочая копия (= не содержащая изменённых файлов) -# purge — зачистить (напр. удалить неиспользуемые файлы из рабочей копии), +# clean working copy — чистая рабочая копия (= не содержащая изменённых файлов) +# purge — зачистить (напр. удалить неиспользуемые файлы из рабочей копии), # (не «удалить», не «очистить») # strip — срезать (~ ревизию и всех её потомков) -# strip revision — срезать ревизию +# strip revision — срезать ревизию # delete — удалить, уничтожить (напр. стереть с диска) -# remove — изъять (из-под контроля версий, не «удалить») +# remove — изъять (из-под контроля версий, не «удалить») # # # Слова на тему «слияние» # # branch — ветка (не «ветвь», не «бранч»), гл. ветвление -# frozen branch — замороженная ветка +# frozen branch — замороженная ветка # named branch — именованная ветка # diff — различия (не «дифф») # merge — слить (сущ. «слияние») # backout — обратить изменения (~ сделанные в ранней ревизии) # rollback — откатить (~ транзакцию) -# cancel — отменить +# cancel — отменить # # Слова на тему «управление изменениями» # @@ -67,10 +67,10 @@ # patch — заплатка, патч, накладывать заплатку, патчить, пропатчить # ??? apply patch — наложить заплатку (не «применить заплатку») # ??? unapply patch — отпороть заплатку -# ??? fold patch(es) — подшить заплатки (~ к самой верхней наложенной заплатке) +# ??? fold patch(es) — подшить заплатки (~ к самой верхней наложенной заплатке) # ??? chunk, hunk — лоскут (= часть заплатки) # ??? shelf — долгий ящик (= место для откладывания изменений) ;) -# shelve — отложить изменения (в долгий ящик) +# shelve — отложить изменения (в долгий ящик) # # # Разное @@ -78,12 +78,12 @@ # amend - исправлять # extension — расширение # option — параметр (не «опция») -# options — параметры (не «настройки») +# options — параметры (не «настройки») # settings — настройки (не «установки») # hook — хук, перехватчик, (не «крючок», не «ловушка», не «уловка», не «зацепка») # hook script — хук, скрипт-перехватчик (не «скрипт ловушки») # иногда слово hook употребляется в том же смысле, что и intercepted action (например, в документации) -# — тогда его следует переводить как «перехватываемое действие/событие» или «точка/момент перехвата» +# — тогда его следует переводить как «перехватываемое действие/событие» или «точка/момент перехвата» # # alias — псевдоним # changeset — набор изменений @@ -99,7 +99,7 @@ # hash хэш # glob глоб, glob, (thg: «маска файла (glob)») # binary бинарный -# basename ? +# basename ? # import импортировать # export экспортировать # rename переименовывать @@ -152,7 +152,7 @@ # - пометить как-то строки, которые не требуют перевода или не будут переводиться, # чтобы можно легко понять, сколько еще осталось сделать. # - в какой форме д.б. глаголы сообщений о текущем действии: -# 1 л ед.ч. - загружаю обновления, +# 1 л ед.ч. - загружаю обновления, # 1 л мн.ч - загружаем обновления, -- Так! // comment by Andrei Polushin # 3 л - загружаются обновления ? # Сюда же относятся сообщения cannot do smth: не могу сделать что-то или что-то невозможно? @@ -161,7 +161,7 @@ # - bisect - можно во многих местах употреблять термин "бисекция" (употребляется в thg) # вместо неуклюжего "метод деления пополам". Это устоявшийся термин. # - в строке должно быть не более 75 символов! -# - переводить ли примеры конфигов? я оставил как есть, чтобы пользователь +# - переводить ли примеры конфигов? я оставил как есть, чтобы пользователь # привыкал к англ, т.к. все настройки Mercurial на англ # - Attention Caution !Danger! Error Hint Important Note Tip Warning! # какая разница? @@ -6693,7 +6693,7 @@ msgid "hg qpush [-f] [-l] [-a] [--move] [PATCH | INDEX]" msgstr "hg qpush [-f] [-l] [-a] [--move] [ПАТЧ | ИНДЕКС]" -# MAYBE: поместить (добавить) следующий патч в стек +# MAYBE: поместить (добавить) следующий патч в стек msgid "push the next patch onto the stack" msgstr "протолкнуть следующий патч в стек" diff -r 0890e6fd3e00 -r 838c6b72928d i18n/zh_CN.po --- a/i18n/zh_CN.po Sun May 12 15:35:53 2013 +0400 +++ b/i18n/zh_CN.po Tue May 14 23:04:23 2013 +0400 @@ -1,25 +1,25 @@ # Chinese (simplified) translation for Mercurial # This file is distributed under the same license as Mercurial -# +# # Copyright (C) 2009 the Mercurial team # Dongsheng Song , 2009 -# +# # Update with pot file: # msgmerge --update zh_CN.po hg.pot # msgfmt --statistics -c zh_CN.po -# +# # Please test your translation before commit: # python setup.py build_py -c -d . build_ext -i build_mo # LC_ALL=zh_CN.UTF-8 ./hg -# +# # Please format your translation before commit: # msgcat --width=80 --sort-by-file -o zh_CN_new.po zh_CN.po # mv -f zh_CN_new.po zh_CN.po -# +# # Please remove '#: filename:line' lines before submit to hg: # msgcat --width=80 --no-location -o zh_CN_new.po zh_CN.po # mv -f zh_CN_new.po zh_CN.po -# +# # Dictionary: # blame 追溯 # branch 分支 @@ -42,7 +42,7 @@ # versioned 受版本控制 # working copy 工作副本 # ... -# +# msgid "" msgstr "" "Project-Id-Version: Mercurial 1.3\n" diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/ancestor.py --- a/mercurial/ancestor.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/ancestor.py Tue May 14 23:04:23 2013 +0400 @@ -8,7 +8,129 @@ import heapq, util from node import nullrev -def ancestor(a, b, pfunc): +def ancestors(pfunc, *orignodes): + """ + Returns the common ancestors of a and b that are furthest from a + root (as measured by longest path). + + pfunc must return a list of parent vertices for a given vertex. + """ + if not isinstance(orignodes, set): + orignodes = set(orignodes) + if nullrev in orignodes: + return set() + if len(orignodes) <= 1: + return orignodes + + def candidates(nodes): + allseen = (1 << len(nodes)) - 1 + seen = [0] * (max(nodes) + 1) + for i, n in enumerate(nodes): + seen[n] = 1 << i + poison = 1 << (i + 1) + + gca = set() + interesting = left = len(nodes) + nv = len(seen) - 1 + while nv >= 0 and interesting: + v = nv + nv -= 1 + if not seen[v]: + continue + sv = seen[v] + if sv < poison: + interesting -= 1 + if sv == allseen: + gca.add(v) + sv |= poison + if v in nodes: + left -= 1 + if left <= 1: + # history is linear + return set([v]) + if sv < poison: + for p in pfunc(v): + sp = seen[p] + if p == nullrev: + continue + if sp == 0: + seen[p] = sv + interesting += 1 + elif sp != sv: + seen[p] |= sv + else: + for p in pfunc(v): + if p == nullrev: + continue + sp = seen[p] + if sp and sp < poison: + interesting -= 1 + seen[p] = sv + return gca + + def deepest(nodes): + interesting = {} + count = max(nodes) + 1 + depth = [0] * count + seen = [0] * count + mapping = [] + for (i, n) in enumerate(sorted(nodes)): + depth[n] = 1 + b = 1 << i + seen[n] = b + interesting[b] = 1 + mapping.append((b, n)) + nv = count - 1 + while nv >= 0 and len(interesting) > 1: + v = nv + nv -= 1 + dv = depth[v] + if dv == 0: + continue + sv = seen[v] + for p in pfunc(v): + if p == nullrev: + continue + dp = depth[p] + nsp = sp = seen[p] + if dp <= dv: + depth[p] = dv + 1 + if sp != sv: + interesting[sv] += 1 + nsp = seen[p] = sv + if sp: + interesting[sp] -= 1 + if interesting[sp] == 0: + del interesting[sp] + elif dv == dp - 1: + nsp = sp | sv + if nsp == sp: + continue + seen[p] = nsp + interesting.setdefault(nsp, 0) + interesting[nsp] += 1 + interesting[sp] -= 1 + if interesting[sp] == 0: + del interesting[sp] + interesting[sv] -= 1 + if interesting[sv] == 0: + del interesting[sv] + + if len(interesting) != 1: + return [] + + k = 0 + for i in interesting: + k |= i + return set(n for (i, n) in mapping if k & i) + + gca = candidates(orignodes) + + if len(gca) <= 1: + return gca + return deepest(gca) + +def genericancestor(a, b, pfunc): """ Returns the common ancestor of a and b that is furthest from a root (as measured by longest path) or None if no ancestor is @@ -30,7 +152,7 @@ depth = {} while visit: vertex = visit[-1] - pl = pfunc(vertex) + pl = [p for p in pfunc(vertex) if p != nullrev] parentcache[vertex] = pl if not pl: depth[vertex] = 0 diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/archival.py --- a/mercurial/archival.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/archival.py Tue May 14 23:04:23 2013 +0400 @@ -13,6 +13,7 @@ import cStringIO, os, tarfile, time, zipfile import zlib, gzip import struct +import error # from unzip source code: _UNX_IFREG = 0x8000 @@ -288,20 +289,25 @@ files = [f for f in ctx.manifest().keys() if matchfn(f)] else: files = ctx.manifest().keys() - files.sort() total = len(files) - repo.ui.progress(_('archiving'), 0, unit=_('files'), total=total) - for i, f in enumerate(files): - ff = ctx.flags(f) - write(f, 'x' in ff and 0755 or 0644, 'l' in ff, ctx[f].data) - repo.ui.progress(_('archiving'), i + 1, item=f, - unit=_('files'), total=total) - repo.ui.progress(_('archiving'), None) + if total: + files.sort() + repo.ui.progress(_('archiving'), 0, unit=_('files'), total=total) + for i, f in enumerate(files): + ff = ctx.flags(f) + write(f, 'x' in ff and 0755 or 0644, 'l' in ff, ctx[f].data) + repo.ui.progress(_('archiving'), i + 1, item=f, + unit=_('files'), total=total) + repo.ui.progress(_('archiving'), None) if subrepos: for subpath in sorted(ctx.substate): sub = ctx.sub(subpath) submatch = matchmod.narrowmatcher(subpath, matchfn) - sub.archive(repo.ui, archiver, prefix, submatch) + total += sub.archive(repo.ui, archiver, prefix, submatch) + + if total == 0: + raise error.Abort(_('no files match the archive pattern')) archiver.done() + return total diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/bdiff.c --- a/mercurial/bdiff.c Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/bdiff.c Tue May 14 23:04:23 2013 +0400 @@ -347,6 +347,11 @@ if (!PyArg_ParseTuple(args, "s#s#:bdiff", &sa, &la, &sb, &lb)) return NULL; + if (la > UINT_MAX || lb > UINT_MAX) { + PyErr_SetString(PyExc_ValueError, "bdiff inputs too large"); + return NULL; + } + _save = PyEval_SaveThread(); an = splitlines(sa, la, &al); bn = splitlines(sb, lb, &bl); @@ -381,18 +386,9 @@ for (h = l.next; h; h = h->next) { if (h->a1 != la || h->b1 != lb) { len = bl[h->b1].l - bl[lb].l; - -#define checkputbe32(__x, __c) \ - if (__x > UINT_MAX) { \ - PyErr_SetString(PyExc_ValueError, \ - "bdiff: value too large for putbe32"); \ - goto nomem; \ - } \ - putbe32((uint32_t)(__x), __c); - - checkputbe32(al[la].l - al->l, rb); - checkputbe32(al[h->a1].l - al->l, rb + 4); - checkputbe32(len, rb + 8); + putbe32((uint32_t)(al[la].l - al->l), rb); + putbe32((uint32_t)(al[h->a1].l - al->l), rb + 4); + putbe32((uint32_t)len, rb + 8); memcpy(rb + 12, bl[lb].l, len); rb += 12 + len; } diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/bookmarks.py --- a/mercurial/bookmarks.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/bookmarks.py Tue May 14 23:04:23 2013 +0400 @@ -134,6 +134,19 @@ finally: wlock.release() +def iscurrent(repo, mark=None, parents=None): + '''Tell whether the current bookmark is also active + + I.e., the bookmark listed in .hg/bookmarks.current also points to a + parent of the working directory. + ''' + if not mark: + mark = repo._bookmarkcurrent + if not parents: + parents = [p.node() for p in repo[None].parents()] + marks = repo._bookmarks + return (mark in marks and marks[mark] in parents) + def updatecurrentbookmark(repo, oldnode, curbranch): try: return update(repo, oldnode, repo.branchtip(curbranch)) @@ -143,23 +156,42 @@ else: raise util.Abort(_("branch %s not found") % curbranch) +def deletedivergent(repo, deletefrom, bm): + '''Delete divergent versions of bm on nodes in deletefrom. + + Return True if at least one bookmark was deleted, False otherwise.''' + deleted = False + marks = repo._bookmarks + divergent = [b for b in marks if b.split('@', 1)[0] == bm.split('@', 1)[0]] + for mark in divergent: + if mark and marks[mark] in deletefrom: + if mark != bm: + del marks[mark] + deleted = True + return deleted + def update(repo, parents, node): + deletefrom = parents marks = repo._bookmarks update = False cur = repo._bookmarkcurrent if not cur: return False - toupdate = [b for b in marks if b.split('@', 1)[0] == cur.split('@', 1)[0]] - for mark in toupdate: - if mark and marks[mark] in parents: - old = repo[marks[mark]] - new = repo[node] - if old.descendant(new) and mark == cur: - marks[cur] = new.node() - update = True - if mark != cur: - del marks[mark] + if marks[cur] in parents: + old = repo[marks[cur]] + new = repo[node] + divs = [repo[b] for b in marks + if b.split('@', 1)[0] == cur.split('@', 1)[0]] + anc = repo.changelog.ancestors([new.rev()]) + deletefrom = [b.node() for b in divs if b.rev() in anc or b == new] + if old.descendant(new): + marks[cur] = new.node() + update = True + + if deletedivergent(repo, deletefrom, cur): + update = True + if update: marks.write() return update @@ -170,9 +202,10 @@ marks = getattr(repo, '_bookmarks', {}) d = {} + hasnode = repo.changelog.hasnode for k, v in marks.iteritems(): # don't expose local divergent bookmarks - if '@' not in k or k.endswith('@'): + if hasnode(v) and ('@' not in k or k.endswith('@')): d[k] = hex(v) return d @@ -193,14 +226,13 @@ finally: w.release() -def updatefromremote(ui, repo, remote, path): +def updatefromremote(ui, repo, remotemarks, path): ui.debug("checking for updated bookmarks\n") - rb = remote.listkeys('bookmarks') changed = False localmarks = repo._bookmarks - for k in sorted(rb): + for k in sorted(remotemarks): if k in localmarks: - nr, nl = rb[k], localmarks[k] + nr, nl = remotemarks[k], localmarks[k] if nr in repo: cr = repo[nr] cl = repo[nl] @@ -229,9 +261,9 @@ localmarks[n] = cr.node() changed = True ui.warn(_("divergent bookmark %s stored as %s\n") % (k, n)) - elif rb[k] in repo: + elif remotemarks[k] in repo: # add remote bookmarks for changes we already have - localmarks[k] = repo[rb[k]].node() + localmarks[k] = repo[remotemarks[k]].node() changed = True ui.status(_("adding remote bookmark %s\n") % k) @@ -265,19 +297,7 @@ # (new != nullrev has been excluded by the previous check) return True elif repo.obsstore: - # We only need this complicated logic if there is obsolescence - # XXX will probably deserve an optimised revset. - nm = repo.changelog.nodemap - validdests = set([old]) - plen = -1 - # compute the whole set of successors or descendants - while len(validdests) != plen: - plen = len(validdests) - succs = set(c.node() for c in validdests) - mutable = [c.node() for c in validdests if c.mutable()] - succs.update(obsolete.allsuccessors(repo.obsstore, mutable)) - known = (n for n in succs if n in nm) - validdests = set(repo.set('%ln::', known)) - return new in validdests + return new.node() in obsolete.foreground(repo, [old.node()]) else: + # still an independant clause as it is lazyer (and therefore faster) return old.descendant(new) diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/branchmap.py --- a/mercurial/branchmap.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/branchmap.py Tue May 14 23:04:23 2013 +0400 @@ -95,7 +95,7 @@ def _hashfiltered(self, repo): """build hash of revision filtered in the current cache - Tracking tipnode and tiprev is not enough to ensure validaty of the + Tracking tipnode and tiprev is not enough to ensure validity of the cache as they do not help to distinct cache that ignored various revision bellow tiprev. @@ -114,9 +114,9 @@ return key def validfor(self, repo): - """Is the cache content valide regarding a repo + """Is the cache content valid regarding a repo - - False when cached tipnode are unknown or if we detect a strip. + - False when cached tipnode is unknown or if we detect a strip. - True when cache is up to date or a subset of current repo.""" try: return ((self.tipnode == repo.changelog.node(self.tiprev)) diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/bundlerepo.py --- a/mercurial/bundlerepo.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/bundlerepo.py Tue May 14 23:04:23 2013 +0400 @@ -22,18 +22,15 @@ # How it works: # To retrieve a revision, we need to know the offset of the revision in # the bundle (an unbundle object). We store this offset in the index - # (start). - # - # basemap is indexed with revisions coming from the bundle, and it - # maps to the revision that is the base of the corresponding delta. + # (start). The base of the delta is stored in the base field. # # To differentiate a rev in the bundle from a rev in the revlog, we - # check revision against basemap. + # check revision against repotiprev. opener = scmutil.readonlyvfs(opener) revlog.revlog.__init__(self, opener, indexfile) self.bundle = bundle - self.basemap = {} # mapping rev to delta base rev n = len(self) + self.repotiprev = n - 1 chain = None self.bundlerevs = set() # used by 'bundle()' revset expression while True: @@ -68,9 +65,8 @@ baserev = self.rev(deltabase) # start, size, full unc. size, base (unused), link, p1, p2, node - e = (revlog.offset_type(start, 0), size, -1, -1, link, + e = (revlog.offset_type(start, 0), size, -1, baserev, link, self.rev(p1), self.rev(p2), node) - self.basemap[n] = baserev self.index.insert(-1, e) self.nodemap[node] = n self.bundlerevs.add(n) @@ -78,22 +74,22 @@ n += 1 def _chunk(self, rev): - # Warning: in case of bundle, the diff is against self.basemap, - # not against rev - 1 + # Warning: in case of bundle, the diff is against what we stored as + # delta base, not against rev - 1 # XXX: could use some caching - if rev not in self.basemap: + if rev <= self.repotiprev: return revlog.revlog._chunk(self, rev) self.bundle.seek(self.start(rev)) return self.bundle.read(self.length(rev)) def revdiff(self, rev1, rev2): """return or calculate a delta between two revisions""" - if rev1 in self.basemap and rev2 in self.basemap: + if rev1 > self.repotiprev and rev2 > self.repotiprev: # hot path for bundle - revb = self.basemap[rev2] + revb = self.index[rev2][3] if revb == rev1: return self._chunk(rev2) - elif rev1 not in self.basemap and rev2 not in self.basemap: + elif rev1 <= self.repotiprev and rev2 <= self.repotiprev: return revlog.revlog.revdiff(self, rev1, rev2) return mdiff.textdiff(self.revision(self.node(rev1)), @@ -117,12 +113,12 @@ chain = [] iterrev = rev # reconstruct the revision if it is from a changegroup - while iterrev in self.basemap: + while iterrev > self.repotiprev: if self._cache and self._cache[1] == iterrev: text = self._cache[2] break chain.append(iterrev) - iterrev = self.basemap[iterrev] + iterrev = self.index[iterrev][3] if text is None: text = revlog.revlog.revision(self, iterrev) @@ -364,9 +360,15 @@ bundle = None if not localrepo: # use the created uncompressed bundlerepo - localrepo = bundlerepo = bundlerepository(ui, repo.root, fname) + localrepo = bundlerepo = bundlerepository(repo.baseui, repo.root, + fname) # this repo contains local and other now, so filter out local again common = repo.heads() + if localrepo: + # Part of common may be remotely filtered + # So use an unfiltered version + # The discovery process probably need cleanup to avoid that + localrepo = localrepo.unfiltered() csets = localrepo.changelog.findmissing(common, rheads) diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/byterange.py --- a/mercurial/byterange.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/byterange.py Tue May 14 23:04:23 2013 +0400 @@ -238,7 +238,6 @@ unquote, addclosehook, addinfourl import ftplib import socket -import sys import mimetypes import email @@ -320,7 +319,7 @@ headers = email.message_from_string(headers) return addinfourl(fp, headers, req.get_full_url()) except ftplib.all_errors, msg: - raise IOError('ftp error', msg), sys.exc_info()[2] + raise IOError('ftp error', msg) def connect_ftp(self, user, passwd, host, port, dirs): fw = ftpwrapper(user, passwd, host, port, dirs) @@ -350,7 +349,7 @@ try: self.ftp.nlst(file) except ftplib.error_perm, reason: - raise IOError('ftp error', reason), sys.exc_info()[2] + raise IOError('ftp error', reason) # Restore the transfer mode! self.ftp.voidcmd(cmd) # Try to retrieve as a file @@ -364,7 +363,7 @@ fp = RangeableFileObject(fp, (rest,'')) return (fp, retrlen) elif not str(reason).startswith('550'): - raise IOError('ftp error', reason), sys.exc_info()[2] + raise IOError('ftp error', reason) if not conn: # Set transfer mode to ASCII! self.ftp.voidcmd('TYPE A') diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/changelog.py --- a/mercurial/changelog.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/changelog.py Tue May 14 23:04:23 2013 +0400 @@ -183,7 +183,7 @@ """filtered version of revlog.rev""" r = super(changelog, self).rev(node) if r in self.filteredrevs: - raise error.LookupError(node, self.indexfile, _('no node')) + raise error.LookupError(hex(node), self.indexfile, _('no node')) return r def node(self, rev): diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/cmdutil.py --- a/mercurial/cmdutil.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/cmdutil.py Tue May 14 23:04:23 2013 +0400 @@ -12,6 +12,7 @@ import match as matchmod import subrepo, context, repair, graphmod, revset, phases, obsolete import changelog +import bookmarks import lock as lockmod def parsealiases(cmd): @@ -169,7 +170,8 @@ inst.args[0]) def makefileobj(repo, pat, node=None, desc=None, total=None, - seqno=None, revwidth=None, mode='wb', pathname=None): + seqno=None, revwidth=None, mode='wb', modemap={}, + pathname=None): writable = mode not in ('r', 'rb') @@ -195,9 +197,11 @@ return pat if util.safehasattr(pat, 'read') and 'r' in mode: return pat - return open(makefilename(repo, pat, node, desc, total, seqno, revwidth, - pathname), - mode) + fn = makefilename(repo, pat, node, desc, total, seqno, revwidth, pathname) + mode = modemap.get(fn, mode) + if mode == 'wb': + modemap[fn] = 'ab' + return open(fn, mode) def openrevlog(repo, cmd, file_, opts): """opens the changelog, manifest, a filelog or a given revlog""" @@ -538,6 +542,7 @@ total = len(revs) revwidth = max([len(str(rev)) for rev in revs]) + filemode = {} def single(rev, seqno, fp): ctx = repo[rev] @@ -553,7 +558,8 @@ desc_lines = ctx.description().rstrip().split('\n') desc = desc_lines[0] #Commit always has a first line. fp = makefileobj(repo, template, node, desc=desc, total=total, - seqno=seqno, revwidth=revwidth, mode='ab') + seqno=seqno, revwidth=revwidth, mode='wb', + modemap=filemode) if fp != template: shouldclose = True if fp and fp != sys.stdout and util.safehasattr(fp, 'name'): @@ -569,6 +575,7 @@ write("# HG changeset patch\n") write("# User %s\n" % ctx.user()) write("# Date %d %d\n" % ctx.date()) + write("# %s\n" % util.datestr(ctx.date())) if branch and branch != 'default': write("# Branch %s\n" % branch) write("# Node ID %s\n" % hex(node)) @@ -1015,8 +1022,6 @@ follow = opts.get('follow') or opts.get('follow_first') - if not len(repo): - return [] if opts.get('rev'): revs = scmutil.revrange(repo, opts.get('rev')) elif follow: @@ -1203,6 +1208,13 @@ if ff.match(x): wanted.discard(x) + # Choose a small initial window if we will probably only visit a + # few commits. + limit = loglimit(opts) + windowsize = 8 + if limit: + windowsize = min(limit, windowsize) + # Now that wanted is correctly initialized, we can iterate over the # revision range, yielding only revisions in wanted. def iterate(): @@ -1214,7 +1226,7 @@ def want(rev): return rev in wanted - for i, window in increasingwindows(0, len(revs)): + for i, window in increasingwindows(0, len(revs), windowsize): nrevs = [rev for rev in revs[i:i + window] if want(rev)] for rev in sorted(nrevs): fns = fncache.get(rev) @@ -1576,10 +1588,13 @@ forgot.extend(forget) return bad, forgot -def duplicatecopies(repo, rev, p1): - "Reproduce copies found in the source revision in the dirstate for grafts" - for dst, src in copies.pathcopies(repo[p1], repo[rev]).iteritems(): - repo.dirstate.copy(src, dst) +def duplicatecopies(repo, rev, fromrev): + '''reproduce copies from fromrev to rev in the dirstate''' + for dst, src in copies.pathcopies(repo[fromrev], repo[rev]).iteritems(): + # copies.pathcopies returns backward renames, so dst might not + # actually be in the dirstate + if repo.dirstate[dst] in "nma": + repo.dirstate.copy(src, dst) def commit(ui, repo, commitfunc, pats, opts): '''commit the specified files or all outstanding changes''' @@ -1643,7 +1658,13 @@ # Also update it from the intermediate commit or from the wctx extra.update(ctx.extra()) - files = set(old.files()) + if len(old.parents()) > 1: + # ctx.files() isn't reliable for merges, so fall back to the + # slower repo.status() method + files = set([fn for st in repo.status(base, old)[:3] + for fn in st]) + else: + files = set(old.files()) # Second, we use either the commit we just did, or if there were no # changes the parent of the working directory as the version of the @@ -1708,7 +1729,7 @@ extra['amend_source'] = old.hex() new = context.memctx(repo, - parents=[base.node(), nullid], + parents=[base.node(), old.p2().node()], text=message, files=files, filectxfn=filectxfn, @@ -1769,7 +1790,7 @@ finally: if newid is None: repo.dirstate.invalidate() - lockmod.release(wlock, lock) + lockmod.release(lock, wlock) return newid def commiteditor(repo, ctx, subs): @@ -1793,6 +1814,8 @@ edittext.append(_("HG: branch merge")) if ctx.branch(): edittext.append(_("HG: branch '%s'") % ctx.branch()) + if bookmarks.iscurrent(repo): + edittext.append(_("HG: bookmark '%s'") % repo._bookmarkcurrent) edittext.extend([_("HG: subrepo %s") % s for s in subs]) edittext.extend([_("HG: added %s") % f for f in added]) edittext.extend([_("HG: changed %s") % f for f in modified]) @@ -1812,6 +1835,52 @@ return text +def commitstatus(repo, node, branch, bheads=None, opts={}): + ctx = repo[node] + parents = ctx.parents() + + if (not opts.get('amend') and bheads and node not in bheads and not + [x for x in parents if x.node() in bheads and x.branch() == branch]): + repo.ui.status(_('created new head\n')) + # The message is not printed for initial roots. For the other + # changesets, it is printed in the following situations: + # + # Par column: for the 2 parents with ... + # N: null or no parent + # B: parent is on another named branch + # C: parent is a regular non head changeset + # H: parent was a branch head of the current branch + # Msg column: whether we print "created new head" message + # In the following, it is assumed that there already exists some + # initial branch heads of the current branch, otherwise nothing is + # printed anyway. + # + # Par Msg Comment + # N N y additional topo root + # + # B N y additional branch root + # C N y additional topo head + # H N n usual case + # + # B B y weird additional branch root + # C B y branch merge + # H B n merge with named branch + # + # C C y additional head from merge + # C H n merge with a head + # + # H H n head merge: head count decreases + + if not opts.get('close_branch'): + for r in parents: + if r.closesbranch() and r.branch() == branch: + repo.ui.status(_('reopening closed branch head %d\n') % r) + + if repo.ui.debugflag: + repo.ui.write(_('committed changeset %d:%s\n') % (int(ctx), ctx.hex())) + elif repo.ui.verbose: + repo.ui.write(_('committed changeset %d:%s\n') % (int(ctx), ctx)) + def revert(ui, repo, ctx, parents, *pats, **opts): parent, p2 = parents node = ctx.node() @@ -1986,6 +2055,12 @@ checkout(f) normal(f) + copied = copies.pathcopies(repo[parent], ctx) + + for f in add[0] + undelete[0] + revert[0]: + if f in copied: + repo.dirstate.copy(copied[f], f) + if targetsubs: # Revert the subrepos on the revert list for sub in targetsubs: diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/commands.py --- a/mercurial/commands.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/commands.py Tue May 14 23:04:23 2013 +0400 @@ -7,9 +7,9 @@ from node import hex, bin, nullid, nullrev, short from lock import release -from i18n import _, gettext +from i18n import _ import os, re, difflib, time, tempfile, errno -import hg, scmutil, util, revlog, extensions, copies, error, bookmarks +import hg, scmutil, util, revlog, copies, error, bookmarks import patch, help, encoding, templatekw, discovery import archival, changegroup, cmdutil, hbisect import sshserver, hgweb, hgweb.server, commandserver @@ -456,14 +456,11 @@ wlock = repo.wlock() try: branch = repo.dirstate.branch() + bheads = repo.branchheads(branch) hg.clean(repo, node, show_stats=False) repo.dirstate.setbranch(branch) - revert_opts = opts.copy() - revert_opts['date'] = None - revert_opts['all'] = True - revert_opts['rev'] = hex(parent) - revert_opts['no_backup'] = None - revert(ui, repo, **revert_opts) + rctx = scmutil.revsingle(repo, hex(parent)) + cmdutil.revert(ui, repo, rctx, repo.dirstate.parents()) if not opts.get('merge') and op1 != node: try: ui.setconfig('ui', 'forcemerge', opts.get('tool', '')) @@ -471,13 +468,18 @@ finally: ui.setconfig('ui', 'forcemerge', '') - commit_opts = opts.copy() - commit_opts['addremove'] = False - if not commit_opts['message'] and not commit_opts['logfile']: + e = cmdutil.commiteditor + if not opts['message'] and not opts['logfile']: # we don't translate commit messages - commit_opts['message'] = "Backed out changeset %s" % short(node) - commit_opts['force_editor'] = True - commit(ui, repo, **commit_opts) + opts['message'] = "Backed out changeset %s" % short(node) + e = cmdutil.commitforceeditor + + def commitfunc(ui, repo, message, match, opts): + return repo.commit(message, opts.get('user'), opts.get('date'), + match, editor=e) + newnode = cmdutil.commit(ui, repo, commitfunc, [], opts) + cmdutil.commitstatus(repo, newnode, branch, bheads) + def nice(node): return '%d:%s' % (repo.changelog.rev(node), short(node)) ui.status(_('changeset %s backs out changeset %s\n') % @@ -785,6 +787,10 @@ repositories to support bookmarks. For versions prior to 1.8, this means the bookmarks extension must be enabled. + If you set a bookmark called '@', new clones of the repository will + have that revision checked out (and the bookmark made active) by + default. + With -i/--inactive, the new bookmark will not be made the active bookmark. If -r/--rev is given, the new bookmark will not be made active even if -i/--inactive is not given. If no NAME is given, the @@ -802,8 +808,31 @@ scmutil.checknewlabel(repo, mark, 'bookmark') return mark - def checkconflict(repo, mark, force=False): + def checkconflict(repo, mark, force=False, target=None): if mark in marks and not force: + if target: + if marks[mark] == target and target == cur: + # re-activating a bookmark + return + anc = repo.changelog.ancestors([repo[target].rev()]) + bmctx = repo[marks[mark]] + divs = [repo[b].node() for b in marks + if b.split('@', 1)[0] == mark.split('@', 1)[0]] + + # allow resolving a single divergent bookmark even if moving + # the bookmark across branches when a revision is specified + # that contains a divergent bookmark + if bmctx.rev() not in anc and target in divs: + bookmarks.deletedivergent(repo, [target], mark) + return + + deletefrom = [b for b in divs + if repo[b].rev() in anc or b == target] + bookmarks.deletedivergent(repo, deletefrom, mark) + if bmctx.rev() in anc: + ui.status(_("moving bookmark '%s' forward from %s\n") % + (mark, short(bmctx.node()))) + return raise util.Abort(_("bookmark '%s' already exists " "(use -f to force)") % mark) if ((mark in repo.branchmap() or mark == repo.dirstate.branch()) @@ -846,13 +875,15 @@ if inactive and mark == repo._bookmarkcurrent: bookmarks.setcurrent(repo, None) return - checkconflict(repo, mark, force) + tgt = cur if rev: - marks[mark] = scmutil.revsingle(repo, rev).node() - else: - marks[mark] = cur - if not inactive and cur == marks[mark]: + tgt = scmutil.revsingle(repo, rev).node() + checkconflict(repo, mark, force, tgt) + marks[mark] = tgt + if not inactive and cur == marks[mark] and not rev: bookmarks.setcurrent(repo, mark) + elif cur != tgt and mark == repo._bookmarkcurrent: + bookmarks.setcurrent(repo, None) marks.write() # Same message whether trying to deactivate the current bookmark (-i @@ -869,7 +900,7 @@ else: # show bookmarks for bmark, n in sorted(marks.iteritems()): current = repo._bookmarkcurrent - if bmark == current and n == cur: + if bmark == current: prefix, label = '*', 'bookmarks.current' else: prefix, label = ' ', '' @@ -912,6 +943,9 @@ Returns 0 on success. """ + if label: + label = label.strip() + if not opts.get('clean') and not label: ui.write("%s\n" % repo.dirstate.branch()) return @@ -1062,7 +1096,7 @@ dest = ui.expandpath(dest or 'default-push', dest or 'default') dest, branches = hg.parseurl(dest, opts.get('branch')) other = hg.peer(repo, opts, dest) - revs, checkout = hg.addbranchrevs(repo, other, branches, revs) + revs, checkout = hg.addbranchrevs(repo, repo, branches, revs) heads = revs and map(repo.lookup, revs) or revs outgoing = discovery.findcommonoutgoing(repo, other, onlyheads=heads, @@ -1146,6 +1180,9 @@ tag will include the tagged changeset but not the changeset containing the tag. + If the source repository has a bookmark called '@' set, that + revision will be checked out in the new repository by default. + To check out a particular version, use -u/--update, or -U/--noupdate to create a clone with no working directory. @@ -1181,8 +1218,9 @@ d) the changeset specified with -r e) the tipmost head specified with -b f) the tipmost head specified with the url#branch source syntax - g) the tipmost head of the default branch - h) tip + g) the revision marked with the '@' bookmark, if present + h) the tipmost head of the default branch + i) tip Examples: @@ -1277,10 +1315,6 @@ extra = {} if opts.get('close_branch'): - if repo['.'].node() not in repo.branchheads(): - # The topo heads set is included in the branch heads set of the - # current branch, so it's sufficient to test branchheads - raise util.Abort(_('can only close branch heads')) extra['close'] = 1 branch = repo[None].branch() @@ -1293,8 +1327,6 @@ old = repo['.'] if old.phase() == phases.public: raise util.Abort(_('cannot amend public changesets')) - if len(old.parents()) > 1: - raise util.Abort(_('cannot amend merge changesets')) if len(repo[None].parents()) > 1: raise util.Abort(_('cannot amend while merging')) if (not obsolete._enabled) and old.children(): @@ -1353,50 +1385,7 @@ ui.status(_("nothing changed\n")) return 1 - ctx = repo[node] - parents = ctx.parents() - - if (not opts.get('amend') and bheads and node not in bheads and not - [x for x in parents if x.node() in bheads and x.branch() == branch]): - ui.status(_('created new head\n')) - # The message is not printed for initial roots. For the other - # changesets, it is printed in the following situations: - # - # Par column: for the 2 parents with ... - # N: null or no parent - # B: parent is on another named branch - # C: parent is a regular non head changeset - # H: parent was a branch head of the current branch - # Msg column: whether we print "created new head" message - # In the following, it is assumed that there already exists some - # initial branch heads of the current branch, otherwise nothing is - # printed anyway. - # - # Par Msg Comment - # N N y additional topo root - # - # B N y additional branch root - # C N y additional topo head - # H N n usual case - # - # B B y weird additional branch root - # C B y branch merge - # H B n merge with named branch - # - # C C y additional head from merge - # C H n merge with a head - # - # H H n head merge: head count decreases - - if not opts.get('close_branch'): - for r in parents: - if r.closesbranch() and r.branch() == branch: - ui.status(_('reopening closed branch head %d\n') % r) - - if ui.debugflag: - ui.write(_('committed changeset %d:%s\n') % (int(ctx), ctx.hex())) - elif ui.verbose: - ui.write(_('committed changeset %d:%s\n') % (int(ctx), ctx)) + cmdutil.commitstatus(repo, node, branch, bheads, opts) @command('copy|cp', [('A', 'after', None, _('record a copy that has already occurred')), @@ -2093,12 +2082,34 @@ flags = repo.known([bin(s) for s in ids]) ui.write("%s\n" % ("".join([f and "1" or "0" for f in flags]))) +@command('debuglabelcomplete', [], _('LABEL...')) +def debuglabelcomplete(ui, repo, *args): + '''complete "labels" - tags, open branch names, bookmark names''' + + labels = set() + labels.update(t[0] for t in repo.tagslist()) + labels.update(repo._bookmarks.keys()) + for heads in repo.branchmap().itervalues(): + for h in heads: + ctx = repo[h] + if not ctx.closesbranch(): + labels.add(ctx.branch()) + completions = set() + if not args: + args = [''] + for a in args: + completions.update(l for l in labels if l.startswith(a)) + ui.write('\n'.join(sorted(completions))) + ui.write('\n') + @command('debugobsolete', [('', 'flags', 0, _('markers flag')), ] + commitopts2, _('[OBSOLETED [REPLACEMENT] [REPL... ]')) def debugobsolete(ui, repo, precursor=None, *successors, **opts): - """create arbitrary obsolete marker""" + """create arbitrary obsolete marker + + With no arguments, displays the list of obsolescence markers.""" def parsenodeid(s): try: # We do not use revsingle/revrange functions here to accept @@ -2140,6 +2151,75 @@ sorted(m.metadata().items())))) ui.write('\n') +@command('debugpathcomplete', + [('f', 'full', None, _('complete an entire path')), + ('n', 'normal', None, _('show only normal files')), + ('a', 'added', None, _('show only added files')), + ('r', 'removed', None, _('show only removed files'))], + _('FILESPEC...')) +def debugpathcomplete(ui, repo, *specs, **opts): + '''complete part or all of a tracked path + + This command supports shells that offer path name completion. It + currently completes only files already known to the dirstate. + + Completion extends only to the next path segment unless + --full is specified, in which case entire paths are used.''' + + def complete(path, acceptable): + dirstate = repo.dirstate + spec = os.path.normpath(os.path.join(os.getcwd(), path)) + rootdir = repo.root + os.sep + if spec != repo.root and not spec.startswith(rootdir): + return [], [] + if os.path.isdir(spec): + spec += '/' + spec = spec[len(rootdir):] + fixpaths = os.sep != '/' + if fixpaths: + spec = spec.replace(os.sep, '/') + speclen = len(spec) + fullpaths = opts['full'] + files, dirs = set(), set() + adddir, addfile = dirs.add, files.add + for f, st in dirstate.iteritems(): + if f.startswith(spec) and st[0] in acceptable: + if fixpaths: + f = f.replace('/', os.sep) + if fullpaths: + addfile(f) + continue + s = f.find(os.sep, speclen) + if s >= 0: + adddir(f[:s + 1]) + else: + addfile(f) + return files, dirs + + acceptable = '' + if opts['normal']: + acceptable += 'nm' + if opts['added']: + acceptable += 'a' + if opts['removed']: + acceptable += 'r' + cwd = repo.getcwd() + if not specs: + specs = ['.'] + + files, dirs = set(), set() + for spec in specs: + f, d = complete(spec, acceptable or 'nmar') + files.update(f) + dirs.update(d) + if not files and len(dirs) == 1: + # force the shell to consider a completion that matches one + # directory and zero files to be ambiguous + dirs.add(iter(dirs).next() + '.') + files.update(dirs) + ui.write('\n'.join(repo.pathto(p, cwd) for p in sorted(files))) + ui.write('\n') + @command('debugpushkey', [], _('REPO NAMESPACE [KEY OLD NEW]')) def debugpushkey(ui, repopath, namespace, *keyinfo, **opts): '''access the pushkey key/value protocol @@ -2182,11 +2262,21 @@ (abs(pa._depth - pb._depth), pvec._hamming(pa._vec, pb._vec), pa.distance(pb), rel)) -@command('debugrebuildstate', +@command('debugrebuilddirstate|debugrebuildstate', [('r', 'rev', '', _('revision to rebuild to'), _('REV'))], - _('[-r REV] [REV]')) -def debugrebuildstate(ui, repo, rev="tip"): - """rebuild the dirstate as it would look like for the given revision""" + _('[-r REV]')) +def debugrebuilddirstate(ui, repo, rev): + """rebuild the dirstate as it would look like for the given revision + + If no revision is specified the first current parent will be used. + + The dirstate will be set to the files of the given revision. + The actual working directory content or existing dirstate + information such as adds or removes is not considered. + + One use of this command is to make the next :hg:`status` invocation + check the actual file content. + """ ctx = scmutil.revsingle(repo, rev) wlock = repo.wlock() try: @@ -2420,7 +2510,7 @@ finally: wlock.release() -@command('debugstate', +@command('debugdirstate|debugstate', [('', 'nodates', None, _('do not display the saved mtime')), ('', 'datesort', None, _('sort by saved mtime'))], _('[OPTION]...')) @@ -2645,11 +2735,12 @@ ('', 'switch-parent', None, _('diff against the second parent')), ('r', 'rev', [], _('revisions to export'), _('REV')), ] + diffopts, - _('[OPTION]... [-o OUTFILESPEC] [-r] REV...')) + _('[OPTION]... [-o OUTFILESPEC] [-r] [REV]...')) def export(ui, repo, *changesets, **opts): """dump the header and diffs for one or more changesets Print the changeset header and diffs for one or more revisions. + If no revision is given, the parent of the working directory is used. The information shown in the changeset header is: author, date, branch name (if non-default), changeset hash, parent(s) and commit @@ -2705,6 +2796,8 @@ Returns 0 on success. """ changesets += tuple(opts.get('rev', [])) + if not changesets: + changesets = ['.'] revs = scmutil.revrange(repo, changesets) if not revs: raise util.Abort(_("export requires at least one changeset")) @@ -2851,9 +2944,13 @@ return -1 # check for ancestors of dest branch - for rev in repo.revs('::. and %ld', revs): - ui.warn(_('skipping ancestor revision %s\n') % rev) - revs.remove(rev) + crev = repo['.'].rev() + ancestors = repo.changelog.ancestors([crev], inclusive=True) + # don't mutate while iterating, create a copy + for rev in list(revs): + if rev in ancestors: + ui.warn(_('skipping ancestor revision %s\n') % rev) + revs.remove(rev) if not revs: return -1 @@ -2867,7 +2964,9 @@ # check ancestors for earlier grafts ui.debug('scanning for duplicate grafts\n') - for ctx in repo.set("::. - ::%ld", revs): + + for rev in repo.changelog.findmissingrevs(revs, [crev]): + ctx = repo[rev] n = ctx.extra().get('source') if n in ids: r = repo[n].rev() @@ -2881,7 +2980,7 @@ elif ctx.hex() in ids: r = ids[ctx.hex()] ui.warn(_('skipping already grafted revision %s ' - '(was grafted from %d)\n') % (r, ctx.rev())) + '(was grafted from %d)\n') % (r, rev)) revs.remove(r) if not revs: return -1 @@ -2991,7 +3090,7 @@ if opts.get('ignore_case'): reflags |= re.I try: - regexp = re.compile(pattern, reflags) + regexp = util.compilere(pattern, reflags) except re.error, inst: ui.warn(_("grep: invalid match pattern: %s\n") % inst) return 1 @@ -3245,7 +3344,7 @@ ('k', 'keyword', '', _('show topics matching keyword')), ], _('[-ec] [TOPIC]')) -def help_(ui, name=None, unknowncmd=False, full=True, **opts): +def help_(ui, name=None, **opts): """show help for a given topic or a help overview With no arguments, print a list of commands with short help messages. @@ -3258,291 +3357,9 @@ textwidth = min(ui.termwidth(), 80) - 2 - def helpcmd(name): - try: - aliases, entry = cmdutil.findcmd(name, table, strict=unknowncmd) - except error.AmbiguousCommand, inst: - # py3k fix: except vars can't be used outside the scope of the - # except block, nor can be used inside a lambda. python issue4617 - prefix = inst.args[0] - select = lambda c: c.lstrip('^').startswith(prefix) - rst = helplist(select) - return rst - - rst = [] - - # check if it's an invalid alias and display its error if it is - if getattr(entry[0], 'badalias', False): - if not unknowncmd: - ui.pushbuffer() - entry[0](ui) - rst.append(ui.popbuffer()) - return rst - - # synopsis - if len(entry) > 2: - if entry[2].startswith('hg'): - rst.append("%s\n" % entry[2]) - else: - rst.append('hg %s %s\n' % (aliases[0], entry[2])) - else: - rst.append('hg %s\n' % aliases[0]) - # aliases - if full and not ui.quiet and len(aliases) > 1: - rst.append(_("\naliases: %s\n") % ', '.join(aliases[1:])) - rst.append('\n') - - # description - doc = gettext(entry[0].__doc__) - if not doc: - doc = _("(no help text available)") - if util.safehasattr(entry[0], 'definition'): # aliased command - if entry[0].definition.startswith('!'): # shell alias - doc = _('shell alias for::\n\n %s') % entry[0].definition[1:] - else: - doc = _('alias for: hg %s\n\n%s') % (entry[0].definition, doc) - doc = doc.splitlines(True) - if ui.quiet or not full: - rst.append(doc[0]) - else: - rst.extend(doc) - rst.append('\n') - - # check if this command shadows a non-trivial (multi-line) - # extension help text - try: - mod = extensions.find(name) - doc = gettext(mod.__doc__) or '' - if '\n' in doc.strip(): - msg = _('use "hg help -e %s" to show help for ' - 'the %s extension') % (name, name) - rst.append('\n%s\n' % msg) - except KeyError: - pass - - # options - if not ui.quiet and entry[1]: - rst.append('\n%s\n\n' % _("options:")) - rst.append(help.optrst(entry[1], ui.verbose)) - - if ui.verbose: - rst.append('\n%s\n\n' % _("global options:")) - rst.append(help.optrst(globalopts, ui.verbose)) - - if not ui.verbose: - if not full: - rst.append(_('\nuse "hg help %s" to show the full help text\n') - % name) - elif not ui.quiet: - omitted = _('use "hg -v help %s" to show more complete' - ' help and the global options') % name - notomitted = _('use "hg -v help %s" to show' - ' the global options') % name - help.indicateomitted(rst, omitted, notomitted) - - return rst - - - def helplist(select=None): - # list of commands - if name == "shortlist": - header = _('basic commands:\n\n') - else: - header = _('list of commands:\n\n') - - h = {} - cmds = {} - for c, e in table.iteritems(): - f = c.split("|", 1)[0] - if select and not select(f): - continue - if (not select and name != 'shortlist' and - e[0].__module__ != __name__): - continue - if name == "shortlist" and not f.startswith("^"): - continue - f = f.lstrip("^") - if not ui.debugflag and f.startswith("debug"): - continue - doc = e[0].__doc__ - if doc and 'DEPRECATED' in doc and not ui.verbose: - continue - doc = gettext(doc) - if not doc: - doc = _("(no help text available)") - h[f] = doc.splitlines()[0].rstrip() - cmds[f] = c.lstrip("^") - - rst = [] - if not h: - if not ui.quiet: - rst.append(_('no commands defined\n')) - return rst - - if not ui.quiet: - rst.append(header) - fns = sorted(h) - for f in fns: - if ui.verbose: - commands = cmds[f].replace("|",", ") - rst.append(" :%s: %s\n" % (commands, h[f])) - else: - rst.append(' :%s: %s\n' % (f, h[f])) - - if not name: - exts = help.listexts(_('enabled extensions:'), extensions.enabled()) - if exts: - rst.append('\n') - rst.extend(exts) - - rst.append(_("\nadditional help topics:\n\n")) - topics = [] - for names, header, doc in help.helptable: - topics.append((names[0], header)) - for t, desc in topics: - rst.append(" :%s: %s\n" % (t, desc)) - - optlist = [] - if not ui.quiet: - if ui.verbose: - optlist.append((_("global options:"), globalopts)) - if name == 'shortlist': - optlist.append((_('use "hg help" for the full list ' - 'of commands'), ())) - else: - if name == 'shortlist': - msg = _('use "hg help" for the full list of commands ' - 'or "hg -v" for details') - elif name and not full: - msg = _('use "hg help %s" to show the full help ' - 'text') % name - else: - msg = _('use "hg -v help%s" to show builtin aliases and ' - 'global options') % (name and " " + name or "") - optlist.append((msg, ())) - - if optlist: - for title, options in optlist: - rst.append('\n%s\n' % title) - if options: - rst.append('\n%s\n' % help.optrst(options, ui.verbose)) - return rst - - def helptopic(name): - for names, header, doc in help.helptable: - if name in names: - break - else: - raise error.UnknownCommand(name) - - rst = ["%s\n\n" % header] - # description - if not doc: - rst.append(" %s\n" % _("(no help text available)")) - if util.safehasattr(doc, '__call__'): - rst += [" %s\n" % l for l in doc().splitlines()] - - if not ui.verbose: - omitted = (_('use "hg help -v %s" to show more complete help') % - name) - help.indicateomitted(rst, omitted) - - try: - cmdutil.findcmd(name, table) - rst.append(_('\nuse "hg help -c %s" to see help for ' - 'the %s command\n') % (name, name)) - except error.UnknownCommand: - pass - return rst - - def helpext(name): - try: - mod = extensions.find(name) - doc = gettext(mod.__doc__) or _('no help text available') - except KeyError: - mod = None - doc = extensions.disabledext(name) - if not doc: - raise error.UnknownCommand(name) - - if '\n' not in doc: - head, tail = doc, "" - else: - head, tail = doc.split('\n', 1) - rst = [_('%s extension - %s\n\n') % (name.split('.')[-1], head)] - if tail: - rst.extend(tail.splitlines(True)) - rst.append('\n') - - if not ui.verbose: - omitted = (_('use "hg help -v %s" to show more complete help') % - name) - help.indicateomitted(rst, omitted) - - if mod: - try: - ct = mod.cmdtable - except AttributeError: - ct = {} - modcmds = set([c.split('|', 1)[0] for c in ct]) - rst.extend(helplist(modcmds.__contains__)) - else: - rst.append(_('use "hg help extensions" for information on enabling ' - 'extensions\n')) - return rst - - def helpextcmd(name): - cmd, ext, mod = extensions.disabledcmd(ui, name, - ui.configbool('ui', 'strict')) - doc = gettext(mod.__doc__).splitlines()[0] - - rst = help.listexts(_("'%s' is provided by the following " - "extension:") % cmd, {ext: doc}, indent=4) - rst.append('\n') - rst.append(_('use "hg help extensions" for information on enabling ' - 'extensions\n')) - return rst - - - rst = [] - kw = opts.get('keyword') - if kw: - matches = help.topicmatch(kw) - for t, title in (('topics', _('Topics')), - ('commands', _('Commands')), - ('extensions', _('Extensions')), - ('extensioncommands', _('Extension Commands'))): - if matches[t]: - rst.append('%s:\n\n' % title) - rst.extend(minirst.maketable(sorted(matches[t]), 1)) - rst.append('\n') - elif name and name != 'shortlist': - i = None - if unknowncmd: - queries = (helpextcmd,) - elif opts.get('extension'): - queries = (helpext,) - elif opts.get('command'): - queries = (helpcmd,) - else: - queries = (helptopic, helpcmd, helpext, helpextcmd) - for f in queries: - try: - rst = f(name) - i = None - break - except error.UnknownCommand, inst: - i = inst - if i: - raise i - else: - # program name - if not ui.quiet: - rst = [_("Mercurial Distributed SCM\n"), '\n'] - rst.extend(helplist()) - keep = ui.verbose and ['verbose'] or [] - text = ''.join(rst) + text = help.help_(ui, name, **opts) + formatted, pruned = minirst.format(text, textwidth, keep=keep) if 'verbose' in pruned: keep.append('omitted') @@ -3802,11 +3619,6 @@ wlock = lock = tr = None msgs = [] - def checkexact(repo, n, nodeid): - if opts.get('exact') and hex(n) != nodeid: - repo.rollback() - raise util.Abort(_('patch is damaged or loses information')) - def tryone(ui, hunk, parents): tmpname, message, user, date, branch, nodeid, p1, p2 = \ patch.extract(ui, hunk) @@ -3878,7 +3690,6 @@ n = repo.commit(message, opts.get('user') or user, opts.get('date') or date, match=m, editor=editor) - checkexact(repo, n, nodeid) else: if opts.get('exact') or opts.get('import_branch'): branch = branch or 'default' @@ -3900,9 +3711,10 @@ editor=cmdutil.commiteditor) repo.savecommitmessage(memctx.description()) n = memctx.commit() - checkexact(repo, n, nodeid) finally: store.close() + if opts.get('exact') and hex(n) != nodeid: + raise util.Abort(_('patch is damaged or loses information')) if n: # i18n: refers to a short changeset id msg = _('created %s') % short(n) @@ -4247,10 +4059,10 @@ displayer.show(ctx, copies=copies, matchfn=revmatchfn) for ctx in cmdutil.walkchangerevs(repo, matchfn, opts, prep): + if displayer.flush(ctx.rev()): + count += 1 if count == limit: break - if displayer.flush(ctx.rev()): - count += 1 displayer.close() @command('manifest', @@ -4713,14 +4525,15 @@ ui.status(_('pulling from %s\n') % util.hidepassword(source)) revs, checkout = hg.addbranchrevs(repo, other, branches, opts.get('rev')) + remotebookmarks = other.listkeys('bookmarks') + if opts.get('bookmark'): if not revs: revs = [] - rb = other.listkeys('bookmarks') for b in opts['bookmark']: - if b not in rb: + if b not in remotebookmarks: raise util.Abort(_('remote bookmark %s not found!') % b) - revs.append(rb[b]) + revs.append(remotebookmarks[b]) if revs: try: @@ -4731,7 +4544,7 @@ raise util.Abort(err) modheads = repo.pull(other, heads=revs, force=opts.get('force')) - bookmarks.updatefromremote(ui, repo, other, source) + bookmarks.updatefromremote(ui, repo, remotebookmarks, source) if checkout: checkout = str(repo.changelog.rev(other.lookup(checkout))) repo._subtoppath = source @@ -4747,7 +4560,7 @@ for b in opts['bookmark']: # explicit pull overrides local bookmark if any ui.status(_("importing bookmark %s\n") % b) - marks[b] = repo[rb[b]].node() + marks[b] = repo[remotebookmarks[b]].node() marks.write() return ret @@ -5308,9 +5121,9 @@ if not repo: raise error.RepoError(_("there is no Mercurial repository" " here (.hg not found)")) - o = repo.root - - app = hgweb.hgweb(o, baseui=ui) + o = repo + + app = hgweb.hgweb(o, baseui=baseui) class service(object): def init(self): @@ -5567,12 +5380,11 @@ # i18n: column positioning for "hg summary" ui.write(_('bookmarks:'), label='log.bookmark') if current is not None: - try: - marks.remove(current) + if current in marks: ui.write(' *' + current, label='bookmarks.current') - except ValueError: - # current bookmark not in parent ctx marks - pass + marks.remove(current) + else: + ui.write(' [%s]' % current, label='bookmarks.current') for m in marks: ui.write(' ' + m, label='log.bookmark') ui.write('\n', label='log.bookmark') @@ -5664,25 +5476,32 @@ if opts.get('remote'): t = [] source, branches = hg.parseurl(ui.expandpath('default')) + sbranch = branches[0] other = hg.peer(repo, {}, source) revs, checkout = hg.addbranchrevs(repo, other, branches, opts.get('rev')) + if revs: + revs = [other.lookup(rev) for rev in revs] ui.debug('comparing with %s\n' % util.hidepassword(source)) repo.ui.pushbuffer() - commoninc = discovery.findcommonincoming(repo, other) + commoninc = discovery.findcommonincoming(repo, other, heads=revs) _common, incoming, _rheads = commoninc repo.ui.popbuffer() if incoming: t.append(_('1 or more incoming')) dest, branches = hg.parseurl(ui.expandpath('default-push', 'default')) + dbranch = branches[0] revs, checkout = hg.addbranchrevs(repo, repo, branches, None) if source != dest: other = hg.peer(repo, {}, dest) + ui.debug('comparing with %s\n' % util.hidepassword(dest)) + if (source != dest or (sbranch is not None and sbranch != dbranch)): commoninc = None - ui.debug('comparing with %s\n' % util.hidepassword(dest)) + if revs: + revs = [repo.lookup(rev) for rev in revs] repo.ui.pushbuffer() - outgoing = discovery.findcommonoutgoing(repo, other, + outgoing = discovery.findcommonoutgoing(repo, other, onlyheads=revs, commoninc=commoninc) repo.ui.popbuffer() o = outgoing.missing @@ -5808,7 +5627,7 @@ # don't allow tagging the null rev if (not opts.get('remove') and scmutil.revsingle(repo, rev_).rev() == nullrev): - raise util.Abort(_("null revision specified")) + raise util.Abort(_("cannot tag null revision")) repo.tag(names, r, message, opts.get('local'), opts.get('user'), date) finally: @@ -5961,7 +5780,12 @@ # with no argument, we also move the current bookmark, if any movemarkfrom = None if rev is None: - movemarkfrom = repo['.'].node() + curmark = repo._bookmarkcurrent + if bookmarks.iscurrent(repo): + movemarkfrom = repo['.'].node() + elif curmark: + ui.status(_("updating to active bookmark %s\n") % curmark) + rev = curmark # if we defined a bookmark, we have to remember the original bookmark name brev = rev diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/config.py --- a/mercurial/config.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/config.py Tue May 14 23:04:23 2013 +0400 @@ -44,6 +44,7 @@ def __init__(self, data=None): self._data = {} self._source = {} + self._unset = [] if data: for k in data._data: self._data[k] = data[k].copy() @@ -58,6 +59,10 @@ for d in self.sections(): yield d def update(self, src): + for s, n in src._unset: + if s in self and n in self._data[s]: + del self._data[s][n] + del self._source[(s, n)] for s in src: if s not in self: self._data[s] = sortdict() @@ -173,6 +178,7 @@ continue if self.get(section, name) is not None: del self._data[section][name] + self._unset.append((section, name)) continue raise error.ParseError(l.rstrip(), ("%s:%s" % (src, line))) diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/context.py --- a/mercurial/context.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/context.py Tue May 14 23:04:23 2013 +0400 @@ -291,16 +291,16 @@ try: return self._manifest[path], self._manifest.flags(path) except KeyError: - raise error.LookupError(self._node, path, - _('not found in manifest')) + raise error.ManifestLookupError(self._node, path, + _('not found in manifest')) if '_manifestdelta' in self.__dict__ or path in self.files(): if path in self._manifestdelta: return (self._manifestdelta[path], self._manifestdelta.flags(path)) node, flag = self._repo.manifest.find(self._changeset[0], path) if not node: - raise error.LookupError(self._node, path, - _('not found in manifest')) + raise error.ManifestLookupError(self._node, path, + _('not found in manifest')) return node, flag @@ -374,16 +374,7 @@ @propertycache def _dirs(self): - dirs = set() - for f in self._manifest: - pos = f.rfind('/') - while pos != -1: - f = f[:pos] - if f in dirs: - break # dirs already contains this and above - dirs.add(f) - pos = f.rfind('/') - return dirs + return scmutil.dirs(self._manifest) def dirs(self): return self._dirs @@ -426,12 +417,12 @@ # repository is filtered this may lead to `filectx` trying to build # `changectx` for filtered revision. In such case we fallback to # creating `changectx` on the unfiltered version of the reposition. - # This fallback should not be an issue because`changectx` from - # `filectx` are not used in complexe operation that care about + # This fallback should not be an issue because `changectx` from + # `filectx` are not used in complex operations that care about # filtering. # # This fallback is a cheap and dirty fix that prevent several - # crash. It does not ensure the behavior is correct. However the + # crashes. It does not ensure the behavior is correct. However the # behavior was not correct before filtering either and "incorrect # behavior" is seen as better as "crash" # @@ -698,7 +689,8 @@ needed = {base: 1} while visit: f = visit[-1] - if f not in pcache: + pcached = f in pcache + if not pcached: pcache[f] = parents(f) ready = True @@ -707,14 +699,21 @@ if p not in hist: ready = False visit.append(p) + if not pcached: needed[p] = needed.get(p, 0) + 1 if ready: visit.pop() - curr = decorate(f.data(), f) + reusable = f in hist + if reusable: + curr = hist[f] + else: + curr = decorate(f.data(), f) for p in pl: - curr = pair(hist[p], curr) + if not reusable: + curr = pair(hist[p], curr) if needed[p] == 1: del hist[p] + del needed[p] else: needed[p] -= 1 @@ -765,7 +764,7 @@ return pl a, b = (self._path, self._filenode), (fc2._path, fc2._filenode) - v = ancestor.ancestor(a, b, parents) + v = ancestor.genericancestor(a, b, parents) if v: f, n = v return filectx(self._repo, f, fileid=n, filelog=flcache[f]) @@ -1138,8 +1137,24 @@ finally: wlock.release() + def markcommitted(self, node): + """Perform post-commit cleanup necessary after committing this ctx + + Specifically, this updates backing stores this working context + wraps to reflect the fact that the changes reflected by this + workingctx have been committed. For example, it marks + modified and added files as normal in the dirstate. + + """ + + for f in self.modified() + self.added(): + self._repo.dirstate.normal(f) + for f in self.removed(): + self._repo.dirstate.drop(f) + self._repo.dirstate.setparents(node) + def dirs(self): - return set(self._repo.dirstate.dirs()) + return self._repo.dirstate.dirs() class workingfilectx(filectx): """A workingfilectx object makes access to data related to a particular diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/copies.py --- a/mercurial/copies.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/copies.py Tue May 14 23:04:23 2013 +0400 @@ -133,11 +133,13 @@ # we currently don't try to find where old files went, too expensive # this means we can miss a case like 'hg rm b; hg cp a b' cm = {} - for f in b: - if f not in a: - ofctx = _tracefile(b[f], a) - if ofctx: - cm[f] = ofctx.path() + missing = set(b.manifest().iterkeys()) + missing.difference_update(a.manifest().iterkeys()) + + for f in missing: + ofctx = _tracefile(b[f], a) + if ofctx: + cm[f] = ofctx.path() # combine copies from dirstate if necessary if w is not None: @@ -333,8 +335,8 @@ # generate a directory move map d1, d2 = c1.dirs(), c2.dirs() - d1.add('') - d2.add('') + d1.addpath('/') + d2.addpath('/') invalid = set() dirmove = {} diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/dicthelpers.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/mercurial/dicthelpers.py Tue May 14 23:04:23 2013 +0400 @@ -0,0 +1,55 @@ +# dicthelpers.py - helper routines for Python dicts +# +# Copyright 2013 Facebook +# +# This software may be used and distributed according to the terms of the +# GNU General Public License version 2 or any later version. + +def diff(d1, d2, default=None): + '''Return all key-value pairs that are different between d1 and d2. + + This includes keys that are present in one dict but not the other, and + keys whose values are different. The return value is a dict with values + being pairs of values from d1 and d2 respectively, and missing values + treated as default, so if a value is missing from one dict and the same as + default in the other, it will not be returned.''' + res = {} + if d1 is d2: + # same dict, so diff is empty + return res + + for k1, v1 in d1.iteritems(): + v2 = d2.get(k1, default) + if v1 != v2: + res[k1] = (v1, v2) + + for k2 in d2: + if k2 not in d1: + v2 = d2[k2] + if v2 != default: + res[k2] = (default, v2) + + return res + +def join(d1, d2, default=None): + '''Return all key-value pairs from both d1 and d2. + + This is akin to an outer join in relational algebra. The return value is a + dict with values being pairs of values from d1 and d2 respectively, and + missing values represented as default.''' + res = {} + + for k1, v1 in d1.iteritems(): + if k1 in d2: + res[k1] = (v1, d2[k1]) + else: + res[k1] = (v1, default) + + if d1 is d2: + return res + + for k2 in d2: + if k2 not in d1: + res[k2] = (default, d2[k2]) + + return res diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/dirs.c --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/mercurial/dirs.c Tue May 14 23:04:23 2013 +0400 @@ -0,0 +1,305 @@ +/* + dirs.c - dynamic directory diddling for dirstates + + Copyright 2013 Facebook + + This software may be used and distributed according to the terms of + the GNU General Public License, incorporated herein by reference. +*/ + +#define PY_SSIZE_T_CLEAN +#include +#include "util.h" + +/* + * This is a multiset of directory names, built from the files that + * appear in a dirstate or manifest. + * + * A few implementation notes: + * + * We modify Python integers for refcounting, but those integers are + * never visible to Python code. + * + * We mutate strings in-place, but leave them immutable once they can + * be seen by Python code. + */ +typedef struct { + PyObject_HEAD + PyObject *dict; +} dirsObject; + +static inline Py_ssize_t _finddir(PyObject *path, Py_ssize_t pos) +{ + const char *s = PyString_AS_STRING(path); + + while (pos != -1) { + if (s[pos] == '/') + break; + pos -= 1; + } + + return pos; +} + +static int _addpath(PyObject *dirs, PyObject *path) +{ + const char *cpath = PyString_AS_STRING(path); + Py_ssize_t pos = PyString_GET_SIZE(path); + PyObject *key = NULL; + int ret = -1; + + while ((pos = _finddir(path, pos - 1)) != -1) { + PyObject *val; + + /* It's likely that every prefix already has an entry + in our dict. Try to avoid allocating and + deallocating a string for each prefix we check. */ + if (key != NULL) + ((PyStringObject *)key)->ob_shash = -1; + else { + /* Force Python to not reuse a small shared string. */ + key = PyString_FromStringAndSize(cpath, + pos < 2 ? 2 : pos); + if (key == NULL) + goto bail; + } + PyString_GET_SIZE(key) = pos; + PyString_AS_STRING(key)[pos] = '\0'; + + val = PyDict_GetItem(dirs, key); + if (val != NULL) { + PyInt_AS_LONG(val) += 1; + continue; + } + + /* Force Python to not reuse a small shared int. */ + val = PyInt_FromLong(0x1eadbeef); + + if (val == NULL) + goto bail; + + PyInt_AS_LONG(val) = 1; + ret = PyDict_SetItem(dirs, key, val); + Py_DECREF(val); + if (ret == -1) + goto bail; + Py_CLEAR(key); + } + ret = 0; + +bail: + Py_XDECREF(key); + + return ret; +} + +static int _delpath(PyObject *dirs, PyObject *path) +{ + Py_ssize_t pos = PyString_GET_SIZE(path); + PyObject *key = NULL; + int ret = -1; + + while ((pos = _finddir(path, pos - 1)) != -1) { + PyObject *val; + + key = PyString_FromStringAndSize(PyString_AS_STRING(path), pos); + + if (key == NULL) + goto bail; + + val = PyDict_GetItem(dirs, key); + if (val == NULL) { + PyErr_SetString(PyExc_ValueError, + "expected a value, found none"); + goto bail; + } + + if (--PyInt_AS_LONG(val) <= 0 && + PyDict_DelItem(dirs, key) == -1) + goto bail; + Py_CLEAR(key); + } + ret = 0; + +bail: + Py_XDECREF(key); + + return ret; +} + +static int dirs_fromdict(PyObject *dirs, PyObject *source, char skipchar) +{ + PyObject *key, *value; + Py_ssize_t pos = 0; + + while (PyDict_Next(source, &pos, &key, &value)) { + if (!PyString_Check(key)) { + PyErr_SetString(PyExc_TypeError, "expected string key"); + return -1; + } + if (skipchar) { + PyObject *st; + + if (!PyTuple_Check(value) || + PyTuple_GET_SIZE(value) == 0) { + PyErr_SetString(PyExc_TypeError, + "expected non-empty tuple"); + return -1; + } + + st = PyTuple_GET_ITEM(value, 0); + + if (!PyString_Check(st) || PyString_GET_SIZE(st) == 0) { + PyErr_SetString(PyExc_TypeError, + "expected non-empty string " + "at tuple index 0"); + return -1; + } + + if (PyString_AS_STRING(st)[0] == skipchar) + continue; + } + + if (_addpath(dirs, key) == -1) + return -1; + } + + return 0; +} + +static int dirs_fromiter(PyObject *dirs, PyObject *source) +{ + PyObject *iter, *item = NULL; + int ret; + + iter = PyObject_GetIter(source); + if (iter == NULL) + return -1; + + while ((item = PyIter_Next(iter)) != NULL) { + if (!PyString_Check(item)) { + PyErr_SetString(PyExc_TypeError, "expected string"); + break; + } + + if (_addpath(dirs, item) == -1) + break; + Py_CLEAR(item); + } + + ret = PyErr_Occurred() ? -1 : 0; + Py_XDECREF(item); + return ret; +} + +/* + * Calculate a refcounted set of directory names for the files in a + * dirstate. + */ +static int dirs_init(dirsObject *self, PyObject *args) +{ + PyObject *dirs = NULL, *source = NULL; + char skipchar = 0; + int ret = -1; + + self->dict = NULL; + + if (!PyArg_ParseTuple(args, "|Oc:__init__", &source, &skipchar)) + return -1; + + dirs = PyDict_New(); + + if (dirs == NULL) + return -1; + + if (source == NULL) + ret = 0; + else if (PyDict_Check(source)) + ret = dirs_fromdict(dirs, source, skipchar); + else if (skipchar) + PyErr_SetString(PyExc_ValueError, + "skip character is only supported " + "with a dict source"); + else + ret = dirs_fromiter(dirs, source); + + if (ret == -1) + Py_XDECREF(dirs); + else + self->dict = dirs; + + return ret; +} + +PyObject *dirs_addpath(dirsObject *self, PyObject *args) +{ + PyObject *path; + + if (!PyArg_ParseTuple(args, "O!:addpath", &PyString_Type, &path)) + return NULL; + + if (_addpath(self->dict, path) == -1) + return NULL; + + Py_RETURN_NONE; +} + +static PyObject *dirs_delpath(dirsObject *self, PyObject *args) +{ + PyObject *path; + + if (!PyArg_ParseTuple(args, "O!:delpath", &PyString_Type, &path)) + return NULL; + + if (_delpath(self->dict, path) == -1) + return NULL; + + Py_RETURN_NONE; +} + +static int dirs_contains(dirsObject *self, PyObject *value) +{ + return PyString_Check(value) ? PyDict_Contains(self->dict, value) : 0; +} + +static void dirs_dealloc(dirsObject *self) +{ + Py_XDECREF(self->dict); + PyObject_Del(self); +} + +static PyObject *dirs_iter(dirsObject *self) +{ + return PyObject_GetIter(self->dict); +} + +static PySequenceMethods dirs_sequence_methods; + +static PyMethodDef dirs_methods[] = { + {"addpath", (PyCFunction)dirs_addpath, METH_VARARGS, "add a path"}, + {"delpath", (PyCFunction)dirs_delpath, METH_VARARGS, "remove a path"}, + {NULL} /* Sentinel */ +}; + +static PyTypeObject dirsType = { PyObject_HEAD_INIT(NULL) }; + +void dirs_module_init(PyObject *mod) +{ + dirs_sequence_methods.sq_contains = (objobjproc)dirs_contains; + dirsType.tp_name = "parsers.dirs"; + dirsType.tp_new = PyType_GenericNew; + dirsType.tp_basicsize = sizeof(dirsObject); + dirsType.tp_dealloc = (destructor)dirs_dealloc; + dirsType.tp_as_sequence = &dirs_sequence_methods; + dirsType.tp_flags = Py_TPFLAGS_DEFAULT; + dirsType.tp_doc = "dirs"; + dirsType.tp_iter = (getiterfunc)dirs_iter; + dirsType.tp_methods = dirs_methods; + dirsType.tp_init = (initproc)dirs_init; + + if (PyType_Ready(&dirsType) < 0) + return; + Py_INCREF(&dirsType); + + PyModule_AddObject(mod, "dirs", (PyObject *)&dirsType); +} diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/dirstate.py --- a/mercurial/dirstate.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/dirstate.py Tue May 14 23:04:23 2013 +0400 @@ -9,10 +9,8 @@ from node import nullid from i18n import _ import scmutil, util, ignore, osutil, parsers, encoding -import struct, os, stat, errno -import cStringIO +import os, stat, errno, gc -_format = ">cllll" propertycache = util.propertycache filecache = scmutil.filecache _rangemask = 0x7fffffff @@ -27,26 +25,6 @@ def join(self, obj, fname): return obj._join(fname) -def _finddirs(path): - pos = path.rfind('/') - while pos != -1: - yield path[:pos] - pos = path.rfind('/', 0, pos) - -def _incdirs(dirs, path): - for base in _finddirs(path): - if base in dirs: - dirs[base] += 1 - return - dirs[base] = 1 - -def _decdirs(dirs, path): - for base in _finddirs(path): - if dirs[base] > 1: - dirs[base] -= 1 - return - del dirs[base] - class dirstate(object): def __init__(self, opener, ui, root, validate): @@ -81,8 +59,9 @@ @propertycache def _foldmap(self): f = {} - for name in self._map: - f[util.normcase(name)] = name + for name, s in self._map.iteritems(): + if s[0] != 'r': + f[util.normcase(name)] = name for name in self._dirs: f[util.normcase(name)] = name f['.'] = '.' # prevents useless util.fspath() invocation @@ -115,11 +94,7 @@ @propertycache def _dirs(self): - dirs = {} - for f, s in self._map.iteritems(): - if s[0] != 'r': - _incdirs(dirs, f) - return dirs + return scmutil.dirs(self._map, 'r') def dirs(self): return self._dirs @@ -156,11 +131,14 @@ def flagfunc(self, buildfallback): if self._checklink and self._checkexec: def f(x): - p = self._join(x) - if os.path.islink(p): - return 'l' - if util.isexec(p): - return 'x' + try: + st = os.lstat(self._join(x)) + if util.statislink(st): + return 'l' + if util.statisexec(st): + return 'x' + except OSError: + pass return '' return f @@ -225,6 +203,9 @@ for x in sorted(self._map): yield x + def iteritems(self): + return self._map.iteritems() + def parents(self): return [self._validate(p) for p in self._pl] @@ -287,7 +268,23 @@ if not st: return - p = parsers.parse_dirstate(self._map, self._copymap, st) + # Python's garbage collector triggers a GC each time a certain number + # of container objects (the number being defined by + # gc.get_threshold()) are allocated. parse_dirstate creates a tuple + # for each file in the dirstate. The C version then immediately marks + # them as not to be tracked by the collector. However, this has no + # effect on when GCs are triggered, only on what objects the GC looks + # into. This means that O(number of files) GCs are unavoidable. + # Depending on when in the process's lifetime the dirstate is parsed, + # this can get very expensive. As a workaround, disable GC while + # parsing the dirstate. + gcenabled = gc.isenabled() + gc.disable() + try: + p = parsers.parse_dirstate(self._map, self._copymap, st) + finally: + if gcenabled: + gc.enable() if not self._dirtypl: self._pl = p @@ -317,7 +314,7 @@ def _droppath(self, f): if self[f] not in "?r" and "_dirs" in self.__dict__: - _decdirs(self._dirs, f) + self._dirs.delpath(f) def _addpath(self, f, state, mode, size, mtime): oldstate = self[f] @@ -326,14 +323,14 @@ if f in self._dirs: raise util.Abort(_('directory %r already in dirstate') % f) # shadows - for d in _finddirs(f): + for d in scmutil.finddirs(f): if d in self._dirs: break if d in self._map and self[d] != 'r': raise util.Abort( _('file %r in dirstate clashes with %r') % (d, f)) if oldstate in "?r" and "_dirs" in self.__dict__: - _incdirs(self._dirs, f) + self._dirs.addpath(f) self._dirty = True self._map[f] = (state, mode, size, mtime) @@ -484,13 +481,18 @@ self._lastnormaltime = 0 self._dirty = True - def rebuild(self, parent, files): + def rebuild(self, parent, allfiles, changedfiles=None): + changedfiles = changedfiles or allfiles + oldmap = self._map self.clear() - for f in files: - if 'x' in files.flags(f): - self._map[f] = ('n', 0777, -1, 0) + for f in allfiles: + if f not in changedfiles: + self._map[f] = oldmap[f] else: - self._map[f] = ('n', 0666, -1, 0) + if 'x' in allfiles.flags(f): + self._map[f] = ('n', 0777, -1, 0) + else: + self._map[f] = ('n', 0666, -1, 0) self._pl = (parent, nullid) self._dirty = True @@ -508,45 +510,14 @@ # use the modification time of the newly created temporary file as the # filesystem's notion of 'now' now = util.fstat(st).st_mtime - copymap = self._copymap - try: - finish(parsers.pack_dirstate(self._map, copymap, self._pl, now)) - return - except AttributeError: - pass - - now = int(now) - cs = cStringIO.StringIO() - pack = struct.pack - write = cs.write - write("".join(self._pl)) - for f, e in self._map.iteritems(): - if e[0] == 'n' and e[3] == now: - # The file was last modified "simultaneously" with the current - # write to dirstate (i.e. within the same second for file- - # systems with a granularity of 1 sec). This commonly happens - # for at least a couple of files on 'update'. - # The user could change the file without changing its size - # within the same second. Invalidate the file's stat data in - # dirstate, forcing future 'status' calls to compare the - # contents of the file. This prevents mistakenly treating such - # files as clean. - e = (e[0], 0, -1, -1) # mark entry as 'unset' - self._map[f] = e - - if f in copymap: - f = "%s\0%s" % (f, copymap[f]) - e = pack(_format, e[0], e[1], e[2], e[3], len(f)) - write(e) - write(f) - finish(cs.getvalue()) + finish(parsers.pack_dirstate(self._map, self._copymap, self._pl, now)) def _dirignore(self, f): if f == '.': return False if self._ignore(f): return True - for p in _finddirs(f): + for p in scmutil.finddirs(f): if self._ignore(p): return True return False @@ -589,6 +560,7 @@ dirignore = util.always matchfn = match.matchfn + matchalways = match.always() badfn = match.bad dmap = self._map normpath = util.normpath @@ -696,26 +668,53 @@ if not ignore(nf): match.dir(nf) wadd(nf) - if nf in dmap and matchfn(nf): + if nf in dmap and (matchalways or matchfn(nf)): results[nf] = None elif kind == regkind or kind == lnkkind: if nf in dmap: - if matchfn(nf): + if matchalways or matchfn(nf): results[nf] = st - elif matchfn(nf) and not ignore(nf): + elif (matchalways or matchfn(nf)) and not ignore(nf): results[nf] = st - elif nf in dmap and matchfn(nf): + elif nf in dmap and (matchalways or matchfn(nf)): results[nf] = None + for s in subrepos: + del results[s] + del results['.hg'] + # step 3: report unseen items in the dmap hash if not skipstep3 and not exact: - visit = sorted([f for f in dmap if f not in results and matchfn(f)]) - nf = iter(visit).next - for st in util.statfiles([join(i) for i in visit]): - results[nf()] = st - for s in subrepos: - del results[s] - del results['.hg'] + if not results and matchalways: + visit = dmap.keys() + else: + visit = [f for f in dmap if f not in results and matchfn(f)] + visit.sort() + + if unknown: + # unknown == True means we walked the full directory tree above. + # So if a file is not seen it was either a) not matching matchfn + # b) ignored, c) missing, or d) under a symlink directory. + audit_path = scmutil.pathauditor(self._root) + + for nf in iter(visit): + # Report ignored items in the dmap as long as they are not + # under a symlink directory. + if audit_path.check(nf): + try: + results[nf] = lstat(join(nf)) + except OSError: + # file doesn't exist + results[nf] = None + else: + # It's either missing or under a symlink directory + results[nf] = None + else: + # We may not have walked the full directory tree above, + # so stat everything we missed. + nf = iter(visit).next + for st in util.statfiles([join(i) for i in visit]): + results[nf()] = st return results def status(self, match, subrepos, ignored, clean, unknown): diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/dispatch.py --- a/mercurial/dispatch.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/dispatch.py Tue May 14 23:04:23 2013 +0400 @@ -151,6 +151,9 @@ commands.help_(ui, inst.args[0], unknowncmd=True) except error.UnknownCommand: commands.help_(ui, 'shortlist') + except error.InterventionRequired, inst: + ui.warn("%s\n" % inst) + return 1 except util.Abort, inst: ui.warn(_("abort: %s\n") % inst) if inst.hint: @@ -247,6 +250,7 @@ (_("** Mercurial Distributed SCM (version %s)\n") % myver) + (_("** Extensions loaded: %s\n") % ", ".join([x[0] for x in extensions.extensions()]))) + ui.log("commandexception", "%s\n%s\n", warning, traceback.format_exc()) ui.warn(warning) raise @@ -333,7 +337,7 @@ self.cmdname = cmd = args.pop(0) args = map(util.expandpath, args) - for invalidarg in ("--cwd", "-R", "--repository", "--repo"): + for invalidarg in ("--cwd", "-R", "--repository", "--repo", "--config"): if _earlygetopt([invalidarg], args): def fn(ui, *args): ui.warn(_("error in definition for alias '%s': %s may only " @@ -481,6 +485,22 @@ The values are listed in the order they appear in args. The options and values are removed from args. + + >>> args = ['x', '--cwd', 'foo', 'y'] + >>> _earlygetopt(['--cwd'], args), args + (['foo'], ['x', 'y']) + + >>> args = ['x', '--cwd=bar', 'y'] + >>> _earlygetopt(['--cwd'], args), args + (['bar'], ['x', 'y']) + + >>> args = ['x', '-R', 'foo', 'y'] + >>> _earlygetopt(['-R'], args), args + (['foo'], ['x', 'y']) + + >>> args = ['x', '-Rbar', 'y'] + >>> _earlygetopt(['-R'], args), args + (['bar'], ['x', 'y']) """ try: argcount = args.index("--") @@ -490,14 +510,22 @@ values = [] pos = 0 while pos < argcount: - if args[pos] in aliases: - if pos + 1 >= argcount: - # ignore and let getopt report an error if there is no value - break + fullarg = arg = args[pos] + equals = arg.find('=') + if equals > -1: + arg = arg[:equals] + if arg in aliases: del args[pos] - values.append(args.pop(pos)) - argcount -= 2 - elif args[pos][:2] in shortopts: + if equals > -1: + values.append(fullarg[equals + 1:]) + argcount -= 1 + else: + if pos + 1 >= argcount: + # ignore and let getopt report an error if there is no value + break + values.append(args.pop(pos)) + argcount -= 2 + elif arg[:2] in shortopts: # short option can have no following space, e.g. hg log -Rfoo values.append(args.pop(pos)[2:]) argcount -= 1 @@ -507,10 +535,8 @@ def runcommand(lui, repo, cmd, fullargs, ui, options, d, cmdpats, cmdoptions): # run pre-hook, and abort if it fails - ret = hook.hook(lui, repo, "pre-%s" % cmd, False, args=" ".join(fullargs), - pats=cmdpats, opts=cmdoptions) - if ret: - return ret + hook.hook(lui, repo, "pre-%s" % cmd, True, args=" ".join(fullargs), + pats=cmdpats, opts=cmdoptions) ret = _runcommand(ui, options, cmd, d) # run post-hook, passing command result hook.hook(lui, repo, "post-%s" % cmd, False, args=" ".join(fullargs), @@ -736,18 +762,25 @@ ui.warn(_("warning: --repository ignored\n")) msg = ' '.join(' ' in a and repr(a) or a for a in fullargs) - ui.log("command", msg + "\n") + ui.log("command", '%s\n', msg) d = lambda: util.checksignature(func)(ui, *args, **cmdoptions) + starttime = time.time() + ret = None try: - return runcommand(lui, repo, cmd, fullargs, ui, options, d, - cmdpats, cmdoptions) + ret = runcommand(lui, repo, cmd, fullargs, ui, options, d, + cmdpats, cmdoptions) + return ret finally: + duration = time.time() - starttime + ui.log("commandfinish", "%s exited %s after %0.2f seconds\n", + cmd, ret, duration) if repo and repo != req.repo: repo.close() def lsprofile(ui, func, fp): format = ui.config('profiling', 'format', default='text') field = ui.config('profiling', 'sort', default='inlinetime') + limit = ui.configint('profiling', 'limit', default=30) climit = ui.configint('profiling', 'nested', default=5) if format not in ['text', 'kcachegrind']: @@ -776,7 +809,7 @@ # format == 'text' stats = lsprof.Stats(p.getstats()) stats.sort(field) - stats.pprint(limit=30, file=fp, climit=climit) + stats.pprint(limit=limit, file=fp, climit=climit) def statprofile(ui, func, fp): try: diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/error.py --- a/mercurial/error.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/error.py Tue May 14 23:04:23 2013 +0400 @@ -27,9 +27,15 @@ def __str__(self): return RevlogError.__str__(self) +class ManifestLookupError(LookupError): + pass + class CommandError(Exception): """Exception raised on errors in parsing the command line.""" +class InterventionRequired(Exception): + """Exception raised when a command requires human intervention.""" + class Abort(Exception): """Raised if a command needs to print an error and exit.""" def __init__(self, *args, **kw): diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/extensions.py --- a/mercurial/extensions.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/extensions.py Tue May 14 23:04:23 2013 +0400 @@ -11,7 +11,7 @@ _extensions = {} _order = [] -_ignore = ['hbisect', 'bookmarks', 'parentrevspec'] +_ignore = ['hbisect', 'bookmarks', 'parentrevspec', 'interhg'] def extensions(): for name in _order: @@ -50,7 +50,6 @@ raise def load(ui, name, path): - # unused ui argument kept for backwards compatibility if name.startswith('hgext.') or name.startswith('hgext/'): shortname = name[6:] else: diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/fileset.py --- a/mercurial/fileset.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/fileset.py Tue May 14 23:04:23 2013 +0400 @@ -353,6 +353,29 @@ return s +def eol(mctx, x): + """``eol(style)`` + File contains newlines of the given style (dos, unix, mac). Binary + files are excluded, files with mixed line endings match multiple + styles. + """ + + # i18n: "encoding" is a keyword + enc = getstring(x, _("encoding requires an encoding name")) + + s = [] + for f in mctx.existing(): + d = mctx.ctx[f].data() + if util.binary(d): + continue + if (enc == 'dos' or enc == 'win') and '\r\n' in d: + s.append(f) + elif enc == 'unix' and re.search('(? 2: # ascii() only knows how to add or remove a single column between two diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/help.py --- a/mercurial/help.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/help.py Tue May 14 23:04:23 2013 +0400 @@ -6,9 +6,10 @@ # GNU General Public License version 2 or any later version. from i18n import gettext, _ -import itertools, sys, os +import itertools, sys, os, error import extensions, revset, fileset, templatekw, templatefilters, filemerge import encoding, util, minirst +import cmdutil def listexts(header, exts, indent=1): '''return a text listing of the given extensions''' @@ -206,3 +207,298 @@ addtopicsymbols('revsets', '.. predicatesmarker', revset.symbols) addtopicsymbols('templates', '.. keywordsmarker', templatekw.dockeywords) addtopicsymbols('templates', '.. filtersmarker', templatefilters.filters) + +def help_(ui, name, unknowncmd=False, full=True, **opts): + ''' + Generate the help for 'name' as unformatted restructured text. If + 'name' is None, describe the commands available. + ''' + + import commands # avoid cycle + + def helpcmd(name): + try: + aliases, entry = cmdutil.findcmd(name, commands.table, + strict=unknowncmd) + except error.AmbiguousCommand, inst: + # py3k fix: except vars can't be used outside the scope of the + # except block, nor can be used inside a lambda. python issue4617 + prefix = inst.args[0] + select = lambda c: c.lstrip('^').startswith(prefix) + rst = helplist(select) + return rst + + rst = [] + + # check if it's an invalid alias and display its error if it is + if getattr(entry[0], 'badalias', False): + if not unknowncmd: + ui.pushbuffer() + entry[0](ui) + rst.append(ui.popbuffer()) + return rst + + # synopsis + if len(entry) > 2: + if entry[2].startswith('hg'): + rst.append("%s\n" % entry[2]) + else: + rst.append('hg %s %s\n' % (aliases[0], entry[2])) + else: + rst.append('hg %s\n' % aliases[0]) + # aliases + if full and not ui.quiet and len(aliases) > 1: + rst.append(_("\naliases: %s\n") % ', '.join(aliases[1:])) + rst.append('\n') + + # description + doc = gettext(entry[0].__doc__) + if not doc: + doc = _("(no help text available)") + if util.safehasattr(entry[0], 'definition'): # aliased command + if entry[0].definition.startswith('!'): # shell alias + doc = _('shell alias for::\n\n %s') % entry[0].definition[1:] + else: + doc = _('alias for: hg %s\n\n%s') % (entry[0].definition, doc) + doc = doc.splitlines(True) + if ui.quiet or not full: + rst.append(doc[0]) + else: + rst.extend(doc) + rst.append('\n') + + # check if this command shadows a non-trivial (multi-line) + # extension help text + try: + mod = extensions.find(name) + doc = gettext(mod.__doc__) or '' + if '\n' in doc.strip(): + msg = _('use "hg help -e %s" to show help for ' + 'the %s extension') % (name, name) + rst.append('\n%s\n' % msg) + except KeyError: + pass + + # options + if not ui.quiet and entry[1]: + rst.append('\n%s\n\n' % _("options:")) + rst.append(optrst(entry[1], ui.verbose)) + + if ui.verbose: + rst.append('\n%s\n\n' % _("global options:")) + rst.append(optrst(commands.globalopts, ui.verbose)) + + if not ui.verbose: + if not full: + rst.append(_('\nuse "hg help %s" to show the full help text\n') + % name) + elif not ui.quiet: + omitted = _('use "hg -v help %s" to show more complete' + ' help and the global options') % name + notomitted = _('use "hg -v help %s" to show' + ' the global options') % name + indicateomitted(rst, omitted, notomitted) + + return rst + + + def helplist(select=None): + # list of commands + if name == "shortlist": + header = _('basic commands:\n\n') + else: + header = _('list of commands:\n\n') + + h = {} + cmds = {} + for c, e in commands.table.iteritems(): + f = c.split("|", 1)[0] + if select and not select(f): + continue + if (not select and name != 'shortlist' and + e[0].__module__ != commands.__name__): + continue + if name == "shortlist" and not f.startswith("^"): + continue + f = f.lstrip("^") + if not ui.debugflag and f.startswith("debug"): + continue + doc = e[0].__doc__ + if doc and 'DEPRECATED' in doc and not ui.verbose: + continue + doc = gettext(doc) + if not doc: + doc = _("(no help text available)") + h[f] = doc.splitlines()[0].rstrip() + cmds[f] = c.lstrip("^") + + rst = [] + if not h: + if not ui.quiet: + rst.append(_('no commands defined\n')) + return rst + + if not ui.quiet: + rst.append(header) + fns = sorted(h) + for f in fns: + if ui.verbose: + commacmds = cmds[f].replace("|",", ") + rst.append(" :%s: %s\n" % (commacmds, h[f])) + else: + rst.append(' :%s: %s\n' % (f, h[f])) + + if not name: + exts = listexts(_('enabled extensions:'), extensions.enabled()) + if exts: + rst.append('\n') + rst.extend(exts) + + rst.append(_("\nadditional help topics:\n\n")) + topics = [] + for names, header, doc in helptable: + topics.append((names[0], header)) + for t, desc in topics: + rst.append(" :%s: %s\n" % (t, desc)) + + optlist = [] + if not ui.quiet: + if ui.verbose: + optlist.append((_("global options:"), commands.globalopts)) + if name == 'shortlist': + optlist.append((_('use "hg help" for the full list ' + 'of commands'), ())) + else: + if name == 'shortlist': + msg = _('use "hg help" for the full list of commands ' + 'or "hg -v" for details') + elif name and not full: + msg = _('use "hg help %s" to show the full help ' + 'text') % name + else: + msg = _('use "hg -v help%s" to show builtin aliases and ' + 'global options') % (name and " " + name or "") + optlist.append((msg, ())) + + if optlist: + for title, options in optlist: + rst.append('\n%s\n' % title) + if options: + rst.append('\n%s\n' % optrst(options, ui.verbose)) + return rst + + def helptopic(name): + for names, header, doc in helptable: + if name in names: + break + else: + raise error.UnknownCommand(name) + + rst = [minirst.section(header)] + + # description + if not doc: + rst.append(" %s\n" % _("(no help text available)")) + if util.safehasattr(doc, '__call__'): + rst += [" %s\n" % l for l in doc().splitlines()] + + if not ui.verbose: + omitted = (_('use "hg help -v %s" to show more complete help') % + name) + indicateomitted(rst, omitted) + + try: + cmdutil.findcmd(name, commands.table) + rst.append(_('\nuse "hg help -c %s" to see help for ' + 'the %s command\n') % (name, name)) + except error.UnknownCommand: + pass + return rst + + def helpext(name): + try: + mod = extensions.find(name) + doc = gettext(mod.__doc__) or _('no help text available') + except KeyError: + mod = None + doc = extensions.disabledext(name) + if not doc: + raise error.UnknownCommand(name) + + if '\n' not in doc: + head, tail = doc, "" + else: + head, tail = doc.split('\n', 1) + rst = [_('%s extension - %s\n\n') % (name.split('.')[-1], head)] + if tail: + rst.extend(tail.splitlines(True)) + rst.append('\n') + + if not ui.verbose: + omitted = (_('use "hg help -v %s" to show more complete help') % + name) + indicateomitted(rst, omitted) + + if mod: + try: + ct = mod.cmdtable + except AttributeError: + ct = {} + modcmds = set([c.split('|', 1)[0] for c in ct]) + rst.extend(helplist(modcmds.__contains__)) + else: + rst.append(_('use "hg help extensions" for information on enabling ' + 'extensions\n')) + return rst + + def helpextcmd(name): + cmd, ext, mod = extensions.disabledcmd(ui, name, + ui.configbool('ui', 'strict')) + doc = gettext(mod.__doc__).splitlines()[0] + + rst = listexts(_("'%s' is provided by the following " + "extension:") % cmd, {ext: doc}, indent=4) + rst.append('\n') + rst.append(_('use "hg help extensions" for information on enabling ' + 'extensions\n')) + return rst + + + rst = [] + kw = opts.get('keyword') + if kw: + matches = topicmatch(kw) + for t, title in (('topics', _('Topics')), + ('commands', _('Commands')), + ('extensions', _('Extensions')), + ('extensioncommands', _('Extension Commands'))): + if matches[t]: + rst.append('%s:\n\n' % title) + rst.extend(minirst.maketable(sorted(matches[t]), 1)) + rst.append('\n') + elif name and name != 'shortlist': + i = None + if unknowncmd: + queries = (helpextcmd,) + elif opts.get('extension'): + queries = (helpext,) + elif opts.get('command'): + queries = (helpcmd,) + else: + queries = (helptopic, helpcmd, helpext, helpextcmd) + for f in queries: + try: + rst = f(name) + i = None + break + except error.UnknownCommand, inst: + i = inst + if i: + raise i + else: + # program name + if not ui.quiet: + rst = [_("Mercurial Distributed SCM\n"), '\n'] + rst.extend(helplist()) + + return ''.join(rst) diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/help/config.txt --- a/mercurial/help/config.txt Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/help/config.txt Tue May 14 23:04:23 2013 +0400 @@ -85,6 +85,9 @@ be read. Mercurial checks each of these locations in the specified order until one or more configuration files are detected. +.. note:: The registry key ``HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Mercurial`` + is used when running 32-bit Python on 64-bit Windows. + Syntax ====== @@ -745,7 +748,7 @@ For example:: [hostfingerprints] - hg.intevation.org = 38:76:52:7c:87:26:9a:8f:4a:f8:d3:de:08:45:3b:ea:d6:4b:ee:cc + hg.intevation.org = 44:ed:af:1f:97:11:b6:01:7a:48:45:fc:10:3c:b7:f9:d4:89:2a:9d This feature is only supported when using Python 2.6 or later. @@ -990,6 +993,22 @@ file exists, it is replaced. Default: None, data is printed on stderr +``sort`` + Sort field. Specific to the ``ls`` instrumenting profiler. + One of ``callcount``, ``reccallcount``, ``totaltime`` and + ``inlinetime``. + Default: inlinetime. + +``limit`` + Number of lines to show. Specific to the ``ls`` instrumenting profiler. + Default: 30. + +``nested`` + Show at most this number of lines of drill-down info after each main entry. + This can help explain the difference between Total and Inline. + Specific to the ``ls`` instrumenting profiler. + Default: 5. + ``revsetalias`` --------------- @@ -1030,12 +1049,23 @@ Host name of mail server, e.g. "mail.example.com". ``port`` - Optional. Port to connect to on mail server. Default: 25. + Optional. Port to connect to on mail server. Default: 465 (if + ``tls`` is smtps) or 25 (otherwise). ``tls`` Optional. Method to enable TLS when connecting to mail server: starttls, smtps or none. Default: none. +``verifycert`` + Optional. Verification for the certificate of mail server, when + ``tls`` is starttls or smtps. "strict", "loose" or False. For + "strict" or "loose", the certificate is verified as same as the + verification for HTTPS connections (see ``[hostfingerprints]`` and + ``[web] cacerts`` also). For "strict", sending email is also + aborted, if there is no configuration for mail server in + ``[hostfingerprints]`` and ``[web] cacerts``. --insecure for + :hg:`email` overwrites this as "loose". Default: "strict". + ``username`` Optional. User name for authenticating with the SMTP server. Default: none. @@ -1446,3 +1476,48 @@ ``templates`` Where to find the HTML templates. Default is install path. + +``websub`` +---------- + +Web substitution filter definition. You can use this section to +define a set of regular expression substitution patterns which +let you automatically modify the hgweb server output. + +The default hgweb templates only apply these substitution patterns +on the revision description fields. You can apply them anywhere +you want when you create your own templates by adding calls to the +"websub" filter (usually after calling the "escape" filter). + +This can be used, for example, to convert issue references to links +to your issue tracker, or to convert "markdown-like" syntax into +HTML (see the examples below). + +Each entry in this section names a substitution filter. +The value of each entry defines the substitution expression itself. +The websub expressions follow the old interhg extension syntax, +which in turn imitates the Unix sed replacement syntax:: + + patternname = s/SEARCH_REGEX/REPLACE_EXPRESSION/[i] + +You can use any separator other than "/". The final "i" is optional +and indicates that the search must be case insensitive. + +Examples:: + + [websub] + issues = s|issue(\d+)|issue\1|i + italic = s/\b_(\S+)_\b/\1<\/i>/ + bold = s/\*\b(\S+)\b\*/\1<\/b>/ + +``worker`` +---------- + +Parallel master/worker configuration. We currently perform working +directory updates in parallel on Unix-like systems, which greatly +helps performance. + +``numcpus`` + Number of CPUs to use for parallel operations. Default is 4 or the + number of CPUs on the system, whichever is larger. A zero or + negative value is treated as ``use the default``. diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/help/dates.txt --- a/mercurial/help/dates.txt Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/help/dates.txt Tue May 14 23:04:23 2013 +0400 @@ -18,6 +18,9 @@ - ``12-6`` - ``12/6`` - ``12/6/6`` (Dec 6 2006) +- ``today`` (midnight) +- ``yesterday`` (midnight) +- ``now`` - right now Lastly, there is Mercurial's internal format: diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/help/filesets.txt --- a/mercurial/help/filesets.txt Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/help/filesets.txt Tue May 14 23:04:23 2013 +0400 @@ -1,5 +1,5 @@ Mercurial supports a functional language for selecting a set of -files. +files. Like other file patterns, this pattern type is indicated by a prefix, 'set:'. The language supports a number of predicates which are joined diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/help/patterns.txt --- a/mercurial/help/patterns.txt Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/help/patterns.txt Tue May 14 23:04:23 2013 +0400 @@ -7,7 +7,7 @@ Alternate pattern notations must be specified explicitly. .. note:: - Patterns specified in ``.hgignore`` are not rooted. + Patterns specified in ``.hgignore`` are not rooted. Please see :hg:`help hgignore` for details. To use a plain path name without any pattern matching, start it with diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/help/phases.txt --- a/mercurial/help/phases.txt Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/help/phases.txt Tue May 14 23:04:23 2013 +0400 @@ -79,6 +79,6 @@ - resynchronize draft changesets relative to a remote repository:: - hg phase -fd 'outgoing(URL)' + hg phase -fd 'outgoing(URL)' See :hg:`help phase` for more information on manually manipulating phases. diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/help/revsets.txt --- a/mercurial/help/revsets.txt Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/help/revsets.txt Tue May 14 23:04:23 2013 +0400 @@ -118,4 +118,4 @@ - Changesets mentioning "bug" or "issue" that are not in a tagged release:: - hg log -r "(keyword(bug) or keyword(issue)) and not ancestors(tagged())" + hg log -r "(keyword(bug) or keyword(issue)) and not ancestors(tag())" diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/help/templates.txt --- a/mercurial/help/templates.txt Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/help/templates.txt Tue May 14 23:04:23 2013 +0400 @@ -38,3 +38,65 @@ List of filters: .. filtersmarker + +Note that a filter is nothing more than a function call, i.e. +``expr|filter`` is equivalent to ``filter(expr)``. + +In addition to filters, there are some basic built-in functions: + +- date(date[, fmt]) + +- fill(text[, width]) + +- get(dict, key) + +- if(expr, then[, else]) + +- ifeq(expr, expr, then[, else]) + +- join(list, sep) + +- label(label, expr) + +- sub(pat, repl, expr) + +- rstdoc(text, style) + +Also, for any expression that returns a list, there is a list operator: + +- expr % "{template}" + +Some sample command line templates: + +- Format lists, e.g. files:: + + $ hg log -r 0 --template "files:\n{files % ' {file}\n'}" + +- Join the list of files with a ", ":: + + $ hg log -r 0 --template "files: {join(files, ', ')}\n" + +- Format date:: + + $ hg log -r 0 --template "{date(date, '%Y')}\n" + +- Output the description set to a fill-width of 30:: + + $ hg log -r 0 --template "{fill(desc, '30')}" + +- Use a conditional to test for the default branch:: + + $ hg log -r 0 --template "{ifeq(branch, 'default', 'on the main branch', + 'on branch {branch}')}\n" + +- Append a newline if not empty:: + + $ hg tip --template "{if(author, '{author}\n')}" + +- Label the output for use with the color extension:: + + $ hg log -r 0 --template "{label('changeset.{phase}', node|short)}\n" + +- Invert the firstline filter, i.e. everything but the first line:: + + $ hg log -r 0 --template "{sub(r'^.*\n?\n?', '', desc)}\n" diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/hg.py --- a/mercurial/hg.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/hg.py Tue May 14 23:04:23 2013 +0400 @@ -9,8 +9,8 @@ from i18n import _ from lock import release from node import hex, nullid -import localrepo, bundlerepo, httppeer, sshpeer, statichttprepo, bookmarks -import lock, util, extensions, error, node, scmutil, phases, url +import localrepo, bundlerepo, unionrepo, httppeer, sshpeer, statichttprepo +import bookmarks, lock, util, extensions, error, node, scmutil, phases, url import cmdutil, discovery import merge as mergemod import verify as verifymod @@ -64,6 +64,7 @@ schemes = { 'bundle': bundlerepo, + 'union': unionrepo, 'file': _local, 'http': httppeer, 'https': httppeer, @@ -122,7 +123,7 @@ def defaultdest(source): '''return default destination of clone if none is given''' - return os.path.basename(os.path.normpath(util.url(source).path)) + return os.path.basename(os.path.normpath(util.url(source).path or '')) def share(ui, source, dest=None, update=True): '''create a shared repository''' @@ -171,14 +172,11 @@ r = repository(ui, root) default = srcrepo.ui.config('paths', 'default') - if not default: - # set default to source for being able to clone subrepos - default = os.path.abspath(util.urllocalpath(origsource)) - fp = r.opener("hgrc", "w", text=True) - fp.write("[paths]\n") - fp.write("default = %s\n" % default) - fp.close() - r.ui.setconfig('paths', 'default', default) + if default: + fp = r.opener("hgrc", "w", text=True) + fp.write("[paths]\n") + fp.write("default = %s\n" % default) + fp.close() if update: r.ui.status(_("updating working directory\n")) @@ -558,7 +556,7 @@ revs = [repo.lookup(rev) for rev in scmutil.revrange(repo, revs)] other = peer(repo, opts, dest) - outgoing = discovery.findcommonoutgoing(repo, other, revs, + outgoing = discovery.findcommonoutgoing(repo.unfiltered(), other, revs, force=opts.get('force')) o = outgoing.missing if not o: diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/hgweb/common.py --- a/mercurial/hgweb/common.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/hgweb/common.py Tue May 14 23:04:23 2013 +0400 @@ -18,6 +18,15 @@ HTTP_SERVER_ERROR = 500 +def ismember(ui, username, userlist): + """Check if username is a member of userlist. + + If userlist has a single '*' member, all users are considered members. + Can be overriden by extensions to provide more complex authorization + schemes. + """ + return userlist == ['*'] or username in userlist + def checkauthz(hgweb, req, op): '''Check permission for operation based on request data (including authentication info). Return if op allowed, else raise an ErrorResponse @@ -26,12 +35,11 @@ user = req.env.get('REMOTE_USER') deny_read = hgweb.configlist('web', 'deny_read') - if deny_read and (not user or deny_read == ['*'] or user in deny_read): + if deny_read and (not user or ismember(hgweb.repo.ui, user, deny_read)): raise ErrorResponse(HTTP_UNAUTHORIZED, 'read not authorized') allow_read = hgweb.configlist('web', 'allow_read') - result = (not allow_read) or (allow_read == ['*']) - if not (result or user in allow_read): + if allow_read and (not ismember(hgweb.repo.ui, user, allow_read)): raise ErrorResponse(HTTP_UNAUTHORIZED, 'read not authorized') if op == 'pull' and not hgweb.allowpull: @@ -51,12 +59,11 @@ raise ErrorResponse(HTTP_FORBIDDEN, 'ssl required') deny = hgweb.configlist('web', 'deny_push') - if deny and (not user or deny == ['*'] or user in deny): + if deny and (not user or ismember(hgweb.repo.ui, user, deny)): raise ErrorResponse(HTTP_UNAUTHORIZED, 'push not authorized') allow = hgweb.configlist('web', 'allow_push') - result = allow and (allow == ['*'] or user in allow) - if not result: + if not (allow and ismember(hgweb.repo.ui, user, allow)): raise ErrorResponse(HTTP_UNAUTHORIZED, 'push not authorized') # Hooks for hgweb permission checks; extensions can add hooks here. @@ -129,7 +136,7 @@ for part in parts: if (part in ('', os.curdir, os.pardir) or os.sep in part or os.altsep is not None and os.altsep in part): - return "" + return fpath = os.path.join(*parts) if isinstance(directory, str): directory = [directory] @@ -144,7 +151,6 @@ data = fp.read() fp.close() req.respond(HTTP_OK, ct, body=data) - return "" except TypeError: raise ErrorResponse(HTTP_SERVER_ERROR, 'illegal filename') except OSError, err: diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/hgweb/hgweb_mod.py --- a/mercurial/hgweb/hgweb_mod.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/hgweb/hgweb_mod.py Tue May 14 23:04:23 2013 +0400 @@ -7,12 +7,14 @@ # GNU General Public License version 2 or any later version. import os -from mercurial import ui, hg, hook, error, encoding, templater, util +from mercurial import ui, hg, hook, error, encoding, templater, util, repoview +from mercurial.templatefilters import websub +from mercurial.i18n import _ from common import get_stat, ErrorResponse, permhooks, caching from common import HTTP_OK, HTTP_NOT_MODIFIED, HTTP_BAD_REQUEST from common import HTTP_NOT_FOUND, HTTP_SERVER_ERROR from request import wsgirequest -import webcommands, protocol, webutil +import webcommands, protocol, webutil, re perms = { 'changegroup': 'pull', @@ -24,7 +26,7 @@ 'pushkey': 'push', } -def makebreadcrumb(url): +def makebreadcrumb(url, prefix=''): '''Return a 'URL breadcrumb' list A 'URL breadcrumb' is a list of URL-name pairs, @@ -33,6 +35,8 @@ ''' if url.endswith('/'): url = url[:-1] + if prefix: + url = '/' + prefix + url relpath = url if relpath.startswith('/'): relpath = relpath[1:] @@ -59,9 +63,11 @@ else: self.repo = repo - self.repo = self.repo.filtered('served') + self.repo = self._getview(self.repo) self.repo.ui.setconfig('ui', 'report_untrusted', 'off') + self.repo.baseui.setconfig('ui', 'report_untrusted', 'off') self.repo.ui.setconfig('ui', 'nontty', 'true') + self.repo.baseui.setconfig('ui', 'nontty', 'true') hook.redirect(True) self.mtime = -1 self.size = -1 @@ -71,6 +77,7 @@ # a repo owner may set web.templates in .hg/hgrc to get any file # readable by the user running the CGI script self.templatepath = self.config('web', 'templates') + self.websubtable = self.loadwebsub() # The CGI scripts are often run by a user different from the repo owner. # Trust the settings from the .hg/hgrc files by default. @@ -86,17 +93,24 @@ return self.repo.ui.configlist(section, name, default, untrusted=untrusted) + def _getview(self, repo): + viewconfig = self.config('web', 'view', 'served') + if viewconfig == 'all': + return repo.unfiltered() + elif viewconfig in repoview.filtertable: + return repo.filtered(viewconfig) + else: + return repo.filtered('served') + def refresh(self, request=None): - if request: - self.repo.ui.environ = request.env st = get_stat(self.repo.spath) # compare changelog size in addition to mtime to catch # rollbacks made less than a second ago if st.st_mtime != self.mtime or st.st_size != self.size: self.mtime = st.st_mtime self.size = st.st_size - self.repo = hg.repository(self.repo.ui, self.repo.root) - self.repo = self.repo.filtered('served') + r = hg.repository(self.repo.baseui, self.repo.root) + self.repo = self._getview(r) self.maxchanges = int(self.config("web", "maxchanges", 10)) self.stripecount = int(self.config("web", "stripes", 1)) self.maxshortchanges = int(self.config("web", "maxshortchanges", @@ -105,6 +119,8 @@ self.allowpull = self.configbool("web", "allowpull", True) encoding.encoding = self.config("web", "encoding", encoding.encoding) + if request: + self.repo.ui.environ = request.env def run(self): if not os.environ.get('GATEWAY_INTERFACE', '').startswith("CGI/1."): @@ -231,10 +247,11 @@ return content - except error.LookupError, err: + except (error.LookupError, error.RepoLookupError), err: req.respond(HTTP_NOT_FOUND, ctype) msg = str(err) - if 'manifest' not in msg: + if (util.safehasattr(err, 'name') and + not isinstance(err, error.ManifestLookupError)): msg = 'revision not found: %s' % err.name return tmpl('error', error=msg) except (error.RepoError, error.RevlogError), inst: @@ -247,6 +264,47 @@ return [''] return tmpl('error', error=inst.message) + def loadwebsub(self): + websubtable = [] + websubdefs = self.repo.ui.configitems('websub') + # we must maintain interhg backwards compatibility + websubdefs += self.repo.ui.configitems('interhg') + for key, pattern in websubdefs: + # grab the delimiter from the character after the "s" + unesc = pattern[1] + delim = re.escape(unesc) + + # identify portions of the pattern, taking care to avoid escaped + # delimiters. the replace format and flags are optional, but + # delimiters are required. + match = re.match( + r'^s%s(.+)(?:(?<=\\\\)|(? 0 - need[0] -= len(s) - return needed - blocks = list(itertools.takewhile(pred, self._done_chunks)) - self._done_chunks = self._done_chunks[len(blocks):] - over_read = sum(map(len, blocks)) - amt - if over_read > 0 and blocks: - logger.debug('need to reinsert %d data into done chunks', over_read) - last = blocks[-1] - blocks[-1], reinsert = last[:-over_read], last[-over_read:] - self._done_chunks.insert(0, reinsert) + blocks = [] + need = amt + while self._done_chunks: + b = self.popchunk() + if len(b) > need: + nb = b[:need] + self.pushchunk(b[need:]) + b = nb + blocks.append(b) + need -= len(b) + if need == 0: + break result = ''.join(blocks) assert len(result) == amt or (self._finished and len(result) < amt) + return result + def readto(self, delimstr, blocks = None): + """return available data chunks up to the first one in which delimstr + occurs. No data will be returned after delimstr -- the chunk in which + it occurs will be split and the remainder pushed back onto the available + data queue. If blocks is supplied chunks will be added to blocks, otherwise + a new list will be allocated. + """ + if blocks is None: + blocks = [] + + while self._done_chunks: + b = self.popchunk() + i = b.find(delimstr) + len(delimstr) + if i: + if i < len(b): + self.pushchunk(b[i:]) + blocks.append(b[:i]) + break + else: + blocks.append(b) + + return blocks + def _load(self, data): # pragma: no cover """Subclasses must implement this. @@ -121,7 +155,7 @@ assert not self._finished, ( 'tried to add data (%r) to a closed reader!' % data) logger.debug('%s read an additional %d data', self.name, len(data)) - self._done_chunks.append(data) + self.addchunk(data) class CloseIsEndReader(AbstractSimpleReader): @@ -190,6 +224,6 @@ self._finished = True logger.debug('closing chunked reader due to chunk of length 0') return - self._done_chunks.append(data[block_start:block_start + amt]) + self.addchunk(data[block_start:block_start + amt]) position = block_start + amt + len(self._eol) # no-check-code diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/httppeer.py --- a/mercurial/httppeer.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/httppeer.py Tue May 14 23:04:23 2013 +0400 @@ -145,13 +145,14 @@ raise error.OutOfBandError(resp.read()) # accept old "text/plain" and "application/hg-changegroup" for now if not (proto.startswith('application/mercurial-') or - proto.startswith('text/plain') or + (proto.startswith('text/plain') + and not resp.headers.get('content-length')) or proto.startswith('application/hg-changegroup')): self.ui.debug("requested URL: '%s'\n" % util.hidepassword(cu)) raise error.RepoError( _("'%s' does not appear to be an hg repository:\n" "---%%<--- (%s)\n%s\n---%%<---\n") - % (safeurl, proto or 'no content-type', resp.read())) + % (safeurl, proto or 'no content-type', resp.read(1024))) if proto.startswith('application/mercurial-'): try: diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/localrepo.py --- a/mercurial/localrepo.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/localrepo.py Tue May 14 23:04:23 2013 +0400 @@ -8,7 +8,7 @@ from i18n import _ import peer, changegroup, subrepo, discovery, pushkey, obsolete, repoview import changelog, dirstate, filelog, manifest, context, bookmarks, phases -import lock, transaction, store, encoding, base85 +import lock, transaction, store, encoding import scmutil, util, extensions, hook, error, revset import match as matchmod import merge as mergemod @@ -49,7 +49,7 @@ def hasunfilteredcache(repo, name): - """check if an repo and a unfilteredproperty cached value for """ + """check if a repo has an unfilteredpropertycache value for """ return name in vars(repo.unfiltered()) def unfilteredmethod(orig): @@ -153,7 +153,7 @@ return self.requirements[:] def __init__(self, baseui, path=None, create=False): - self.wvfs = scmutil.vfs(path, expand=True) + self.wvfs = scmutil.vfs(path, expandpath=True, realpath=True) self.wopener = self.wvfs self.root = self.wvfs.base self.path = self.wvfs.join(".hg") @@ -209,8 +209,10 @@ self.sharedpath = self.path try: - s = os.path.realpath(self.opener.read("sharedpath").rstrip('\n')) - if not os.path.exists(s): + vfs = scmutil.vfs(self.vfs.read("sharedpath").rstrip('\n'), + realpath=True) + s = vfs.base + if not vfs.exists(): raise error.RepoError( _('.hg/sharedpath points to nonexistent directory %s') % s) self.sharedpath = s @@ -310,13 +312,13 @@ def unfiltered(self): """Return unfiltered version of the repository - Intended to be ovewritten by filtered repo.""" + Intended to be overwritten by filtered repo.""" return self def filtered(self, name): """Return a filtered version of a repository""" # build a new class with the mixin and the current class - # (possibily subclass of the repo) + # (possibly subclass of the repo) class proxycls(repoview.repoview, self.unfiltered().__class__): pass return proxycls(self, name) @@ -705,14 +707,18 @@ def setparents(self, p1, p2=nullid): copies = self.dirstate.setparents(p1, p2) + pctx = self[p1] if copies: # Adjust copy records, the dirstate cannot do it, it # requires access to parents manifests. Preserve them # only for entries added to first parent. - pctx = self[p1] for f in copies: if f not in pctx and copies[f] in pctx: self.dirstate.copy(copies[f], f) + if p2 == nullid: + for f, s in sorted(self.dirstate.copies().items()): + if f not in pctx and s not in pctx: + self.dirstate.copy(None, f) def filectx(self, path, changeid=None, fileid=None): """changeid can be a changeset revision, node, or tag. @@ -729,7 +735,7 @@ return self.wopener(f, mode) def _link(self, f): - return os.path.islink(self.wjoin(f)) + return self.wvfs.islink(f) def _loadfilter(self, filter): if filter not in self.filterpats: @@ -777,7 +783,7 @@ def wread(self, filename): if self._link(filename): - data = os.readlink(self.wjoin(filename)) + data = self.wvfs.readlink(filename) else: data = self.wopener.read(filename) return self._filter(self._encodefilterpats, filename, data) @@ -789,7 +795,7 @@ else: self.wopener.write(filename, data) if 'x' in flags: - util.setflags(self.wjoin(filename), False, True) + self.wvfs.setflags(filename, False, True) def wwritedata(self, filename, data): return self._filter(self._decodefilterpats, filename, data) @@ -800,12 +806,12 @@ return tr.nest() # abort here if the journal already exists - if os.path.exists(self.sjoin("journal")): + if self.svfs.exists("journal"): raise error.RepoError( _("abandoned transaction found - run hg recover")) self._writejournal(desc) - renames = [(x, undoname(x)) for x in self._journalfiles()] + renames = [(vfs, x, undoname(x)) for vfs, x in self._journalfiles()] tr = transaction.transaction(self.ui.warn, self.sopener, self.sjoin("journal"), @@ -815,13 +821,15 @@ return tr def _journalfiles(self): - return (self.sjoin('journal'), self.join('journal.dirstate'), - self.join('journal.branch'), self.join('journal.desc'), - self.join('journal.bookmarks'), - self.sjoin('journal.phaseroots')) + return ((self.svfs, 'journal'), + (self.vfs, 'journal.dirstate'), + (self.vfs, 'journal.branch'), + (self.vfs, 'journal.desc'), + (self.vfs, 'journal.bookmarks'), + (self.svfs, 'journal.phaseroots')) def undofiles(self): - return [undoname(x) for x in self._journalfiles()] + return [vfs.join(undoname(x)) for vfs, x in self._journalfiles()] def _writejournal(self, desc): self.opener.write("journal.dirstate", @@ -838,7 +846,7 @@ def recover(self): lock = self.lock() try: - if os.path.exists(self.sjoin("journal")): + if self.svfs.exists("journal"): self.ui.status(_("rolling back interrupted transaction\n")) transaction.rollback(self.sopener, self.sjoin("journal"), self.ui.warn) @@ -855,7 +863,7 @@ try: wlock = self.wlock() lock = self.lock() - if os.path.exists(self.sjoin("undo")): + if self.svfs.exists("undo"): return self._rollback(dryrun, force) else: self.ui.warn(_("no rollback information available\n")) @@ -897,18 +905,16 @@ parents = self.dirstate.parents() self.destroying() transaction.rollback(self.sopener, self.sjoin('undo'), ui.warn) - if os.path.exists(self.join('undo.bookmarks')): - util.rename(self.join('undo.bookmarks'), - self.join('bookmarks')) - if os.path.exists(self.sjoin('undo.phaseroots')): - util.rename(self.sjoin('undo.phaseroots'), - self.sjoin('phaseroots')) + if self.vfs.exists('undo.bookmarks'): + self.vfs.rename('undo.bookmarks', 'bookmarks') + if self.svfs.exists('undo.phaseroots'): + self.svfs.rename('undo.phaseroots', 'phaseroots') self.invalidate() parentgone = (parents[0] not in self.changelog.nodemap or parents[1] not in self.changelog.nodemap) if parentgone: - util.rename(self.join('undo.dirstate'), self.join('dirstate')) + self.vfs.rename('undo.dirstate', 'dirstate') try: branch = self.opener.read('undo.branch') self.dirstate.setbranch(encoding.tolocal(branch)) @@ -962,7 +968,7 @@ delattr(self.unfiltered(), 'dirstate') def invalidate(self): - unfiltered = self.unfiltered() # all filecaches are stored on unfiltered + unfiltered = self.unfiltered() # all file caches are stored unfiltered for k in self._filecache: # dirstate is invalidated separately in invalidatedirstate() if k == 'dirstate': @@ -1230,12 +1236,14 @@ elif f not in self.dirstate: fail(f, _("file not tracked!")) + cctx = context.workingctx(self, text, user, date, extra, changes) + if (not force and not extra.get("close") and not merge - and not (changes[0] or changes[1] or changes[2]) + and not cctx.files() and wctx.branch() == wctx.p1().branch()): return None - if merge and changes[3]: + if merge and cctx.deleted(): raise util.Abort(_("cannot commit merge with missing files")) ms = mergemod.mergestate(self) @@ -1244,7 +1252,6 @@ raise util.Abort(_("unresolved merge conflicts " "(see hg help resolve)")) - cctx = context.workingctx(self, text, user, date, extra, changes) if editor: cctx._text = editor(self, cctx, subs) edited = (text != cctx._text) @@ -1278,11 +1285,7 @@ # update bookmarks, dirstate and mergestate bookmarks.update(self, [p1, p2], ret) - for f in changes[0] + changes[1]: - self.dirstate.normal(f) - for f in changes[2]: - self.dirstate.drop(f) - self.dirstate.setparents(ret) + cctx.markcommitted(ret) ms.reset() finally: wlock.release() @@ -1397,12 +1400,6 @@ '''Inform the repository that nodes have been destroyed. Intended for use by strip and rollback, so there's a common place for anything that has to be done after destroying history. - - If you know the branchheadcache was uptodate before nodes were removed - and you also know the set of candidate new heads that may have resulted - from the destruction, you can set newheadnodes. This will enable the - code to update the branchheads cache, rather than having future code - decide it's invalid and regenerating it from scratch. ''' # When one tries to: # 1) destroy nodes thus calling this method (e.g. strip) @@ -1412,12 +1409,11 @@ # removed. We can either remove phasecache from the filecache, # causing it to reload next time it is accessed, or simply filter # the removed nodes now and write the updated cache. - if '_phasecache' in self._filecache: - self._phasecache.filterunknown(self) - self._phasecache.write() + self._phasecache.filterunknown(self) + self._phasecache.write() # update the 'served' branch cache to help read only server process - # Thanks to branchcach collaboration this is done from the nearest + # Thanks to branchcache collaboration this is done from the nearest # filtered subset and it is expected to be fast. branchmap.updatecache(self.filtered('served')) @@ -1541,12 +1537,12 @@ modified, added, clean = [], [], [] withflags = mf1.withflags() | mf2.withflags() - for fn in mf2: + for fn, mf2node in mf2.iteritems(): if fn in mf1: if (fn not in deleted and ((fn in withflags and mf1.flags(fn) != mf2.flags(fn)) or - (mf1[fn] != mf2[fn] and - (mf2[fn] or ctx1[fn].cmp(ctx2[fn]))))): + (mf1[fn] != mf2node and + (mf2node or ctx1[fn].cmp(ctx2[fn]))))): modified.append(fn) elif listclean: clean.append(fn) @@ -1688,10 +1684,14 @@ "changegroupsubset.")) else: cg = remote.changegroupsubset(fetch, heads, 'pull') - clstart = len(self.changelog) + # we use unfiltered changelog here because hidden revision must + # be taken in account for phase synchronization. They may + # becomes public and becomes visible again. + cl = self.unfiltered().changelog + clstart = len(cl) result = self.addchangegroup(cg, 'pull', remote.url()) - clend = len(self.changelog) - added = [self.changelog.node(r) for r in xrange(clstart, clend)] + clend = len(cl) + added = [cl.node(r) for r in xrange(clstart, clend)] # compute target subset if heads is None: @@ -1717,17 +1717,15 @@ # should be seen as public phases.advanceboundary(self, phases.public, subset) - if obsolete._enabled: - self.ui.debug('fetching remote obsolete markers\n') - remoteobs = remote.listkeys('obsolete') - if 'dump0' in remoteobs: - if tr is None: - tr = self.transaction(trname) - for key in sorted(remoteobs, reverse=True): - if key.startswith('dump'): - data = base85.b85decode(remoteobs[key]) - self.obsstore.mergemarkers(tr, data) - self.invalidatevolatilesets() + def gettransaction(): + if tr is None: + return self.transaction(trname) + return tr + + obstr = obsolete.syncpull(self, remote, gettransaction) + if obstr is not None: + tr = obstr + if tr is not None: tr.close() finally: @@ -1764,8 +1762,31 @@ if not remote.canpush(): raise util.Abort(_("destination does not support push")) unfi = self.unfiltered() + def localphasemove(nodes, phase=phases.public): + """move to in the local source repo""" + if locallock is not None: + phases.advanceboundary(self, phase, nodes) + else: + # repo is not locked, do not change any phases! + # Informs the user that phases should have been moved when + # applicable. + actualmoves = [n for n in nodes if phase < self[n].phase()] + phasestr = phases.phasenames[phase] + if actualmoves: + self.ui.status(_('cannot lock source repo, skipping local' + ' %s phase update\n') % phasestr) # get local lock as we might write phase data - locallock = self.lock() + locallock = None + try: + locallock = self.lock() + except IOError, err: + if err.errno != errno.EACCES: + raise + # source repo cannot be locked. + # We do not abort the push, but just disable the local phase + # synchronisation. + msg = 'cannot lock source repository: %s\n' % err + self.ui.debug(msg) try: self.checkpush(force, revs) lock = None @@ -1870,18 +1891,32 @@ cheads.extend(c.node() for c in revset) # even when we don't push, exchanging phase data is useful remotephases = remote.listkeys('phases') + if (self.ui.configbool('ui', '_usedassubrepo', False) + and remotephases # server supports phases + and ret is None # nothing was pushed + and remotephases.get('publishing', False)): + # When: + # - this is a subrepo push + # - and remote support phase + # - and no changeset was pushed + # - and remote is publishing + # We may be in issue 3871 case! + # We drop the possible phase synchronisation done by + # courtesy to publish changesets possibly locally draft + # on the remote. + remotephases = {'publishing': 'True'} if not remotephases: # old server or public only repo - phases.advanceboundary(self, phases.public, cheads) + localphasemove(cheads) # don't push any phase data as there is nothing to push else: ana = phases.analyzeremotephases(self, cheads, remotephases) pheads, droots = ana ### Apply remote phase on local if remotephases.get('publishing', False): - phases.advanceboundary(self, phases.public, cheads) + localphasemove(cheads) else: # publish = False - phases.advanceboundary(self, phases.public, pheads) - phases.advanceboundary(self, phases.draft, cheads) + localphasemove(pheads) + localphasemove(cheads, phases.draft) ### Apply local phase on remote # Get the list of all revs draft on remote by public here. @@ -1898,22 +1933,13 @@ self.ui.warn(_('updating %s to public failed!\n') % newremotehead) self.ui.debug('try to push obsolete markers to remote\n') - if (obsolete._enabled and self.obsstore and - 'obsolete' in remote.listkeys('namespaces')): - rslts = [] - remotedata = self.listkeys('obsolete') - for key in sorted(remotedata, reverse=True): - # reverse sort to ensure we end with dump0 - data = remotedata[key] - rslts.append(remote.pushkey('obsolete', key, '', data)) - if [r for r in rslts if not r]: - msg = _('failed to push some obsolete markers!\n') - self.ui.warn(msg) + obsolete.syncpush(self, remote) finally: if lock is not None: lock.release() finally: - locallock.release() + if locallock is not None: + locallock.release() self.ui.debug("checking for updated bookmarks\n") rb = remote.listkeys('bookmarks') @@ -2390,6 +2416,12 @@ for n in added: self.hook("incoming", node=hex(n), source=srctype, url=url) + + newheads = [h for h in self.heads() if h not in oldheads] + self.ui.log("incoming", + "%s incoming changes - new heads: %s\n", + len(added), + ', '.join([hex(c[:6]) for c in newheads])) self._afterlock(runhooks) finally: @@ -2557,9 +2589,9 @@ def aftertrans(files): renamefiles = [tuple(t) for t in files] def a(): - for src, dest in renamefiles: + for vfs, src, dest in renamefiles: try: - util.rename(src, dest) + vfs.rename(src, dest) except OSError: # journal file does not yet exist pass return a diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/lock.py --- a/mercurial/lock.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/lock.py Tue May 14 23:04:23 2013 +0400 @@ -36,6 +36,7 @@ self.releasefn = releasefn self.desc = desc self.postrelease = [] + self.pid = os.getpid() self.lock() def __del__(self): @@ -71,7 +72,7 @@ return if lock._host is None: lock._host = socket.gethostname() - lockname = '%s:%s' % (lock._host, os.getpid()) + lockname = '%s:%s' % (lock._host, self.pid) while not self.held: try: util.makelock(lockname, self.f) @@ -133,6 +134,9 @@ self.held -= 1 elif self.held == 1: self.held = 0 + if os.getpid() != self.pid: + # we forked, and are not the parent + return if self.releasefn: self.releasefn() try: diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/lsprof.py --- a/mercurial/lsprof.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/lsprof.py Tue May 14 23:04:23 2013 +0400 @@ -50,9 +50,9 @@ ccount = 0 if climit and e.calls: for se in e.calls: - file.write(cols % ("+%s" % se.callcount, se.reccallcount, + file.write(cols % (se.callcount, se.reccallcount, se.totaltime, se.inlinetime, - "+%s" % label(se.code))) + " %s" % label(se.code))) count += 1 ccount += 1 if limit is not None and count == limit: diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/mail.py --- a/mercurial/mail.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/mail.py Tue May 14 23:04:23 2013 +0400 @@ -6,8 +6,8 @@ # GNU General Public License version 2 or any later version. from i18n import _ -import util, encoding -import os, smtplib, socket, quopri, time +import util, encoding, sslutil +import os, smtplib, socket, quopri, time, sys import email.Header, email.MIMEText, email.Utils _oldheaderinit = email.Header.Header.__init__ @@ -30,6 +30,59 @@ email.Header.Header.__dict__['__init__'] = _unifiedheaderinit +class STARTTLS(smtplib.SMTP): + '''Derived class to verify the peer certificate for STARTTLS. + + This class allows to pass any keyword arguments to SSL socket creation. + ''' + def __init__(self, sslkwargs, **kwargs): + smtplib.SMTP.__init__(self, **kwargs) + self._sslkwargs = sslkwargs + + def starttls(self, keyfile=None, certfile=None): + if not self.has_extn("starttls"): + msg = "STARTTLS extension not supported by server" + raise smtplib.SMTPException(msg) + (resp, reply) = self.docmd("STARTTLS") + if resp == 220: + self.sock = sslutil.ssl_wrap_socket(self.sock, keyfile, certfile, + **self._sslkwargs) + if not util.safehasattr(self.sock, "read"): + # using httplib.FakeSocket with Python 2.5.x or earlier + self.sock.read = self.sock.recv + self.file = smtplib.SSLFakeFile(self.sock) + self.helo_resp = None + self.ehlo_resp = None + self.esmtp_features = {} + self.does_esmtp = 0 + return (resp, reply) + +if util.safehasattr(smtplib.SMTP, '_get_socket'): + class SMTPS(smtplib.SMTP): + '''Derived class to verify the peer certificate for SMTPS. + + This class allows to pass any keyword arguments to SSL socket creation. + ''' + def __init__(self, sslkwargs, keyfile=None, certfile=None, **kwargs): + self.keyfile = keyfile + self.certfile = certfile + smtplib.SMTP.__init__(self, **kwargs) + self.default_port = smtplib.SMTP_SSL_PORT + self._sslkwargs = sslkwargs + + def _get_socket(self, host, port, timeout): + if self.debuglevel > 0: + print >> sys.stderr, 'connect:', (host, port) + new_socket = socket.create_connection((host, port), timeout) + new_socket = sslutil.ssl_wrap_socket(new_socket, + self.keyfile, self.certfile, + **self._sslkwargs) + self.file = smtplib.SSLFakeFile(new_socket) + return new_socket +else: + def SMTPS(sslkwargs, keyfile=None, certfile=None, **kwargs): + raise util.Abort(_('SMTPS requires Python 2.6 or later')) + def _smtp(ui): '''build an smtp connection and return a function to send mail''' local_hostname = ui.config('smtp', 'local_hostname') @@ -39,15 +92,30 @@ smtps = tls == 'smtps' if (starttls or smtps) and not util.safehasattr(socket, 'ssl'): raise util.Abort(_("can't use TLS: Python SSL support not installed")) - if smtps: - ui.note(_('(using smtps)\n')) - s = smtplib.SMTP_SSL(local_hostname=local_hostname) - else: - s = smtplib.SMTP(local_hostname=local_hostname) mailhost = ui.config('smtp', 'host') if not mailhost: raise util.Abort(_('smtp.host not configured - cannot send mail')) - mailport = util.getport(ui.config('smtp', 'port', 25)) + verifycert = ui.config('smtp', 'verifycert', 'strict') + if verifycert not in ['strict', 'loose']: + if util.parsebool(verifycert) is not False: + raise util.Abort(_('invalid smtp.verifycert configuration: %s') + % (verifycert)) + if (starttls or smtps) and verifycert: + sslkwargs = sslutil.sslkwargs(ui, mailhost) + else: + sslkwargs = {} + if smtps: + ui.note(_('(using smtps)\n')) + s = SMTPS(sslkwargs, local_hostname=local_hostname) + elif starttls: + s = STARTTLS(sslkwargs, local_hostname=local_hostname) + else: + s = smtplib.SMTP(local_hostname=local_hostname) + if smtps: + defaultport = 465 + else: + defaultport = 25 + mailport = util.getport(ui.config('smtp', 'port', defaultport)) ui.note(_('sending mail: smtp host %s, port %s\n') % (mailhost, mailport)) s.connect(host=mailhost, port=mailport) @@ -56,6 +124,9 @@ s.ehlo() s.starttls() s.ehlo() + if (starttls or smtps) and verifycert: + ui.note(_('(verifying remote certificate)\n')) + sslutil.validator(ui, mailhost)(s.sock, verifycert == 'strict') username = ui.config('smtp', 'username') password = ui.config('smtp', 'password') if username and not password: diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/manifest.py --- a/mercurial/manifest.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/manifest.py Tue May 14 23:04:23 2013 +0400 @@ -6,7 +6,7 @@ # GNU General Public License version 2 or any later version. from i18n import _ -import mdiff, parsers, error, revlog, util +import mdiff, parsers, error, revlog, util, dicthelpers import array, struct class manifestdict(dict): @@ -25,10 +25,13 @@ self._flags[f] = flags def copy(self): return manifestdict(self, dict.copy(self._flags)) + def flagsdiff(self, d2): + return dicthelpers.diff(self._flags, d2._flags, "") class manifest(revlog.revlog): def __init__(self, opener): - self._mancache = None + # we expect to deal with not more than three revs at a time in merge + self._mancache = util.lrucachedict(3) revlog.revlog.__init__(self, opener, "00manifest.i") def parse(self, lines): @@ -51,12 +54,12 @@ def read(self, node): if node == revlog.nullid: return manifestdict() # don't upset local cache - if self._mancache and self._mancache[0] == node: - return self._mancache[1] + if node in self._mancache: + return self._mancache[node][0] text = self.revision(node) arraytext = array.array('c', text) mapping = self.parse(text) - self._mancache = (node, mapping, arraytext) + self._mancache[node] = (mapping, arraytext) return mapping def _search(self, m, s, lo=0, hi=None): @@ -102,8 +105,9 @@ def find(self, node, f): '''look up entry for a single file efficiently. return (node, flags) pair if found, (None, None) if not.''' - if self._mancache and self._mancache[0] == node: - return self._mancache[1].get(f), self._mancache[1].flags(f) + if node in self._mancache: + mapping = self._mancache[node][0] + return mapping.get(f), mapping.flags(f) text = self.revision(node) start, end = self._search(text, f) if start == end: @@ -143,7 +147,7 @@ # if we're using the cache, make sure it is valid and # parented by the same node we're diffing against - if not (changed and self._mancache and p1 and self._mancache[0] == p1): + if not (changed and p1 and (p1 in self._mancache)): files = sorted(map) checkforbidden(files) @@ -156,7 +160,7 @@ cachedelta = None else: added, removed = changed - addlist = self._mancache[2] + addlist = self._mancache[p1][1] checkforbidden(added) # combine the changed lists into one list for sorting @@ -208,6 +212,6 @@ text = util.buffer(arraytext) n = self.addrevision(text, transaction, link, p1, p2, cachedelta) - self._mancache = (n, map, arraytext) + self._mancache[n] = (map, arraytext) return n diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/match.py --- a/mercurial/match.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/match.py Tue May 14 23:04:23 2013 +0400 @@ -62,6 +62,7 @@ self._files = [] self._anypats = bool(include or exclude) self._ctx = ctx + self._always = False if include: pats = _normalize(include, 'glob', root, cwd, auditor) @@ -103,6 +104,7 @@ m = lambda f: not em(f) else: m = lambda f: True + self._always = True self.matchfn = m self._fmap = set(self._files) @@ -130,7 +132,7 @@ def anypats(self): return self._anypats def always(self): - return False + return self._always class exact(match): def __init__(self, root, cwd, files): @@ -139,8 +141,7 @@ class always(match): def __init__(self, root, cwd): match.__init__(self, root, cwd, []) - def always(self): - return True + self._always = True class narrowmatcher(match): """Adapt a matcher to work on a subdirectory only. @@ -175,6 +176,7 @@ self._cwd = matcher._cwd self._path = path self._matcher = matcher + self._always = matcher._always self._files = [f[len(path) + 1:] for f in matcher._files if f.startswith(path + "/")] @@ -342,7 +344,7 @@ r.append('/'.join(root) or '.') elif kind in ('relpath', 'path'): r.append(name or '.') - elif kind == 'relglob': + else: # relglob, re, relre r.append('.') return r diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/merge.py --- a/mercurial/merge.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/merge.py Tue May 14 23:04:23 2013 +0400 @@ -7,7 +7,8 @@ from node import nullid, nullrev, hex, bin from i18n import _ -import error, util, filemerge, copies, subrepo +from mercurial import obsolete +import error, util, filemerge, copies, subrepo, worker, dicthelpers import errno, os, shutil class mergestate(object): @@ -109,54 +110,6 @@ raise util.Abort(_("untracked files in working directory differ " "from files in requested revision")) -def _remains(f, m, ma, workingctx=False): - """check whether specified file remains after merge. - - It is assumed that specified file is not contained in the manifest - of the other context. - """ - if f in ma: - n = m[f] - if n != ma[f]: - return True # because it is changed locally - # even though it doesn't remain, if "remote deleted" is - # chosen in manifestmerge() - elif workingctx and n[20:] == "a": - return True # because it is added locally (linear merge specific) - else: - return False # because it is removed remotely - else: - return True # because it is added locally - -def _checkcollision(mctx, extractxs): - "check for case folding collisions in the destination context" - folded = {} - for fn in mctx: - fold = util.normcase(fn) - if fold in folded: - raise util.Abort(_("case-folding collision between %s and %s") - % (fn, folded[fold])) - folded[fold] = fn - - if extractxs: - wctx, actx = extractxs - # class to delay looking up copy mapping - class pathcopies(object): - @util.propertycache - def map(self): - # {dst@mctx: src@wctx} copy mapping - return copies.pathcopies(wctx, mctx) - pc = pathcopies() - - for fn in wctx: - fold = util.normcase(fn) - mfn = folded.get(fold, None) - if (mfn and mfn != fn and pc.map.get(mfn) != fn and - _remains(fn, wctx.manifest(), actx.manifest(), True) and - _remains(mfn, mctx.manifest(), actx.manifest())): - raise util.Abort(_("case-folding collision between %s and %s") - % (mfn, fn)) - def _forgetremoved(wctx, mctx, branchmerge): """ Forget removed files @@ -176,146 +129,296 @@ state = branchmerge and 'r' or 'f' for f in wctx.deleted(): if f not in mctx: - actions.append((f, state)) + actions.append((f, state, None, "forget deleted")) if not branchmerge: for f in wctx.removed(): if f not in mctx: - actions.append((f, "f")) + actions.append((f, "f", None, "forget removed")) return actions -def manifestmerge(repo, p1, p2, pa, overwrite, partial): +def _checkcollision(repo, wmf, actions, prompts): + # build provisional merged manifest up + pmmf = set(wmf) + + def addop(f, args): + pmmf.add(f) + def removeop(f, args): + pmmf.discard(f) + def nop(f, args): + pass + + def renameop(f, args): + f2, fd, flags = args + if f: + pmmf.discard(f) + pmmf.add(fd) + def mergeop(f, args): + f2, fd, move = args + if move: + pmmf.discard(f) + pmmf.add(fd) + + opmap = { + "a": addop, + "d": renameop, + "dr": nop, + "e": nop, + "f": addop, # untracked file should be kept in working directory + "g": addop, + "m": mergeop, + "r": removeop, + "rd": nop, + } + for f, m, args, msg in actions: + op = opmap.get(m) + assert op, m + op(f, args) + + opmap = { + "cd": addop, + "dc": addop, + } + for f, m in prompts: + op = opmap.get(m) + assert op, m + op(f, None) + + # check case-folding collision in provisional merged manifest + foldmap = {} + for f in sorted(pmmf): + fold = util.normcase(f) + if fold in foldmap: + raise util.Abort(_("case-folding collision between %s and %s") + % (f, foldmap[fold])) + foldmap[fold] = f + +def manifestmerge(repo, wctx, p2, pa, branchmerge, force, partial, + acceptremote=False): """ Merge p1 and p2 with ancestor pa and generate merge action list - overwrite = whether we clobber working files + branchmerge and force are as passed in to update partial = function to filter file lists + acceptremote = accept the incoming changes without prompting """ - def act(msg, m, f, *args): - repo.ui.debug(" %s: %s -> %s\n" % (f, msg, m)) - actions.append((f, m) + args) - + overwrite = force and not branchmerge actions, copy, movewithdir = [], {}, {} + followcopies = False if overwrite: - pa = p1 + pa = wctx elif pa == p2: # backwards - pa = p1.p1() + pa = wctx.p1() + elif not branchmerge and not wctx.dirty(missing=True): + pass elif pa and repo.ui.configbool("merge", "followcopies", True): - ret = copies.mergecopies(repo, p1, p2, pa) + followcopies = True + + # manifests fetched in order are going to be faster, so prime the caches + [x.manifest() for x in + sorted(wctx.parents() + [p2, pa], key=lambda x: x.rev())] + + if followcopies: + ret = copies.mergecopies(repo, wctx, p2, pa) copy, movewithdir, diverge, renamedelete = ret for of, fl in diverge.iteritems(): - act("divergent renames", "dr", of, fl) + actions.append((of, "dr", (fl,), "divergent renames")) for of, fl in renamedelete.iteritems(): - act("rename and delete", "rd", of, fl) + actions.append((of, "rd", (fl,), "rename and delete")) repo.ui.note(_("resolving manifests\n")) - repo.ui.debug(" overwrite: %s, partial: %s\n" - % (bool(overwrite), bool(partial))) - repo.ui.debug(" ancestor: %s, local: %s, remote: %s\n" % (pa, p1, p2)) + repo.ui.debug(" branchmerge: %s, force: %s, partial: %s\n" + % (bool(branchmerge), bool(force), bool(partial))) + repo.ui.debug(" ancestor: %s, local: %s, remote: %s\n" % (pa, wctx, p2)) - m1, m2, ma = p1.manifest(), p2.manifest(), pa.manifest() + m1, m2, ma = wctx.manifest(), p2.manifest(), pa.manifest() copied = set(copy.values()) copied.update(movewithdir.values()) if '.hgsubstate' in m1: # check whether sub state is modified - for s in sorted(p1.substate): - if p1.sub(s).dirty(): + for s in sorted(wctx.substate): + if wctx.sub(s).dirty(): m1['.hgsubstate'] += "+" break + aborts, prompts = [], [] # Compare manifests - visit = m1.iteritems() - if repo.ui.debugflag: - visit = sorted(visit) - for f, n in visit: + fdiff = dicthelpers.diff(m1, m2) + flagsdiff = m1.flagsdiff(m2) + diff12 = dicthelpers.join(fdiff, flagsdiff) + + for f, (n12, fl12) in diff12.iteritems(): + if n12: + n1, n2 = n12 + else: # file contents didn't change, but flags did + n1 = n2 = m1.get(f, None) + if n1 is None: + # Since n1 == n2, the file isn't present in m2 either. This + # means that the file was removed or deleted locally and + # removed remotely, but that residual entries remain in flags. + # This can happen in manifests generated by workingctx. + continue + if fl12: + fl1, fl2 = fl12 + else: # flags didn't change, file contents did + fl1 = fl2 = m1.flags(f) + if partial and not partial(f): continue - if f in m2: - n2 = m2[f] - fl1, fl2, fla = m1.flags(f), m2.flags(f), ma.flags(f) + if n1 and n2: + fla = ma.flags(f) nol = 'l' not in fl1 + fl2 + fla a = ma.get(f, nullid) - if n == n2 and fl1 == fl2: - pass # same - keep local - elif n2 == a and fl2 == fla: + if n2 == a and fl2 == fla: pass # remote unchanged - keep local - elif n == a and fl1 == fla: # local unchanged - use remote - if n == n2: # optimization: keep local content - act("update permissions", "e", f, fl2) + elif n1 == a and fl1 == fla: # local unchanged - use remote + if n1 == n2: # optimization: keep local content + actions.append((f, "e", (fl2,), "update permissions")) else: - act("remote is newer", "g", f, fl2) + actions.append((f, "g", (fl2,), "remote is newer")) elif nol and n2 == a: # remote only changed 'x' - act("update permissions", "e", f, fl2) - elif nol and n == a: # local only changed 'x' - act("remote is newer", "g", f, fl) + actions.append((f, "e", (fl2,), "update permissions")) + elif nol and n1 == a: # local only changed 'x' + actions.append((f, "g", (fl1,), "remote is newer")) else: # both changed something - act("versions differ", "m", f, f, f, False) + actions.append((f, "m", (f, f, False), "versions differ")) elif f in copied: # files we'll deal with on m2 side pass - elif f in movewithdir: # directory rename + elif n1 and f in movewithdir: # directory rename f2 = movewithdir[f] - act("remote renamed directory to " + f2, "d", f, None, f2, - m1.flags(f)) - elif f in copy: + actions.append((f, "d", (None, f2, fl1), + "remote renamed directory to " + f2)) + elif n1 and f in copy: f2 = copy[f] - act("local copied/moved to " + f2, "m", f, f2, f, False) - elif f in ma: # clean, a different, no remote - if n != ma[f]: - if repo.ui.promptchoice( - _(" local changed %s which remote deleted\n" - "use (c)hanged version or (d)elete?") % f, - (_("&Changed"), _("&Delete")), 0): - act("prompt delete", "r", f) - else: - act("prompt keep", "a", f) - elif n[20:] == "a": # added, no remote - act("remote deleted", "f", f) + actions.append((f, "m", (f2, f, False), + "local copied/moved to " + f2)) + elif n1 and f in ma: # clean, a different, no remote + if n1 != ma[f]: + prompts.append((f, "cd")) # prompt changed/deleted + elif n1[20:] == "a": # added, no remote + actions.append((f, "f", None, "remote deleted")) else: - act("other deleted", "r", f) - - visit = m2.iteritems() - if repo.ui.debugflag: - visit = sorted(visit) - for f, n in visit: - if partial and not partial(f): - continue - if f in m1 or f in copied: # files already visited - continue - if f in movewithdir: + actions.append((f, "r", None, "other deleted")) + elif n2 and f in movewithdir: f2 = movewithdir[f] - act("local renamed directory to " + f2, "d", None, f, f2, - m2.flags(f)) - elif f in copy: + actions.append((None, "d", (f, f2, fl2), + "local renamed directory to " + f2)) + elif n2 and f in copy: f2 = copy[f] if f2 in m2: - act("remote copied to " + f, "m", - f2, f, f, False) + actions.append((f2, "m", (f, f, False), + "remote copied to " + f)) + else: + actions.append((f2, "m", (f, f, True), + "remote moved to " + f)) + elif n2 and f not in ma: + # local unknown, remote created: the logic is described by the + # following table: + # + # force branchmerge different | action + # n * n | get + # n * y | abort + # y n * | get + # y y n | get + # y y y | merge + # + # Checking whether the files are different is expensive, so we + # don't do that when we can avoid it. + if force and not branchmerge: + actions.append((f, "g", (fl2,), "remote created")) else: - act("remote moved to " + f, "m", - f2, f, f, True) - elif f not in ma: - if (not overwrite - and _checkunknownfile(repo, p1, p2, f)): - act("remote differs from untracked local", - "m", f, f, f, False) + different = _checkunknownfile(repo, wctx, p2, f) + if force and branchmerge and different: + actions.append((f, "m", (f, f, False), + "remote differs from untracked local")) + elif not force and different: + aborts.append((f, "ud")) + else: + actions.append((f, "g", (fl2,), "remote created")) + elif n2 and n2 != ma[f]: + prompts.append((f, "dc")) # prompt deleted/changed + + for f, m in sorted(aborts): + if m == "ud": + repo.ui.warn(_("%s: untracked file differs\n") % f) + else: assert False, m + if aborts: + raise util.Abort(_("untracked files in working directory differ " + "from files in requested revision")) + + if not util.checkcase(repo.path): + # check collision between files only in p2 for clean update + if (not branchmerge and + (force or not wctx.dirty(missing=True, branch=False))): + _checkcollision(repo, m2, [], []) + else: + _checkcollision(repo, m1, actions, prompts) + + for f, m in sorted(prompts): + if m == "cd": + if acceptremote: + actions.append((f, "r", None, "remote delete")) + elif repo.ui.promptchoice( + _("local changed %s which remote deleted\n" + "use (c)hanged version or (d)elete?") % f, + (_("&Changed"), _("&Delete")), 0): + actions.append((f, "r", None, "prompt delete")) else: - act("remote created", "g", f, m2.flags(f)) - elif n != ma[f]: - if repo.ui.promptchoice( + actions.append((f, "a", None, "prompt keep")) + elif m == "dc": + if acceptremote: + actions.append((f, "g", (m2.flags(f),), "remote recreating")) + elif repo.ui.promptchoice( _("remote changed %s which local deleted\n" "use (c)hanged version or leave (d)eleted?") % f, (_("&Changed"), _("&Deleted")), 0) == 0: - act("prompt recreating", "g", f, m2.flags(f)) - + actions.append((f, "g", (m2.flags(f),), "prompt recreating")) + else: assert False, m return actions def actionkey(a): return a[1] == "r" and -1 or 0, a +def getremove(repo, mctx, overwrite, args): + """apply usually-non-interactive updates to the working directory + + mctx is the context to be merged into the working copy + + yields tuples for progress updates + """ + verbose = repo.ui.verbose + unlink = util.unlinkpath + wjoin = repo.wjoin + fctx = mctx.filectx + wwrite = repo.wwrite + audit = repo.wopener.audit + i = 0 + for arg in args: + f = arg[0] + if arg[1] == 'r': + if verbose: + repo.ui.note(_("removing %s\n") % f) + audit(f) + try: + unlink(wjoin(f), ignoremissing=True) + except OSError, inst: + repo.ui.warn(_("update failed to remove %s: %s!\n") % + (f, inst.strerror)) + else: + if verbose: + repo.ui.note(_("getting %s\n") % f) + wwrite(f, fctx(f).data(), arg[2][0]) + if i == 100: + yield i, f + i = 0 + i += 1 + if i > 0: + yield i, f + def applyupdates(repo, actions, wctx, mctx, actx, overwrite): """apply the merge action list to the working directory @@ -335,12 +438,13 @@ # prescan for merges for a in actions: - f, m = a[:2] + f, m, args, msg = a + repo.ui.debug(" %s: %s -> %s\n" % (f, msg, m)) if m == "m": # merge - f2, fd, move = a[2:] + f2, fd, move = args if fd == '.hgsubstate': # merged internally continue - repo.ui.debug("preserving %s for resolve of %s\n" % (f, fd)) + repo.ui.debug(" preserving %s for resolve of %s\n" % (f, fd)) fcl = wctx[f] fco = mctx[f2] if mctx == actx: # backwards, use working dir parent as ancestor @@ -366,27 +470,47 @@ util.unlinkpath(repo.wjoin(f)) numupdates = len(actions) - for i, a in enumerate(actions): - f, m = a[:2] - repo.ui.progress(_('updating'), i + 1, item=f, total=numupdates, + workeractions = [a for a in actions if a[1] in 'gr'] + updateactions = [a for a in workeractions if a[1] == 'g'] + updated = len(updateactions) + removeactions = [a for a in workeractions if a[1] == 'r'] + removed = len(removeactions) + actions = [a for a in actions if a[1] not in 'gr'] + + hgsub = [a[1] for a in workeractions if a[0] == '.hgsubstate'] + if hgsub and hgsub[0] == 'r': + subrepo.submerge(repo, wctx, mctx, wctx, overwrite) + + z = 0 + prog = worker.worker(repo.ui, 0.001, getremove, (repo, mctx, overwrite), + removeactions) + for i, item in prog: + z += i + repo.ui.progress(_('updating'), z, item=item, total=numupdates, unit=_('files')) - if m == "r": # remove - repo.ui.note(_("removing %s\n") % f) - audit(f) - if f == '.hgsubstate': # subrepo states need updating - subrepo.submerge(repo, wctx, mctx, wctx, overwrite) - try: - util.unlinkpath(repo.wjoin(f), ignoremissing=True) - except OSError, inst: - repo.ui.warn(_("update failed to remove %s: %s!\n") % - (f, inst.strerror)) - removed += 1 - elif m == "m": # merge + prog = worker.worker(repo.ui, 0.001, getremove, (repo, mctx, overwrite), + updateactions) + for i, item in prog: + z += i + repo.ui.progress(_('updating'), z, item=item, total=numupdates, + unit=_('files')) + + if hgsub and hgsub[0] == 'g': + subrepo.submerge(repo, wctx, mctx, wctx, overwrite) + + _updating = _('updating') + _files = _('files') + progress = repo.ui.progress + + for i, a in enumerate(actions): + f, m, args, msg = a + progress(_updating, z + i + 1, item=f, total=numupdates, unit=_files) + if m == "m": # merge + f2, fd, move = args if fd == '.hgsubstate': # subrepo states need updating subrepo.submerge(repo, wctx, mctx, wctx.ancestor(mctx), overwrite) continue - f2, fd, move = a[2:] audit(fd) r = ms.resolve(fd, wctx, mctx) if r is not None and r > 0: @@ -396,15 +520,8 @@ updated += 1 else: merged += 1 - elif m == "g": # get - flags = a[2] - repo.ui.note(_("getting %s\n") % f) - repo.wwrite(f, mctx.filectx(f).data(), flags) - updated += 1 - if f == '.hgsubstate': # subrepo states need updating - subrepo.submerge(repo, wctx, mctx, wctx, overwrite) elif m == "d": # directory rename - f2, fd, flags = a[2:] + f2, fd, flags = args if f: repo.ui.note(_("moving %s to %s\n") % (f, fd)) audit(f) @@ -415,53 +532,44 @@ repo.wwrite(fd, mctx.filectx(f2).data(), flags) updated += 1 elif m == "dr": # divergent renames - fl = a[2] + fl, = args repo.ui.warn(_("note: possible conflict - %s was renamed " "multiple times to:\n") % f) for nf in fl: repo.ui.warn(" %s\n" % nf) elif m == "rd": # rename and delete - fl = a[2] + fl, = args repo.ui.warn(_("note: possible conflict - %s was deleted " "and renamed to:\n") % f) for nf in fl: repo.ui.warn(" %s\n" % nf) elif m == "e": # exec - flags = a[2] + flags, = args audit(f) util.setflags(repo.wjoin(f), 'l' in flags, 'x' in flags) updated += 1 ms.commit() - repo.ui.progress(_('updating'), None, total=numupdates, unit=_('files')) + progress(_updating, None, total=numupdates, unit=_files) return updated, merged, removed, unresolved -def calculateupdates(repo, tctx, mctx, ancestor, branchmerge, force, partial): +def calculateupdates(repo, tctx, mctx, ancestor, branchmerge, force, partial, + acceptremote=False): "Calculate the actions needed to merge mctx into tctx" actions = [] - folding = not util.checkcase(repo.path) - if folding: - # collision check is not needed for clean update - if (not branchmerge and - (force or not tctx.dirty(missing=True, branch=False))): - _checkcollision(mctx, None) - else: - _checkcollision(mctx, (tctx, ancestor)) - if not force: - _checkunknown(repo, tctx, mctx) + actions += manifestmerge(repo, tctx, mctx, + ancestor, + branchmerge, force, + partial, acceptremote) if tctx.rev() is None: actions += _forgetremoved(tctx, mctx, branchmerge) - actions += manifestmerge(repo, tctx, mctx, - ancestor, - force and not branchmerge, - partial) return actions def recordupdates(repo, actions, branchmerge): "record merge actions to the dirstate" for a in actions: - f, m = a[:2] + f, m, args, msg = a if m == "r": # remove if branchmerge: repo.dirstate.remove(f) @@ -480,7 +588,7 @@ else: repo.dirstate.normal(f) elif m == "m": # merge - f2, fd, move = a[2:] + f2, fd, move = args if branchmerge: # We've done a branch merge, mark this file as merged # so that we properly record the merger later @@ -503,7 +611,7 @@ if move: repo.dirstate.drop(f) elif m == "d": # directory rename - f2, fd, flag = a[2:] + f2, fd, flag = args if not f2 and f not in repo.dirstate: # untracked file moved continue @@ -528,10 +636,11 @@ branchmerge = whether to merge between branches force = whether to force branch merging or file overwriting partial = a function to filter file lists (dirstate not updated) - mergeancestor = if false, merging with an ancestor (fast-forward) - is only allowed between different named branches. This flag - is used by rebase extension as a temporary fix and should be - avoided in general. + mergeancestor = whether it is merging with an ancestor. If true, + we should accept the incoming changes for any prompts that occur. + If false, merging with an ancestor (fast-forward) is only allowed + between different named branches. This flag is used by rebase extension + as a temporary fix and should be avoided in general. The table below shows all the behaviors of the update command given the -c and -C or no options, whether the working directory @@ -605,21 +714,30 @@ "subrepository '%s'") % s) elif not overwrite: - if pa == p1 or pa == p2: # linear - pass # all good - elif wc.dirty(missing=True): - raise util.Abort(_("crosses branches (merge branches or use" - " --clean to discard changes)")) - elif onode is None: - raise util.Abort(_("crosses branches (merge branches or update" - " --check to force update)")) - else: - # Allow jumping branches if clean and specific rev given - pa = p1 + if pa not in (p1, p2): # nolinear + dirty = wc.dirty(missing=True) + if dirty or onode is None: + # Branching is a bit strange to ensure we do the minimal + # amount of call to obsolete.background. + foreground = obsolete.foreground(repo, [p1.node()]) + # note: the variable contains a random identifier + if repo[node].node() in foreground: + pa = p1 # allow updating to successors + elif dirty: + msg = _("crosses branches (merge branches or use" + " --clean to discard changes)") + raise util.Abort(msg) + else: # node is none + msg = _("crosses branches (merge branches or update" + " --check to force update)") + raise util.Abort(msg) + else: + # Allow jumping branches if clean and specific rev given + pa = p1 ### calculate phase actions = calculateupdates(repo, wc, p2, pa, - branchmerge, force, partial) + branchmerge, force, partial, mergeancestor) ### apply phase if not branchmerge: # just jump to the new rev diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/minirst.py --- a/mercurial/minirst.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/minirst.py Tue May 14 23:04:23 2013 +0400 @@ -22,6 +22,20 @@ import util, encoding from i18n import _ +import cgi + +def section(s): + return "%s\n%s\n\n" % (s, "\"" * encoding.colwidth(s)) + +def subsection(s): + return "%s\n%s\n\n" % (s, '=' * encoding.colwidth(s)) + +def subsubsection(s): + return "%s\n%s\n\n" % (s, "-" * encoding.colwidth(s)) + +def subsubsubsection(s): + return "%s\n%s\n\n" % (s, "." * encoding.colwidth(s)) + def replace(text, substs): ''' Apply a list of (find, replace) pairs to a text. @@ -512,6 +526,9 @@ headernest = '' listnest = [] + def escape(s): + return cgi.escape(s, True) + def openlist(start, level): if not listnest or listnest[-1][0] != start: listnest.append((start, level)) @@ -525,37 +542,38 @@ lines = b['lines'] if btype == 'admonition': - admonition = _admonitiontitles[b['admonitiontitle']] - text = ' '.join(map(str.strip, lines)) + admonition = escape(_admonitiontitles[b['admonitiontitle']]) + text = escape(' '.join(map(str.strip, lines))) out.append('

\n%s %s\n

\n' % (admonition, text)) elif btype == 'paragraph': - out.append('

\n%s\n

\n' % '\n'.join(lines)) + out.append('

\n%s\n

\n' % escape('\n'.join(lines))) elif btype == 'margin': pass elif btype == 'literal': - out.append('
\n%s\n
\n' % '\n'.join(lines)) + out.append('
\n%s\n
\n' % escape('\n'.join(lines))) elif btype == 'section': i = b['underline'] if i not in headernest: headernest += i level = headernest.index(i) + 1 - out.append('%s\n' % (level, lines[0], level)) + out.append('%s\n' % (level, escape(lines[0]), level)) elif btype == 'table': table = b['table'] - t = [] + out.append('\n') for row in table: - l = [] - for v in zip(row): - if not t: - l.append('' % v) - else: - l.append('' % v) - t.append(' %s\n' % ''.join(l)) - out.append('
%s%s
\n%s
\n' % ''.join(t)) + out.append('') + for v in row: + out.append('') + out.append(escape(v)) + out.append('') + out.append('\n') + out.pop() + out.append('\n') + out.append('\n') elif btype == 'definition': openlist('dl', level) - term = lines[0] - text = ' '.join(map(str.strip, lines[1:])) + term = escape(lines[0]) + text = escape(' '.join(map(str.strip, lines[1:]))) out.append('
%s\n
%s\n' % (term, text)) elif btype == 'bullet': bullet, head = lines[0].split(' ', 1) @@ -563,16 +581,16 @@ openlist('ul', level) else: openlist('ol', level) - out.append('
  • %s\n' % ' '.join([head] + lines[1:])) + out.append('
  • %s\n' % escape(' '.join([head] + lines[1:]))) elif btype == 'field': openlist('dl', level) - key = b['key'] - text = ' '.join(map(str.strip, lines)) + key = escape(b['key']) + text = escape(' '.join(map(str.strip, lines))) out.append('
    %s\n
    %s\n' % (key, text)) elif btype == 'option': openlist('dl', level) - opt = b['optstr'] - desc = ' '.join(map(str.strip, lines)) + opt = escape(b['optstr']) + desc = escape(' '.join(map(str.strip, lines))) out.append('
    %s\n
    %s\n' % (opt, desc)) # close lists if indent level of next block is lower diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/obsolete.py --- a/mercurial/obsolete.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/obsolete.py Tue May 14 23:04:23 2013 +0400 @@ -46,7 +46,7 @@ (A, (C, C)) We use a single marker to distinct the "split" case from the "divergence" - case. If two independants operation rewrite the same changeset A in to A' and + case. If two independents operation rewrite the same changeset A in to A' and A'' when have an error case: divergent rewriting. We can detect it because two markers will be created independently: @@ -129,8 +129,9 @@ # # But by transitivity Ad is also a successors of A. To avoid having Ad marked # as bumped too, we add the `bumpedfix` flag to the marker. . -# This flag mean that the successors are an interdiff that fix the bumped -# situation, breaking the transitivity of "bumped" here. +# This flag mean that the successors express the changes between the public and +# bumped version and fix the situation, breaking the transitivity of +# "bumped" here. bumpedfix = 1 def _readmarkers(data): @@ -249,6 +250,8 @@ """ if metadata is None: metadata = {} + if 'date' not in metadata: + metadata['date'] = "%d %d" % util.makedate() if len(prec) != 20: raise ValueError(prec) for succ in succs: @@ -367,6 +370,43 @@ finally: lock.release() +def syncpush(repo, remote): + """utility function to push bookmark to a remote + + Exist mostly to allow overridding for experimentation purpose""" + if (_enabled and repo.obsstore and + 'obsolete' in remote.listkeys('namespaces')): + rslts = [] + remotedata = repo.listkeys('obsolete') + for key in sorted(remotedata, reverse=True): + # reverse sort to ensure we end with dump0 + data = remotedata[key] + rslts.append(remote.pushkey('obsolete', key, '', data)) + if [r for r in rslts if not r]: + msg = _('failed to push some obsolete markers!\n') + repo.ui.warn(msg) + +def syncpull(repo, remote, gettransaction): + """utility function to pull bookmark to a remote + + The `gettransaction` is function that return the pull transaction, creating + one if necessary. We return the transaction to inform the calling code that + a new transaction have been created (when applicable). + + Exists mostly to allow overridding for experimentation purpose""" + tr = None + if _enabled: + repo.ui.debug('fetching remote obsolete markers\n') + remoteobs = remote.listkeys('obsolete') + if 'dump0' in remoteobs: + tr = gettransaction() + for key in sorted(remoteobs, reverse=True): + if key.startswith('dump'): + data = base85.b85decode(remoteobs[key]) + repo.obsstore.mergemarkers(tr, data) + repo.invalidatevolatilesets() + return tr + def allmarkers(repo): """all obsolete markers known in a repository""" for markerdata in repo.obsstore: @@ -402,6 +442,33 @@ seen.add(suc) remaining.add(suc) +def foreground(repo, nodes): + """return all nodes in the "foreground" of other node + + The foreground of a revision is anything reachable using parent -> children + or precursor -> sucessor relation. It is very similars to "descendant" but + augmented with obsolescence information. + + Beware that possible obsolescence cycle may result if complexe situation. + """ + repo = repo.unfiltered() + foreground = set(repo.set('%ln::', nodes)) + if repo.obsstore: + # We only need this complicated logic if there is obsolescence + # XXX will probably deserve an optimised revset. + nm = repo.changelog.nodemap + plen = -1 + # compute the whole set of successors or descendants + while len(foreground) != plen: + plen = len(foreground) + succs = set(c.node() for c in foreground) + mutable = [c.node() for c in foreground if c.mutable()] + succs.update(allsuccessors(repo.obsstore, mutable)) + known = (n for n in succs if n in nm) + foreground = set(repo.set('%ln::', known)) + return set(c.node() for c in foreground) + + def successorssets(repo, initialnode, cache=None): """Return all set of successors of initial nodes @@ -510,7 +577,7 @@ # In such a situation, we arbitrary set the successors sets of # the node to nothing (node pruned) to break the cycle. # - # If no break was encountered we proceeed to phase 2. + # If no break was encountered we proceed to phase 2. # # Phase 2 computes successors sets of CURRENT (case 4); see details # in phase 2 itself. @@ -551,13 +618,13 @@ # successors sets of all its "successors" node. # # Each different marker is a divergence in the obsolescence - # history. It contributes successors sets dictinct from other + # history. It contributes successors sets distinct from other # markers. # # Within a marker, a successor may have divergent successors # sets. In such a case, the marker will contribute multiple # divergent successors sets. If multiple successors have - # divergents successors sets, a cartesian product is used. + # divergent successors sets, a cartesian product is used. # # At the end we post-process successors sets to remove # duplicated entry and successors set that are strict subset of diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/parsers.c --- a/mercurial/parsers.c Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/parsers.c Tue May 14 23:04:23 2013 +0400 @@ -326,7 +326,8 @@ if (getintat(v, 3, &mtime) == -1) goto bail; if (*s == 'n' && mtime == (uint32_t)now) { - /* See dirstate.py:write for why we do this. */ + /* See pure/parsers.py:pack_dirstate for why we do + * this. */ if (PyDict_SetItem(map, k, dirstate_unset) == -1) goto bail; mode = 0, size = -1, mtime = -1; @@ -1162,6 +1163,367 @@ } } +static inline void index_get_parents(indexObject *self, int rev, int *ps) +{ + if (rev >= self->length - 1) { + PyObject *tuple = PyList_GET_ITEM(self->added, + rev - self->length + 1); + ps[0] = (int)PyInt_AS_LONG(PyTuple_GET_ITEM(tuple, 5)); + ps[1] = (int)PyInt_AS_LONG(PyTuple_GET_ITEM(tuple, 6)); + } else { + const char *data = index_deref(self, rev); + ps[0] = getbe32(data + 24); + ps[1] = getbe32(data + 28); + } +} + +typedef uint64_t bitmask; + +/* + * Given a disjoint set of revs, return all candidates for the + * greatest common ancestor. In revset notation, this is the set + * "heads(::a and ::b and ...)" + */ +static PyObject *find_gca_candidates(indexObject *self, const int *revs, + int revcount) +{ + const bitmask allseen = (1ull << revcount) - 1; + const bitmask poison = 1ull << revcount; + PyObject *gca = PyList_New(0); + int i, v, interesting, left; + int maxrev = -1; + long sp; + bitmask *seen; + + for (i = 0; i < revcount; i++) { + if (revs[i] > maxrev) + maxrev = revs[i]; + } + + seen = calloc(sizeof(*seen), maxrev + 1); + if (seen == NULL) + return PyErr_NoMemory(); + + for (i = 0; i < revcount; i++) + seen[revs[i]] = 1ull << i; + + interesting = left = revcount; + + for (v = maxrev; v >= 0 && interesting; v--) { + long sv = seen[v]; + int parents[2]; + + if (!sv) + continue; + + if (sv < poison) { + interesting -= 1; + if (sv == allseen) { + PyObject *obj = PyInt_FromLong(v); + if (obj == NULL) + goto bail; + if (PyList_Append(gca, obj) == -1) { + Py_DECREF(obj); + goto bail; + } + sv |= poison; + for (i = 0; i < revcount; i++) { + if (revs[i] == v) { + if (--left <= 1) + goto done; + break; + } + } + } + } + index_get_parents(self, v, parents); + + for (i = 0; i < 2; i++) { + int p = parents[i]; + if (p == -1) + continue; + sp = seen[p]; + if (sv < poison) { + if (sp == 0) { + seen[p] = sv; + interesting++; + } + else if (sp != sv) + seen[p] |= sv; + } else { + if (sp && sp < poison) + interesting--; + seen[p] = sv; + } + } + } + +done: + free(seen); + return gca; +bail: + free(seen); + Py_XDECREF(gca); + return NULL; +} + +/* + * Given a disjoint set of revs, return the subset with the longest + * path to the root. + */ +static PyObject *find_deepest(indexObject *self, PyObject *revs) +{ + const Py_ssize_t revcount = PyList_GET_SIZE(revs); + static const Py_ssize_t capacity = 24; + int *depth, *interesting = NULL; + int i, j, v, ninteresting; + PyObject *dict = NULL, *keys; + long *seen = NULL; + int maxrev = -1; + long final; + + if (revcount > capacity) { + PyErr_Format(PyExc_OverflowError, + "bitset size (%ld) > capacity (%ld)", + (long)revcount, (long)capacity); + return NULL; + } + + for (i = 0; i < revcount; i++) { + int n = (int)PyInt_AsLong(PyList_GET_ITEM(revs, i)); + if (n > maxrev) + maxrev = n; + } + + depth = calloc(sizeof(*depth), maxrev + 1); + if (depth == NULL) + return PyErr_NoMemory(); + + seen = calloc(sizeof(*seen), maxrev + 1); + if (seen == NULL) { + PyErr_NoMemory(); + goto bail; + } + + interesting = calloc(sizeof(*interesting), 2 << revcount); + if (interesting == NULL) { + PyErr_NoMemory(); + goto bail; + } + + for (i = 0; i < revcount; i++) { + int n = (int)PyInt_AsLong(PyList_GET_ITEM(revs, i)); + long b = 1l << i; + depth[n] = 1; + seen[n] = b; + interesting[b] = 1; + } + + ninteresting = (int)revcount; + + for (v = maxrev; v >= 0 && ninteresting > 1; v--) { + int dv = depth[v]; + int parents[2]; + long sv; + + if (dv == 0) + continue; + + sv = seen[v]; + index_get_parents(self, v, parents); + + for (i = 0; i < 2; i++) { + int p = parents[i]; + long nsp, sp; + int dp; + + if (p == -1) + continue; + + dp = depth[p]; + nsp = sp = seen[p]; + if (dp <= dv) { + depth[p] = dv + 1; + if (sp != sv) { + interesting[sv] += 1; + nsp = seen[p] = sv; + if (sp) { + interesting[sp] -= 1; + if (interesting[sp] == 0) + ninteresting -= 1; + } + } + } + else if (dv == dp - 1) { + nsp = sp | sv; + if (nsp == sp) + continue; + seen[p] = nsp; + interesting[nsp] += 1; + interesting[sp] -= 1; + if (interesting[sp] == 0) + ninteresting -= 1; + } + } + interesting[sv] -= 1; + if (interesting[sv] == 0) + ninteresting -= 1; + } + + final = 0; + j = ninteresting; + for (i = 0; i < (int)(2 << revcount) && j > 0; i++) { + if (interesting[i] == 0) + continue; + final |= i; + j -= 1; + } + if (final == 0) + return PyList_New(0); + + dict = PyDict_New(); + if (dict == NULL) + goto bail; + + j = ninteresting; + for (i = 0; i < revcount && j > 0; i++) { + PyObject *key; + + if ((final & (1 << i)) == 0) + continue; + + key = PyList_GET_ITEM(revs, i); + Py_INCREF(key); + Py_INCREF(Py_None); + if (PyDict_SetItem(dict, key, Py_None) == -1) { + Py_DECREF(key); + Py_DECREF(Py_None); + goto bail; + } + j -= 1; + } + + keys = PyDict_Keys(dict); + + free(depth); + free(seen); + free(interesting); + Py_DECREF(dict); + + return keys; +bail: + free(depth); + free(seen); + free(interesting); + Py_XDECREF(dict); + + return NULL; +} + +/* + * Given a (possibly overlapping) set of revs, return the greatest + * common ancestors: those with the longest path to the root. + */ +static PyObject *index_ancestors(indexObject *self, PyObject *args) +{ + PyObject *ret = NULL, *gca = NULL; + Py_ssize_t argcount, i, len; + bitmask repeat = 0; + int revcount = 0; + int *revs; + + argcount = PySequence_Length(args); + revs = malloc(argcount * sizeof(*revs)); + if (argcount > 0 && revs == NULL) + return PyErr_NoMemory(); + len = index_length(self) - 1; + + for (i = 0; i < argcount; i++) { + static const int capacity = 24; + PyObject *obj = PySequence_GetItem(args, i); + bitmask x; + long val; + + if (!PyInt_Check(obj)) { + PyErr_SetString(PyExc_TypeError, + "arguments must all be ints"); + goto bail; + } + val = PyInt_AsLong(obj); + if (val == -1) { + ret = PyList_New(0); + goto done; + } + if (val < 0 || val >= len) { + PyErr_SetString(PyExc_IndexError, + "index out of range"); + goto bail; + } + /* this cheesy bloom filter lets us avoid some more + * expensive duplicate checks in the common set-is-disjoint + * case */ + x = 1ull << (val & 0x3f); + if (repeat & x) { + int k; + for (k = 0; k < revcount; k++) { + if (val == revs[k]) + goto duplicate; + } + } + else repeat |= x; + if (revcount >= capacity) { + PyErr_Format(PyExc_OverflowError, + "bitset size (%d) > capacity (%d)", + revcount, capacity); + goto bail; + } + revs[revcount++] = (int)val; + duplicate:; + } + + if (revcount == 0) { + ret = PyList_New(0); + goto done; + } + if (revcount == 1) { + PyObject *obj; + ret = PyList_New(1); + if (ret == NULL) + goto bail; + obj = PyInt_FromLong(revs[0]); + if (obj == NULL) + goto bail; + PyList_SET_ITEM(ret, 0, obj); + goto done; + } + + gca = find_gca_candidates(self, revs, revcount); + if (gca == NULL) + goto bail; + + if (PyList_GET_SIZE(gca) <= 1) { + ret = gca; + Py_INCREF(gca); + } + else if (PyList_GET_SIZE(gca) == 1) { + ret = PyList_GET_ITEM(gca, 0); + Py_INCREF(ret); + } + else ret = find_deepest(self, gca); + +done: + free(revs); + Py_XDECREF(gca); + + return ret; + +bail: + free(revs); + Py_XDECREF(gca); + Py_XDECREF(ret); + return NULL; +} + /* * Invalidate any trie entries introduced by added revs. */ @@ -1234,8 +1596,14 @@ self->ntrev = (int)start; } self->length = start + 1; - if (start < self->raw_length) + if (start < self->raw_length) { + if (self->cache) { + Py_ssize_t i; + for (i = start; i < self->raw_length; i++) + Py_CLEAR(self->cache[i]); + } self->raw_length = start; + } goto done; } @@ -1399,6 +1767,8 @@ }; static PyMethodDef index_methods[] = { + {"ancestors", (PyCFunction)index_ancestors, METH_VARARGS, + "return the gca set of the given revs"}, {"clearcaches", (PyCFunction)index_clearcaches, METH_NOARGS, "clear the index caches"}, {"get", (PyCFunction)index_m_get, METH_VARARGS, @@ -1521,8 +1891,12 @@ {NULL, NULL} }; +void dirs_module_init(PyObject *mod); + static void module_init(PyObject *mod) { + dirs_module_init(mod); + indexType.tp_new = PyType_GenericNew; if (PyType_Ready(&indexType) < 0) return; diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/patch.py --- a/mercurial/patch.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/patch.py Tue May 14 23:04:23 2013 +0400 @@ -314,7 +314,7 @@ gitpatches = [] for line in lr: line = line.rstrip(' \r\n') - if line.startswith('diff --git'): + if line.startswith('diff --git a/'): m = gitre.match(line) if m: if gp: @@ -1211,7 +1211,7 @@ emitfile = False yield 'file', (afile, bfile, h, gp and gp.copy() or None) yield 'hunk', h - elif x.startswith('diff --git'): + elif x.startswith('diff --git a/'): m = gitre.match(x.rstrip(' \r\n')) if not m: continue @@ -1756,6 +1756,8 @@ else: header.append('deleted file mode %s\n' % gitmode[man1.flags(f)]) + if util.binary(to): + dodiff = 'binary' elif not to or util.binary(to): # regular diffs cannot represent empty file deletion losedatafn(f) @@ -1813,7 +1815,7 @@ addresult() # set numbers to 0 anyway when starting new file adds, removes, isbinary = 0, 0, False - if line.startswith('diff --git'): + if line.startswith('diff --git a/'): filename = gitre.search(line).group(1) elif line.startswith('diff -r'): # format: "diff -r ... -r ... filename" diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/pathencode.c --- a/mercurial/pathencode.c Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/pathencode.c Tue May 14 23:04:23 2013 +0400 @@ -45,7 +45,7 @@ H, /* ".h" */ HGDI, /* ".hg", ".d", or ".i" */ SPACE, - DEFAULT, /* byte of a path component after the first */ + DEFAULT /* byte of a path component after the first */ }; /* state machine for dir-encoding */ @@ -53,7 +53,7 @@ DDOT, DH, DHGDI, - DDEFAULT, + DDEFAULT }; static inline int inset(const uint32_t bitset[], char c) @@ -696,7 +696,7 @@ return 0; } -#define MAXENCODE 4096 * 3 +#define MAXENCODE 4096 * 4 static PyObject *hashencode(const char *src, Py_ssize_t len) { diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/phases.py --- a/mercurial/phases.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/phases.py Tue May 14 23:04:23 2013 +0400 @@ -266,7 +266,15 @@ filtered = True if filtered: self.dirty = True - self._phaserevs = None + # filterunknown is called by repo.destroyed, we may have no changes in + # root but phaserevs contents is certainly invalide (or at least we + # have not proper way to check that. related to issue 3858. + # + # The other caller is __init__ that have no _phaserevs initialized + # anyway. If this change we should consider adding a dedicated + # "destroyed" function to phasecache or a proper cache key mechanisme + # (see branchmap one) + self._phaserevs = None def advanceboundary(repo, targetphase, nodes): """Add nodes to a phase changing other nodes phases if necessary. diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/posix.py --- a/mercurial/posix.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/posix.py Tue May 14 23:04:23 2013 +0400 @@ -194,19 +194,66 @@ import fcntl # only needed on darwin, missing on jython def normcase(path): + ''' + Normalize a filename for OS X-compatible comparison: + - escape-encode invalid characters + - decompose to NFD + - lowercase + + >>> normcase('UPPER') + 'upper' + >>> normcase('Caf\xc3\xa9') + 'cafe\\xcc\\x81' + >>> normcase('\xc3\x89') + 'e\\xcc\\x81' + >>> normcase('\xb8\xca\xc3\xca\xbe\xc8.JPG') # issue3918 + '%b8%ca%c3\\xca\\xbe%c8.jpg' + ''' + + try: + path.decode('ascii') # throw exception for non-ASCII character + return path.lower() + except UnicodeDecodeError: + pass try: u = path.decode('utf-8') except UnicodeDecodeError: - # percent-encode any characters that don't round-trip - p2 = path.decode('utf-8', 'ignore').encode('utf-8') - s = "" - pos = 0 + # OS X percent-encodes any bytes that aren't valid utf-8 + s = '' + g = '' + l = 0 for c in path: - if p2[pos:pos + 1] == c: + o = ord(c) + if l and o < 128 or o >= 192: + # we want a continuation byte, but didn't get one + s += ''.join(["%%%02X" % ord(x) for x in g]) + g = '' + l = 0 + if l == 0 and o < 128: + # ascii s += c - pos += 1 + elif l == 0 and 194 <= o < 245: + # valid leading bytes + if o < 224: + l = 1 + elif o < 240: + l = 2 + else: + l = 3 + g = c + elif l > 0 and 128 <= o < 192: + # valid continuations + g += c + l -= 1 + if not l: + s += g + g = '' else: - s += "%%%02X" % ord(c) + # invalid + s += "%%%02X" % o + + # any remaining partial characters + s += ''.join(["%%%02X" % ord(x) for x in g]) u = s.decode('utf-8') # Decompose then lowercase (HFS+ technote specifies lower) @@ -552,3 +599,11 @@ if self.realpath != self.path: okayifmissing(os.unlink, self.realpath) okayifmissing(os.rmdir, os.path.dirname(self.realpath)) + +def statislink(st): + '''check whether a stat result is a symlink''' + return st and stat.S_ISLNK(st.st_mode) + +def statisexec(st): + '''check whether a stat result is an executable file''' + return st and (st.st_mode & 0100 != 0) diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/pure/osutil.py --- a/mercurial/pure/osutil.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/pure/osutil.py Tue May 14 23:04:23 2013 +0400 @@ -68,7 +68,7 @@ _INVALID_HANDLE_VALUE = _HANDLE(-1).value - # CreateFile + # CreateFile _FILE_SHARE_READ = 0x00000001 _FILE_SHARE_WRITE = 0x00000002 _FILE_SHARE_DELETE = 0x00000004 diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/pure/parsers.py --- a/mercurial/pure/parsers.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/pure/parsers.py Tue May 14 23:04:23 2013 +0400 @@ -7,7 +7,7 @@ from mercurial.node import bin, nullid from mercurial import util -import struct, zlib +import struct, zlib, cStringIO _pack = struct.pack _unpack = struct.unpack @@ -87,3 +87,29 @@ copymap[f] = c dmap[f] = e[:4] return parents + +def pack_dirstate(dmap, copymap, pl, now): + now = int(now) + cs = cStringIO.StringIO() + write = cs.write + write("".join(pl)) + for f, e in dmap.iteritems(): + if e[0] == 'n' and e[3] == now: + # The file was last modified "simultaneously" with the current + # write to dirstate (i.e. within the same second for file- + # systems with a granularity of 1 sec). This commonly happens + # for at least a couple of files on 'update'. + # The user could change the file without changing its size + # within the same second. Invalidate the file's stat data in + # dirstate, forcing future 'status' calls to compare the + # contents of the file. This prevents mistakenly treating such + # files as clean. + e = (e[0], 0, -1, -1) # mark entry as 'unset' + dmap[f] = e + + if f in copymap: + f = "%s\0%s" % (f, copymap[f]) + e = _pack(">cllll", e[0], e[1], e[2], e[3], len(f)) + write(e) + write(f) + return cs.getvalue() diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/pvec.py --- a/mercurial/pvec.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/pvec.py Tue May 14 23:04:23 2013 +0400 @@ -169,7 +169,7 @@ self._bs = hashorctx self._depth, self._vec = _split(base85.b85decode(hashorctx)) else: - self._vec = ctxpvec(ctx) + self._vec = ctxpvec(hashorctx) def __str__(self): return self._bs diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/repair.py --- a/mercurial/repair.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/repair.py Tue May 14 23:04:23 2013 +0400 @@ -119,6 +119,7 @@ if backup == "all": backupfile = _bundle(repo, stripbases, cl.heads(), node, topic) repo.ui.status(_("saved backup bundle to %s\n") % backupfile) + repo.ui.log("backupbundle", "saved backup bundle to %s\n", backupfile) if saveheads or savebases: # do not compress partial bundle if we remove it from disk later chgrpfile = _bundle(repo, savebases, saveheads, node, 'temp', diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/repoview.py --- a/mercurial/repoview.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/repoview.py Tue May 14 23:04:23 2013 +0400 @@ -9,7 +9,7 @@ import copy import phases import util -import obsolete, bookmarks, revset +import obsolete, revset def hideablerevs(repo): @@ -32,7 +32,7 @@ if r not in hideable] for par in repo[None].parents(): blockers.append(par.rev()) - for bm in bookmarks.listbookmarks(repo).values(): + for bm in repo._bookmarks.values(): blockers.append(repo[bm].rev()) blocked = cl.ancestors(blockers, inclusive=True) return frozenset(r for r in hideable if r not in blocked) @@ -55,7 +55,6 @@ return frozenset(hiddens | secrets) else: return hiddens - return frozenset() def computemutable(repo): """compute the set of revision that should be filtered when used a server @@ -149,7 +148,7 @@ repoview.method() --> repo.__class__.method(repoview) The inheritance has to be done dynamically because `repo` can be of any - subclasses of `localrepo`. Eg: `bundlerepo` or `httprepo`. + subclasses of `localrepo`. Eg: `bundlerepo` or `statichttprepo`. """ def __init__(self, repo, filtername): @@ -158,7 +157,7 @@ object.__setattr__(self, '_clcachekey', None) object.__setattr__(self, '_clcache', None) - # not a cacheproperty on purpose we shall implement a proper cache later + # not a propertycache on purpose we shall implement a proper cache later @property def changelog(self): """return a filtered version of the changeset @@ -210,7 +209,7 @@ def __delattr__(self, attr): return delattr(self._unfilteredrepo, attr) - # The `requirement` attribut is initialiazed during __init__. But + # The `requirements` attribute is initialized during __init__. But # __getattr__ won't be called as it also exists on the class. We need # explicit forwarding to main repo here @property diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/revlog.py --- a/mercurial/revlog.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/revlog.py Tue May 14 23:04:23 2013 +0400 @@ -91,6 +91,14 @@ return bin[1:] raise RevlogError(_("unknown compression type %r") % t) +# index v0: +# 4 bytes: offset +# 4 bytes: compressed length +# 4 bytes: base rev +# 4 bytes: link rev +# 32 bytes: parent 1 nodeid +# 32 bytes: parent 2 nodeid +# 32 bytes: nodeid indexformatv0 = ">4l20s20s20s" v0shaoffset = 56 @@ -697,20 +705,15 @@ def ancestor(self, a, b): """calculate the least common ancestor of nodes a and b""" - # fast path, check if it is a descendant a, b = self.rev(a), self.rev(b) - start, end = sorted((a, b)) - if self.descendant(start, end): - return self.node(start) - - def parents(rev): - return [p for p in self.parentrevs(rev) if p != nullrev] - - c = ancestor.ancestor(a, b, parents) - if c is None: - return nullid - - return self.node(c) + try: + ancs = self.index.ancestors(a, b) + except (AttributeError, OverflowError): + ancs = ancestor.ancestors(self.parentrevs, a, b) + if ancs: + # choose a consistent winner when there's a tie + return min(map(self.node, ancs)) + return nullid def _match(self, id): if isinstance(id, int): diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/revset.py --- a/mercurial/revset.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/revset.py Tue May 14 23:04:23 2013 +0400 @@ -8,7 +8,6 @@ import re import parser, util, error, discovery, hbisect, phases import node -import bookmarks as bookmarksmod import match as matchmod from i18n import _ import encoding @@ -223,13 +222,9 @@ return stringset(repo, subset, x) def rangeset(repo, subset, x, y): - m = getset(repo, subset, x) - if not m: - m = getset(repo, list(repo), x) - - n = getset(repo, subset, y) - if not n: - n = getset(repo, list(repo), y) + cl = repo.changelog + m = getset(repo, cl, x) + n = getset(repo, cl, y) if not m or not n: return [] @@ -243,12 +238,10 @@ return [x for x in r if x in s] def dagrange(repo, subset, x, y): - if subset: - r = list(repo) - xs = _revsbetween(repo, getset(repo, r, x), getset(repo, r, y)) - s = set(subset) - return [r for r in xs if r in s] - return [] + r = list(repo) + xs = _revsbetween(repo, getset(repo, r, x), getset(repo, r, y)) + s = set(subset) + return [r for r in xs if r in s] def andset(repo, subset, x, y): return getset(repo, getset(repo, subset, x), y) @@ -282,20 +275,32 @@ return checkstatus(repo, subset, pat, 1) def ancestor(repo, subset, x): - """``ancestor(single, single)`` - Greatest common ancestor of the two changesets. + """``ancestor(*changeset)`` + Greatest common ancestor of the changesets. + + Accepts 0 or more changesets. + Will return empty list when passed no args. + Greatest common ancestor of a single changeset is that changeset. """ # i18n: "ancestor" is a keyword - l = getargs(x, 2, 2, _("ancestor requires two arguments")) - r = list(repo) - a = getset(repo, r, l[0]) - b = getset(repo, r, l[1]) - if len(a) != 1 or len(b) != 1: - # i18n: "ancestor" is a keyword - raise error.ParseError(_("ancestor arguments must be single revisions")) - an = [repo[a[0]].ancestor(repo[b[0]]).rev()] + l = getlist(x) + rl = list(repo) + anc = None - return [r for r in an if r in subset] + # (getset(repo, rl, i) for i in l) generates a list of lists + rev = repo.changelog.rev + ancestor = repo.changelog.ancestor + node = repo.changelog.node + for revs in (getset(repo, rl, i) for i in l): + for r in revs: + if anc is None: + anc = r + else: + anc = rev(ancestor(node(anc), node(r))) + + if anc is not None and anc in subset: + return [anc] + return [] def _ancestors(repo, subset, x, followfirst=False): args = getset(repo, list(repo), x) @@ -326,7 +331,7 @@ raise error.ParseError(_("~ expects a number")) ps = set() cl = repo.changelog - for r in getset(repo, subset, x): + for r in getset(repo, cl, x): for i in range(n): r = cl.parentrevs(r)[0] ps.add(r) @@ -379,14 +384,14 @@ _('the argument to bookmark must be a string')) kind, pattern, matcher = _stringmatcher(bm) if kind == 'literal': - bmrev = bookmarksmod.listbookmarks(repo).get(bm, None) + bmrev = repo._bookmarks.get(bm, None) if not bmrev: raise util.Abort(_("bookmark '%s' does not exist") % bm) bmrev = repo[bmrev].rev() return [r for r in subset if r == bmrev] else: matchrevs = set() - for name, bmrev in bookmarksmod.listbookmarks(repo).iteritems(): + for name, bmrev in repo._bookmarks.iteritems(): if matcher(name): matchrevs.add(bmrev) if not matchrevs: @@ -398,7 +403,7 @@ return [r for r in subset if r in bmrevs] bms = set([repo[r].rev() - for r in bookmarksmod.listbookmarks(repo).values()]) + for r in repo._bookmarks.values()]) return [r for r in subset if r in bms] def branch(repo, subset, x): @@ -1139,7 +1144,7 @@ raise error.ParseError(_("^ expects a number 0, 1, or 2")) ps = set() cl = repo.changelog - for r in getset(repo, subset, x): + for r in getset(repo, cl, x): if n == 0: ps.add(r) elif n == 1: @@ -1489,8 +1494,6 @@ s = set([repo[tn].rev()]) else: s = set([cl.rev(n) for t, n in repo.tagslist() if matcher(t)]) - if not s: - raise util.Abort(_("no tags exist that match '%s'") % pattern) else: s = set([cl.rev(n) for t, n in repo.tagslist() if t != 'tip']) return [r for r in subset if r in s] diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/scmposix.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/mercurial/scmposix.py Tue May 14 23:04:23 2013 +0400 @@ -0,0 +1,32 @@ +import sys, os +import osutil + +def _rcfiles(path): + rcs = [os.path.join(path, 'hgrc')] + rcdir = os.path.join(path, 'hgrc.d') + try: + rcs.extend([os.path.join(rcdir, f) + for f, kind in osutil.listdir(rcdir) + if f.endswith(".rc")]) + except OSError: + pass + return rcs + +def systemrcpath(): + path = [] + if sys.platform == 'plan9': + root = 'lib/mercurial' + else: + root = 'etc/mercurial' + # old mod_python does not set sys.argv + if len(getattr(sys, 'argv', [])) > 0: + p = os.path.dirname(os.path.dirname(sys.argv[0])) + path.extend(_rcfiles(os.path.join(p, root))) + path.extend(_rcfiles('/' + root)) + return path + +def userrcpath(): + if sys.platform == 'plan9': + return [os.environ['home'] + '/lib/hgrc'] + else: + return [os.path.expanduser('~/.hgrc')] diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/scmutil.py --- a/mercurial/scmutil.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/scmutil.py Tue May 14 23:04:23 2013 +0400 @@ -6,9 +6,18 @@ # GNU General Public License version 2 or any later version. from i18n import _ -import util, error, osutil, revset, similar, encoding, phases +from mercurial.node import nullrev +import util, error, osutil, revset, similar, encoding, phases, parsers import match as matchmod -import os, errno, re, stat, sys, glob +import os, errno, re, stat, glob + +if os.name == 'nt': + import scmwindows as scmplatform +else: + import scmposix as scmplatform + +systemrcpath = scmplatform.systemrcpath +userrcpath = scmplatform.userrcpath def nochangesfound(ui, repo, excluded=None): '''Report no changes for push/pull, excluded is None or a list of @@ -17,6 +26,10 @@ secretlist = [] if excluded: for n in excluded: + if n not in repo: + # discovery should not have included the filtered revision, + # we have to explicitly exclude it until discovery is cleanup. + continue ctx = repo[n] if ctx.phase() >= phases.secret and not ctx.extinct(): secretlist.append(n) @@ -28,11 +41,18 @@ ui.status(_("no changes found\n")) def checknewlabel(repo, lbl, kind): + # Do not use the "kind" parameter in ui output. + # It makes strings difficult to translate. if lbl in ['tip', '.', 'null']: raise util.Abort(_("the name '%s' is reserved") % lbl) for c in (':', '\0', '\n', '\r'): if c in lbl: raise util.Abort(_("%r cannot be used in a name") % c) + try: + int(lbl) + raise util.Abort(_("cannot use an integer as a name")) + except ValueError: + pass def checkfilename(f): '''Check that the filename f is an acceptable filename for a tracked file''' @@ -174,6 +194,13 @@ # want to add "foo/bar/baz" before checking if there's a "foo/.hg" self.auditeddir.update(prefixes) + def check(self, path): + try: + self(path) + return True + except (OSError, util.Abort): + return False + class abstractvfs(object): """Abstract base class; cannot be instantiated""" @@ -217,6 +244,9 @@ def isdir(self, path=None): return os.path.isdir(self.join(path)) + def islink(self, path=None): + return os.path.islink(self.join(path)) + def makedir(self, path=None, notindexed=True): return util.makedir(self.join(path), notindexed) @@ -229,6 +259,15 @@ def readdir(self, path=None, stat=None, skip=None): return osutil.listdir(self.join(path), stat, skip) + def rename(self, src, dst): + return util.rename(self.join(src), self.join(dst)) + + def readlink(self, path): + return os.readlink(self.join(path)) + + def setflags(self, path, l, x): + return util.setflags(self.join(path), l, x) + def stat(self, path=None): return os.stat(self.join(path)) @@ -238,9 +277,11 @@ This class is used to hide the details of COW semantics and remote file access from higher level code. ''' - def __init__(self, base, audit=True, expand=False): - if expand: - base = os.path.realpath(util.expandpath(base)) + def __init__(self, base, audit=True, expandpath=False, realpath=False): + if expandpath: + base = util.expandpath(base) + if realpath: + base = os.path.realpath(base) self.base = base self._setmustaudit(audit) self.createmode = None @@ -289,8 +330,7 @@ # to a directory. Let the posixfile() call below raise IOError. if basename: if atomictemp: - if not os.path.isdir(dirname): - util.makedirs(dirname, self.createmode) + util.ensuredirs(dirname, self.createmode) return util.atomictempfile(f, mode, self.createmode) try: if 'w' in mode: @@ -308,8 +348,7 @@ if e.errno != errno.ENOENT: raise nlink = 0 - if not os.path.isdir(dirname): - util.makedirs(dirname, self.createmode) + util.ensuredirs(dirname, self.createmode) if nlink > 0: if self._trustnlink is None: self._trustnlink = nlink > 1 or util.checknlink(f) @@ -328,9 +367,7 @@ except OSError: pass - dirname = os.path.dirname(linkname) - if not os.path.exists(dirname): - util.makedirs(dirname, self.createmode) + util.ensuredirs(os.path.dirname(linkname), self.createmode) if self._cansymlink: try: @@ -518,84 +555,6 @@ _rcpath = osrcpath() return _rcpath -if os.name != 'nt': - - def rcfiles(path): - rcs = [os.path.join(path, 'hgrc')] - rcdir = os.path.join(path, 'hgrc.d') - try: - rcs.extend([os.path.join(rcdir, f) - for f, kind in osutil.listdir(rcdir) - if f.endswith(".rc")]) - except OSError: - pass - return rcs - - def systemrcpath(): - path = [] - if sys.platform == 'plan9': - root = 'lib/mercurial' - else: - root = 'etc/mercurial' - # old mod_python does not set sys.argv - if len(getattr(sys, 'argv', [])) > 0: - p = os.path.dirname(os.path.dirname(sys.argv[0])) - path.extend(rcfiles(os.path.join(p, root))) - path.extend(rcfiles('/' + root)) - return path - - def userrcpath(): - if sys.platform == 'plan9': - return [os.environ['home'] + '/lib/hgrc'] - else: - return [os.path.expanduser('~/.hgrc')] - -else: - - import _winreg - - def systemrcpath(): - '''return default os-specific hgrc search path''' - rcpath = [] - filename = util.executablepath() - # Use mercurial.ini found in directory with hg.exe - progrc = os.path.join(os.path.dirname(filename), 'mercurial.ini') - if os.path.isfile(progrc): - rcpath.append(progrc) - return rcpath - # Use hgrc.d found in directory with hg.exe - progrcd = os.path.join(os.path.dirname(filename), 'hgrc.d') - if os.path.isdir(progrcd): - for f, kind in osutil.listdir(progrcd): - if f.endswith('.rc'): - rcpath.append(os.path.join(progrcd, f)) - return rcpath - # else look for a system rcpath in the registry - value = util.lookupreg('SOFTWARE\\Mercurial', None, - _winreg.HKEY_LOCAL_MACHINE) - if not isinstance(value, str) or not value: - return rcpath - value = util.localpath(value) - for p in value.split(os.pathsep): - if p.lower().endswith('mercurial.ini'): - rcpath.append(p) - elif os.path.isdir(p): - for f, kind in osutil.listdir(p): - if f.endswith('.rc'): - rcpath.append(os.path.join(p, f)) - return rcpath - - def userrcpath(): - '''return os-specific hgrc search path to the user dir''' - home = os.path.expanduser('~') - path = [os.path.join(home, 'mercurial.ini'), - os.path.join(home, '.hgrc')] - userprofile = os.environ.get('USERPROFILE') - if userprofile: - path.append(os.path.join(userprofile, 'mercurial.ini')) - path.append(os.path.join(userprofile, '.hgrc')) - return path - def revsingle(repo, revspec, default='.'): if not revspec: return repo[default] @@ -647,6 +606,8 @@ start, end = spec.split(_revrangesep, 1) start = revfix(repo, start, 0) end = revfix(repo, end, len(repo) - 1) + if end == nullrev and start <= 0: + start = nullrev rangeiter = repo.changelog.revs(start, end) if not seen and not l: # by far the most common case: revs = ["-1:0"] @@ -730,30 +691,33 @@ rejected = [] m.bad = lambda x, y: rejected.append(x) - for abs in repo.walk(m): - target = repo.wjoin(abs) - good = True - try: - audit_path(abs) - except (OSError, util.Abort): - good = False - rel = m.rel(abs) - exact = m.exact(abs) - if good and abs not in repo.dirstate: + ctx = repo[None] + dirstate = repo.dirstate + walkresults = dirstate.walk(m, sorted(ctx.substate), True, False) + for abs, st in walkresults.iteritems(): + dstate = dirstate[abs] + if dstate == '?' and audit_path.check(abs): unknown.append(abs) - if repo.ui.verbose or not exact: - repo.ui.status(_('adding %s\n') % ((pats and rel) or abs)) - elif (repo.dirstate[abs] != 'r' and - (not good or not os.path.lexists(target) or - (os.path.isdir(target) and not os.path.islink(target)))): + elif dstate != 'r' and not st: deleted.append(abs) - if repo.ui.verbose or not exact: - repo.ui.status(_('removing %s\n') % ((pats and rel) or abs)) # for finding renames - elif repo.dirstate[abs] == 'r': + elif dstate == 'r': removed.append(abs) - elif repo.dirstate[abs] == 'a': + elif dstate == 'a': added.append(abs) + + unknownset = set(unknown) + toprint = unknownset.copy() + toprint.update(deleted) + for abs in sorted(toprint): + if repo.ui.verbose or not m.exact(abs): + rel = m.rel(abs) + if abs in unknownset: + status = _('adding %s\n') % ((pats and rel) or abs) + else: + status = _('removing %s\n') % ((pats and rel) or abs) + repo.ui.status(status) + copies = {} if similarity > 0: for old, new, score in similar.findrenames(repo, @@ -780,49 +744,6 @@ return 1 return 0 -def updatedir(ui, repo, patches, similarity=0): - '''Update dirstate after patch application according to metadata''' - if not patches: - return [] - copies = [] - removes = set() - cfiles = patches.keys() - cwd = repo.getcwd() - if cwd: - cfiles = [util.pathto(repo.root, cwd, f) for f in patches.keys()] - for f in patches: - gp = patches[f] - if not gp: - continue - if gp.op == 'RENAME': - copies.append((gp.oldpath, gp.path)) - removes.add(gp.oldpath) - elif gp.op == 'COPY': - copies.append((gp.oldpath, gp.path)) - elif gp.op == 'DELETE': - removes.add(gp.path) - - wctx = repo[None] - for src, dst in copies: - dirstatecopy(ui, repo, wctx, src, dst, cwd=cwd) - if (not similarity) and removes: - wctx.remove(sorted(removes), True) - - for f in patches: - gp = patches[f] - if gp and gp.mode: - islink, isexec = gp.mode - dst = repo.wjoin(gp.path) - # patch won't create empty files - if gp.op == 'ADD' and not os.path.lexists(dst): - flags = (isexec and 'x' or '') + (islink and 'l' or '') - repo.wwrite(gp.path, '', flags) - util.setflags(dst, islink, isexec) - addremove(repo, cfiles, similarity=similarity) - files = patches.keys() - files.extend([r for r in removes if r not in files]) - return sorted(files) - def dirstatecopy(ui, repo, wctx, src, dst, dryrun=False, cwd=None): """Update the dirstate to reflect the intent of copying src to dst. For different reasons it might not end with dst being marked as copied from src. @@ -985,3 +906,48 @@ del obj.__dict__[self.name] except KeyError: raise AttributeError(self.name) + +class dirs(object): + '''a multiset of directory names from a dirstate or manifest''' + + def __init__(self, map, skip=None): + self._dirs = {} + addpath = self.addpath + if util.safehasattr(map, 'iteritems') and skip is not None: + for f, s in map.iteritems(): + if s[0] != skip: + addpath(f) + else: + for f in map: + addpath(f) + + def addpath(self, path): + dirs = self._dirs + for base in finddirs(path): + if base in dirs: + dirs[base] += 1 + return + dirs[base] = 1 + + def delpath(self, path): + dirs = self._dirs + for base in finddirs(path): + if dirs[base] > 1: + dirs[base] -= 1 + return + del dirs[base] + + def __iter__(self): + return self._dirs.iterkeys() + + def __contains__(self, d): + return d in self._dirs + +if util.safehasattr(parsers, 'dirs'): + dirs = parsers.dirs + +def finddirs(path): + pos = path.rfind('/') + while pos != -1: + yield path[:pos] + pos = path.rfind('/', 0, pos) diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/scmwindows.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/mercurial/scmwindows.py Tue May 14 23:04:23 2013 +0400 @@ -0,0 +1,46 @@ +import os +import osutil +import util +import _winreg + +def systemrcpath(): + '''return default os-specific hgrc search path''' + rcpath = [] + filename = util.executablepath() + # Use mercurial.ini found in directory with hg.exe + progrc = os.path.join(os.path.dirname(filename), 'mercurial.ini') + if os.path.isfile(progrc): + rcpath.append(progrc) + return rcpath + # Use hgrc.d found in directory with hg.exe + progrcd = os.path.join(os.path.dirname(filename), 'hgrc.d') + if os.path.isdir(progrcd): + for f, kind in osutil.listdir(progrcd): + if f.endswith('.rc'): + rcpath.append(os.path.join(progrcd, f)) + return rcpath + # else look for a system rcpath in the registry + value = util.lookupreg('SOFTWARE\\Mercurial', None, + _winreg.HKEY_LOCAL_MACHINE) + if not isinstance(value, str) or not value: + return rcpath + value = util.localpath(value) + for p in value.split(os.pathsep): + if p.lower().endswith('mercurial.ini'): + rcpath.append(p) + elif os.path.isdir(p): + for f, kind in osutil.listdir(p): + if f.endswith('.rc'): + rcpath.append(os.path.join(p, f)) + return rcpath + +def userrcpath(): + '''return os-specific hgrc search path to the user dir''' + home = os.path.expanduser('~') + path = [os.path.join(home, 'mercurial.ini'), + os.path.join(home, '.hgrc')] + userprofile = os.environ.get('USERPROFILE') + if userprofile: + path.append(os.path.join(userprofile, 'mercurial.ini')) + path.append(os.path.join(userprofile, '.hgrc')) + return path diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/sshpeer.py --- a/mercurial/sshpeer.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/sshpeer.py Tue May 14 23:04:23 2013 +0400 @@ -70,7 +70,10 @@ (_serverquote(remotecmd), _serverquote(self.path)))) ui.note(_('running %s\n') % cmd) cmd = util.quotecommand(cmd) - self.pipeo, self.pipei, self.pipee = util.popen3(cmd) + + # while self.subprocess isn't used, having it allows the subprocess to + # to clean up correctly later + self.pipeo, self.pipei, self.pipee, self.subprocess = util.popen4(cmd) # skip any noise generated by remote shell self._callstream("hello") diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/sslutil.py --- a/mercurial/sslutil.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/sslutil.py Tue May 14 23:04:23 2013 +0400 @@ -99,7 +99,7 @@ self.ui = ui self.host = host - def __call__(self, sock): + def __call__(self, sock, strict=False): host = self.host cacerts = self.ui.config('web', 'cacerts') hostfingerprint = self.ui.config('hostfingerprints', host) @@ -107,13 +107,22 @@ if hostfingerprint: raise util.Abort(_("host fingerprint for %s can't be " "verified (Python too old)") % host) + if strict: + raise util.Abort(_("certificate for %s can't be verified " + "(Python too old)") % host) if self.ui.configbool('ui', 'reportoldssl', True): self.ui.warn(_("warning: certificate for %s can't be verified " "(Python too old)\n") % host) return + if not sock.cipher(): # work around http://bugs.python.org/issue13721 raise util.Abort(_('%s ssl connection error') % host) - peercert = sock.getpeercert(True) + try: + peercert = sock.getpeercert(True) + peercert2 = sock.getpeercert() + except AttributeError: + raise util.Abort(_('%s ssl connection error') % host) + if not peercert: raise util.Abort(_('%s certificate error: ' 'no certificate received') % host) @@ -129,13 +138,18 @@ self.ui.debug('%s certificate matched fingerprint %s\n' % (host, nicefingerprint)) elif cacerts: - msg = _verifycert(sock.getpeercert(), host) + msg = _verifycert(peercert2, host) if msg: raise util.Abort(_('%s certificate error: %s') % (host, msg), hint=_('configure hostfingerprint %s or use ' '--insecure to connect insecurely') % nicefingerprint) self.ui.debug('%s certificate successfully verified\n' % host) + elif strict: + raise util.Abort(_('%s certificate with fingerprint %s not ' + 'verified') % (host, nicefingerprint), + hint=_('check hostfingerprints or web.cacerts ' + 'config setting')) else: self.ui.warn(_('warning: %s certificate with fingerprint %s not ' 'verified (check hostfingerprints or web.cacerts ' diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/statichttprepo.py --- a/mercurial/statichttprepo.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/statichttprepo.py Tue May 14 23:04:23 2013 +0400 @@ -10,7 +10,7 @@ from i18n import _ import changelog, byterange, url, error import localrepo, manifest, util, scmutil, store -import urllib, urllib2, errno +import urllib, urllib2, errno, os class httprangereader(object): def __init__(self, url, opener): diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/subrepo.py --- a/mercurial/subrepo.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/subrepo.py Tue May 14 23:04:23 2013 +0400 @@ -5,7 +5,7 @@ # This software may be used and distributed according to the terms of the # GNU General Public License version 2 or any later version. -import errno, os, re, xml.dom.minidom, shutil, posixpath +import errno, os, re, xml.dom.minidom, shutil, posixpath, sys import stat, subprocess, tarfile from i18n import _ import config, scmutil, util, node, error, cmdutil, bookmarks, match as matchmod @@ -14,11 +14,34 @@ nullstate = ('', '', 'empty') +def _expandedabspath(path): + ''' + get a path or url and if it is a path expand it and return an absolute path + ''' + expandedpath = util.urllocalpath(util.expandpath(path)) + u = util.url(expandedpath) + if not u.scheme: + path = util.normpath(os.path.abspath(u.path)) + return path + +def _getstorehashcachename(remotepath): + '''get a unique filename for the store hash cache of a remote repository''' + return util.sha1(_expandedabspath(remotepath)).hexdigest()[0:12] + +def _calcfilehash(filename): + data = '' + if os.path.exists(filename): + fd = open(filename, 'rb') + data = fd.read() + fd.close() + return util.sha1(data).hexdigest() + class SubrepoAbort(error.Abort): """Exception class used to avoid handling a subrepo error more than once""" def __init__(self, *args, **kw): error.Abort.__init__(self, *args, **kw) self.subrepo = kw.get('subrepo') + self.cause = kw.get('cause') def annotatesubrepoerror(func): def decoratedmethod(self, *args, **kargs): @@ -31,7 +54,8 @@ subrepo = subrelpath(self) errormsg = str(ex) + ' ' + _('(in subrepo %s)') % subrepo # avoid handling this exception by raising a SubrepoAbort exception - raise SubrepoAbort(errormsg, hint=ex.hint, subrepo=subrepo) + raise SubrepoAbort(errormsg, hint=ex.hint, subrepo=subrepo, + cause=sys.exc_info()) return res return decoratedmethod @@ -264,6 +288,9 @@ return repo.ui.config('paths', 'default-push') if repo.ui.config('paths', 'default'): return repo.ui.config('paths', 'default') + if repo.sharedpath != repo.path: + # chop off the .hg component to get the default path form + return os.path.dirname(repo.sharedpath) if abort: raise util.Abort(_("default path for subrepository not found")) @@ -297,6 +324,13 @@ class abstractsubrepo(object): + def storeclean(self, path): + """ + returns true if the repository has not changed since it was last + cloned from or pushed to a given repository. + """ + return False + def dirty(self, ignoreupdate=False): """returns true if the dirstate of the subrepo is dirty or does not match current stored state. If ignoreupdate is true, only check @@ -389,6 +423,7 @@ ui.progress(_('archiving (%s)') % relpath, i + 1, unit=_('files'), total=total) ui.progress(_('archiving (%s)') % relpath, None) + return total def walk(self, match): ''' @@ -420,8 +455,75 @@ v = r.ui.config(s, k) if v: self._repo.ui.setconfig(s, k, v) + self._repo.ui.setconfig('ui', '_usedassubrepo', 'True') self._initrepo(r, state[0], create) + def storeclean(self, path): + clean = True + lock = self._repo.lock() + itercache = self._calcstorehash(path) + try: + for filehash in self._readstorehashcache(path): + if filehash != itercache.next(): + clean = False + break + except StopIteration: + # the cached and current pull states have a different size + clean = False + if clean: + try: + itercache.next() + # the cached and current pull states have a different size + clean = False + except StopIteration: + pass + lock.release() + return clean + + def _calcstorehash(self, remotepath): + '''calculate a unique "store hash" + + This method is used to to detect when there are changes that may + require a push to a given remote path.''' + # sort the files that will be hashed in increasing (likely) file size + filelist = ('bookmarks', 'store/phaseroots', 'store/00changelog.i') + yield '# %s\n' % _expandedabspath(remotepath) + for relname in filelist: + absname = os.path.normpath(self._repo.join(relname)) + yield '%s = %s\n' % (relname, _calcfilehash(absname)) + + def _getstorehashcachepath(self, remotepath): + '''get a unique path for the store hash cache''' + return self._repo.join(os.path.join( + 'cache', 'storehash', _getstorehashcachename(remotepath))) + + def _readstorehashcache(self, remotepath): + '''read the store hash cache for a given remote repository''' + cachefile = self._getstorehashcachepath(remotepath) + if not os.path.exists(cachefile): + return '' + fd = open(cachefile, 'r') + pullstate = fd.readlines() + fd.close() + return pullstate + + def _cachestorehash(self, remotepath): + '''cache the current store hash + + Each remote repo requires its own store hash cache, because a subrepo + store may be "clean" versus a given remote repo, but not versus another + ''' + cachefile = self._getstorehashcachepath(remotepath) + lock = self._repo.lock() + storehash = list(self._calcstorehash(remotepath)) + cachedir = os.path.dirname(cachefile) + if not os.path.exists(cachedir): + util.makedirs(cachedir, notindexed=True) + fd = open(cachefile, 'w') + fd.writelines(storehash) + fd.close() + lock.release() + @annotatesubrepoerror def _initrepo(self, parentrepo, source, create): self._repo._subparent = parentrepo @@ -479,14 +581,15 @@ @annotatesubrepoerror def archive(self, ui, archiver, prefix, match=None): self._get(self._state + ('hg',)) - abstractsubrepo.archive(self, ui, archiver, prefix, match) - + total = abstractsubrepo.archive(self, ui, archiver, prefix, match) rev = self._state[1] ctx = self._repo[rev] for subpath in ctx.substate: s = subrepo(ctx, subpath) submatch = matchmod.narrowmatcher(subpath, match) - s.archive(ui, archiver, os.path.join(prefix, self._path), submatch) + total += s.archive( + ui, archiver, os.path.join(prefix, self._path), submatch) + return total @annotatesubrepoerror def dirty(self, ignoreupdate=False): @@ -540,12 +643,18 @@ update=False) self._repo = cloned.local() self._initrepo(parentrepo, source, create=True) + self._cachestorehash(srcurl) else: self._repo.ui.status(_('pulling subrepo %s from %s\n') % (subrelpath(self), srcurl)) + cleansub = self.storeclean(srcurl) + remotebookmarks = other.listkeys('bookmarks') self._repo.pull(other) - bookmarks.updatefromremote(self._repo.ui, self._repo, other, - srcurl) + bookmarks.updatefromremote(self._repo.ui, self._repo, + remotebookmarks, srcurl) + if cleansub: + # keep the repo clean after pull + self._cachestorehash(srcurl) @annotatesubrepoerror def get(self, state, overwrite=False): @@ -595,10 +704,20 @@ return False dsturl = _abssource(self._repo, True) + if not force: + if self.storeclean(dsturl): + self._repo.ui.status( + _('no changes made to subrepo %s since last push to %s\n') + % (subrelpath(self), dsturl)) + return None self._repo.ui.status(_('pushing subrepo %s to %s\n') % (subrelpath(self), dsturl)) other = hg.peer(self._repo, {'ssh': ssh}, dsturl) - return self._repo.push(other, force, newbranch=newbranch) + res = self._repo.push(other, force, newbranch=newbranch) + + # the repo is now clean + self._cachestorehash(dsturl) + return res @annotatesubrepoerror def outgoing(self, ui, dest, opts): @@ -649,7 +768,7 @@ opts['rev'] = substate[1] pats = [] - if not opts['all']: + if not opts.get('all'): pats = ['set:modified()'] self.filerevert(ui, *pats, **opts) @@ -659,7 +778,7 @@ def filerevert(self, ui, *pats, **opts): ctx = self._repo[opts['rev']] parents = self._repo.dirstate.parents() - if opts['all']: + if opts.get('all'): pats = ['set:modified()'] else: pats = [] @@ -1147,7 +1266,7 @@ if remote not in tracking: # create a new local tracking branch - local = remote.split('/', 2)[2] + local = remote.split('/', 3)[3] checkout(['-b', local, remote]) elif self._gitisancestor(branch2rev[tracking[remote]], remote): # When updating to a tracked remote branch, @@ -1266,9 +1385,10 @@ os.remove(path) def archive(self, ui, archiver, prefix, match=None): + total = 0 source, revision = self._state if not revision: - return + return total self._fetch(source, revision) # Parse git's native archive command. @@ -1289,9 +1409,11 @@ data = tar.extractfile(info).read() archiver.addfile(os.path.join(prefix, self._path, info.name), info.mode, info.issym(), data) + total += 1 ui.progress(_('archiving (%s)') % relpath, i + 1, unit=_('files')) ui.progress(_('archiving (%s)') % relpath, None) + return total @annotatesubrepoerror diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/tags.py --- a/mercurial/tags.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/tags.py Tue May 14 23:04:23 2013 +0400 @@ -132,9 +132,10 @@ if (bnode != anode and anode in bhist and (bnode not in ahist or len(bhist) > len(ahist))): anode = bnode + else: + tagtypes[name] = tagtype ahist.extend([n for n in bhist if n not in ahist]) alltags[name] = anode, ahist - tagtypes[name] = tagtype # The tag cache only stores info about heads, not the tag contents diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templatefilters.py --- a/mercurial/templatefilters.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templatefilters.py Tue May 14 23:04:23 2013 +0400 @@ -5,6 +5,7 @@ # This software may be used and distributed according to the terms of the # GNU General Public License version 2 or any later version. +from i18n import _ import cgi, re, os, time, urllib import encoding, node, util, error import hbisect @@ -391,6 +392,15 @@ "xmlescape": xmlescape, } +def websub(text, websubtable): + """:websub: Any text. Only applies to hgweb. Applies the regular + expression replacements defined in the websub section. + """ + if websubtable: + for regexp, format in websubtable: + text = regexp.sub(format, text) + return text + def fillfunc(context, mapping, args): if not (1 <= len(args) <= 2): raise error.ParseError(_("fill expects one or two arguments")) diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templatekw.py --- a/mercurial/templatekw.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templatekw.py Tue May 14 23:04:23 2013 +0400 @@ -14,9 +14,13 @@ # "{files % '{file}\n'}" (hgweb-style with inlining and function support) class _hybrid(object): - def __init__(self, gen, values): + def __init__(self, gen, values, joinfmt=None): self.gen = gen self.values = values + if joinfmt: + self.joinfmt = joinfmt + else: + self.joinfmt = lambda x: x.values()[0] def __iter__(self): return self.gen def __call__(self): @@ -244,8 +248,8 @@ copies.append((fn, rename[0])) c = [{'name': x[0], 'source': x[1]} for x in copies] - return showlist('file_copy', c, plural='file_copies', - element='file', **args) + f = _showlist('file_copy', c, plural='file_copies', **args) + return _hybrid(f, c, lambda x: '%s (%s)' % (x['name'], x['source'])) # showfilecopiesswitch() displays file copies only if copy records are # provided before calling the templater, usually with a --copies @@ -256,8 +260,8 @@ """ copies = args['revcache'].get('copies') or [] c = [{'name': x[0], 'source': x[1]} for x in copies] - return showlist('file_copy', c, plural='file_copies', - element='file', **args) + f = _showlist('file_copy', c, plural='file_copies', **args) + return _hybrid(f, c, lambda x: '%s (%s)' % (x['name'], x['source'])) def showfiledels(**args): """:file_dels: List of strings. Files removed by this changeset.""" diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templater.py --- a/mercurial/templater.py Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templater.py Tue May 14 23:04:23 2013 +0400 @@ -9,6 +9,7 @@ import sys, os, re import util, config, templatefilters, parser, error import types +import minirst # template parsing @@ -207,6 +208,19 @@ f = context._filters[n] return (runfilter, (args[0][0], args[0][1], f)) +def get(context, mapping, args): + if len(args) != 2: + # i18n: "get" is a keyword + raise error.ParseError(_("get() expects two arguments")) + + dictarg = args[0][0](context, mapping, args[0][1]) + if not util.safehasattr(dictarg, 'get'): + # i18n: "get" is a keyword + raise error.ParseError(_("get() expects a dict as first argument")) + + key = args[1][0](context, mapping, args[1][1]) + yield dictarg.get(key) + def join(context, mapping, args): if not (1 <= len(args) <= 2): # i18n: "join" is a keyword @@ -214,7 +228,8 @@ joinset = args[0][0](context, mapping, args[0][1]) if util.safehasattr(joinset, '__call__'): - joinset = [x.values()[0] for x in joinset()] + jf = joinset.joinfmt + joinset = [jf(x) for x in joinset()] joiner = " " if len(args) > 1: @@ -236,6 +251,8 @@ pat = stringify(args[0][0](context, mapping, args[0][1])) rpl = stringify(args[1][0](context, mapping, args[1][1])) src = stringify(args[2][0](context, mapping, args[2][1])) + src = stringify(runtemplate(context, mapping, + compiletemplate(src, context))) yield re.sub(pat, rpl, src) def if_(context, mapping, args): @@ -274,6 +291,16 @@ t = stringify(args[1][0](context, mapping, args[1][1])) yield runtemplate(context, mapping, compiletemplate(t, context)) +def rstdoc(context, mapping, args): + if len(args) != 2: + # i18n: "rstdoc" is a keyword + raise error.ParseError(_("rstdoc expects two arguments")) + + text = stringify(args[0][0](context, mapping, args[0][1])) + style = stringify(args[1][0](context, mapping, args[1][1])) + + return minirst.format(text, style=style, keep=['verbose']) + methods = { "string": lambda e, c: (runstring, e[1]), "symbol": lambda e, c: (runsymbol, e[1]), @@ -285,11 +312,13 @@ } funcs = { + "get": get, "if": if_, "ifeq": ifeq, "join": join, + "label": label, + "rstdoc": rstdoc, "sub": sub, - "label": label, } # template engine diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/atom/bookmarkentry.tmpl --- a/mercurial/templates/atom/bookmarkentry.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/atom/bookmarkentry.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,7 +1,7 @@ {bookmark|escape} - - {urlbase}{url}#bookmark-{node} + + {urlbase}{url|urlescape}#bookmark-{node} {date|rfc3339date} {date|rfc3339date} {bookmark|strip|escape} diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/atom/bookmarks.tmpl --- a/mercurial/templates/atom/bookmarks.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/atom/bookmarks.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,7 +1,7 @@ {header} - {urlbase}{url} - - + {urlbase}{url|urlescape} + + {repo|escape}: bookmarks {repo|escape} bookmark history Mercurial SCM diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/atom/branchentry.tmpl --- a/mercurial/templates/atom/branchentry.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/atom/branchentry.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,7 +1,7 @@ {branch|escape} - - {urlbase}{url}#branch-{node} + + {urlbase}{url|urlescape}#branch-{node} {date|rfc3339date} {date|rfc3339date} diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/atom/branches.tmpl --- a/mercurial/templates/atom/branches.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/atom/branches.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,7 +1,7 @@ {header} - {urlbase}{url} - - + {urlbase}{url|urlescape} + + {repo|escape}: branches {repo|escape} branch history Mercurial SCM diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/atom/changelog.tmpl --- a/mercurial/templates/atom/changelog.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/atom/changelog.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,8 +1,8 @@ {header} - {urlbase}{url} - - + {urlbase}{url|urlescape} + + {repo|escape} Changelog {latestentry%feedupdated} diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/atom/changelogentry.tmpl --- a/mercurial/templates/atom/changelogentry.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/atom/changelogentry.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,7 +1,7 @@ {desc|strip|firstline|strip|escape|nonempty} - {urlbase}{url}#changeset-{node} - + {urlbase}{url|urlescape}#changeset-{node} + {author|person|escape} {author|email|obfuscate} diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/atom/error.tmpl --- a/mercurial/templates/atom/error.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/atom/error.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,8 +1,8 @@ {header} - {urlbase}{url} - - + {urlbase}{url|urlescape} + + Error 1970-01-01T00:00:00+00:00 diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/atom/filelog.tmpl --- a/mercurial/templates/atom/filelog.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/atom/filelog.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,6 +1,6 @@ {header} - {urlbase}{url}atom-log/tip/{file|escape} - + {urlbase}{url|urlescape}atom-log/tip/{file|escape} + {repo|escape}: {file|escape} history {latestentry%feedupdated} diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/atom/tagentry.tmpl --- a/mercurial/templates/atom/tagentry.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/atom/tagentry.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,7 +1,7 @@ {tag|escape} - - {urlbase}{url}#tag-{node} + + {urlbase}{url|urlescape}#tag-{node} {date|rfc3339date} {date|rfc3339date} {tag|strip|escape} diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/atom/tags.tmpl --- a/mercurial/templates/atom/tags.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/atom/tags.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,7 +1,7 @@ {header} - {urlbase}{url} - - + {urlbase}{url|urlescape} + + {repo|escape}: tags {repo|escape} tag history Mercurial SCM diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/coal/header.tmpl --- a/mercurial/templates/coal/header.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/coal/header.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,7 +1,7 @@ - + - - + + diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/coal/map --- a/mercurial/templates/coal/map Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/coal/map Tue May 14 23:04:23 2013 +0400 @@ -13,14 +13,21 @@ help = ../paper/help.tmpl helptopics = ../paper/helptopics.tmpl -helpentry = '{topic|escape}{summary|escape}' +helpentry = ' + + + {topic|escape} + + + {summary|escape} + ' -naventry = '{label|escape} ' -navshortentry = '{label|escape} ' -navgraphentry = '{label|escape} ' -filenaventry = '{label|escape} ' -filedifflink = '{file|escape} ' -filenodelink = '{file|escape} ' +naventry = '{label|escape} ' +navshortentry = '{label|escape} ' +navgraphentry = '{label|escape} ' +filenaventry = '{label|escape} ' +filedifflink = '{file|escape} ' +filenodelink = '{file|escape} ' filenolink = '{file|escape} ' fileellipses = '...' diffstatlink = ../paper/diffstat.tmpl @@ -38,10 +45,10 @@ direntry = ' - - dir. {basename|escape}/ + + dir. {basename|escape}/ - + {emptydirs|escape} @@ -52,8 +59,8 @@ fileentry = ' - - file {basename|escape} + + file {basename|escape} {size} @@ -72,7 +79,7 @@ annotateline = ' - {author|user}@{rev} {linenumber} {line|escape} @@ -97,19 +104,19 @@ changelogparent = ' parent {rev}: - {node|short} + {node|short} ' -changesetparent = '{node|short} ' +changesetparent = '{node|short} ' -filerevparent = '{rename%filerename}{node|short} ' -filerevchild = '{node|short} ' +filerevparent = '{rename%filerename}{node|short} ' +filerevchild = '{node|short} ' filerename = '{file|escape}@' filelogrename = ' base - + {file|escape}@{node|short} ' @@ -117,17 +124,17 @@ parent: - + {rename%filerename}{node|short} ' -changesetchild = ' {node|short}' +changesetchild = ' {node|short}' changelogchild = ' child - + {node|short} @@ -136,7 +143,7 @@ child: - + {node|short} @@ -145,7 +152,7 @@ tagentry = ' - + {tag|escape} @@ -157,7 +164,7 @@ bookmarkentry = ' - + {bookmark|escape} @@ -169,7 +176,7 @@ branchentry = ' - + {branch|escape} @@ -186,41 +193,41 @@ filediffparent = ' parent {rev}: - {node|short} + {node|short} ' filelogparent = ' parent {rev}: - {node|short} + {node|short} ' filediffchild = ' child {rev}: - {node|short} + {node|short} ' filelogchild = ' child {rev}: - {node|short} + {node|short} ' indexentry = ' - {name|escape} + {name|escape} {description} {contact|obfuscate} {lastchange|rfc822date} {archives%indexarchiveentry} \n' -indexarchiveentry = ' ↓{type|escape}' +indexarchiveentry = ' ↓{type|escape}' index = ../paper/index.tmpl archiveentry = '
  • - {type|escape} + {type|escape}
  • ' notfound = ../paper/notfound.tmpl error = ../paper/error.tmpl urlparameter = '{separator}{name}={value|urlescape}' hiddenformentry = '' -breadcrumb = '> {name} ' +breadcrumb = '> {name|escape} ' diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/bookmarks.tmpl --- a/mercurial/templates/gitweb/bookmarks.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/bookmarks.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,9 +1,9 @@ {header} {repo|escape}: Bookmarks + href="{url|urlescape}atom-bookmarks" title="Atom feed for {repo|escape}"/> + href="{url|urlescape}rss-bookmarks" title="RSS feed for {repo|escape}"/> @@ -13,15 +13,15 @@ diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/branches.tmpl --- a/mercurial/templates/gitweb/branches.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/branches.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,9 +1,9 @@ {header} {repo|escape}: Branches + href="{url|urlescape}atom-branches" title="Atom feed for {repo|escape}"/> + href="{url|urlescape}rss-branches" title="RSS feed for {repo|escape}"/> @@ -13,15 +13,15 @@ diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/changelog.tmpl --- a/mercurial/templates/gitweb/changelog.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/changelog.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,9 +1,9 @@ {header} {repo|escape}: Changelog + href="{url|urlescape}atom-log" title="Atom feed for {repo|escape}"/> + href="{url|urlescape}rss-log" title="RSS feed for {repo|escape}"/> @@ -12,7 +12,7 @@ Mercurial {pathdef%breadcrumb} / changelog -
    + {sessionvars%hiddenformentry}
    @@ -41,7 +41,7 @@
    -{desc|strip|escape|addbreaks|nonempty} +{desc|strip|escape|websub|addbreaks|nonempty}
    diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/error.tmpl --- a/mercurial/templates/gitweb/error.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/error.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,9 +1,9 @@ {header} {repo|escape}: Error + href="{url|urlescape}atom-log" title="Atom feed for {repo|escape}"/> + href="{url|urlescape}rss-log" title="RSS feed for {repo|escape}"/> @@ -13,14 +13,14 @@
    diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/fileannotate.tmpl --- a/mercurial/templates/gitweb/fileannotate.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/fileannotate.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,9 +1,9 @@ {header} {repo|escape}: {file|escape}@{node|short} (annotated) + href="{url|urlescape}atom-log" title="Atom feed for {repo|escape}"/> + href="{url|urlescape}rss-log" title="RSS feed for {repo|escape}"/> @@ -13,23 +13,23 @@ @@ -46,7 +46,7 @@ {branch%filerevbranch} changeset {rev} - {node|short} + {node|short} {parent%fileannotateparent} {child%fileannotatechild} @@ -56,7 +56,7 @@
    -{desc|strip|escape|addbreaks|nonempty} +{desc|strip|escape|websub|addbreaks|nonempty}
    diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/filecomparison.tmpl --- a/mercurial/templates/gitweb/filecomparison.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/filecomparison.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,9 +1,9 @@ {header} {repo|escape}: comparison {file|escape} + href="{url|urlescape}atom-log" title="Atom feed for {repo|escape}"/> + href="{url|urlescape}rss-log" title="RSS feed for {repo|escape}"/> @@ -13,23 +13,23 @@ @@ -39,7 +39,7 @@ {branch%filerevbranch} - + {parent%filecompparent} {child%filecompchild}
    changeset {rev}{node|short}
    {node|short}
    diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/filediff.tmpl --- a/mercurial/templates/gitweb/filediff.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/filediff.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,9 +1,9 @@ {header} {repo|escape}: diff {file|escape} + href="{url|urlescape}atom-log" title="Atom feed for {repo|escape}"/> + href="{url|urlescape}rss-log" title="RSS feed for {repo|escape}"/> @@ -13,23 +13,23 @@
    @@ -39,7 +39,7 @@ {branch%filerevbranch} changeset {rev} - {node|short} + {node|short} {parent%filediffparent} {child%filediffchild} diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/filelog.tmpl --- a/mercurial/templates/gitweb/filelog.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/filelog.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,9 +1,9 @@ {header} {repo|escape}: File revisions + href="{url|urlescape}atom-log" title="Atom feed for {repo|escape}"/> + href="{url|urlescape}rss-log" title="RSS feed for {repo|escape}"/> @@ -13,20 +13,20 @@ diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/filerevision.tmpl --- a/mercurial/templates/gitweb/filerevision.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/filerevision.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,9 +1,9 @@ {header} {repo|escape}: {file|escape}@{node|short} + href="{url|urlescape}atom-log" title="Atom feed for {repo|escape}"/> + href="{url|urlescape}rss-log" title="RSS feed for {repo|escape}"/> @@ -13,23 +13,23 @@ @@ -46,7 +46,7 @@ {branch%filerevbranch} changeset {rev} - {node|short} + {node|short} {parent%filerevparent} {child%filerevchild} @@ -56,7 +56,7 @@
    -{desc|strip|escape|addbreaks|nonempty} +{desc|strip|escape|websub|addbreaks|nonempty}
    diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/footer.tmpl --- a/mercurial/templates/gitweb/footer.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/footer.tmpl Tue May 14 23:04:23 2013 +0400 @@ -2,8 +2,8 @@ -
    + {sessionvars%hiddenformentry}
    @@ -89,7 +89,7 @@ } var item = '
  • '; - item += '' + cur[3] + ''; + item += '' + cur[3] + ''; item += ' ' + tagspan + ''; item += '' + cur[5] + ', by ' + cur[4] + '
  • '; @@ -103,8 +103,8 @@ diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/header.tmpl --- a/mercurial/templates/gitweb/header.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/header.tmpl Tue May 14 23:04:23 2013 +0400 @@ -2,7 +2,7 @@ - + - - + + diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/help.tmpl --- a/mercurial/templates/gitweb/help.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/help.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,9 +1,9 @@ {header} {repo|escape}: Branches + href="{url|urlescape}atom-tags" title="Atom feed for {repo|escape}"/> + href="{url|urlescape}rss-tags" title="RSS feed for {repo|escape}"/> @@ -13,22 +13,22 @@
     
    -
    -{doc|escape}
    -
    +
    +{rstdoc(doc, "html")} +
    {footer} diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/helptopics.tmpl --- a/mercurial/templates/gitweb/helptopics.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/helptopics.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,9 +1,9 @@ {header} {repo|escape}: Branches + href="{url|urlescape}atom-tags" title="Atom feed for {repo|escape}"/> + href="{url|urlescape}rss-tags" title="RSS feed for {repo|escape}"/> @@ -13,14 +13,14 @@ diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/manifest.tmpl --- a/mercurial/templates/gitweb/manifest.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/manifest.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,9 +1,9 @@ {header} {repo|escape}: files + href="{url|urlescape}atom-log" title="Atom feed for {repo|escape}"/> + href="{url|urlescape}rss-log" title="RSS feed for {repo|escape}"/> @@ -13,16 +13,16 @@ @@ -32,7 +32,7 @@ drwxr-xr-x -[up] +[up]   {dentries%direntry} diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/map --- a/mercurial/templates/gitweb/map Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/map Tue May 14 23:04:23 2013 +0400 @@ -11,35 +11,42 @@ help = help.tmpl helptopics = helptopics.tmpl -helpentry = '{topic|escape}{summary|escape}' +helpentry = ' + + + {topic|escape} + + + {summary|escape} + ' -naventry = '{label|escape} ' -navshortentry = '{label|escape} ' -navgraphentry = '{label|escape} ' -filenaventry = '{label|escape} ' -filedifflink = '{file|escape} ' +naventry = '{label|escape} ' +navshortentry = '{label|escape} ' +navgraphentry = '{label|escape} ' +filenaventry = '{label|escape} ' +filedifflink = '{file|escape} ' filenodelink = ' - {file|escape} + {file|escape} - file | - annotate | - diff | - comparison | - revisions + file | + annotate | + diff | + comparison | + revisions ' filenolink = ' - {file|escape} + {file|escape} file | annotate | - diff | - comparison | - revisions + diff | + comparison | + revisions ' @@ -59,11 +66,11 @@ - {basename|escape} - {emptydirs|escape} + {basename|escape} + {emptydirs|escape} - files + files ' fileentry = ' @@ -72,12 +79,12 @@ {date|isodate} {size} - {basename|escape} + {basename|escape} - file | - revisions | - annotate + file | + revisions | + annotate ' filerevision = filerevision.tmpl @@ -92,7 +99,7 @@ annotateline = ' - {author|user}@{rev}
    {linenumber}
    @@ -117,34 +124,34 @@ parent {rev}: - {node|short} + {node|short} ' -changesetbranch = 'branch{name}' +changesetbranch = 'branch{name|escape}' changesetparent = ' parent {rev} - {node|short} + {node|short} ' -filerevbranch = 'branch{name}' +filerevbranch = 'branch{name|escape}' filerevparent = ' parent {rev} - + {rename%filerename}{node|short} ' filerename = '{file|escape}@' -filelogrename = '| base' +filelogrename = '| base' fileannotateparent = ' parent {rev} - + {rename%filerename}{node|short} @@ -152,59 +159,59 @@ changelogchild = ' child {rev}: - {node|short} + {node|short} ' changesetchild = ' child {rev} - {node|short} + {node|short} ' filerevchild = ' child {rev} - {node|short} + {node|short} ' fileannotatechild = ' child {rev} - {node|short} + {node|short} ' tags = tags.tmpl tagentry = ' {date|rfc822date} - {tag|escape} + {tag|escape} - changeset | - changelog | - files + changeset | + changelog | + files ' bookmarks = bookmarks.tmpl bookmarkentry = ' {date|rfc822date} - {bookmark|escape} + {bookmark|escape} - changeset | - changelog | - files + changeset | + changelog | + files ' branches = branches.tmpl branchentry = ' {date|rfc822date} - {node|short} + {node|short} {branch|escape} - changeset | - changelog | - files + changeset | + changelog | + files ' diffblock = '
    {lines}
    ' @@ -212,7 +219,7 @@ parent {rev} - + {node|short} @@ -221,7 +228,7 @@ parent {rev} - + {node|short} @@ -229,64 +236,64 @@ filelogparent = ' parent {rev}:  - {node|short} + {node|short} ' filediffchild = ' child {rev} - {node|short} + {node|short} ' filecompchild = ' child {rev} - {node|short} + {node|short} ' filelogchild = ' child {rev}:  - {node|short} + {node|short} ' shortlog = shortlog.tmpl graph = graph.tmpl -tagtag = '{name} ' -branchtag = '{name} ' -inbranchtag = '{name} ' -bookmarktag = '{name} ' +tagtag = '{name|escape} ' +branchtag = '{name|escape} ' +inbranchtag = '{name|escape} ' +bookmarktag = '{name|escape} ' shortlogentry = ' {date|rfc822date} {author|person} - + {desc|strip|firstline|escape|nonempty} {inbranch%inbranchtag}{branches%branchtag}{tags%tagtag}{bookmarks%bookmarktag} - changeset | - files + changeset | + files ' filelogentry = ' {date|rfc822date} - + {desc|strip|firstline|escape|nonempty} - file | diff | annotate {rename%filelogrename} + file | diff | annotate {rename%filelogrename} ' -archiveentry = ' | {type|escape} ' +archiveentry = ' | {type|escape} ' indexentry = ' - + {name|escape} @@ -296,13 +303,13 @@ {archives%indexarchiveentry} {if(isdirectory, '', '' )} \n' -indexarchiveentry = ' {type|escape} ' +indexarchiveentry = ' {type|escape} ' index = index.tmpl urlparameter = '{separator}{name}={value|urlescape}' hiddenformentry = '' -breadcrumb = '> {name} ' +breadcrumb = '> {name|escape} ' diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/notfound.tmpl --- a/mercurial/templates/gitweb/notfound.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/notfound.tmpl Tue May 14 23:04:23 2013 +0400 @@ -12,7 +12,7 @@ The specified repository "{repo|escape}" is unknown, sorry.

    -Please go back to the main repository list page. +Please go back to the main repository list page. {footer} diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/search.tmpl --- a/mercurial/templates/gitweb/search.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/search.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,9 +1,9 @@ {header} {repo|escape}: Search + href="{url|urlescape}atom-log" title="Atom feed for {repo|escape}"/> + href="{url|urlescape}rss-log" title="RSS feed for {repo|escape}"/> @@ -11,7 +11,7 @@ Mercurial Mercurial {pathdef%breadcrumb} / search -
    + {sessionvars%hiddenformentry} diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/shortlog.tmpl --- a/mercurial/templates/gitweb/shortlog.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/shortlog.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,9 +1,9 @@ {header} {repo|escape}: Shortlog + href="{url|urlescape}atom-log" title="Atom feed for {repo|escape}"/> + href="{url|urlescape}rss-log" title="RSS feed for {repo|escape}"/> @@ -12,22 +12,22 @@ Mercurial {pathdef%breadcrumb} / shortlog - + {sessionvars%hiddenformentry}
    diff -r 0890e6fd3e00 -r 838c6b72928d mercurial/templates/gitweb/summary.tmpl --- a/mercurial/templates/gitweb/summary.tmpl Sun May 12 15:35:53 2013 +0400 +++ b/mercurial/templates/gitweb/summary.tmpl Tue May 14 23:04:23 2013 +0400 @@ -1,16 +1,16 @@ {header} {repo|escape}: Summary + href="{url|urlescape}atom-log" title="Atom feed for {repo|escape}"/> + href="{url|urlescape}rss-log" title="RSS feed for {repo|escape}"/>