# HG changeset patch # User Yuya Nishihara # Date 1565742174 -32400 # Node ID 7013c7ce987fc8f5b97c772b6e68e1e83f8c5c1b # Parent d684449eef67a296ffb4f0bdf83849b9bef796d1# Parent f59f8a5e90969153d94216720650d59e0eebfa1a merge with stable diff -r f59f8a5e9096 -r 7013c7ce987f contrib/automation/hgautomation/aws.py --- a/contrib/automation/hgautomation/aws.py Mon Aug 12 14:00:19 2019 -0400 +++ b/contrib/automation/hgautomation/aws.py Wed Aug 14 09:22:54 2019 +0900 @@ -970,7 +970,7 @@ 'DeviceName': image.block_device_mappings[0]['DeviceName'], 'Ebs': { 'DeleteOnTermination': True, - 'VolumeSize': 8, + 'VolumeSize': 12, 'VolumeType': 'gp2', }, } diff -r f59f8a5e9096 -r 7013c7ce987f contrib/automation/hgautomation/linux.py --- a/contrib/automation/hgautomation/linux.py Mon Aug 12 14:00:19 2019 -0400 +++ b/contrib/automation/hgautomation/linux.py Wed Aug 14 09:22:54 2019 +0900 @@ -28,11 +28,11 @@ INSTALL_PYTHONS = r''' PYENV2_VERSIONS="2.7.16 pypy2.7-7.1.1" -PYENV3_VERSIONS="3.5.7 3.6.8 3.7.3 3.8-dev pypy3.5-7.0.0 pypy3.6-7.1.1" +PYENV3_VERSIONS="3.5.7 3.6.9 3.7.4 3.8-dev pypy3.5-7.0.0 pypy3.6-7.1.1" git clone https://github.com/pyenv/pyenv.git /hgdev/pyenv pushd /hgdev/pyenv -git checkout 3faeda67bb33e07750d1a104271369a7384ca45c +git checkout 17f44b7cd6f58ea2fa68ec0371fb9e7a826b8be2 popd export PYENV_ROOT="/hgdev/pyenv" @@ -65,6 +65,18 @@ '''.lstrip().replace('\r\n', '\n') +INSTALL_RUST = r''' +RUSTUP_INIT_SHA256=a46fe67199b7bcbbde2dcbc23ae08db6f29883e260e23899a88b9073effc9076 +wget -O rustup-init --progress dot:mega https://static.rust-lang.org/rustup/archive/1.18.3/x86_64-unknown-linux-gnu/rustup-init +echo "${RUSTUP_INIT_SHA256} rustup-init" | sha256sum --check - + +chmod +x rustup-init +sudo -H -u hg -g hg ./rustup-init -y +sudo -H -u hg -g hg /home/hg/.cargo/bin/rustup install 1.31.1 1.34.2 +sudo -H -u hg -g hg /home/hg/.cargo/bin/rustup component add clippy +''' + + BOOTSTRAP_VIRTUALENV = r''' /usr/bin/virtualenv /hgdev/venv-bootstrap @@ -286,6 +298,8 @@ # Will be normalized to hg:hg later. sudo chown `whoami` /hgdev +{install_rust} + cp requirements-py2.txt /hgdev/requirements-py2.txt cp requirements-py3.txt /hgdev/requirements-py3.txt @@ -309,6 +323,7 @@ sudo chown -R hg:hg /hgdev '''.lstrip().format( + install_rust=INSTALL_RUST, install_pythons=INSTALL_PYTHONS, bootstrap_virtualenv=BOOTSTRAP_VIRTUALENV ).replace('\r\n', '\n') diff -r f59f8a5e9096 -r 7013c7ce987f contrib/automation/linux-requirements-py2.txt --- a/contrib/automation/linux-requirements-py2.txt Mon Aug 12 14:00:19 2019 -0400 +++ b/contrib/automation/linux-requirements-py2.txt Wed Aug 14 09:22:54 2019 +0900 @@ -2,7 +2,7 @@ # This file is autogenerated by pip-compile # To update, run: # -# pip-compile -U --generate-hashes --output-file contrib/automation/linux-requirements-py2.txt contrib/automation/linux-requirements.txt.in +# pip-compile --generate-hashes --output-file=contrib/automation/linux-requirements-py2.txt contrib/automation/linux-requirements.txt.in # astroid==1.6.6 \ --hash=sha256:87de48a92e29cedf7210ffa853d11441e7ad94cb47bacd91b023499b51cbc756 \ @@ -22,10 +22,10 @@ --hash=sha256:509f9419ee91cdd00ba34443217d5ca51f5a364a404e1dce9e8979cea969ca48 \ --hash=sha256:f5260a6e679d2ff42ec91ec5252f4eeffdcf21053db9113bd0a8e4d953769c00 \ # via vcrpy -docutils==0.14 \ - --hash=sha256:02aec4bd92ab067f6ff27a38a38a41173bf01bed8f89157768c1573f53e474a6 \ - --hash=sha256:51e64ef2ebfb29cae1faa133b3710143496eca21c530f3f71424d77687764274 \ - --hash=sha256:7a4bd47eaf6596e1295ecb11361139febe29b084a87bf005bf899f9a42edc3c6 +docutils==0.15.2 \ + --hash=sha256:6c4f696463b79f1fb8ba0c594b63840ebd41f059e92b31957c46b74a4599b6d0 \ + --hash=sha256:9e4d7ecfc600058e07ba661411a2b7de2fd0fafa17d1a7f7361cd47b1175c827 \ + --hash=sha256:a2aeea129088da402665e92e0b25b04b073c04b2dce4ab65caaa38b7ce2e1a99 enum34==1.1.6 \ --hash=sha256:2d81cbbe0e73112bdfe6ef8576f2238f2ba27dd0d55752a776c41d38b7da2850 \ --hash=sha256:644837f692e5f550741432dd3f223bbb9852018674981b1664e5dc339387588a \ @@ -36,83 +36,70 @@ --hash=sha256:330cc27ccbf7f1e992e69fef78261dc7c6569012cf397db8d3de0234e6c937ca \ --hash=sha256:a7bb0f2cf3a3fd1ab2732cb49eba4252c2af4240442415b4abce3b87022a8f50 \ # via mock -futures==3.2.0 \ - --hash=sha256:9ec02aa7d674acb8618afb127e27fde7fc68994c0437ad759fa094a574adb265 \ - --hash=sha256:ec0a6cb848cc212002b9828c3e34c675e0c9ff6741dc445cab6fdd4e1085d1f1 \ +futures==3.3.0 \ + --hash=sha256:49b3f5b064b6e3afc3316421a3f25f66c137ae88f068abbf72830170033c5e16 \ + --hash=sha256:7e033af76a5e35f58e56da7a91e687706faf4e7bdfb2cbc3f2cca6b9bcda9794 \ # via isort fuzzywuzzy==0.17.0 \ --hash=sha256:5ac7c0b3f4658d2743aa17da53a55598144edbc5bee3c6863840636e6926f254 \ --hash=sha256:6f49de47db00e1c71d40ad16da42284ac357936fa9b66bea1df63fed07122d62 -isort==4.3.17 \ - --hash=sha256:01cb7e1ca5e6c5b3f235f0385057f70558b70d2f00320208825fa62887292f43 \ - --hash=sha256:268067462aed7eb2a1e237fcb287852f22077de3fb07964e87e00f829eea2d1a \ +isort==4.3.21 \ + --hash=sha256:54da7e92468955c4fceacd0c86bd0ec997b0e1ee80d97f67c35a78b719dccab1 \ + --hash=sha256:6e811fcb295968434526407adb8796944f1988c5b65e8139058f2014cbe100fd \ # via pylint -lazy-object-proxy==1.3.1 \ - --hash=sha256:0ce34342b419bd8f018e6666bfef729aec3edf62345a53b537a4dcc115746a33 \ - --hash=sha256:1b668120716eb7ee21d8a38815e5eb3bb8211117d9a90b0f8e21722c0758cc39 \ - --hash=sha256:209615b0fe4624d79e50220ce3310ca1a9445fd8e6d3572a896e7f9146bbf019 \ - --hash=sha256:27bf62cb2b1a2068d443ff7097ee33393f8483b570b475db8ebf7e1cba64f088 \ - --hash=sha256:27ea6fd1c02dcc78172a82fc37fcc0992a94e4cecf53cb6d73f11749825bd98b \ - --hash=sha256:2c1b21b44ac9beb0fc848d3993924147ba45c4ebc24be19825e57aabbe74a99e \ - --hash=sha256:2df72ab12046a3496a92476020a1a0abf78b2a7db9ff4dc2036b8dd980203ae6 \ - --hash=sha256:320ffd3de9699d3892048baee45ebfbbf9388a7d65d832d7e580243ade426d2b \ - --hash=sha256:50e3b9a464d5d08cc5227413db0d1c4707b6172e4d4d915c1c70e4de0bbff1f5 \ - --hash=sha256:5276db7ff62bb7b52f77f1f51ed58850e315154249aceb42e7f4c611f0f847ff \ - --hash=sha256:61a6cf00dcb1a7f0c773ed4acc509cb636af2d6337a08f362413c76b2b47a8dd \ - --hash=sha256:6ae6c4cb59f199d8827c5a07546b2ab7e85d262acaccaacd49b62f53f7c456f7 \ - --hash=sha256:7661d401d60d8bf15bb5da39e4dd72f5d764c5aff5a86ef52a042506e3e970ff \ - --hash=sha256:7bd527f36a605c914efca5d3d014170b2cb184723e423d26b1fb2fd9108e264d \ - --hash=sha256:7cb54db3535c8686ea12e9535eb087d32421184eacc6939ef15ef50f83a5e7e2 \ - --hash=sha256:7f3a2d740291f7f2c111d86a1c4851b70fb000a6c8883a59660d95ad57b9df35 \ - --hash=sha256:81304b7d8e9c824d058087dcb89144842c8e0dea6d281c031f59f0acf66963d4 \ - --hash=sha256:933947e8b4fbe617a51528b09851685138b49d511af0b6c0da2539115d6d4514 \ - --hash=sha256:94223d7f060301b3a8c09c9b3bc3294b56b2188e7d8179c762a1cda72c979252 \ - --hash=sha256:ab3ca49afcb47058393b0122428358d2fbe0408cf99f1b58b295cfeb4ed39109 \ - --hash=sha256:bd6292f565ca46dee4e737ebcc20742e3b5be2b01556dafe169f6c65d088875f \ - --hash=sha256:cb924aa3e4a3fb644d0c463cad5bc2572649a6a3f68a7f8e4fbe44aaa6d77e4c \ - --hash=sha256:d0fc7a286feac9077ec52a927fc9fe8fe2fabab95426722be4c953c9a8bede92 \ - --hash=sha256:ddc34786490a6e4ec0a855d401034cbd1242ef186c20d79d2166d6a4bd449577 \ - --hash=sha256:e34b155e36fa9da7e1b7c738ed7767fc9491a62ec6af70fe9da4a057759edc2d \ - --hash=sha256:e5b9e8f6bda48460b7b143c3821b21b452cb3a835e6bbd5dd33aa0c8d3f5137d \ - --hash=sha256:e81ebf6c5ee9684be8f2c87563880f93eedd56dd2b6146d8a725b50b7e5adb0f \ - --hash=sha256:eb91be369f945f10d3a49f5f9be8b3d0b93a4c2be8f8a5b83b0571b8123e0a7a \ - --hash=sha256:f460d1ceb0e4a5dcb2a652db0904224f367c9b3c1470d5a7683c0480e582468b \ +lazy-object-proxy==1.4.1 \ + --hash=sha256:159a745e61422217881c4de71f9eafd9d703b93af95618635849fe469a283661 \ + --hash=sha256:23f63c0821cc96a23332e45dfaa83266feff8adc72b9bcaef86c202af765244f \ + --hash=sha256:3b11be575475db2e8a6e11215f5aa95b9ec14de658628776e10d96fa0b4dac13 \ + --hash=sha256:3f447aff8bc61ca8b42b73304f6a44fa0d915487de144652816f950a3f1ab821 \ + --hash=sha256:4ba73f6089cd9b9478bc0a4fa807b47dbdb8fad1d8f31a0f0a5dbf26a4527a71 \ + --hash=sha256:4f53eadd9932055eac465bd3ca1bd610e4d7141e1278012bd1f28646aebc1d0e \ + --hash=sha256:64483bd7154580158ea90de5b8e5e6fc29a16a9b4db24f10193f0c1ae3f9d1ea \ + --hash=sha256:6f72d42b0d04bfee2397aa1862262654b56922c20a9bb66bb76b6f0e5e4f9229 \ + --hash=sha256:7c7f1ec07b227bdc561299fa2328e85000f90179a2f44ea30579d38e037cb3d4 \ + --hash=sha256:7c8b1ba1e15c10b13cad4171cfa77f5bb5ec2580abc5a353907780805ebe158e \ + --hash=sha256:8559b94b823f85342e10d3d9ca4ba5478168e1ac5658a8a2f18c991ba9c52c20 \ + --hash=sha256:a262c7dfb046f00e12a2bdd1bafaed2408114a89ac414b0af8755c696eb3fc16 \ + --hash=sha256:acce4e3267610c4fdb6632b3886fe3f2f7dd641158a843cf6b6a68e4ce81477b \ + --hash=sha256:be089bb6b83fac7f29d357b2dc4cf2b8eb8d98fe9d9ff89f9ea6012970a853c7 \ + --hash=sha256:bfab710d859c779f273cc48fb86af38d6e9210f38287df0069a63e40b45a2f5c \ + --hash=sha256:c10d29019927301d524a22ced72706380de7cfc50f767217485a912b4c8bd82a \ + --hash=sha256:dd6e2b598849b3d7aee2295ac765a578879830fb8966f70be8cd472e6069932e \ + --hash=sha256:e408f1eacc0a68fed0c08da45f31d0ebb38079f043328dce69ff133b95c29dc1 \ # via astroid mccabe==0.6.1 \ --hash=sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42 \ --hash=sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f \ # via pylint -mock==2.0.0 \ - --hash=sha256:5ce3c71c5545b472da17b72268978914d0252980348636840bd34a00b5cc96c1 \ - --hash=sha256:b158b6df76edd239b8208d481dc46b6afd45a846b7812ff0ce58971cf5bc8bba \ +mock==3.0.5 \ + --hash=sha256:83657d894c90d5681d62155c82bda9c1187827525880eda8ff5df4ec813437c3 \ + --hash=sha256:d157e52d4e5b938c550f39eb2fd15610db062441a9c2747d3dbfa9298211d0f8 \ # via vcrpy -pbr==5.1.3 \ - --hash=sha256:8257baf496c8522437e8a6cfe0f15e00aedc6c0e0e7c9d55eeeeab31e0853843 \ - --hash=sha256:8c361cc353d988e4f5b998555c88098b9d5964c2e11acf7b0d21925a66bb5824 \ - # via mock pyflakes==2.1.1 \ --hash=sha256:17dbeb2e3f4d772725c777fabc446d5634d1038f234e77343108ce445ea69ce0 \ --hash=sha256:d976835886f8c5b31d47970ed689944a0262b5f3afa00a5a7b4dc81e5449f8a2 -pygments==2.3.1 \ - --hash=sha256:5ffada19f6203563680669ee7f53b64dabbeb100eb51b61996085e99c03b284a \ - --hash=sha256:e8218dd399a61674745138520d0d4cf2621d7e032439341bc3f647bff125818d -pylint==1.9.4 \ - --hash=sha256:02c2b6d268695a8b64ad61847f92e611e6afcff33fd26c3a2125370c4662905d \ - --hash=sha256:ee1e85575587c5b58ddafa25e1c1b01691ef172e139fc25585e5d3f02451da93 +pygments==2.4.2 \ + --hash=sha256:71e430bc85c88a430f000ac1d9b331d2407f681d6f6aec95e8bcfbc3df5b0127 \ + --hash=sha256:881c4c157e45f30af185c1ffe8d549d48ac9127433f2c380c24b84572ad66297 +pylint==1.9.5 \ + --hash=sha256:367e3d49813d349a905390ac27989eff82ab84958731c5ef0bef867452cfdc42 \ + --hash=sha256:97a42df23d436c70132971d1dcb9efad2fe5c0c6add55b90161e773caf729300 python-levenshtein==0.12.0 \ --hash=sha256:033a11de5e3d19ea25c9302d11224e1a1898fe5abd23c61c7c360c25195e3eb1 -pyyaml==5.1 \ - --hash=sha256:1adecc22f88d38052fb787d959f003811ca858b799590a5eaa70e63dca50308c \ - --hash=sha256:436bc774ecf7c103814098159fbb84c2715d25980175292c648f2da143909f95 \ - --hash=sha256:460a5a4248763f6f37ea225d19d5c205677d8d525f6a83357ca622ed541830c2 \ - --hash=sha256:5a22a9c84653debfbf198d02fe592c176ea548cccce47553f35f466e15cf2fd4 \ - --hash=sha256:7a5d3f26b89d688db27822343dfa25c599627bc92093e788956372285c6298ad \ - --hash=sha256:9372b04a02080752d9e6f990179a4ab840227c6e2ce15b95e1278456664cf2ba \ - --hash=sha256:a5dcbebee834eaddf3fa7366316b880ff4062e4bcc9787b78c7fbb4a26ff2dd1 \ - --hash=sha256:aee5bab92a176e7cd034e57f46e9df9a9862a71f8f37cad167c6fc74c65f5b4e \ - --hash=sha256:c51f642898c0bacd335fc119da60baae0824f2cde95b0330b56c0553439f0673 \ - --hash=sha256:c68ea4d3ba1705da1e0d85da6684ac657912679a649e8868bd850d2c299cce13 \ - --hash=sha256:e23d0cc5299223dcc37885dae624f382297717e459ea24053709675a976a3e19 \ +pyyaml==5.1.2 \ + --hash=sha256:0113bc0ec2ad727182326b61326afa3d1d8280ae1122493553fd6f4397f33df9 \ + --hash=sha256:01adf0b6c6f61bd11af6e10ca52b7d4057dd0be0343eb9283c878cf3af56aee4 \ + --hash=sha256:5124373960b0b3f4aa7df1707e63e9f109b5263eca5976c66e08b1c552d4eaf8 \ + --hash=sha256:5ca4f10adbddae56d824b2c09668e91219bb178a1eee1faa56af6f99f11bf696 \ + --hash=sha256:7907be34ffa3c5a32b60b95f4d95ea25361c951383a894fec31be7252b2b6f34 \ + --hash=sha256:7ec9b2a4ed5cad025c2278a1e6a19c011c80a3caaac804fd2d329e9cc2c287c9 \ + --hash=sha256:87ae4c829bb25b9fe99cf71fbb2140c448f534e24c998cc60f39ae4f94396a73 \ + --hash=sha256:9de9919becc9cc2ff03637872a440195ac4241c80536632fffeb6a1e25a74299 \ + --hash=sha256:a5a85b10e450c66b49f98846937e8cfca1db3127a9d5d1e31ca45c3d0bef4c5b \ + --hash=sha256:b0997827b4f6a7c286c01c5f60384d218dca4ed7d9efa945c3e1aa623d5709ae \ + --hash=sha256:b631ef96d3222e62861443cc89d6563ba3eeb816eeb96b2629345ab795e53681 \ + --hash=sha256:bf47c0607522fdbca6c9e817a6e81b08491de50f3766a7a0e6a5be7905961b41 \ + --hash=sha256:f81025eddd0327c7d4cfe9b62cf33190e1e736cc6e97502b3ec425f574b3e7a8 \ # via vcrpy singledispatch==3.4.0.3 \ --hash=sha256:5b06af87df13818d14f08a028e42f566640aef80805c3b50c5056b086e3c2b9c \ @@ -125,6 +112,10 @@ vcrpy==2.0.1 \ --hash=sha256:127e79cf7b569d071d1bd761b83f7b62b2ce2a2eb63ceca7aa67cba8f2602ea3 \ --hash=sha256:57be64aa8e9883a4117d0b15de28af62275c001abcdb00b6dc2d4406073d9a4f -wrapt==1.11.1 \ - --hash=sha256:4aea003270831cceb8a90ff27c4031da6ead7ec1886023b80ce0dfe0adf61533 \ +wrapt==1.11.2 \ + --hash=sha256:565a021fd19419476b9362b05eeaa094178de64f8361e44468f9e9d7843901e1 \ # via astroid, vcrpy + +# WARNING: The following packages were not pinned, but pip requires them to be +# pinned when the requirements file includes hashes. Consider using the --allow-unsafe flag. +# setuptools==41.0.1 # via python-levenshtein diff -r f59f8a5e9096 -r 7013c7ce987f contrib/automation/linux-requirements-py3.txt --- a/contrib/automation/linux-requirements-py3.txt Mon Aug 12 14:00:19 2019 -0400 +++ b/contrib/automation/linux-requirements-py3.txt Wed Aug 14 09:22:54 2019 +0900 @@ -2,16 +2,16 @@ # This file is autogenerated by pip-compile # To update, run: # -# pip-compile -U --generate-hashes --output-file contrib/automation/linux-requirements-py3.txt contrib/automation/linux-requirements.txt.in +# pip-compile --generate-hashes --output-file=contrib/automation/linux-requirements-py3.txt contrib/automation/linux-requirements.txt.in # astroid==2.2.5 \ --hash=sha256:6560e1e1749f68c64a4b5dee4e091fce798d2f0d84ebe638cf0e0585a343acf4 \ --hash=sha256:b65db1bbaac9f9f4d190199bb8680af6f6f84fd3769a5ea883df8a91fe68b4c4 \ # via pylint -docutils==0.14 \ - --hash=sha256:02aec4bd92ab067f6ff27a38a38a41173bf01bed8f89157768c1573f53e474a6 \ - --hash=sha256:51e64ef2ebfb29cae1faa133b3710143496eca21c530f3f71424d77687764274 \ - --hash=sha256:7a4bd47eaf6596e1295ecb11361139febe29b084a87bf005bf899f9a42edc3c6 +docutils==0.15.2 \ + --hash=sha256:6c4f696463b79f1fb8ba0c594b63840ebd41f059e92b31957c46b74a4599b6d0 \ + --hash=sha256:9e4d7ecfc600058e07ba661411a2b7de2fd0fafa17d1a7f7361cd47b1175c827 \ + --hash=sha256:a2aeea129088da402665e92e0b25b04b073c04b2dce4ab65caaa38b7ce2e1a99 fuzzywuzzy==0.17.0 \ --hash=sha256:5ac7c0b3f4658d2743aa17da53a55598144edbc5bee3c6863840636e6926f254 \ --hash=sha256:6f49de47db00e1c71d40ad16da42284ac357936fa9b66bea1df63fed07122d62 @@ -19,40 +19,29 @@ --hash=sha256:c357b3f628cf53ae2c4c05627ecc484553142ca23264e593d327bcde5e9c3407 \ --hash=sha256:ea8b7f6188e6fa117537c3df7da9fc686d485087abf6ac197f9c46432f7e4a3c \ # via yarl -isort==4.3.17 \ - --hash=sha256:01cb7e1ca5e6c5b3f235f0385057f70558b70d2f00320208825fa62887292f43 \ - --hash=sha256:268067462aed7eb2a1e237fcb287852f22077de3fb07964e87e00f829eea2d1a \ +isort==4.3.21 \ + --hash=sha256:54da7e92468955c4fceacd0c86bd0ec997b0e1ee80d97f67c35a78b719dccab1 \ + --hash=sha256:6e811fcb295968434526407adb8796944f1988c5b65e8139058f2014cbe100fd \ # via pylint -lazy-object-proxy==1.3.1 \ - --hash=sha256:0ce34342b419bd8f018e6666bfef729aec3edf62345a53b537a4dcc115746a33 \ - --hash=sha256:1b668120716eb7ee21d8a38815e5eb3bb8211117d9a90b0f8e21722c0758cc39 \ - --hash=sha256:209615b0fe4624d79e50220ce3310ca1a9445fd8e6d3572a896e7f9146bbf019 \ - --hash=sha256:27bf62cb2b1a2068d443ff7097ee33393f8483b570b475db8ebf7e1cba64f088 \ - --hash=sha256:27ea6fd1c02dcc78172a82fc37fcc0992a94e4cecf53cb6d73f11749825bd98b \ - --hash=sha256:2c1b21b44ac9beb0fc848d3993924147ba45c4ebc24be19825e57aabbe74a99e \ - --hash=sha256:2df72ab12046a3496a92476020a1a0abf78b2a7db9ff4dc2036b8dd980203ae6 \ - --hash=sha256:320ffd3de9699d3892048baee45ebfbbf9388a7d65d832d7e580243ade426d2b \ - --hash=sha256:50e3b9a464d5d08cc5227413db0d1c4707b6172e4d4d915c1c70e4de0bbff1f5 \ - --hash=sha256:5276db7ff62bb7b52f77f1f51ed58850e315154249aceb42e7f4c611f0f847ff \ - --hash=sha256:61a6cf00dcb1a7f0c773ed4acc509cb636af2d6337a08f362413c76b2b47a8dd \ - --hash=sha256:6ae6c4cb59f199d8827c5a07546b2ab7e85d262acaccaacd49b62f53f7c456f7 \ - --hash=sha256:7661d401d60d8bf15bb5da39e4dd72f5d764c5aff5a86ef52a042506e3e970ff \ - --hash=sha256:7bd527f36a605c914efca5d3d014170b2cb184723e423d26b1fb2fd9108e264d \ - --hash=sha256:7cb54db3535c8686ea12e9535eb087d32421184eacc6939ef15ef50f83a5e7e2 \ - --hash=sha256:7f3a2d740291f7f2c111d86a1c4851b70fb000a6c8883a59660d95ad57b9df35 \ - --hash=sha256:81304b7d8e9c824d058087dcb89144842c8e0dea6d281c031f59f0acf66963d4 \ - --hash=sha256:933947e8b4fbe617a51528b09851685138b49d511af0b6c0da2539115d6d4514 \ - --hash=sha256:94223d7f060301b3a8c09c9b3bc3294b56b2188e7d8179c762a1cda72c979252 \ - --hash=sha256:ab3ca49afcb47058393b0122428358d2fbe0408cf99f1b58b295cfeb4ed39109 \ - --hash=sha256:bd6292f565ca46dee4e737ebcc20742e3b5be2b01556dafe169f6c65d088875f \ - --hash=sha256:cb924aa3e4a3fb644d0c463cad5bc2572649a6a3f68a7f8e4fbe44aaa6d77e4c \ - --hash=sha256:d0fc7a286feac9077ec52a927fc9fe8fe2fabab95426722be4c953c9a8bede92 \ - --hash=sha256:ddc34786490a6e4ec0a855d401034cbd1242ef186c20d79d2166d6a4bd449577 \ - --hash=sha256:e34b155e36fa9da7e1b7c738ed7767fc9491a62ec6af70fe9da4a057759edc2d \ - --hash=sha256:e5b9e8f6bda48460b7b143c3821b21b452cb3a835e6bbd5dd33aa0c8d3f5137d \ - --hash=sha256:e81ebf6c5ee9684be8f2c87563880f93eedd56dd2b6146d8a725b50b7e5adb0f \ - --hash=sha256:eb91be369f945f10d3a49f5f9be8b3d0b93a4c2be8f8a5b83b0571b8123e0a7a \ - --hash=sha256:f460d1ceb0e4a5dcb2a652db0904224f367c9b3c1470d5a7683c0480e582468b \ +lazy-object-proxy==1.4.1 \ + --hash=sha256:159a745e61422217881c4de71f9eafd9d703b93af95618635849fe469a283661 \ + --hash=sha256:23f63c0821cc96a23332e45dfaa83266feff8adc72b9bcaef86c202af765244f \ + --hash=sha256:3b11be575475db2e8a6e11215f5aa95b9ec14de658628776e10d96fa0b4dac13 \ + --hash=sha256:3f447aff8bc61ca8b42b73304f6a44fa0d915487de144652816f950a3f1ab821 \ + --hash=sha256:4ba73f6089cd9b9478bc0a4fa807b47dbdb8fad1d8f31a0f0a5dbf26a4527a71 \ + --hash=sha256:4f53eadd9932055eac465bd3ca1bd610e4d7141e1278012bd1f28646aebc1d0e \ + --hash=sha256:64483bd7154580158ea90de5b8e5e6fc29a16a9b4db24f10193f0c1ae3f9d1ea \ + --hash=sha256:6f72d42b0d04bfee2397aa1862262654b56922c20a9bb66bb76b6f0e5e4f9229 \ + --hash=sha256:7c7f1ec07b227bdc561299fa2328e85000f90179a2f44ea30579d38e037cb3d4 \ + --hash=sha256:7c8b1ba1e15c10b13cad4171cfa77f5bb5ec2580abc5a353907780805ebe158e \ + --hash=sha256:8559b94b823f85342e10d3d9ca4ba5478168e1ac5658a8a2f18c991ba9c52c20 \ + --hash=sha256:a262c7dfb046f00e12a2bdd1bafaed2408114a89ac414b0af8755c696eb3fc16 \ + --hash=sha256:acce4e3267610c4fdb6632b3886fe3f2f7dd641158a843cf6b6a68e4ce81477b \ + --hash=sha256:be089bb6b83fac7f29d357b2dc4cf2b8eb8d98fe9d9ff89f9ea6012970a853c7 \ + --hash=sha256:bfab710d859c779f273cc48fb86af38d6e9210f38287df0069a63e40b45a2f5c \ + --hash=sha256:c10d29019927301d524a22ced72706380de7cfc50f767217485a912b4c8bd82a \ + --hash=sha256:dd6e2b598849b3d7aee2295ac765a578879830fb8966f70be8cd472e6069932e \ + --hash=sha256:e408f1eacc0a68fed0c08da45f31d0ebb38079f043328dce69ff133b95c29dc1 \ # via astroid mccabe==0.6.1 \ --hash=sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42 \ @@ -92,57 +81,54 @@ pyflakes==2.1.1 \ --hash=sha256:17dbeb2e3f4d772725c777fabc446d5634d1038f234e77343108ce445ea69ce0 \ --hash=sha256:d976835886f8c5b31d47970ed689944a0262b5f3afa00a5a7b4dc81e5449f8a2 -pygments==2.3.1 \ - --hash=sha256:5ffada19f6203563680669ee7f53b64dabbeb100eb51b61996085e99c03b284a \ - --hash=sha256:e8218dd399a61674745138520d0d4cf2621d7e032439341bc3f647bff125818d +pygments==2.4.2 \ + --hash=sha256:71e430bc85c88a430f000ac1d9b331d2407f681d6f6aec95e8bcfbc3df5b0127 \ + --hash=sha256:881c4c157e45f30af185c1ffe8d549d48ac9127433f2c380c24b84572ad66297 pylint==2.3.1 \ --hash=sha256:5d77031694a5fb97ea95e828c8d10fc770a1df6eb3906067aaed42201a8a6a09 \ --hash=sha256:723e3db49555abaf9bf79dc474c6b9e2935ad82230b10c1138a71ea41ac0fff1 python-levenshtein==0.12.0 \ --hash=sha256:033a11de5e3d19ea25c9302d11224e1a1898fe5abd23c61c7c360c25195e3eb1 -pyyaml==5.1 \ - --hash=sha256:1adecc22f88d38052fb787d959f003811ca858b799590a5eaa70e63dca50308c \ - --hash=sha256:436bc774ecf7c103814098159fbb84c2715d25980175292c648f2da143909f95 \ - --hash=sha256:460a5a4248763f6f37ea225d19d5c205677d8d525f6a83357ca622ed541830c2 \ - --hash=sha256:5a22a9c84653debfbf198d02fe592c176ea548cccce47553f35f466e15cf2fd4 \ - --hash=sha256:7a5d3f26b89d688db27822343dfa25c599627bc92093e788956372285c6298ad \ - --hash=sha256:9372b04a02080752d9e6f990179a4ab840227c6e2ce15b95e1278456664cf2ba \ - --hash=sha256:a5dcbebee834eaddf3fa7366316b880ff4062e4bcc9787b78c7fbb4a26ff2dd1 \ - --hash=sha256:aee5bab92a176e7cd034e57f46e9df9a9862a71f8f37cad167c6fc74c65f5b4e \ - --hash=sha256:c51f642898c0bacd335fc119da60baae0824f2cde95b0330b56c0553439f0673 \ - --hash=sha256:c68ea4d3ba1705da1e0d85da6684ac657912679a649e8868bd850d2c299cce13 \ - --hash=sha256:e23d0cc5299223dcc37885dae624f382297717e459ea24053709675a976a3e19 \ +pyyaml==5.1.2 \ + --hash=sha256:0113bc0ec2ad727182326b61326afa3d1d8280ae1122493553fd6f4397f33df9 \ + --hash=sha256:01adf0b6c6f61bd11af6e10ca52b7d4057dd0be0343eb9283c878cf3af56aee4 \ + --hash=sha256:5124373960b0b3f4aa7df1707e63e9f109b5263eca5976c66e08b1c552d4eaf8 \ + --hash=sha256:5ca4f10adbddae56d824b2c09668e91219bb178a1eee1faa56af6f99f11bf696 \ + --hash=sha256:7907be34ffa3c5a32b60b95f4d95ea25361c951383a894fec31be7252b2b6f34 \ + --hash=sha256:7ec9b2a4ed5cad025c2278a1e6a19c011c80a3caaac804fd2d329e9cc2c287c9 \ + --hash=sha256:87ae4c829bb25b9fe99cf71fbb2140c448f534e24c998cc60f39ae4f94396a73 \ + --hash=sha256:9de9919becc9cc2ff03637872a440195ac4241c80536632fffeb6a1e25a74299 \ + --hash=sha256:a5a85b10e450c66b49f98846937e8cfca1db3127a9d5d1e31ca45c3d0bef4c5b \ + --hash=sha256:b0997827b4f6a7c286c01c5f60384d218dca4ed7d9efa945c3e1aa623d5709ae \ + --hash=sha256:b631ef96d3222e62861443cc89d6563ba3eeb816eeb96b2629345ab795e53681 \ + --hash=sha256:bf47c0607522fdbca6c9e817a6e81b08491de50f3766a7a0e6a5be7905961b41 \ + --hash=sha256:f81025eddd0327c7d4cfe9b62cf33190e1e736cc6e97502b3ec425f574b3e7a8 \ # via vcrpy six==1.12.0 \ --hash=sha256:3350809f0555b11f552448330d0b52d5f24c91a322ea4a15ef22629740f3761c \ --hash=sha256:d16a0141ec1a18405cd4ce8b4613101da75da0e9a7aec5bdd4fa804d0e0eba73 \ # via astroid, vcrpy -typed-ast==1.3.4 ; python_version >= "3.0" and platform_python_implementation != "PyPy" \ - --hash=sha256:04894d268ba6eab7e093d43107869ad49e7b5ef40d1a94243ea49b352061b200 \ - --hash=sha256:16616ece19daddc586e499a3d2f560302c11f122b9c692bc216e821ae32aa0d0 \ - --hash=sha256:252fdae740964b2d3cdfb3f84dcb4d6247a48a6abe2579e8029ab3be3cdc026c \ - --hash=sha256:2af80a373af123d0b9f44941a46df67ef0ff7a60f95872412a145f4500a7fc99 \ - --hash=sha256:2c88d0a913229a06282b285f42a31e063c3bf9071ff65c5ea4c12acb6977c6a7 \ - --hash=sha256:2ea99c029ebd4b5a308d915cc7fb95b8e1201d60b065450d5d26deb65d3f2bc1 \ - --hash=sha256:3d2e3ab175fc097d2a51c7a0d3fda442f35ebcc93bb1d7bd9b95ad893e44c04d \ - --hash=sha256:4766dd695548a15ee766927bf883fb90c6ac8321be5a60c141f18628fb7f8da8 \ - --hash=sha256:56b6978798502ef66625a2e0f80cf923da64e328da8bbe16c1ff928c70c873de \ - --hash=sha256:5cddb6f8bce14325b2863f9d5ac5c51e07b71b462361fd815d1d7706d3a9d682 \ - --hash=sha256:644ee788222d81555af543b70a1098f2025db38eaa99226f3a75a6854924d4db \ - --hash=sha256:64cf762049fc4775efe6b27161467e76d0ba145862802a65eefc8879086fc6f8 \ - --hash=sha256:68c362848d9fb71d3c3e5f43c09974a0ae319144634e7a47db62f0f2a54a7fa7 \ - --hash=sha256:6c1f3c6f6635e611d58e467bf4371883568f0de9ccc4606f17048142dec14a1f \ - --hash=sha256:b213d4a02eec4ddf622f4d2fbc539f062af3788d1f332f028a2e19c42da53f15 \ - --hash=sha256:bb27d4e7805a7de0e35bd0cb1411bc85f807968b2b0539597a49a23b00a622ae \ - --hash=sha256:c9d414512eaa417aadae7758bc118868cd2396b0e6138c1dd4fda96679c079d3 \ - --hash=sha256:f0937165d1e25477b01081c4763d2d9cdc3b18af69cb259dd4f640c9b900fe5e \ - --hash=sha256:fb96a6e2c11059ecf84e6741a319f93f683e440e341d4489c9b161eca251cf2a \ - --hash=sha256:fc71d2d6ae56a091a8d94f33ec9d0f2001d1cb1db423d8b4355debfe9ce689b7 +typed-ast==1.4.0 ; python_version >= "3.0" and platform_python_implementation != "PyPy" \ + --hash=sha256:18511a0b3e7922276346bcb47e2ef9f38fb90fd31cb9223eed42c85d1312344e \ + --hash=sha256:262c247a82d005e43b5b7f69aff746370538e176131c32dda9cb0f324d27141e \ + --hash=sha256:2b907eb046d049bcd9892e3076c7a6456c93a25bebfe554e931620c90e6a25b0 \ + --hash=sha256:354c16e5babd09f5cb0ee000d54cfa38401d8b8891eefa878ac772f827181a3c \ + --hash=sha256:4e0b70c6fc4d010f8107726af5fd37921b666f5b31d9331f0bd24ad9a088e631 \ + --hash=sha256:630968c5cdee51a11c05a30453f8cd65e0cc1d2ad0d9192819df9978984529f4 \ + --hash=sha256:66480f95b8167c9c5c5c87f32cf437d585937970f3fc24386f313a4c97b44e34 \ + --hash=sha256:71211d26ffd12d63a83e079ff258ac9d56a1376a25bc80b1cdcdf601b855b90b \ + --hash=sha256:95bd11af7eafc16e829af2d3df510cecfd4387f6453355188342c3e79a2ec87a \ + --hash=sha256:bc6c7d3fa1325a0c6613512a093bc2a2a15aeec350451cbdf9e1d4bffe3e3233 \ + --hash=sha256:cc34a6f5b426748a507dd5d1de4c1978f2eb5626d51326e43280941206c209e1 \ + --hash=sha256:d755f03c1e4a51e9b24d899561fec4ccaf51f210d52abdf8c07ee2849b212a36 \ + --hash=sha256:d7c45933b1bdfaf9f36c579671fec15d25b06c8398f113dab64c18ed1adda01d \ + --hash=sha256:d896919306dd0aa22d0132f62a1b78d11aaf4c9fc5b3410d3c666b818191630a \ + --hash=sha256:ffde2fbfad571af120fcbfbbc61c72469e72f550d676c3342492a9dfdefb8f12 vcrpy==2.0.1 \ --hash=sha256:127e79cf7b569d071d1bd761b83f7b62b2ce2a2eb63ceca7aa67cba8f2602ea3 \ --hash=sha256:57be64aa8e9883a4117d0b15de28af62275c001abcdb00b6dc2d4406073d9a4f -wrapt==1.11.1 \ - --hash=sha256:4aea003270831cceb8a90ff27c4031da6ead7ec1886023b80ce0dfe0adf61533 \ +wrapt==1.11.2 \ + --hash=sha256:565a021fd19419476b9362b05eeaa094178de64f8361e44468f9e9d7843901e1 \ # via astroid, vcrpy yarl==1.3.0 \ --hash=sha256:024ecdc12bc02b321bc66b41327f930d1c2c543fa9a561b39861da9388ba7aa9 \ @@ -157,3 +143,7 @@ --hash=sha256:c9bb7c249c4432cd47e75af3864bc02d26c9594f49c82e2a28624417f0ae63b8 \ --hash=sha256:e060906c0c585565c718d1c3841747b61c5439af2211e185f6739a9412dfbde1 \ # via vcrpy + +# WARNING: The following packages were not pinned, but pip requires them to be +# pinned when the requirements file includes hashes. Consider using the --allow-unsafe flag. +# setuptools==41.0.1 # via python-levenshtein diff -r f59f8a5e9096 -r 7013c7ce987f contrib/byteify-strings.py --- a/contrib/byteify-strings.py Mon Aug 12 14:00:19 2019 -0400 +++ b/contrib/byteify-strings.py Wed Aug 14 09:22:54 2019 +0900 @@ -78,23 +78,69 @@ already been done. """ - st = tokens[j] - if st.type == token.STRING and st.string.startswith(("'", '"')): - sysstrtokens.add(st) + k = j + currtoken = tokens[k] + while currtoken.type in (token.STRING, token.NEWLINE, tokenize.NL): + k += 1 + if ( + currtoken.type == token.STRING + and currtoken.string.startswith(("'", '"')) + ): + sysstrtokens.add(currtoken) + try: + currtoken = tokens[k] + except IndexError: + break + + def _isitemaccess(j): + """Assert the next tokens form an item access on `tokens[j]` and that + `tokens[j]` is a name. + """ + try: + return ( + tokens[j].type == token.NAME + and _isop(j + 1, '[') + and tokens[j + 2].type == token.STRING + and _isop(j + 3, ']') + ) + except IndexError: + return False + + def _ismethodcall(j, *methodnames): + """Assert the next tokens form a call to `methodname` with a string + as first argument on `tokens[j]` and that `tokens[j]` is a name. + """ + try: + return ( + tokens[j].type == token.NAME + and _isop(j + 1, '.') + and tokens[j + 2].type == token.NAME + and tokens[j + 2].string in methodnames + and _isop(j + 3, '(') + and tokens[j + 4].type == token.STRING + ) + except IndexError: + return False coldelta = 0 # column increment for new opening parens coloffset = -1 # column offset for the current line (-1: TBD) - parens = [(0, 0, 0)] # stack of (line, end-column, column-offset) + parens = [(0, 0, 0, -1)] # stack of (line, end-column, column-offset, type) + ignorenextline = False # don't transform the next line + insideignoreblock = False # don't transform until turned off for i, t in enumerate(tokens): # Compute the column offset for the current line, such that # the current line will be aligned to the last opening paren # as before. if coloffset < 0: - if t.start[1] == parens[-1][1]: - coloffset = parens[-1][2] - elif t.start[1] + 1 == parens[-1][1]: + lastparen = parens[-1] + if t.start[1] == lastparen[1]: + coloffset = lastparen[2] + elif ( + t.start[1] + 1 == lastparen[1] + and lastparen[3] not in (token.NEWLINE, tokenize.NL) + ): # fix misaligned indent of s/util.Abort/error.Abort/ - coloffset = parens[-1][2] + (parens[-1][1] - t.start[1]) + coloffset = lastparen[2] + (lastparen[1] - t.start[1]) else: coloffset = 0 @@ -103,11 +149,26 @@ yield adjusttokenpos(t, coloffset) coldelta = 0 coloffset = -1 + if not insideignoreblock: + ignorenextline = ( + tokens[i - 1].type == token.COMMENT + and tokens[i - 1].string == "# no-py3-transform" + ) + continue + + if t.type == token.COMMENT: + if t.string == "# py3-transform: off": + insideignoreblock = True + if t.string == "# py3-transform: on": + insideignoreblock = False + + if ignorenextline or insideignoreblock: + yield adjusttokenpos(t, coloffset) continue # Remember the last paren position. if _isop(i, '(', '[', '{'): - parens.append(t.end + (coloffset + coldelta,)) + parens.append(t.end + (coloffset + coldelta, tokens[i + 1].type)) elif _isop(i, ')', ']', '}'): parens.pop() @@ -129,8 +190,10 @@ # components touching docstrings need to handle unicode, # unfortunately. if s[0:3] in ("'''", '"""'): - yield adjusttokenpos(t, coloffset) - continue + # If it's assigned to something, it's not a docstring + if not _isop(i - 1, '='): + yield adjusttokenpos(t, coloffset) + continue # If the first character isn't a quote, it is likely a string # prefixing character (such as 'b', 'u', or 'r'. Ignore. @@ -149,8 +212,10 @@ fn = t.string # *attr() builtins don't accept byte strings to 2nd argument. - if (fn in ('getattr', 'setattr', 'hasattr', 'safehasattr') and - not _isop(i - 1, '.')): + if fn in ( + 'getattr', 'setattr', 'hasattr', 'safehasattr', 'wrapfunction', + 'wrapclass', 'addattr' + ) and (opts['allow-attr-methods'] or not _isop(i - 1, '.')): arg1idx = _findargnofcall(1) if arg1idx is not None: _ensuresysstr(arg1idx) @@ -169,6 +234,12 @@ yield adjusttokenpos(t._replace(string=fn[4:]), coloffset) continue + if t.type == token.NAME and t.string in opts['treat-as-kwargs']: + if _isitemaccess(i): + _ensuresysstr(i + 2) + if _ismethodcall(i, 'get', 'pop', 'setdefault', 'popitem'): + _ensuresysstr(i + 4) + # Looks like "if __name__ == '__main__'". if (t.type == token.NAME and t.string == '__name__' and _isop(i + 1, '==')): @@ -207,14 +278,23 @@ def main(): ap = argparse.ArgumentParser() + ap.add_argument('--version', action='version', + version='Byteify strings 1.0') ap.add_argument('-i', '--inplace', action='store_true', default=False, help='edit files in place') ap.add_argument('--dictiter', action='store_true', default=False, help='rewrite iteritems() and itervalues()'), + ap.add_argument('--allow-attr-methods', action='store_true', + default=False, + help='also handle attr*() when they are methods'), + ap.add_argument('--treat-as-kwargs', nargs="+", default=[], + help="ignore kwargs-like objects"), ap.add_argument('files', metavar='FILE', nargs='+', help='source file') args = ap.parse_args() opts = { 'dictiter': args.dictiter, + 'treat-as-kwargs': set(args.treat_as_kwargs), + 'allow-attr-methods': args.allow_attr_methods, } for fname in args.files: if args.inplace: diff -r f59f8a5e9096 -r 7013c7ce987f contrib/import-checker.py --- a/contrib/import-checker.py Mon Aug 12 14:00:19 2019 -0400 +++ b/contrib/import-checker.py Wed Aug 14 09:22:54 2019 +0900 @@ -31,6 +31,7 @@ 'mercurial.node', # for revlog to re-export constant to extensions 'mercurial.revlogutils.constants', + 'mercurial.revlogutils.flagutil', # for cffi modules to re-export pure functions 'mercurial.pure.base85', 'mercurial.pure.bdiff', diff -r f59f8a5e9096 -r 7013c7ce987f contrib/python3-whitelist --- a/contrib/python3-whitelist Mon Aug 12 14:00:19 2019 -0400 +++ b/contrib/python3-whitelist Wed Aug 14 09:22:54 2019 +0900 @@ -124,6 +124,7 @@ test-convert-hg-sink.t test-convert-hg-source.t test-convert-hg-startrev.t +test-convert-identity.t test-convert-mtn.t test-convert-splicemap.t test-convert-svn-sink.t diff -r f59f8a5e9096 -r 7013c7ce987f hgext/fix.py --- a/hgext/fix.py Mon Aug 12 14:00:19 2019 -0400 +++ b/hgext/fix.py Wed Aug 14 09:22:54 2019 +0900 @@ -102,6 +102,13 @@ mapping fixer tool names to lists of metadata values returned from executions that modified a file. This aggregates the same metadata previously passed to the "postfixfile" hook. + +Fixer tools are run the in repository's root directory. This allows them to read +configuration files from the working copy, or even write to the working copy. +The working copy is not updated to match the revision being fixed. In fact, +several revisions may be fixed in parallel. Writes to the working copy are not +amended into the revision being fixed; fixer tools should always write fixed +file content back to stdout as documented above. """ from __future__ import absolute_import @@ -152,7 +159,6 @@ FIXER_ATTRS = { 'command': None, 'linerange': None, - 'fileset': None, 'pattern': None, 'priority': 0, 'metadata': False, @@ -233,7 +239,7 @@ for rev, path in items: ctx = repo[rev] olddata = ctx[path].data() - metadata, newdata = fixfile(ui, opts, fixers, ctx, path, + metadata, newdata = fixfile(ui, repo, opts, fixers, ctx, path, basectxs[rev]) # Don't waste memory/time passing unchanged content back, but # produce one result per item either way. @@ -530,7 +536,7 @@ basectxs[rev].add(pctx) return basectxs -def fixfile(ui, opts, fixers, fixctx, path, basectxs): +def fixfile(ui, repo, opts, fixers, fixctx, path, basectxs): """Run any configured fixers that should affect the file in this context Returns the file content that results from applying the fixers in some order @@ -539,7 +545,8 @@ (i.e. they will only avoid lines that are common to all basectxs). A fixer tool's stdout will become the file's new content if and only if it - exits with code zero. + exits with code zero. The fixer tool's working directory is the repository's + root. """ metadata = {} newdata = fixctx[path].data() @@ -553,7 +560,7 @@ proc = subprocess.Popen( procutil.tonativestr(command), shell=True, - cwd=procutil.tonativestr(b'/'), + cwd=repo.root, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE) @@ -702,14 +709,18 @@ for name in fixernames(ui): fixers[name] = Fixer() attrs = ui.configsuboptions('fix', name)[1] - if 'fileset' in attrs and 'pattern' not in attrs: - ui.warn(_('the fix.tool:fileset config name is deprecated; ' - 'please rename it to fix.tool:pattern\n')) - attrs['pattern'] = attrs['fileset'] for key, default in FIXER_ATTRS.items(): setattr(fixers[name], pycompat.sysstr('_' + key), attrs.get(key, default)) fixers[name]._priority = int(fixers[name]._priority) + # Don't use a fixer if it has no pattern configured. It would be + # dangerous to let it affect all files. It would be pointless to let it + # affect no files. There is no reasonable subset of files to use as the + # default. + if fixers[name]._pattern is None: + ui.warn( + _('fixer tool has no pattern configuration: %s\n') % (name,)) + del fixers[name] return collections.OrderedDict( sorted(fixers.items(), key=lambda item: item[1]._priority, reverse=True)) @@ -727,7 +738,8 @@ def affects(self, opts, fixctx, path): """Should this fixer run on the file at the given path and context?""" - return scmutil.match(fixctx, [self._pattern], opts)(path) + return (self._pattern is not None and + scmutil.match(fixctx, [self._pattern], opts)(path)) def shouldoutputmetadata(self): """Should the stdout of this fixer start with JSON and a null byte?""" diff -r f59f8a5e9096 -r 7013c7ce987f hgext/fsmonitor/__init__.py --- a/hgext/fsmonitor/__init__.py Mon Aug 12 14:00:19 2019 -0400 +++ b/hgext/fsmonitor/__init__.py Wed Aug 14 09:22:54 2019 +0900 @@ -112,6 +112,7 @@ import os import stat import sys +import tempfile import weakref from mercurial.i18n import _ @@ -175,6 +176,23 @@ # and will disable itself when encountering one of these: _blacklist = ['largefiles', 'eol'] +def debuginstall(ui, fm): + fm.write("fsmonitor-watchman", + _("fsmonitor checking for watchman binary... (%s)\n"), + ui.configpath("fsmonitor", "watchman_exe")) + root = tempfile.mkdtemp() + c = watchmanclient.client(ui, root) + err = None + try: + v = c.command("version") + fm.write("fsmonitor-watchman-version", + _(" watchman binary version %s\n"), v["version"]) + except watchmanclient.Unavailable as e: + err = str(e) + fm.condwrite(err, "fsmonitor-watchman-error", + _(" watchman binary missing or broken: %s\n"), err) + return 1 if err else 0 + def _handleunavailable(ui, state, ex): """Exception handler for Watchman interaction exceptions""" if isinstance(ex, watchmanclient.Unavailable): @@ -780,7 +798,7 @@ return try: - client = watchmanclient.client(repo) + client = watchmanclient.client(repo.ui, repo._root) except Exception as ex: _handleunavailable(ui, fsmonitorstate, ex) return diff -r f59f8a5e9096 -r 7013c7ce987f hgext/fsmonitor/watchmanclient.py --- a/hgext/fsmonitor/watchmanclient.py Mon Aug 12 14:00:19 2019 -0400 +++ b/hgext/fsmonitor/watchmanclient.py Wed Aug 14 09:22:54 2019 +0900 @@ -33,12 +33,12 @@ super(WatchmanNoRoot, self).__init__(msg) class client(object): - def __init__(self, repo, timeout=1.0): + def __init__(self, ui, root, timeout=1.0): err = None if not self._user: err = "couldn't get user" warn = True - if self._user in repo.ui.configlist('fsmonitor', 'blacklistusers'): + if self._user in ui.configlist('fsmonitor', 'blacklistusers'): err = 'user %s in blacklist' % self._user warn = False @@ -47,8 +47,8 @@ self._timeout = timeout self._watchmanclient = None - self._root = repo.root - self._ui = repo.ui + self._root = root + self._ui = ui self._firsttime = True def settimeout(self, timeout): diff -r f59f8a5e9096 -r 7013c7ce987f hgext/remotefilelog/remotefilelog.py --- a/hgext/remotefilelog/remotefilelog.py Mon Aug 12 14:00:19 2019 -0400 +++ b/hgext/remotefilelog/remotefilelog.py Wed Aug 14 09:22:54 2019 +0900 @@ -324,6 +324,9 @@ text, verifyhash = self._processflags(rawtext, flags, 'read') return text + def rawdata(self, node): + return self.revision(node, raw=False) + def _processflags(self, text, flags, operation, raw=False): # mostly copied from hg/mercurial/revlog.py validatehash = True diff -r f59f8a5e9096 -r 7013c7ce987f hgext/sqlitestore.py --- a/hgext/sqlitestore.py Mon Aug 12 14:00:19 2019 -0400 +++ b/hgext/sqlitestore.py Wed Aug 14 09:22:54 2019 +0900 @@ -549,6 +549,9 @@ return fulltext + def rawdata(self, *args, **kwargs): + return self.revision(*args, **kwargs) + def read(self, node): return storageutil.filtermetadata(self.revision(node)) diff -r f59f8a5e9096 -r 7013c7ce987f hgext/transplant.py --- a/hgext/transplant.py Mon Aug 12 14:00:19 2019 -0400 +++ b/hgext/transplant.py Wed Aug 14 09:22:54 2019 +0900 @@ -734,6 +734,13 @@ if cleanupfn: cleanupfn() +def continuecmd(ui, repo): + """logic to resume an interrupted transplant using + 'hg continue'""" + with repo.wlock(): + tp = transplanter(ui, repo, {}) + return tp.resume(repo, repo, {}) + revsetpredicate = registrar.revsetpredicate() @revsetpredicate('transplanted([set])') @@ -760,6 +767,7 @@ def extsetup(ui): statemod.addunfinished ( 'transplant', fname='transplant/journal', clearable=True, + continuefunc=continuecmd, statushint=_('To continue: hg transplant --continue\n' 'To abort: hg update'), cmdhint=_("use 'hg transplant --continue' or 'hg update' to abort") diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/bundlerepo.py --- a/mercurial/bundlerepo.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/bundlerepo.py Wed Aug 14 09:22:54 2019 +0900 @@ -146,6 +146,9 @@ self._revisioncache = (node, rev, rawtext) return text + def rawdata(self, nodeorrev, _df=None): + return self.revision(nodeorrev, _df=_df, raw=True) + def baserevision(self, nodeorrev): # Revlog subclasses may override 'revision' method to modify format of # content retrieved from revlog. To use bundlerevlog with such class one diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/commands.py --- a/mercurial/commands.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/commands.py Wed Aug 14 09:22:54 2019 +0900 @@ -1872,6 +1872,7 @@ for section, name, value in ui.walkconfig(untrusted=untrusted): source = ui.configsource(section, name, untrusted) value = pycompat.bytestr(value) + defaultvalue = ui.configdefault(section, name) if fm.isplain(): source = source or 'none' value = value.replace('\n', '\\n') @@ -1885,6 +1886,7 @@ fm.write('value', '%s\n', value) else: fm.write('name value', '%s=%s\n', entryname, value) + fm.data(defaultvalue=defaultvalue) matched = True fm.end() if matched: diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/context.py --- a/mercurial/context.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/context.py Wed Aug 14 09:22:54 2019 +0900 @@ -24,6 +24,7 @@ wdirhex, ) from . import ( + copies, dagop, encoding, error, @@ -274,23 +275,7 @@ @propertycache def _copies(self): - p1copies = {} - p2copies = {} - p1 = self.p1() - p2 = self.p2() - narrowmatch = self._repo.narrowmatch() - for dst in self.files(): - if not narrowmatch(dst) or dst not in self: - continue - copied = self[dst].renamed() - if not copied: - continue - src, srcnode = copied - if src in p1 and p1[src].filenode() == srcnode: - p1copies[dst] = src - elif src in p2 and p2[src].filenode() == srcnode: - p2copies[dst] = src - return p1copies, p2copies + return copies.computechangesetcopies(self) def p1copies(self): return self._copies[0] def p2copies(self): @@ -474,24 +459,14 @@ (source == 'compatibility' and self._changeset.filesadded is not None)): return self._changeset.filesadded or [] - - added = [] - for f in self.files(): - if not any(f in p for p in self.parents()): - added.append(f) - return added + return scmutil.computechangesetfilesadded(self) def filesremoved(self): source = self._repo.ui.config('experimental', 'copies.read-from') if (source == 'changeset-only' or (source == 'compatibility' and self._changeset.filesremoved is not None)): return self._changeset.filesremoved or [] - - removed = [] - for f in self.files(): - if f not in self: - removed.append(f) - return removed + return scmutil.computechangesetfilesremoved(self) @propertycache def _copies(self): diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/copies.py --- a/mercurial/copies.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/copies.py Wed Aug 14 09:22:54 2019 +0900 @@ -809,3 +809,28 @@ continue if dst in wctx: wctx[dst].markcopied(src) + +def computechangesetcopies(ctx): + """return the copies data for a changeset + + The copies data are returned as a pair of dictionnary (p1copies, p2copies). + + Each dictionnary are in the form: `{newname: oldname}` + """ + p1copies = {} + p2copies = {} + p1 = ctx.p1() + p2 = ctx.p2() + narrowmatch = ctx._repo.narrowmatch() + for dst in ctx.files(): + if not narrowmatch(dst) or dst not in ctx: + continue + copied = ctx[dst].renamed() + if not copied: + continue + src, srcnode = copied + if src in p1 and p1[src].filenode() == srcnode: + p1copies[dst] = src + elif src in p2 and p2[src].filenode() == srcnode: + p2copies[dst] = src + return p1copies, p2copies diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/debugcommands.py --- a/mercurial/debugcommands.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/debugcommands.py Wed Aug 14 09:22:54 2019 +0900 @@ -1383,6 +1383,11 @@ fm.condwrite(err, 'usernameerror', _("checking username...\n %s\n" " (specify a username in your configuration file)\n"), err) + for name, mod in extensions.extensions(): + handler = getattr(mod, 'debuginstall', None) + if handler is not None: + problems += handler(ui, fm) + fm.condwrite(not problems, '', _("no problems detected\n")) if not problems: diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/filelog.py --- a/mercurial/filelog.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/filelog.py Wed Aug 14 09:22:54 2019 +0900 @@ -90,6 +90,9 @@ def revision(self, node, _df=None, raw=False): return self._revlog.revision(node, _df=_df, raw=raw) + def rawdata(self, node, _df=None): + return self._revlog.rawdata(node, _df=_df) + def emitrevisions(self, nodes, nodesorder=None, revisiondata=False, assumehaveparentrevisions=False, deltamode=repository.CG_DELTAMODE_STD): diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/localrepo.py --- a/mercurial/localrepo.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/localrepo.py Wed Aug 14 09:22:54 2019 +0900 @@ -1942,6 +1942,12 @@ **pycompat.strkwargs(tr.hookargs)) def releasefn(tr, success): repo = reporef() + if repo is None: + # If the repo has been GC'd (and this release function is being + # called from transaction.__del__), there's not much we can do, + # so just leave the unfinished transaction there and let the + # user run `hg recover`. + return if success: # this should be explicitly invoked here, because # in-memory changes aren't written out at closing @@ -2214,6 +2220,16 @@ self.tags() self.filtered('served').tags() + # The `full` arg is documented as updating even the lazily-loaded + # caches immediately, so we're forcing a write to cause these caches + # to be warmed up even if they haven't explicitly been requested + # yet (if they've never been used by hg, they won't ever have been + # written, even if they're a subset of another kind of cache that + # *has* been used). + for filt in repoview.filtertable.keys(): + filtered = self.filtered(filt) + filtered.branchmap().write(filtered) + def invalidatecaches(self): if r'_tagscache' in vars(self): diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/manifest.py --- a/mercurial/manifest.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/manifest.py Wed Aug 14 09:22:54 2019 +0900 @@ -1620,6 +1620,9 @@ def revision(self, node, _df=None, raw=False): return self._revlog.revision(node, _df=_df, raw=raw) + def rawdata(self, node, _df=None): + return self._revlog.rawdata(node, _df=_df) + def revdiff(self, rev1, rev2): return self._revlog.revdiff(rev1, rev2) diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/repository.py --- a/mercurial/repository.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/repository.py Wed Aug 14 09:22:54 2019 +0900 @@ -597,6 +597,10 @@ consumers should use ``read()`` to obtain the actual file data. """ + def rawdata(node): + """Obtain raw data for a node. + """ + def read(node): """Resolve file fulltext data. @@ -1164,6 +1168,9 @@ def revision(node, _df=None, raw=False): """Obtain fulltext data for a node.""" + def rawdata(node, _df=None): + """Obtain raw data for a node.""" + def revdiff(rev1, rev2): """Obtain a delta between two revision numbers. diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/revlog.py --- a/mercurial/revlog.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/revlog.py Wed Aug 14 09:22:54 2019 +0900 @@ -38,13 +38,6 @@ from .revlogutils.constants import ( FLAG_GENERALDELTA, FLAG_INLINE_DATA, - REVIDX_DEFAULT_FLAGS, - REVIDX_ELLIPSIS, - REVIDX_EXTSTORED, - REVIDX_FLAGS_ORDER, - REVIDX_ISCENSORED, - REVIDX_KNOWN_FLAGS, - REVIDX_RAWTEXT_CHANGING_FLAGS, REVLOGV0, REVLOGV1, REVLOGV1_FLAGS, @@ -54,6 +47,14 @@ REVLOG_DEFAULT_FORMAT, REVLOG_DEFAULT_VERSION, ) +from .revlogutils.flagutil import ( + REVIDX_DEFAULT_FLAGS, + REVIDX_ELLIPSIS, + REVIDX_EXTSTORED, + REVIDX_FLAGS_ORDER, + REVIDX_ISCENSORED, + REVIDX_RAWTEXT_CHANGING_FLAGS, +) from .thirdparty import ( attr, ) @@ -70,6 +71,7 @@ ) from .revlogutils import ( deltas as deltautil, + flagutil, ) from .utils import ( interfaceutil, @@ -94,7 +96,6 @@ REVIDX_EXTSTORED REVIDX_DEFAULT_FLAGS REVIDX_FLAGS_ORDER -REVIDX_KNOWN_FLAGS REVIDX_RAWTEXT_CHANGING_FLAGS parsers = policy.importmod(r'parsers') @@ -108,11 +109,6 @@ _maxinline = 131072 _chunksize = 1048576 -# Store flag processors (cf. 'addflagprocessor()' to register) -_flagprocessors = { - REVIDX_ISCENSORED: None, -} - # Flag processors for REVIDX_ELLIPSIS. def ellipsisreadprocessor(rl, text): return text, False @@ -129,45 +125,6 @@ ellipsisrawprocessor, ) -def addflagprocessor(flag, processor): - """Register a flag processor on a revision data flag. - - Invariant: - - Flags need to be defined in REVIDX_KNOWN_FLAGS and REVIDX_FLAGS_ORDER, - and REVIDX_RAWTEXT_CHANGING_FLAGS if they can alter rawtext. - - Only one flag processor can be registered on a specific flag. - - flagprocessors must be 3-tuples of functions (read, write, raw) with the - following signatures: - - (read) f(self, rawtext) -> text, bool - - (write) f(self, text) -> rawtext, bool - - (raw) f(self, rawtext) -> bool - "text" is presented to the user. "rawtext" is stored in revlog data, not - directly visible to the user. - The boolean returned by these transforms is used to determine whether - the returned text can be used for hash integrity checking. For example, - if "write" returns False, then "text" is used to generate hash. If - "write" returns True, that basically means "rawtext" returned by "write" - should be used to generate hash. Usually, "write" and "read" return - different booleans. And "raw" returns a same boolean as "write". - - Note: The 'raw' transform is used for changegroup generation and in some - debug commands. In this case the transform only indicates whether the - contents can be used for hash integrity checks. - """ - _insertflagprocessor(flag, processor, _flagprocessors) - -def _insertflagprocessor(flag, processor, flagprocessors): - if not flag & REVIDX_KNOWN_FLAGS: - msg = _("cannot register processor on unknown flag '%#x'.") % (flag) - raise error.ProgrammingError(msg) - if flag not in REVIDX_FLAGS_ORDER: - msg = _("flag '%#x' undefined in REVIDX_FLAGS_ORDER.") % (flag) - raise error.ProgrammingError(msg) - if flag in flagprocessors: - msg = _("cannot register multiple processors on flag '%#x'.") % (flag) - raise error.Abort(msg) - flagprocessors[flag] = processor - def getoffset(q): return int(q >> 16) @@ -175,7 +132,7 @@ return int(q & 0xFFFF) def offset_type(offset, type): - if (type & ~REVIDX_KNOWN_FLAGS) != 0: + if (type & ~flagutil.REVIDX_KNOWN_FLAGS) != 0: raise ValueError('unknown revlog index flags') return int(int(offset) << 16 | type) @@ -384,7 +341,7 @@ # Make copy of flag processors so each revlog instance can support # custom flags. - self._flagprocessors = dict(_flagprocessors) + self._flagprocessors = dict(flagutil.flagprocessors) # 2-tuple of file handles being used for active writing. self._writinghandles = None @@ -442,7 +399,7 @@ # revlog v0 doesn't have flag processors for flag, processor in opts.get(b'flagprocessors', {}).iteritems(): - _insertflagprocessor(flag, processor, self._flagprocessors) + flagutil.insertflagprocessor(flag, processor, self._flagprocessors) if self._chunkcachesize <= 0: raise error.RevlogError(_('revlog chunk cache size %r is not ' @@ -687,7 +644,7 @@ # fast path: if no "read" flag processor could change the content, # size is rawsize. note: ELLIPSIS is known to not change the content. flags = self.flags(rev) - if flags & (REVIDX_KNOWN_FLAGS ^ REVIDX_ELLIPSIS) == 0: + if flags & (flagutil.REVIDX_KNOWN_FLAGS ^ REVIDX_ELLIPSIS) == 0: return self.rawsize(rev) return len(self.revision(rev, raw=False)) @@ -1651,6 +1608,9 @@ treated as raw data when applying flag transforms. 'raw' should be set to True when generating changegroups or in debug commands. """ + return self._revisiondata(nodeorrev, _df, raw=raw) + + def _revisiondata(self, nodeorrev, _df=None, raw=False): if isinstance(nodeorrev, int): rev = nodeorrev node = self.node(rev) @@ -1717,6 +1677,13 @@ return text + def rawdata(self, nodeorrev, _df=None, raw=False): + """return an uncompressed raw data of a given node or revision number. + + _df - an existing file handle to read from. (internal-only) + """ + return self._revisiondata(nodeorrev, _df, raw=True) + def hash(self, text, p1, p2): """Compute a node hash. @@ -1754,9 +1721,9 @@ raise error.ProgrammingError(_("invalid '%s' operation") % operation) # Check all flags are known. - if flags & ~REVIDX_KNOWN_FLAGS: + if flags & ~flagutil.REVIDX_KNOWN_FLAGS: raise error.RevlogError(_("incompatible revision flag '%#x'") % - (flags & ~REVIDX_KNOWN_FLAGS)) + (flags & ~flagutil.REVIDX_KNOWN_FLAGS)) validatehash = True # Depending on the operation (read or write), the order might be # reversed due to non-commutative transforms. @@ -2461,7 +2428,8 @@ # the revlog chunk is a delta. cachedelta = None rawtext = None - if destrevlog._lazydelta: + if (deltareuse != self.DELTAREUSEFULLADD + and destrevlog._lazydelta): dp = self.deltaparent(rev) if dp != nullrev: cachedelta = (dp, bytes(self._chunk(rev))) @@ -2614,7 +2582,7 @@ # # L1 should be equal to L2. L3 could be different from them. # "text" may or may not affect commit hash depending on flag - # processors (see revlog.addflagprocessor). + # processors (see flagutil.addflagprocessor). # # | common | rename | meta | ext # ------------------------------------------------- diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/revlogutils/constants.py --- a/mercurial/revlogutils/constants.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/revlogutils/constants.py Wed Aug 14 09:22:54 2019 +0900 @@ -11,7 +11,6 @@ from .. import ( repository, - util, ) # revlog header flags @@ -48,7 +47,7 @@ REVIDX_ELLIPSIS, REVIDX_EXTSTORED, ] -REVIDX_KNOWN_FLAGS = util.bitsfrom(REVIDX_FLAGS_ORDER) + # bitmark for flags that could cause rawdata content change REVIDX_RAWTEXT_CHANGING_FLAGS = REVIDX_ISCENSORED | REVIDX_EXTSTORED diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/revlogutils/flagutil.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/mercurial/revlogutils/flagutil.py Wed Aug 14 09:22:54 2019 +0900 @@ -0,0 +1,80 @@ +# flagutils.py - code to deal with revlog flags and their processors +# +# Copyright 2016 Remi Chaintron +# Copyright 2016-2019 Pierre-Yves David +# +# This software may be used and distributed according to the terms of the +# GNU General Public License version 2 or any later version. + +from __future__ import absolute_import + +from ..i18n import _ + +from .constants import ( + REVIDX_DEFAULT_FLAGS, + REVIDX_ELLIPSIS, + REVIDX_EXTSTORED, + REVIDX_FLAGS_ORDER, + REVIDX_ISCENSORED, + REVIDX_RAWTEXT_CHANGING_FLAGS, +) + +from .. import ( + error, + util +) + +# blanked usage of all the name to prevent pyflakes constraints +# We need these name available in the module for extensions. +REVIDX_ISCENSORED +REVIDX_ELLIPSIS +REVIDX_EXTSTORED +REVIDX_DEFAULT_FLAGS +REVIDX_FLAGS_ORDER +REVIDX_RAWTEXT_CHANGING_FLAGS + +REVIDX_KNOWN_FLAGS = util.bitsfrom(REVIDX_FLAGS_ORDER) + +# Store flag processors (cf. 'addflagprocessor()' to register) +flagprocessors = { + REVIDX_ISCENSORED: None, +} + +def addflagprocessor(flag, processor): + """Register a flag processor on a revision data flag. + + Invariant: + - Flags need to be defined in REVIDX_KNOWN_FLAGS and REVIDX_FLAGS_ORDER, + and REVIDX_RAWTEXT_CHANGING_FLAGS if they can alter rawtext. + - Only one flag processor can be registered on a specific flag. + - flagprocessors must be 3-tuples of functions (read, write, raw) with the + following signatures: + - (read) f(self, rawtext) -> text, bool + - (write) f(self, text) -> rawtext, bool + - (raw) f(self, rawtext) -> bool + "text" is presented to the user. "rawtext" is stored in revlog data, not + directly visible to the user. + The boolean returned by these transforms is used to determine whether + the returned text can be used for hash integrity checking. For example, + if "write" returns False, then "text" is used to generate hash. If + "write" returns True, that basically means "rawtext" returned by "write" + should be used to generate hash. Usually, "write" and "read" return + different booleans. And "raw" returns a same boolean as "write". + + Note: The 'raw' transform is used for changegroup generation and in some + debug commands. In this case the transform only indicates whether the + contents can be used for hash integrity checks. + """ + insertflagprocessor(flag, processor, flagprocessors) + +def insertflagprocessor(flag, processor, flagprocessors): + if not flag & REVIDX_KNOWN_FLAGS: + msg = _("cannot register processor on unknown flag '%#x'.") % (flag) + raise error.ProgrammingError(msg) + if flag not in REVIDX_FLAGS_ORDER: + msg = _("flag '%#x' undefined in REVIDX_FLAGS_ORDER.") % (flag) + raise error.ProgrammingError(msg) + if flag in flagprocessors: + msg = _("cannot register multiple processors on flag '%#x'.") % (flag) + raise error.Abort(msg) + flagprocessors[flag] = processor diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/revset.py --- a/mercurial/revset.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/revset.py Wed Aug 14 09:22:54 2019 +0900 @@ -1695,7 +1695,7 @@ parent. (EXPERIMENTAL) """ if x is None: - stacks = stackmod.getstack(repo, x) + stacks = stackmod.getstack(repo) else: stacks = smartset.baseset([]) for revision in getset(repo, fullreposet(repo), x): diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/scmutil.py --- a/mercurial/scmutil.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/scmutil.py Wed Aug 14 09:22:54 2019 +0900 @@ -1984,3 +1984,21 @@ "ancestors(head() and not bookmark(%s)) - " "ancestors(bookmark() and not bookmark(%s))", mark, mark, mark) + +def computechangesetfilesadded(ctx): + """return the list of files added in a changeset + """ + added = [] + for f in ctx.files(): + if not any(f in p for p in ctx.parents()): + added.append(f) + return added + +def computechangesetfilesremoved(ctx): + """return the list of files removed in a changeset + """ + removed = [] + for f in ctx.files(): + if f not in ctx: + removed.append(f) + return removed diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/shelve.py --- a/mercurial/shelve.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/shelve.py Wed Aug 14 09:22:54 2019 +0900 @@ -177,6 +177,7 @@ _nokeep = 'nokeep' # colon is essential to differentiate from a real bookmark name _noactivebook = ':no-active-bookmark' + _interactive = 'interactive' @classmethod def _verifyandtransform(cls, d): @@ -247,6 +248,7 @@ obj.activebookmark = '' if d.get('activebook', '') != cls._noactivebook: obj.activebookmark = d.get('activebook', '') + obj.interactive = d.get('interactive') == cls._interactive except (error.RepoLookupError, KeyError) as err: raise error.CorruptedState(pycompat.bytestr(err)) @@ -254,7 +256,7 @@ @classmethod def save(cls, repo, name, originalwctx, pendingctx, nodestoremove, - branchtorestore, keep=False, activebook=''): + branchtorestore, keep=False, activebook='', interactive=False): info = { "name": name, "originalwctx": nodemod.hex(originalwctx.node()), @@ -267,6 +269,8 @@ "keep": cls._keep if keep else cls._nokeep, "activebook": activebook or cls._noactivebook } + if interactive: + info['interactive'] = cls._interactive scmutil.simplekeyvaluefile( repo.vfs, cls._filename).write(info, firstline=("%d" % cls._version)) @@ -694,11 +698,12 @@ if shfile.exists(): shfile.movetobackup() cleanupoldbackups(repo) -def unshelvecontinue(ui, repo, state, opts, basename=None): +def unshelvecontinue(ui, repo, state, opts): """subcommand to continue an in-progress unshelve""" # We're finishing off a merge. First parent is our original # parent, second is the temporary "fake" commit we're unshelving. - interactive = opts.get('interactive') + interactive = state.interactive + basename = state.name with repo.lock(): checkparents(repo, state) ms = merge.mergestate.read(repo) @@ -721,15 +726,8 @@ with repo.ui.configoverride(overrides, 'unshelve'): with repo.dirstate.parentchange(): repo.setparents(state.parents[0], nodemod.nullid) - if not interactive: - ispartialunshelve = False - newnode = repo.commit(text=shelvectx.description(), - extra=shelvectx.extra(), - user=shelvectx.user(), - date=shelvectx.date()) - else: - newnode, ispartialunshelve = _dounshelveinteractive(ui, - repo, shelvectx, basename, opts) + newnode, ispartialunshelve = _createunshelvectx(ui, + repo, shelvectx, basename, interactive, opts) if newnode is None: # If it ended up being a no-op commit, then the normal @@ -749,11 +747,11 @@ mergefiles(ui, repo, state.wctx, shelvectx) restorebranch(ui, repo, state.branchtorestore) + if not phases.supportinternal(repo): + repair.strip(ui, repo, state.nodestoremove, backup=False, + topic='shelve') + shelvedstate.clear(repo) if not ispartialunshelve: - if not phases.supportinternal(repo): - repair.strip(ui, repo, state.nodestoremove, backup=False, - topic='shelve') - shelvedstate.clear(repo) unshelvecleanup(ui, repo, state.name, opts) _restoreactivebookmark(repo, state.activebookmark) ui.status(_("unshelve of '%s' complete\n") % state.name) @@ -804,14 +802,37 @@ return repo, shelvectx -def _dounshelveinteractive(ui, repo, shelvectx, basename, opts): - """The user might want to unshelve certain changes only from the stored - shelve. So, we would create two commits. One with requested changes to - unshelve at that time and the latter is shelved for future. +def _createunshelvectx(ui, repo, shelvectx, basename, interactive, opts): + """Handles the creation of unshelve commit and updates the shelve if it + was partially unshelved. + + If interactive is: + + * False: Commits all the changes in the working directory. + * True: Prompts the user to select changes to unshelve and commit them. + Update the shelve with remaining changes. + + Returns the node of the new commit formed and a bool indicating whether + the shelve was partially unshelved.Creates a commit ctx to unshelve + interactively or non-interactively. + + The user might want to unshelve certain changes only from the stored + shelve in interactive. So, we would create two commits. One with requested + changes to unshelve at that time and the latter is shelved for future. + + Here, we return both the newnode which is created interactively and a + bool to know whether the shelve is partly done or completely done. """ opts['message'] = shelvectx.description() opts['interactive-unshelve'] = True pats = [] + if not interactive: + newnode = repo.commit(text=shelvectx.description(), + extra=shelvectx.extra(), + user=shelvectx.user(), + date=shelvectx.date()) + return newnode, False + commitfunc = getcommitfunc(shelvectx.extra(), interactive=True, editor=True) newnode = cmdutil.dorecord(ui, repo, commitfunc, None, False, @@ -819,10 +840,9 @@ **pycompat.strkwargs(opts)) snode = repo.commit(text=shelvectx.description(), extra=shelvectx.extra(), - user=shelvectx.user(), - date=shelvectx.date()) - m = scmutil.matchfiles(repo, repo[snode].files()) + user=shelvectx.user()) if snode: + m = scmutil.matchfiles(repo, repo[snode].files()) _shelvecreatedcommit(repo, snode, basename, m) return newnode, bool(snode) @@ -854,22 +874,16 @@ nodestoremove = [repo.changelog.node(rev) for rev in pycompat.xrange(oldtiprev, len(repo))] shelvedstate.save(repo, basename, pctx, tmpwctx, nodestoremove, - branchtorestore, opts.get('keep'), activebookmark) + branchtorestore, opts.get('keep'), activebookmark, + interactive) raise error.InterventionRequired( _("unresolved conflicts (see 'hg resolve', then " "'hg unshelve --continue')")) with repo.dirstate.parentchange(): repo.setparents(tmpwctx.node(), nodemod.nullid) - if not interactive: - ispartialunshelve = False - newnode = repo.commit(text=shelvectx.description(), - extra=shelvectx.extra(), - user=shelvectx.user(), - date=shelvectx.date()) - else: - newnode, ispartialunshelve = _dounshelveinteractive(ui, repo, - shelvectx, basename, opts) + newnode, ispartialunshelve = _createunshelvectx(ui, repo, + shelvectx, basename, interactive, opts) if newnode is None: # If it ended up being a no-op commit, then the normal @@ -928,7 +942,7 @@ if opts.get("name"): shelved.append(opts["name"]) - if abortf or continuef and not interactive: + if abortf or continuef: if abortf and continuef: raise error.Abort(_('cannot use both abort and continue')) if shelved: @@ -940,6 +954,8 @@ state = _loadshelvedstate(ui, repo, opts) if abortf: return unshelveabort(ui, repo, state) + elif continuef and interactive: + raise error.Abort(_('cannot use both continue and interactive')) elif continuef: return unshelvecontinue(ui, repo, state, opts) elif len(shelved) > 1: @@ -950,11 +966,8 @@ raise error.Abort(_('no shelved changes to apply!')) basename = util.split(shelved[0][1])[1] ui.status(_("unshelving change '%s'\n") % basename) - elif shelved: + else: basename = shelved[0] - if continuef and interactive: - state = _loadshelvedstate(ui, repo, opts) - return unshelvecontinue(ui, repo, state, opts, basename) if not shelvedfile(repo, basename, patchextension).exists(): raise error.Abort(_("shelved change '%s' not found") % basename) @@ -990,11 +1003,10 @@ with ui.configoverride(overrides, 'unshelve'): mergefiles(ui, repo, pctx, shelvectx) restorebranch(ui, repo, branchtorestore) + shelvedstate.clear(repo) + _finishunshelve(repo, oldtiprev, tr, activebookmark) + _forgetunknownfiles(repo, shelvectx, addedbefore) if not ispartialunshelve: - _forgetunknownfiles(repo, shelvectx, addedbefore) - - shelvedstate.clear(repo) - _finishunshelve(repo, oldtiprev, tr, activebookmark) unshelvecleanup(ui, repo, basename, opts) finally: if tr: diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/stack.py --- a/mercurial/stack.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/stack.py Wed Aug 14 09:22:54 2019 +0900 @@ -22,7 +22,7 @@ if rev is None: rev = '.' - revspec = 'reverse(only(%s) and not public() and not ::merge())' + revspec = 'only(%s) and not public() and not ::merge()' revset = revsetlang.formatspec(revspec, rev) revisions = scmutil.revrange(repo, [revset]) revisions.sort() diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/ui.py --- a/mercurial/ui.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/ui.py Wed Aug 14 09:22:54 2019 +0900 @@ -783,6 +783,17 @@ return None return default + def configdefault(self, section, name): + """returns the default value of the config item""" + item = self._knownconfig.get(section, {}).get(name) + itemdefault = None + if item is not None: + if callable(item.default): + itemdefault = item.default() + else: + itemdefault = item.default + return itemdefault + def hasconfig(self, section, name, untrusted=False): return self._data(untrusted).hasitem(section, name) diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/unionrepo.py --- a/mercurial/unionrepo.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/unionrepo.py Wed Aug 14 09:22:54 2019 +0900 @@ -116,6 +116,9 @@ # already cached return text + def rawdata(self, nodeorrev, _df=None): + return self.revision(nodeorrev, _df=_df, raw=True) + def baserevision(self, nodeorrev): # Revlog subclasses may override 'revision' method to modify format of # content retrieved from revlog. To use unionrevlog with such class one diff -r f59f8a5e9096 -r 7013c7ce987f mercurial/upgrade.py --- a/mercurial/upgrade.py Mon Aug 12 14:00:19 2019 -0400 +++ b/mercurial/upgrade.py Wed Aug 14 09:22:54 2019 +0900 @@ -533,7 +533,55 @@ #reverse of "/".join(("data", path + ".i")) return filelog.filelog(repo.svfs, path[5:-2]) -def _copyrevlogs(ui, srcrepo, dstrepo, tr, deltareuse, forcedeltabothparents): +def _copyrevlog(tr, destrepo, oldrl, unencodedname): + """copy all relevant files for `oldrl` into `destrepo` store + + Files are copied "as is" without any transformation. The copy is performed + without extra checks. Callers are responsible for making sure the copied + content is compatible with format of the destination repository. + """ + oldrl = getattr(oldrl, '_revlog', oldrl) + newrl = _revlogfrompath(destrepo, unencodedname) + newrl = getattr(newrl, '_revlog', newrl) + + oldvfs = oldrl.opener + newvfs = newrl.opener + oldindex = oldvfs.join(oldrl.indexfile) + newindex = newvfs.join(newrl.indexfile) + olddata = oldvfs.join(oldrl.datafile) + newdata = newvfs.join(newrl.datafile) + + newdir = newvfs.dirname(newrl.indexfile) + newvfs.makedirs(newdir) + + util.copyfile(oldindex, newindex) + if oldrl.opener.exists(olddata): + util.copyfile(olddata, newdata) + + if not (unencodedname.endswith('00changelog.i') + or unencodedname.endswith('00manifest.i')): + destrepo.svfs.fncache.add(unencodedname) + +UPGRADE_CHANGELOG = object() +UPGRADE_MANIFEST = object() +UPGRADE_FILELOG = object() + +UPGRADE_ALL_REVLOGS = frozenset([UPGRADE_CHANGELOG, + UPGRADE_MANIFEST, + UPGRADE_FILELOG]) + +def matchrevlog(revlogfilter, entry): + """check is a revlog is selected for cloning + + The store entry is checked against the passed filter""" + if entry.endswith('00changelog.i'): + return UPGRADE_CHANGELOG in revlogfilter + elif entry.endswith('00manifest.i'): + return UPGRADE_MANIFEST in revlogfilter + return UPGRADE_FILELOG in revlogfilter + +def _clonerevlogs(ui, srcrepo, dstrepo, tr, deltareuse, forcedeltabothparents, + revlogs=UPGRADE_ALL_REVLOGS): """Copy revlogs between 2 repos.""" revcount = 0 srcsize = 0 @@ -554,9 +602,11 @@ crawsize = 0 cdstsize = 0 + alldatafiles = list(srcrepo.store.walk()) + # Perform a pass to collect metadata. This validates we can open all # source files and allows a unified progress bar to be displayed. - for unencoded, encoded, size in srcrepo.store.walk(): + for unencoded, encoded, size in alldatafiles: if unencoded.endswith('.d'): continue @@ -607,12 +657,11 @@ # Do the actual copying. # FUTURE this operation can be farmed off to worker processes. seen = set() - for unencoded, encoded, size in srcrepo.store.walk(): + for unencoded, encoded, size in alldatafiles: if unencoded.endswith('.d'): continue oldrl = _revlogfrompath(srcrepo, unencoded) - newrl = _revlogfrompath(dstrepo, unencoded) if isinstance(oldrl, changelog.changelog) and 'c' not in seen: ui.write(_('finished migrating %d manifest revisions across %d ' @@ -651,11 +700,19 @@ progress = srcrepo.ui.makeprogress(_('file revisions'), total=frevcount) + if matchrevlog(revlogs, unencoded): + ui.note(_('cloning %d revisions from %s\n') + % (len(oldrl), unencoded)) + newrl = _revlogfrompath(dstrepo, unencoded) + oldrl.clone(tr, newrl, addrevisioncb=oncopiedrevision, + deltareuse=deltareuse, + forcedeltabothparents=forcedeltabothparents) + else: + msg = _('blindly copying %s containing %i revisions\n') + ui.note(msg % (unencoded, len(oldrl))) + _copyrevlog(tr, dstrepo, oldrl, unencoded) - ui.note(_('cloning %d revisions from %s\n') % (len(oldrl), unencoded)) - oldrl.clone(tr, newrl, addrevisioncb=oncopiedrevision, - deltareuse=deltareuse, - forcedeltabothparents=forcedeltabothparents) + newrl = _revlogfrompath(dstrepo, unencoded) info = newrl.storageinfo(storedsize=True) datasize = info['storedsize'] or 0 @@ -715,7 +772,8 @@ before the new store is swapped into the original location. """ -def _upgraderepo(ui, srcrepo, dstrepo, requirements, actions): +def _upgraderepo(ui, srcrepo, dstrepo, requirements, actions, + revlogs=UPGRADE_ALL_REVLOGS): """Do the low-level work of upgrading a repository. The upgrade is effectively performed as a copy between a source @@ -743,8 +801,8 @@ deltareuse = revlog.revlog.DELTAREUSEALWAYS with dstrepo.transaction('upgrade') as tr: - _copyrevlogs(ui, srcrepo, dstrepo, tr, deltareuse, - 're-delta-multibase' in actions) + _clonerevlogs(ui, srcrepo, dstrepo, tr, deltareuse, + 're-delta-multibase' in actions, revlogs=revlogs) # Now copy other files in the store directory. # The sorted() makes execution deterministic. diff -r f59f8a5e9096 -r 7013c7ce987f tests/flagprocessorext.py --- a/tests/flagprocessorext.py Mon Aug 12 14:00:19 2019 -0400 +++ b/tests/flagprocessorext.py Wed Aug 14 09:22:54 2019 +0900 @@ -12,6 +12,9 @@ revlog, util, ) +from mercurial.revlogutils import ( + flagutil, +) # Test only: These flags are defined here only in the context of testing the # behavior of the flag processor. The canonical way to add flags is to get in @@ -58,7 +61,7 @@ class wrappedfile(obj.__class__): def addrevision(self, text, transaction, link, p1, p2, cachedelta=None, node=None, - flags=revlog.REVIDX_DEFAULT_FLAGS): + flags=flagutil.REVIDX_DEFAULT_FLAGS): if b'[NOOP]' in text: flags |= REVIDX_NOOP @@ -102,7 +105,7 @@ # Teach revlog about our test flags flags = [REVIDX_NOOP, REVIDX_BASE64, REVIDX_GZIP, REVIDX_FAIL] - revlog.REVIDX_KNOWN_FLAGS |= util.bitsfrom(flags) + flagutil.REVIDX_KNOWN_FLAGS |= util.bitsfrom(flags) revlog.REVIDX_FLAGS_ORDER.extend(flags) # Teach exchange to use changegroup 3 @@ -110,7 +113,7 @@ exchange._bundlespeccontentopts[k][b"cg.version"] = b"03" # Register flag processors for each extension - revlog.addflagprocessor( + flagutil.addflagprocessor( REVIDX_NOOP, ( noopdonothing, @@ -118,7 +121,7 @@ validatehash, ) ) - revlog.addflagprocessor( + flagutil.addflagprocessor( REVIDX_BASE64, ( b64decode, @@ -126,7 +129,7 @@ bypass, ), ) - revlog.addflagprocessor( + flagutil.addflagprocessor( REVIDX_GZIP, ( gzipdecompress, diff -r f59f8a5e9096 -r 7013c7ce987f tests/simplestorerepo.py --- a/tests/simplestorerepo.py Mon Aug 12 14:00:19 2019 -0400 +++ b/tests/simplestorerepo.py Wed Aug 14 09:22:54 2019 +0900 @@ -42,6 +42,9 @@ interfaceutil, storageutil, ) +from mercurial.revlogutils import ( + flagutil, +) # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should @@ -262,9 +265,9 @@ if flags == 0: return text, True - if flags & ~revlog.REVIDX_KNOWN_FLAGS: + if flags & ~flagutil.REVIDX_KNOWN_FLAGS: raise simplestoreerror(_("incompatible revision flag '%#x'") % - (flags & ~revlog.REVIDX_KNOWN_FLAGS)) + (flags & ~flagutil.REVIDX_KNOWN_FLAGS)) validatehash = True # Depending on the operation (read or write), the order might be @@ -326,6 +329,9 @@ return text + def rawdata(self, nodeorrev): + return self.revision(raw=True) + def read(self, node): validatenode(node) diff -r f59f8a5e9096 -r 7013c7ce987f tests/test-bookmarks-corner-case.t --- a/tests/test-bookmarks-corner-case.t Mon Aug 12 14:00:19 2019 -0400 +++ b/tests/test-bookmarks-corner-case.t Wed Aug 14 09:22:54 2019 +0900 @@ -119,7 +119,7 @@ > import atexit > import os > import time - > from mercurial import error, extensions, bookmarks + > from mercurial import bookmarks, error, extensions > > def wait(repo): > if not os.path.exists('push-A-started'): diff -r f59f8a5e9096 -r 7013c7ce987f tests/test-byteify-strings.t --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/tests/test-byteify-strings.t Wed Aug 14 09:22:54 2019 +0900 @@ -0,0 +1,266 @@ +#require py3 + + $ byteify_strings () { + > $PYTHON "$TESTDIR/../contrib/byteify-strings.py" "$@" + > } + +Test version + + $ byteify_strings --version + Byteify strings * (glob) + +Test in-place + + $ cat > testfile.py < obj['test'] = b"1234" + > mydict.iteritems() + > EOF + $ byteify_strings testfile.py -i + $ cat testfile.py + obj[b'test'] = b"1234" + mydict.iteritems() + +Test with dictiter + + $ cat > testfile.py < obj['test'] = b"1234" + > mydict.iteritems() + > EOF + $ byteify_strings testfile.py --dictiter + obj[b'test'] = b"1234" + mydict.items() + +Test kwargs-like objects + + $ cat > testfile.py < kwargs['test'] = "123" + > kwargs[test['testing']] + > kwargs[test[[['testing']]]] + > kwargs[kwargs['testing']] + > kwargs.get('test') + > kwargs.pop('test') + > kwargs.get('test', 'testing') + > kwargs.pop('test', 'testing') + > kwargs.setdefault('test', 'testing') + > + > opts['test'] = "123" + > opts[test['testing']] + > opts[test[[['testing']]]] + > opts[opts['testing']] + > opts.get('test') + > opts.pop('test') + > opts.get('test', 'testing') + > opts.pop('test', 'testing') + > opts.setdefault('test', 'testing') + > + > commitopts['test'] = "123" + > commitopts[test['testing']] + > commitopts[test[[['testing']]]] + > commitopts[commitopts['testing']] + > commitopts.get('test') + > commitopts.pop('test') + > commitopts.get('test', 'testing') + > commitopts.pop('test', 'testing') + > commitopts.setdefault('test', 'testing') + > EOF + $ byteify_strings testfile.py --treat-as-kwargs kwargs opts commitopts + kwargs['test'] = b"123" + kwargs[test[b'testing']] + kwargs[test[[[b'testing']]]] + kwargs[kwargs['testing']] + kwargs.get('test') + kwargs.pop('test') + kwargs.get('test', b'testing') + kwargs.pop('test', b'testing') + kwargs.setdefault('test', b'testing') + + opts['test'] = b"123" + opts[test[b'testing']] + opts[test[[[b'testing']]]] + opts[opts['testing']] + opts.get('test') + opts.pop('test') + opts.get('test', b'testing') + opts.pop('test', b'testing') + opts.setdefault('test', b'testing') + + commitopts['test'] = b"123" + commitopts[test[b'testing']] + commitopts[test[[[b'testing']]]] + commitopts[commitopts['testing']] + commitopts.get('test') + commitopts.pop('test') + commitopts.get('test', b'testing') + commitopts.pop('test', b'testing') + commitopts.setdefault('test', b'testing') + +Test attr*() as methods + + $ cat > testfile.py < setattr(o, 'a', 1) + > util.setattr(o, 'ae', 1) + > util.getattr(o, 'alksjdf', 'default') + > util.addattr(o, 'asdf') + > util.hasattr(o, 'lksjdf', 'default') + > util.safehasattr(o, 'lksjdf', 'default') + > @eh.wrapfunction(func, 'lksjdf') + > def f(): + > pass + > @eh.wrapclass(klass, 'lksjdf') + > def f(): + > pass + > EOF + $ byteify_strings testfile.py --allow-attr-methods + setattr(o, 'a', 1) + util.setattr(o, 'ae', 1) + util.getattr(o, 'alksjdf', b'default') + util.addattr(o, 'asdf') + util.hasattr(o, 'lksjdf', b'default') + util.safehasattr(o, 'lksjdf', b'default') + @eh.wrapfunction(func, 'lksjdf') + def f(): + pass + @eh.wrapclass(klass, 'lksjdf') + def f(): + pass + +Test without attr*() as methods + + $ cat > testfile.py < setattr(o, 'a', 1) + > util.setattr(o, 'ae', 1) + > util.getattr(o, 'alksjdf', 'default') + > util.addattr(o, 'asdf') + > util.hasattr(o, 'lksjdf', 'default') + > util.safehasattr(o, 'lksjdf', 'default') + > @eh.wrapfunction(func, 'lksjdf') + > def f(): + > pass + > @eh.wrapclass(klass, 'lksjdf') + > def f(): + > pass + > EOF + $ byteify_strings testfile.py + setattr(o, 'a', 1) + util.setattr(o, b'ae', 1) + util.getattr(o, b'alksjdf', b'default') + util.addattr(o, b'asdf') + util.hasattr(o, b'lksjdf', b'default') + util.safehasattr(o, b'lksjdf', b'default') + @eh.wrapfunction(func, b'lksjdf') + def f(): + pass + @eh.wrapclass(klass, b'lksjdf') + def f(): + pass + +Test ignore comments + + $ cat > testfile.py < # py3-transform: off + > "none" + > "of" + > 'these' + > s = """should""" + > d = '''be''' + > # py3-transform: on + > "this should" + > 'and this also' + > + > # no-py3-transform + > l = "this should be ignored" + > l2 = "this shouldn't" + > + > EOF + $ byteify_strings testfile.py + # py3-transform: off + "none" + "of" + 'these' + s = """should""" + d = '''be''' + # py3-transform: on + b"this should" + b'and this also' + + # no-py3-transform + l = "this should be ignored" + l2 = b"this shouldn't" + +Test triple-quoted strings + + $ cat > testfile.py < """This is ignored + > """ + > + > line = """ + > This should not be + > """ + > line = ''' + > Neither should this + > ''' + > EOF + $ byteify_strings testfile.py + """This is ignored + """ + + line = b""" + This should not be + """ + line = b''' + Neither should this + ''' + +Test prefixed strings + + $ cat > testfile.py < obj['test'] = b"1234" + > obj[r'test'] = u"1234" + > EOF + $ byteify_strings testfile.py + obj[b'test'] = b"1234" + obj[r'test'] = u"1234" + +Test multi-line alignment + + $ cat > testfile.py <<'EOF' + > def foo(): + > error.Abort(_("foo" + > "bar" + > "%s") + > % parameter) + > { + > 'test': dict, + > 'test2': dict, + > } + > [ + > "thing", + > "thing2" + > ] + > ( + > "tuple", + > "tuple2", + > ) + > {"thing", + > } + > EOF + $ byteify_strings testfile.py + def foo(): + error.Abort(_(b"foo" + b"bar" + b"%s") + % parameter) + { + b'test': dict, + b'test2': dict, + } + [ + b"thing", + b"thing2" + ] + ( + b"tuple", + b"tuple2", + ) + {b"thing", + } diff -r f59f8a5e9096 -r 7013c7ce987f tests/test-config.t --- a/tests/test-config.t Mon Aug 12 14:00:19 2019 -0400 +++ b/tests/test-config.t Wed Aug 14 09:22:54 2019 +0900 @@ -57,11 +57,13 @@ $ hg showconfig Section -Tjson [ { + "defaultvalue": null, "name": "Section.KeY", "source": "*.hgrc:*", (glob) "value": "Case Sensitive" }, { + "defaultvalue": null, "name": "Section.key", "source": "*.hgrc:*", (glob) "value": "lower case" @@ -70,14 +72,15 @@ $ hg showconfig Section.KeY -Tjson [ { + "defaultvalue": null, "name": "Section.KeY", "source": "*.hgrc:*", (glob) "value": "Case Sensitive" } ] $ hg showconfig -Tjson | tail -7 - }, { + "defaultvalue": null, "name": "*", (glob) "source": "*", (glob) "value": "*" (glob) @@ -102,6 +105,7 @@ $ hg config empty.source -Tjson [ { + "defaultvalue": null, "name": "empty.source", "source": "", "value": "value" diff -r f59f8a5e9096 -r 7013c7ce987f tests/test-debugcommands.t --- a/tests/test-debugcommands.t Mon Aug 12 14:00:19 2019 -0400 +++ b/tests/test-debugcommands.t Wed Aug 14 09:22:54 2019 +0900 @@ -546,7 +546,12 @@ .hg/cache/rbc-revs-v1 .hg/cache/rbc-names-v1 .hg/cache/hgtagsfnodes1 + .hg/cache/branch2-visible-hidden + .hg/cache/branch2-visible + .hg/cache/branch2-served.hidden .hg/cache/branch2-served + .hg/cache/branch2-immutable + .hg/cache/branch2-base Test debugcolor diff -r f59f8a5e9096 -r 7013c7ce987f tests/test-fix.t --- a/tests/test-fix.t Mon Aug 12 14:00:19 2019 -0400 +++ b/tests/test-fix.t Wed Aug 14 09:22:54 2019 +0900 @@ -215,6 +215,13 @@ executions that modified a file. This aggregates the same metadata previously passed to the "postfixfile" hook. + Fixer tools are run the in repository's root directory. This allows them to + read configuration files from the working copy, or even write to the working + copy. The working copy is not updated to match the revision being fixed. In + fact, several revisions may be fixed in parallel. Writes to the working copy + are not amended into the revision being fixed; fixer tools should always write + fixed file content back to stdout as documented above. + list of commands: fix rewrite file content in changesets or working directory @@ -439,6 +446,18 @@ $ printf "a\nb\nc\nd\ne\nf\ng\n" > foo.changed $ hg commit -Aqm "foo" $ printf "zz\na\nc\ndd\nee\nff\nf\ngg\n" > foo.changed + + $ hg fix --working-dir + $ cat foo.changed + ZZ + a + c + DD + EE + FF + f + GG + $ hg fix --working-dir --whole $ cat foo.changed ZZ @@ -526,6 +545,21 @@ $ cd .. +If we try to fix a missing file, we still fix other files. + + $ hg init fixmissingfile + $ cd fixmissingfile + + $ printf "fix me!\n" > foo.whole + $ hg add + adding foo.whole + $ hg fix --working-dir foo.whole bar.whole + bar.whole: $ENOENT$ + $ cat *.whole + FIX ME! + + $ cd .. + Specifying a directory name should fix all its files and subdirectories. $ hg init fixdirectory @@ -1161,28 +1195,6 @@ $ cd .. -The :fileset subconfig was a misnomer, so we renamed it to :pattern. We will -still accept :fileset by itself as if it were :pattern, but this will issue a -warning. - - $ hg init filesetispattern - $ cd filesetispattern - - $ printf "foo\n" > foo.whole - $ printf "first\nsecond\n" > bar.txt - $ hg add -q - $ hg fix -w --config fix.sometool:fileset=bar.txt \ - > --config fix.sometool:command="sort -r" - the fix.tool:fileset config name is deprecated; please rename it to fix.tool:pattern - - $ cat foo.whole - FOO - $ cat bar.txt - second - first - - $ cd .. - The execution order of tools can be controlled. This example doesn't work if you sort after truncating, but the config defines the correct order while the definitions are out of order (which might imply the incorrect order given the @@ -1264,3 +1276,83 @@ $ cd .. +We run fixer tools in the repo root so they can look for config files or other +important things in the working directory. This does NOT mean we are +reconstructing a working copy of every revision being fixed; we're just giving +the tool knowledge of the repo's location in case it can do something +reasonable with that. + + $ hg init subprocesscwd + $ cd subprocesscwd + + $ cat >> .hg/hgrc < [fix] + > printcwd:command = pwd + > printcwd:pattern = path:foo/bar + > EOF + + $ mkdir foo + $ printf "bar\n" > foo/bar + $ hg commit -Aqm blah + + $ hg fix -w -r . foo/bar + $ hg cat -r tip foo/bar + $TESTTMP/subprocesscwd + $ cat foo/bar + $TESTTMP/subprocesscwd + + $ cd foo + + $ hg fix -w -r . bar + $ hg cat -r tip bar + $TESTTMP/subprocesscwd + $ cat bar + $TESTTMP/subprocesscwd + + $ cd ../.. + +Tools configured without a pattern are ignored. It would be too dangerous to +run them on all files, because this might happen while testing a configuration +that also deletes all of the file content. There is no reasonable subset of the +files to use as a default. Users should be explicit about what files are +affected by a tool. This test also confirms that we don't crash when the +pattern config is missing, and that we only warn about it once. + + $ hg init nopatternconfigured + $ cd nopatternconfigured + + $ printf "foo" > foo + $ printf "bar" > bar + $ hg add -q + $ hg fix --debug --working-dir --config "fix.nopattern:command=echo fixed" + fixer tool has no pattern configuration: nopattern + $ cat foo bar + foobar (no-eol) + + $ cd .. + +Test that we can configure a fixer to affect all files regardless of the cwd. +The way we invoke matching must not prohibit this. + + $ hg init affectallfiles + $ cd affectallfiles + + $ mkdir foo bar + $ printf "foo" > foo/file + $ printf "bar" > bar/file + $ printf "baz" > baz_file + $ hg add -q + + $ cd bar + $ hg fix --working-dir --config "fix.cooltool:command=echo fixed" \ + > --config "fix.cooltool:pattern=rootglob:**" + $ cd .. + + $ cat foo/file + fixed + $ cat bar/file + fixed + $ cat baz_file + fixed + + $ cd .. diff -r f59f8a5e9096 -r 7013c7ce987f tests/test-flagprocessor.t --- a/tests/test-flagprocessor.t Mon Aug 12 14:00:19 2019 -0400 +++ b/tests/test-flagprocessor.t Wed Aug 14 09:22:54 2019 +0900 @@ -205,9 +205,9 @@ extsetup(ui) File "*/tests/flagprocessorext.py", line *, in extsetup (glob) validatehash, - File "*/mercurial/revlog.py", line *, in addflagprocessor (glob) - _insertflagprocessor(flag, processor, _flagprocessors) - File "*/mercurial/revlog.py", line *, in _insertflagprocessor (glob) + File "*/mercurial/revlogutils/flagutil.py", line *, in addflagprocessor (glob) + insertflagprocessor(flag, processor, flagprocessors) + File "*/mercurial/revlogutils/flagutil.py", line *, in insertflagprocessor (glob) raise error.Abort(msg) mercurial.error.Abort: b"cannot register multiple processors on flag '0x8'." (py3 !) Abort: cannot register multiple processors on flag '0x8'. (no-py3 !) diff -r f59f8a5e9096 -r 7013c7ce987f tests/test-install.t --- a/tests/test-install.t Mon Aug 12 14:00:19 2019 -0400 +++ b/tests/test-install.t Wed Aug 14 09:22:54 2019 +0900 @@ -153,6 +153,16 @@ 1 problems detected, please check your install! [1] +debuginstall extension support + $ hg debuginstall --config extensions.fsmonitor= --config fsmonitor.watchman_exe=false | grep atchman + fsmonitor checking for watchman binary... (false) + watchman binary missing or broken: warning: Watchman unavailable: watchman exited with code 1 +Verify the json works too: + $ hg debuginstall --config extensions.fsmonitor= --config fsmonitor.watchman_exe=false -Tjson | grep atchman + "fsmonitor-watchman": "false", + "fsmonitor-watchman-error": "warning: Watchman unavailable: watchman exited with code 1", + + #if test-repo $ . "$TESTDIR/helpers-testrepo.sh" diff -r f59f8a5e9096 -r 7013c7ce987f tests/test-revlog-raw.py --- a/tests/test-revlog-raw.py Mon Aug 12 14:00:19 2019 -0400 +++ b/tests/test-revlog-raw.py Wed Aug 14 09:22:54 2019 +0900 @@ -16,6 +16,7 @@ from mercurial.revlogutils import ( deltas, + flagutil, ) # TESTTMP is optional. This makes it convenient to run without run-tests.py @@ -56,7 +57,7 @@ # can be used to verify hash. return False -revlog.addflagprocessor(revlog.REVIDX_EXTSTORED, +flagutil.addflagprocessor(revlog.REVIDX_EXTSTORED, (readprocessor, writeprocessor, rawprocessor)) # Utilities about reading and appending revlog diff -r f59f8a5e9096 -r 7013c7ce987f tests/test-rust-discovery.py --- a/tests/test-rust-discovery.py Mon Aug 12 14:00:19 2019 -0400 +++ b/tests/test-rust-discovery.py Wed Aug 14 09:22:54 2019 +0900 @@ -1,16 +1,9 @@ from __future__ import absolute_import import unittest -try: - from mercurial import rustext - rustext.__name__ # trigger immediate actual import -except ImportError: - rustext = None -else: - # this would fail already without appropriate ancestor.__package__ - from mercurial.rustext.discovery import ( - PartialDiscovery, - ) +from mercurial import policy + +PartialDiscovery = policy.importrust('discovery', member='PartialDiscovery') try: from mercurial.cext import parsers as cparsers @@ -39,7 +32,7 @@ ) -@unittest.skipIf(rustext is None or cparsers is None, +@unittest.skipIf(PartialDiscovery is None or cparsers is None, "rustext or the C Extension parsers module " "discovery relies on is not available") class rustdiscoverytest(unittest.TestCase): diff -r f59f8a5e9096 -r 7013c7ce987f tests/test-server-view.t --- a/tests/test-server-view.t Mon Aug 12 14:00:19 2019 -0400 +++ b/tests/test-server-view.t Wed Aug 14 09:22:54 2019 +0900 @@ -50,7 +50,12 @@ $ hg -R test --config experimental.extra-filter-revs='not public()' debugupdatecache $ ls -1 test/.hg/cache/ branch2-base%89c45d2fa07e + branch2-immutable%89c45d2fa07e branch2-served + branch2-served%89c45d2fa07e + branch2-served.hidden%89c45d2fa07e + branch2-visible%89c45d2fa07e + branch2-visible-hidden%89c45d2fa07e hgtagsfnodes1 rbc-names-v1 rbc-revs-v1 diff -r f59f8a5e9096 -r 7013c7ce987f tests/test-shelve.t --- a/tests/test-shelve.t Mon Aug 12 14:00:19 2019 -0400 +++ b/tests/test-shelve.t Wed Aug 14 09:22:54 2019 +0900 @@ -1239,6 +1239,7 @@ > y > EOF unshelving change 'default' + temporarily committing pending changes (restore with 'hg unshelve --abort') rebasing shelved changes diff --git a/d b/d new file mode 100644 @@ -1250,6 +1251,10 @@ record this change to 'd'? (enter ? for help) [Ynesfdaq?] y + + $ hg status -v + A c + A d $ ls b c @@ -1267,15 +1272,21 @@ > B > C > EOF - $ hg shelve + $ echo > garbage + $ hg st + M foo + ? garbage + $ hg shelve --unknown shelved as default - 1 files updated, 0 files merged, 0 files removed, 0 files unresolved + 1 files updated, 0 files merged, 1 files removed, 0 files unresolved $ cat foo B $ hg unshelve -i < y > y > n + > y + > y > EOF unshelving change 'default' rebasing shelved changes @@ -1287,15 +1298,28 @@ @@ -1,1 +1,2 @@ +A B - record change 1/2 to 'foo'? + record change 1/3 to 'foo'? (enter ? for help) [Ynesfdaq?] y @@ -1,1 +2,2 @@ B +C - record change 2/2 to 'foo'? + record change 2/3 to 'foo'? (enter ? for help) [Ynesfdaq?] n + diff --git a/garbage b/garbage + new file mode 100644 + examine changes to 'garbage'? + (enter ? for help) [Ynesfdaq?] y + + @@ -0,0 +1,1 @@ + + + record change 3/3 to 'garbage'? + (enter ? for help) [Ynesfdaq?] y + + $ hg st + M foo + ? garbage $ cat foo A B @@ -1347,17 +1371,44 @@ $ hg resolve -m bar1 bar2 (no more unresolved files) continue: hg unshelve --continue + +-- using --continue with --interactive should throw an error + $ hg unshelve --continue -i + abort: cannot use both continue and interactive + [255] + $ cat bar1 A B C - $ hg unshelve --continue -i < y > y > y - > y + > n > EOF - unshelving change 'default-01' diff --git a/bar1 b/bar1 1 hunks, 1 lines changed examine changes to 'bar1'? @@ -1380,6 +1431,47 @@ +B C record change 2/2 to 'bar2'? + (enter ? for help) [Ynesfdaq?] n + + unshelve of 'default-01' complete + +#if stripbased + $ hg log -r 3:: -G + @ changeset: 4:fe451a778c81 + | tag: tip + | user: test + | date: Thu Jan 01 00:00:00 1970 +0000 + | summary: add C to bars + | + o changeset: 3:e28fd7fa7938 + | user: test + ~ date: Thu Jan 01 00:00:00 1970 +0000 + summary: add A to bars + +#endif + + $ hg unshelve --continue + abort: no unshelve in progress + [255] + + $ hg shelve --list + default-01 (*)* changes to: add A to bars (glob) + default (*)* changes to: add B to foo (glob) + $ hg unshelve -n default-01 -i < y + > y + > EOF + temporarily committing pending changes (restore with 'hg unshelve --abort') + rebasing shelved changes + diff --git a/bar2 b/bar2 + 1 hunks, 1 lines changed + examine changes to 'bar2'? (enter ? for help) [Ynesfdaq?] y - unshelve of 'default-01' complete + @@ -1,2 +1,3 @@ + A + +B + C + record this change to 'bar2'? + (enter ? for help) [Ynesfdaq?] y + diff -r f59f8a5e9096 -r 7013c7ce987f tests/test-transplant.t --- a/tests/test-transplant.t Mon Aug 12 14:00:19 2019 -0400 +++ b/tests/test-transplant.t Wed Aug 14 09:22:54 2019 +0900 @@ -1,8 +1,16 @@ +#testcases commandmode continueflag $ cat <> $HGRCPATH > [extensions] > transplant= > EOF +#if continueflag + $ cat >> $HGRCPATH < [alias] + > continue = transplant --continue + > EOF +#endif + $ hg init t $ cd t $ hg transplant @@ -424,8 +432,9 @@ updated to "e8643552fde5: foobar" 1 other heads for branch "default" $ rm added - $ hg transplant --continue - abort: no transplant to continue + $ hg continue + abort: no transplant to continue (continueflag !) + abort: no operation in progress (no-continueflag !) [255] $ hg transplant 1 applying 46ae92138f3c @@ -492,7 +501,7 @@ # To abort: hg update $ echo fixed > baz - $ hg transplant --continue + $ hg continue 9d6d6b5a8275 transplanted as d80c49962290 applying 1dab759070cf 1dab759070cf transplanted to aa0ffe6bd5ae @@ -881,7 +890,7 @@ [255] $ hg status ? b.rej - $ hg transplant --continue + $ hg continue 645035761929 skipped due to empty diff $ cd ..