encoding: use s.decode to trigger UnicodeDecodeError
When calling encode on a str, the string is first decoded using the
default encoding and then encoded. So
s.encode('ascii') == s.decode().encode('ascii')
We don't care about the encode step here -- we're just after the
UnicodeDecodeError raised by decode if it finds a non-ASCII character.
This way is also marginally faster since it saves the construction of
the extra str object.
# this file holds the definitions that are used in various bzr tests
"$TESTDIR/hghave" bzr || exit 80
TERM=dumb; export TERM
echo '[extensions]' >> $HGRCPATH
echo 'convert = ' >> $HGRCPATH
echo 'hgext.graphlog = ' >> $HGRCPATH
glog()
{
hg glog --template '{rev}@{branch} "{desc|firstline}" files: {files}\n' "$@"
}
manifest()
{
echo "% manifest of $2"
hg -R $1 manifest -v -r $2
}