quaint-telephone-89068
02/08/2023, 5:04 PM// This lockfile was autogenerated by Pants. To regenerate, run:
//
// /Users/my_user_name/Projects/my_project_name/scie-pants-macos-aarch64 generate-lockfiles --resolve=python-default
//
// --- BEGIN PANTS LOCKFILE METADATA: DO NOT EDIT OR REMOVE ---
I want the lock file to be the same no matter which machine created it. And I don't want to store my local machine paths in git repo.
I've tried with:
./scie-pants-macos-aarch64 generate-lockfiles --generate-lockfiles-custom-command=./pants
But it will create instruction that is not valid:
// This lockfile was autogenerated by Pants. To regenerate, run:
//
// ./pants
//
// --- BEGIN PANTS LOCKFILE METADATA: DO NOT EDIT OR REMOVE ---
Trying to fix it using --generate-lockfiles-custom-command
would lead to something like --generate-lockfiles-custom-command="./pants generate-lockfiles --generate-lockfiles-custom-command='recursion here?'"
pantsbuild/pantsquaint-telephone-89068
02/08/2023, 11:28 PMgit clone <https://gist.github.com/huonw/95a15c5cac3c5fcaf0aee3f8b63f8116>
cd 95a15c5cac3c5fcaf0aee3f8b63f8116
./run.sh
# run.sh
#!/bin/bash
pkill -f pantsd
pants version
#> 2.15.0rc4
echo "# $(date +%s)" > x.py
pants package :pex
#> ... Wrote dist/pex.pex
docker stop $(docker ps -aq)
echo "# $(date +%s)" > x.py
pants package :pex
#> ... Failed to create Docker execution in container: DockerResponseServerError { status_code: 409, message: "Container 3be6e437f53c48f195f699ea0193914c68632773765a4b6d9a1d6b8d78190d7e is not running" }
docker rm $(docker ps -aq)
echo "# $(date +%s)" > x.py
pants package :pex
#> ... Failed to create Docker execution in container: DockerResponseServerError { status_code: 404, message: "No such container: 3be6e437f53c48f195f699ea0193914c68632773765a4b6d9a1d6b8d78190d7e" }
Pants version
2.15.0rc4
OS
macOS
Additional info
https://gist.github.com/huonw/95a15c5cac3c5fcaf0aee3f8b63f8116
pantsbuild/pantsquaint-telephone-89068
02/08/2023, 11:42 PMexperimental_shell_command
exceeds its timeout (potentially any command with a timeout?) while running in a docker environment, the underlying process is left still running. This may lead to resource exhaustion.
Reproducer: https://gist.github.com/huonw/d2dea90e86c2c4b16ce5d8ff21cebcdb
git clone <https://gist.github.com/huonw/d2dea90e86c2c4b16ce5d8ff21cebcdb>
cd d2dea90e86c2c4b16ce5d8ff21cebcdb
./run.sh
# BUILD
docker_environment(
name="docker",
image="python:3.9.10-bullseye",
)
experimental_shell_command(
name="esc",
command="sleep 100000",
tools=["sleep"],
timeout=1,
environment="docker",
)
#!/bin/bash
pkill -f pantsd
docker stop $(docker ps -aq)
pants version
#> 2.15.0rc4
pants export-codegen :esc
#> ... Exceeded timeout of 1.0 seconds when executing local process: Running experimental_shell_command //:esc
docker top $(docker ps -q)
#> ...
#> root ... sleep 100000
Pants version
2.15.0rc4
OS
macOS
Additional info
https://gist.github.com/huonw/d2dea90e86c2c4b16ce5d8ff21cebcdb
pantsbuild/pantsquaint-telephone-89068
02/09/2023, 1:13 AM--pantsd
mode) in a subprocess with stdout piped for test use in assertions. This code here: https://github.com/pantsbuild/scie-pants/blob/3d269eb21ab7d764054496cd33530aaa364c6252/package/src/main.rs#L863-L892 The hang happens in the assert_pants_bin_name
calls. The links above on the 1st or 2nd call, the example below in the 3rd or 4th call (can't say for sure if the hang is on pantsd down or pantsd up, but I think its on the bring-down which would make these 1st and 3rd).
I used this ssh rig to debug: pantsbuild/scie-pants#114
The commands run in the ssh session were:
$ sudo rm -rf target/
$ cargo run -p package -- test --tools-pex-mismatch-warn
<hang>
ctrl-Z
$ sudo apt install gdb
$ sudo gdb -p 16881
log-slim.txt
log-slim2.txt
pantsbuild/pantsquaint-telephone-89068
02/09/2023, 2:10 AMquaint-telephone-89068
02/09/2023, 11:03 AM[test]
extra_env_vars = [
"ENV_TOML=toml",
"ENV_ALL=toml"
]
In the docker_environment
docker_environment(
name="dev_docker",
platform="linux_x86_64",
image="python:3.10.4-slim-bullseye",
python_bootstrap_search_path=["<PATH>"],
test_extra_env_vars=[
"ENV_DOCKER=docker",
"ENV_ALL=docker"
],
)
In the test target
python_tests(
environment="dev",
extra_env_vars=[
"ENV_TARGET=target",
"ENV_ALL=target"
],
)
My assumption would be that the following test passes when executed in the docker_environment:
import os
def test_env_var():
envs = {k: v for k,v in os.environ.items() if k.startswith('ENV_')}
assert envs["ENV_TARGET"] == 'target'
assert envs["ENV_ALL"] == 'target'
assert envs["ENV_TOML"] == 'toml'
assert envs["ENV_DOCKER"] == 'docker'
But I get the following errors:
➜ pants test tests/python/hello_world/messages/test_env_vars.py
11:52:25.24 [ERROR] Completed: Run Pytest - (environment:dev, tests/python/hello_world/messages/test_env_vars.py) - failed (exit code 1).
============================= test session starts ==============================
platform linux -- Python 3.10.4, pytest-7.0.1, pluggy-1.0.0
rootdir: /pants-sandbox/pants-sandbox-uh2dCY
plugins: cov-3.0.0, forked-1.4.0, xdist-2.5.0
collected 1 item
tests/python/hello_world/messages/test_env_vars.py F [100%]
=================================== FAILURES ===================================
_________________________________ test_env_var _________________________________
def test_env_var():
envs = {k: v for k,v in os.environ.items() if k.startswith('ENV_')}
assert envs["ENV_TARGET"] == 'target'
assert envs["ENV_ALL"] == 'target'
> assert envs["ENV_TOML"] == 'toml'
E KeyError: 'ENV_TOML'
When I run the same test with the --debug
flag I get a different error:
➜ pants test --debug tests/python/hello_world/messages/test_env_vars.py
========================================== test session starts ==========================================
platform darwin -- Python 3.10.9, pytest-7.0.1, pluggy-1.0.0
rootdir: /private/var/folders/8b/0w6lmjjx7fsdm_6hhcrjfdfw0000gq/T/pants-sandbox-0SBEZ4
plugins: xdist-2.5.0, forked-1.4.0, cov-3.0.0
collected 1 item
tests/python/hello_world/messages/test_env_vars.py F [100%]
=============================================== FAILURES ================================================
_____________________________________________ test_env_var ______________________________________________
def test_env_var():
envs = {k: v for k,v in os.environ.items() if k.startswith('ENV_')}
assert envs["ENV_TARGET"] == 'target'
assert envs["ENV_ALL"] == 'target'
assert envs["ENV_TOML"] == 'toml'
> assert envs["ENV_DOCKER"] == 'docker'
E KeyError: 'ENV_DOCKER'
How is pants supposed to handle the env vars that are configured on the different levels? Are the lists supposed to be merged together or do should they be completely be replaced by the deepest configuration?
Pants version
2.16.0.dev5, we are using the new scie-pants binary in version 0.3.2
OS
MacOS
Additional info
I have created a reproducible example here https://github.com/psontag/pexample/tree/pants-env-vars
pantsbuild/pantsquaint-telephone-89068
02/09/2023, 2:44 PMpants
wrapper script that was used exposing the username and also the repository name
2.11.1rc2
but I think it has been like this for a long time
This was run on MacOS, but I think the same will happen on other platforms
pantsbuild/pantsquaint-telephone-89068
02/09/2023, 5:27 PMprepare-metadata-for-build-wheel
PEP-517 hook instead:
pex/pex/build_system/pep_517.py
Lines 79 to 136 in </pantsbuild/pex/commit/4fd613ab4fe50ac27396f2339ad9aae2d63c398b|4fd613a>
Unfortunately, this hook is not mandatory though; so a fallback to building a wheel will need to be maintained.
pantsbuild/pexquaint-telephone-89068
02/09/2023, 6:35 PMquaint-telephone-89068
02/09/2023, 7:42 PMfiles
target with generic import all files for Docker image targeting which appears to make tailor ignore all-new package files and don't create BUILD files with tailor ::
I have a small fresh repo that I have created to reproduce this so you can test (link in additional info and more detail in Readme)
Pants version
2.15.0rc4 (tested with 2.14.1 also)
OS
WSL 2, Ubuntu 20.04 LTS
Additional info
https://github.com/CerberusQc/pants-issue
pantsbuild/pantsquaint-telephone-89068
02/10/2023, 2:09 AM--resolve-local-platforms
, pex
will still build dependencies using the local interpreter matching the --python
flag, even when binary wheels exist.
When using rosetta Python (x86 emulation), this issue does not exist.
Reproduction - I have two venvs, one for m1 native (arm) and one for rosetta (x86)):
Note the binary wheel exists for grpcio
for requested --platform
. Using --complete-platform
does not help.
# x86 works as expected - pulls the binary wheel and makes the pex
% pex --platform=manylinux2014_x86_64-cp-38-cp38 --python=python3.8 --pip-version=22.2.2 --resolver-version=pip-legacy-resolver grpcio==1.47.2 -o grpcio.pex
(x86-python) ~/src/issues/pex-python-flag
% unzip -l grpcio.pex | grep 'whl/$'
0 01-01-1980 00:00 .deps/grpcio-1.47.2-cp38-cp38-macosx_10_10_x86_64.whl/
0 01-01-1980 00:00 .deps/grpcio-1.47.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl/
0 01-01-1980 00:00 .deps/six-1.16.0-py2.py3-none-any.whl/
# same command in arm just starts building a wheel
pex --platform=manylinux2014_x86_64-cp-38-cp38 --python=python3.8 --pip-version=22.2.2 --resolver-version=pip-legacy-resolver grpcio==1.47.2 -o grpcio.pex -v
pex: Building pex :: Resolving distributions (grpcio==1.47.2) :: Resolving requirements. :: Resolving for:
/Users/shalabh/src/issues/pex-python-flag/arm-python/bin/python3.8
pex: Hashing pex pex: Hashing pex: 90.3ms
pex: Isolating pex: 0.1ms
pex: Building pex :: Resolving distributions (grpcio==1.47.2) :: Resolving requirements. :: Building distributions for:
BuildRequest(target=LocalInterpreter(id='Users.shalabh.src.issues.pex-python-flag.arm-python.bin.python3.8', platform=Platform(platform='macosx_12_0_arm64', impl='cp', version='3.8.16', version_info=(3, 8, 16), abi='cp38'), marker_environment=MarkerEnvironment(implementation_name='cpython', implementation_version='3.8.16', os_name='posix', platform_machine='arm64', platform_python_implementation='CPython', platform_release='21.6.0', platform_system='Darwin', platform_version='Darwin Kernel Version 21.6.0: Mon Aug 22 20:20:05 PDT 2022; root:xnu-8020.140.49~2/RELEASE_ARM64_T8101', python_full_version='3.8.16', python_version='3.8', sys_platform='darwin'), interpreter=PythonInterpreter('/Users/shalabh/src/issues/pex-python-flag/arm-python/bin/python3.8', PythonIdentity('/Users/shalabh/src/issues/pex-python-flag/arm-python/bin/python3.8', 'cp38', 'cp38', 'macosx_12_0_arm64', (3, 8, 16)))), source_path='/Users/shalabh/.pex/downloads/resolver_download.gmefuolk/Users.spex: Building /Users/shalabh/.pex/downloads/resolver_download.gmefuolk/Users.shalabh.src.issues.pex-python-flag.arm-python.bin.python3.8/grpcio-1.47.2.tar.gz to /Users/shalabh/.pex/built_wheels/sdists/grpcio-1.47.2.tar.gz/c7f888b067f2b1833a5670f20eeac2c6063039a23a99b799929de065c1b18692/cp38-cp38-macosx_12_0_arm64
<CTRL-C>
This has something to do with the --python
flag because if I remove that, even arm quickly builds the pex from the pypi wheel.
pantsbuild/pexquaint-telephone-89068
02/10/2023, 5:39 AMruff
does more than formatting, and since there's only one command fix things, by default it should be a fixer.
Pants version
2.16
OS
Additional info
pantsbuild/pantsquaint-telephone-89068
02/10/2023, 4:35 PMresources
targets for any py.typed
files encountered.
Adding these resources to python_distributions
will enable them to be used when running mypy for typechecking.
This is a follow up issue from #12751
pantsbuild/pantsquaint-telephone-89068
02/12/2023, 6:50 AMquaint-telephone-89068
02/12/2023, 8:58 PMBut that would use more passes over the data than we strictly need. We do need to re-compute and validate the Digest of the data (because we don't immediately trust the data that we get over the wire), but the Digest could be computed while writing the data to the temporary file, rather than via reading it in two passes as fn store will do.Describe alternatives you've considered N/A Additional context #18054 is related. pants/src/rust/engine/fs/store/src/remote.rs Lines 458 to 469 in </pantsbuild/pants/commit/763191382d67343959c518e5eefd809ed5480ded|7631913> could be adjusted to feed a hashing writer down into
load_monomorphic
.
pantsbuild/pantsquaint-telephone-89068
02/12/2023, 10:50 PMquaint-telephone-89068
02/13/2023, 7:29 PMexport-codegen
At first blush, codegen seems ideal for making generated files. But codegen in pants is not designed for version controlled files.
When pants generates code, it is considered an internal by-product as described in the PR that introduced the export-codegen
goal:
It's important that we write toThe, rather than saving to the build root, as these are generated files that should be kept out of version control. Also, we'd have major issues if Pants starting thinking those generated files were actual production files, e.g. a [target]'s default sources field claiming ownership of those files.dist/
export-codegen
goal makes these generated files available in dist/
for those who want to:
use the results of codegen outside of Pants, including:
1. Inspect the files for debugging purposes.
2. Save the files for IDEs to use.So, we need a different form of codegen in pants. goals: lint, fmt, fix Version controlled files are the bread and butter of
lint
, fmt
, and fix
which is opposite of export-codegen
which is for files that are not version controlled.
Fixers and formatters also run during the lint
goal so that lint fails if they need to fix or format a file.
Code generation of files that are version controlled is remarkably similar to fmt
and fix
. Files that need to be changed are advertised via the lint
goal. Logically, fmt and fix are very different than codegen - but the process, the UX, and the underlying rules seem quite similar to me.
In the StackStorm project, I've shoved my (version controlled) file generation under the fmt
umbrella even though it's not really a formatter. Logically this generation is also not a fixer. But, one of the benefits of doing this is that: lint will fail if the generator needs to regenerate the file(s) and it will provide an error message that explains how to run ./pants fmt ...
to resolve the lint error.
I've heard of others that create a "test" that fails if the generated file(s) were not properly regenerated. I really like using lint
for this to take advantage of an the fine grained caching and reusing the cache for both lint and fmt/fix.
goal: generate-lockfiles
Another goal that is very similar to the file generation feature I'm looking for is the generate-lockfiles
goal. The warnings/errors saying to regenerate can be triggered in many places where rules need to use the lockfiles. So, there is no need to integrate that warning in a lint backend/rule. The goal also provides a fairly ergonomic interface for regenerating those.
generate-lockfiles looks up the resolves, translates them into lockfile paths, and then generates the lockfile at that path.
Proposal for a new "generate" goal
If there were a generic "generate" goal, it could go in the opposite direction of `generate-lockfiles`: given the path of a lockfile, look up the resolve associated with that lockfile, and then generate the lockfile at that path.
Similarly, that "generate" goal could take a path to another generated file, look up owning target for it, and if that target is explicitly for generated files, run the plugin/subsystem/tool that encompasses the codegen logic for that target.
This goal could be only for non-lockfile generated files. If we did include lockfile generation in this goal, then the owning target would probably be a synthetic target synthesized from the config in pants.toml -- so it is explicitly defined (even though not defined in a BUILD file).
This goal should be modeled on fix/fmt (not modeled on export-codegen). This would extend the lint request/result classes (like fix/fmt do) so that file generation happens in a sandbox during lint, failing lint if something should be regenerated. When lint fails here, the error message would prompt the user to run ./pants generate path/to/generated/file/or/dir
.
For a plugin to provide this generate rule, it would either have to add a custom target, or add a field to an existing target to allow flagging it as owning generated files. This would be a codified assumption built into the generate
goal: all generated files must be owned by exactly 1 file generating target: If unowned, they will not be materialized in the workspace; If owned by more than one file generating targets, raise an error. Generating a file that matches a file glob for that target counts for this.
Then, if any other linters can work with a generated files the standard skip_tool
fields provide an escape hatch if the generated code does not need to meet the same standards as other files. Or, for StackStorm, I can add an extra linter specific to the generated file that checks more than just whether or not it needs to be regenerated.
Goal: run
A lot of work has gone into runnable targets like adhoc_tool
and shell_command
. If combined with the generate goal, those could allow repos to use light-weight run targets and file generator targets instead of writing a medium- or heavy-weight pants plugins.
So, a runnable target could be tied to the file generating target such that the adhoc_tool or pex_binary is used to actually do the generation when a user does ./pants generate path/to/generated/file
.
For the run
goal, the UX focuses on the generator script/target. For this "generate" goal, the UX focuses on the files to generate. So they are complementary.
goal: go-generate
#16909
This seems very similar and could perhaps be rolled into the new "generate" goal. go-generate references the go modules that define the generation. I'm looking at something that targets the generated files. But, often go:generate commands work within the same module where they are defined, so maybe it is effectively the same thing.
But, this is something that would not run via Lint. So, that would probably be a go backend option to skip running for lint is the goals were combined.
discussion
Actually naming the new "generate" goal is a different discussion. When we discuss that, we'll need to make sure it can be distinguished from the codegen that already exists in pants.
So, would a generic generate goal be helpful for anyone else? Does a generate goal that follows a process similar to fix/fmt make sense for other projects?
pantsbuild/pantsquaint-telephone-89068
02/13/2023, 8:14 PMfmt
and fix
goals than what the default behavior configures. Consider a configuration feature to allow users to specify which tools run in each of the goals.
pantsbuild/pantsquaint-telephone-89068
02/13/2023, 8:28 PMroot_output_directory
field, which allows for the output digest from a process to be rooted somewhere other than the working directory.
Similarly, output_files
and output_directories
can be specified as relative to the working directory, which may end up being ../
-ed locations (as long as they do not jump out of the build root.
Frustratingly, Process
provides its outputs relative to the workdir
that is provided to Process
, which means that capturing files relative to the build root involves a cumbersome hack: the Process
is modified to be a bash
process that calls cd
into the original workdir
, the workdir
is set to None
and output_files
and output_directories
are mangled to compensate. The argv
sequence is shlex.join()
-ed into a shell argument string, which limits the effective maximum length of the arguments somewhat.
It would be handy if it were possible to make Process
output files relative to the build root automatically.
pantsbuild/pantsquaint-telephone-89068
02/14/2023, 8:57 AMFailed to resolve requirements from PEX environment @ /root/.cache/pants/named_caches/pex_root/unzipped_pexes/03edf50c1d8897371819584d4094814692505ea8.
Needed {{ ENVIRONMENT }} compatible dependencies for:
1: {{ requirement_name }}
Required by:
{{ my_package }}
But this pex had no ProjectName(raw='{{ requirement_name }}', normalized='{{ requirement_name }}') distributions.
...
I have tried several environments (Linux ARM, M1 Mac, x86_64 Linux).
The command that fails is the equivalent of ./pants test ./libs/my_package/tests/test_something.py
. The BUILD file located in libs/my_package
looks like this:
python_sources(
name = "lib",
)
python_requirements(
name = "req",
)
python_distribution(
name = "pkg",
provides = python_artifact(
name = "my_library_name",
version = "0.0.0",
),
dependencies = [
":lib",
":req",
]
)
python_tests(
name = "tests",
sources= ["./tests/*"],
dependencies = [
":pkg"
],
)
The issue appears for a library with a large set of requirements:
aiohttp~=3.8.3
aiosignal~=1.2.0
anyio~=3.6.2
async-timeout~=4.0.2
atomicwrites~=1.4.1
attrs~=22.1.0
boltons~=21.0.0
brotli~=1.0.9
certifi~=2022.9.24
cffi~=1.15.1
charset-normalizer~=2.1.1
classes~=0.4.1
click~=8.1.3
colorama~=0.4.5
cryptography~=38.0.1
ecdsa~=0.18.0
environs~=9.5.0
face~=22.0.0
faker-commerce~=1.0.3
faker~=15.3.2
flatten-dict~=0.4.2
frozenlist~=1.3.1
furl~=2.1.3
glom~=22.1.0
idna~=3.4
iniconfig~=1.1.1
marshmallow~=3.18.0
msal~=1.20.0
multidict~=6.0.2
openapi-schema-pydantic~=1.2.4
orderedmultidict~=1.0.1
packaging~=21.3
pluggy~=1.0.0
py~=1.11.0
pyasn1~=0.4.8
pycparser~=2.21
pydantic~=1.10.2
pyjwt[crypto]~=2.6.0
pyparsing~=3.0.9
pytest-mock~=3.10.0
pytest~=7.1.2
python-dateutil~=2.8.2
python-dotenv~=0.21.0
python-jose~=3.3.0
redis~=4.3.5
requests~=2.28.1
returns~=0.19.0
rsa~=4.9
six~=1.16.0
sniffio~=1.3.0
starlette~=0.20.4
tabulate~=0.8.10
tomli~=1.2.3
toolz~=0.12.0
typing-extensions~=4.4.0
urllib3~=1.26.12
yarl~=1.8.1
* Pants version *
My Pants version is 2.14.1
.
OS
I am encountering this issue on Mac and Linux.
pantsbuild/pantsquaint-telephone-89068
02/14/2023, 3:40 PMtest
is not afflicted.
Pants version
2.15.0rc3
OS
Linux
Additional info
N/A
pantsbuild/pantsquaint-telephone-89068
02/14/2023, 8:41 PMpackage
but not run
goal (yet). I have defined a local_environment
with subprocess_environment_env_vars
. Now, when I run a python file, e.g. ./pants run src/main.py
I can see that those environment variables are present in this sandbox.
Pants version
2.15.0rc4
OS
Linux (WSL2)
Additional info
It should be noted that I was expecting and hoping this to happen as I have a specific use case. I would like a set of envs to be present in both run
and test
, but specific envs to be different during each goal. I have a macro where I just merge those lists of envs and call one macro function in the local environment with subprocess_environment_env_vars=env_vars_run_django()
and the other in the python tests target with extra_env_vars=env_vars_test_django()
.
I came accross this as I noticed that running a pex_binary
target does not propagate the envs.
pantsbuild/pantsquaint-telephone-89068
02/14/2023, 8:42 PM$ git grep "__hash__ = "
src/python/pants/engine/collection.py: __hash__ = Tuple.__hash__
src/python/pants/engine/target.py: __hash__ = Tuple.__hash__
With some instrumentation:
diff --git a/src/python/pants/engine/collection.py b/src/python/pants/engine/collection.py
index c340fa2e6..cceaae611 100644
--- a/src/python/pants/engine/collection.py
+++ b/src/python/pants/engine/collection.py
@@ -52,7 +52,16 @@ class Collection(Tuple[T, ...]):
# Unlike in Python 2 we must explicitly implement __hash__ since we explicitly implement __eq__
# per the Python 3 data model.
# See: <https://docs.python.org/3/reference/datamodel.html#object.__hash__>
- __hash__ = Tuple.__hash__
+ def __hash__(self):
+ import inspect
+ import sys
+ print(f">>> Tuple.__hash__ code:\n{inspect.getsource(Tuple.__hash__)}", file=sys.stderr)
+ print(f">>> Tuple.__origin__: {Tuple.__origin__}", file=sys.stderr)
+ print(f">>> Tuple.__args__: {Tuple.__args__}", file=sys.stderr)
+ print(f">>> {self} delegating __hash__ to super ...", file=sys.stderr)
+ print(f">>> Tuple.__hash__ value: {Tuple.__hash__()}", file=sys.stderr)
+ # return super().__hash__()
+ return Tuple.__hash__()
def __repr__(self) -> str:
return f"{self.__class__.__name__}({list(self)})"
That reveals:
>>> Tuple.__hash__ code:
def __hash__(self):
if self.__origin__ is Union:
return hash((Union, frozenset(self.__args__)))
return hash((self.__origin__, self.__args__))
>>> Tuple.__origin__: <class 'tuple'>
>>> Tuple.__args__: ()
>>> Tuple.__hash__ value: -9020823266885220566
...
>>> Tuple.__hash__ value: -9020823266885220566
>>> Tuple.__origin__: <class 'tuple'>
>>> Tuple.__args__: ()
>>> Tuple.__hash__ code:
def __hash__(self):
if self.__origin__ is Union:
return hash((Union, frozenset(self.__args__)))
return hash((self.__origin__, self.__args__))
...
So the same hash value for every instance of pants.engine.collection.Collection
. Not good if we have any large collection of these.
Adding in a real call of `tuple.__hash__`:
$ git diff
diff --git a/src/python/pants/engine/collection.py b/src/python/pants/engine/collection.py
index c340fa2e6..b58e13069 100644
--- a/src/python/pants/engine/collection.py
+++ b/src/python/pants/engine/collection.py
@@ -52,7 +52,8 @@ class Collection(Tuple[T, ...]):
# Unlike in Python 2 we must explicitly implement __hash__ since we explicitly implement __eq__
# per the Python 3 data model.
# See: <https://docs.python.org/3/reference/datamodel.html#object.__hash__>
- __hash__ = Tuple.__hash__
+ def __hash__(self):
+ return super().__hash__()
Worse, we're attempting to hash unhashable contained things:
```
$ ./pants check src/python/pants/engine::
124508.69 [ERROR] 1 Exception encountered:
Engine traceback:
in select
..
in pants.core.util_rules.environments.determine_local_environment
..
in pants.core.util_rules.environments.determine_all_environments
..
Traceback (most recent call last):
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/util/frozendict.py", line 91, in _calculate_hash
return hash(tuple(self._data.items()))
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/collection.py", line 56, in hash
return super().__hash__()
File "<string>", line 3, in hash
TypeError: unhashable type: 'dict'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/selectors.py", line 623, in native_engine_generator_send
res = rule.send(arg) if err is None else rule.throw(throw or err)
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/core/util_rules/environments.py", line 592, in determine_all_environments
Get(EnvironmentTarget, EnvironmentName(name)) for name in environments_subsystem.names
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/selectors.py", line 358, in MultiGet
return await _MultiGet(tuple(__arg0))
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/selectors.py", line 165, in await
result = yield self.gets
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/selectors.py", line 623, in native_engine_generator_send
res = rule.send(arg) if err is None else rule.throw(throw or err)
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/core/util_rules/environments.py", line 827, in get_target_for_environment_name
WrappedTargetRequest(address, description_of_origin=_description_of_origin),
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/selectors.py", line 118, in await
result = yield self
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/selectors.py", line 623, in native_engine_generator_send
res = rule.send(arg) if err is None else rule.throw(throw or err)
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/graph.py", line 401, in resolve_target_for_bootstrapping
description_of_origin=request.description_of_origin,
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/graph.py", line 180, in _determine_target_adaptor_and_type
TargetAdaptorRequest(address, description_of_origin=description_of_origin),
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/selectors.py", line 118, in await
result = yield self
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/selectors.py", line 623, in native_engine_generator_send
res = rule.send(arg) if err is None else rule.throw(throw or err)
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/build_files.py", line 395, in find_target_adaptor
address_family = await Get(AddressFamily, AddressFamilyDir(address.spec_path))
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/selectors.py", line 118, in await
result = yield self
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/selectors.py", line 623, in native_engine_generator_send
res = rule.send(arg) if err is None else rule.throw(throw or err)
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/build_files.py", line 395, in find_target_adaptor
address_family = await Get(AddressFamily, AddressFamilyDir(address.spec_path))
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/selectors.py", line 118, in await
result = yield self
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/selectors.py", line 623, in native_engine_generator_send
res = rule.send(arg) if err is None else rule.throw(throw or err)
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/build_files.py", line 260, in parse_address_family
Get(SyntheticAddressMaps, SyntheticAddressMapsRequest(directory.path)),
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/selectors.py", line 509, in MultiGet
return await _MultiGet((__arg0, __arg1))
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/selectors.py", line 165, in await
result = yield self.gets
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/selectors.py", line 623, in native_engine_generator_send
res = rule.send(arg) if err is None else rule.throw(throw or err)
File "/home/jsirois/dev/pantsbuild/pants/src/python/pants/engine/internals/synthetic_targets.py", line 314, in all_synthetic_targets
spec_pat…
pantsbuild/pantsquaint-telephone-89068
02/15/2023, 2:14 AMquaint-telephone-89068
02/15/2023, 7:00 AMpip @ <https://github.com/pypa/pip/archive/22.0.2.zip>
. Pex fails with a very opaque error from an uncaught `ValueError`: not enough values to unpack
.
Potentially I'm doing something that's not supported. If so, I'd be hoping for a more explanatory error message than the raw exception.
Workaround
For a Github archive URL like that, it seems like git
requirement like pip @ git+<https://github.com/pypa/pip@22.0.2>
works instead.
Reproducer
pex3 --version # 2.1.122
PEX_VERBOSE=1 pex3 -v lock create 'pip @ <https://github.com/pypa/pip/archive/22.0.2.zip>' --output=lock.json
(That requirement spec is taken from https://pip.pypa.io/en/stable/reference/requirement-specifiers/#examples, and indeed pip install 'pip @ <https://github.com/pypa/pip/archive/22.0.2.zip>'
seems to work.)
Output (running under 3.9, on macOS):
pex: Resolving for:
pex: Hashing pex: 82.0ms
pex: Isolating pex: 0.0ms
Traceback (most recent call last):
File "/Users/huon/.pyenv/versions/3.9.10/lib/python3.9/site-packages/pex/result.py", line 105, in catch
return func(*args, **kwargs)
File "/Users/huon/.pyenv/versions/3.9.10/lib/python3.9/site-packages/pex/cli/command.py", line 84, in run
return subcommand_func(self)
File "/Users/huon/.pyenv/versions/3.9.10/lib/python3.9/site-packages/pex/cli/commands/lock.py", line 448, in _create
create(
File "/Users/huon/.pyenv/versions/3.9.10/lib/python3.9/site-packages/pex/resolve/lockfile/create.py", line 378, in create
downloaded = resolver.download(
File "/Users/huon/.pyenv/versions/3.9.10/lib/python3.9/site-packages/pex/resolver.py", line 1232, in download
build_requests, download_results = _download_internal(
File "/Users/huon/.pyenv/versions/3.9.10/lib/python3.9/site-packages/pex/resolver.py", line 1117, in _download_internal
download_results = download_request.download_distributions(
File "/Users/huon/.pyenv/versions/3.9.10/lib/python3.9/site-packages/pex/resolver.py", line 111, in download_distributions
return list(
File "/Users/huon/.pyenv/versions/3.9.10/lib/python3.9/site-packages/pex/jobs.py", line 586, in execute_parallel
yield spawn_result.spawned_job.await_result()
File "/Users/huon/.pyenv/versions/3.9.10/lib/python3.9/site-packages/pex/jobs.py", line 222, in await_result
job.wait()
File "/Users/huon/.pyenv/versions/3.9.10/lib/python3.9/site-packages/pex/jobs.py", line 81, in wait
self._check_returncode(stderr)
File "/Users/huon/.pyenv/versions/3.9.10/lib/python3.9/site-packages/pex/pip/log_analyzer.py", line 104, in _check_returncode
result = analyzer.analyze(line)
File "/Users/huon/.pyenv/versions/3.9.10/lib/python3.9/site-packages/pex/resolve/locker.py", line 412, in analyze
project_name_and_version, partial_artifact = self._extract_resolve_data(url)
File "/Users/huon/.pyenv/versions/3.9.10/lib/python3.9/site-packages/pex/resolve/locker.py", line 264, in _extract_resolve_data
ProjectNameAndVersion.from_filename(unquote(urlparse.urlparse(url).path))
File "/Users/huon/.pyenv/versions/3.9.10/lib/python3.9/site-packages/pex/dist_metadata.py", line 294, in from_filename
project_name, version = fname.rsplit("-", 1)
ValueError: not enough values to unpack (expected 2, got 1)
not enough values to unpack (expected 2, got 1)
• The default output (without PEX_VERBOSE
) is just the exception message not enough values to unpack (expected 2, got 1)
.
• Running under other Python versions (e.g. 3.10) and other systems (e.g. a Linux docker image) gives similar output.
pantsbuild/pexquaint-telephone-89068
02/15/2023, 4:05 PMpython_distribution
targets. Each python_distribution
has its own README.md
which gets included in the generated sdist but not the generated wheel - which is as things should be. The issue comes from Pants going further than just adding the wheel as a dependency. Instead it goes the extra mile to add in all sources depended on by the python_distribution
that are not included by the wheel itself:
pants/src/python/pants/backend/python/util_rules/pex_from_targets.py
Lines 495 to 507 in </pantsbuild/pants/commit/9f2acf3387a6356286b04088744dc804cc1523e6|9f2acf3>
This has ill-effects in this case since there are two README.md files, each with different content:
$ pants package cmd:main
08:06:34.40 [ERROR] 1 Exception encountered:
Engine traceback:
in `package` goal
Exception: Can only merge Directories with no duplicates, but found 2 duplicate entries in :
`README.md`: 1.) file digest=96d7f2c61ad10f92e449136b94433c482ad88ef52f8d9513dd1426c4fb3ea17e size=44:
# B's README
Dummy placeholder lorem ipsum
`README.md`: 2.) file digest=028373478b36dc388b32c231aacb2962b6530618c38ae32af9aad1e7ace29323 size=44:
# A's README
Dummy placeholder lorem ipsum
pantsbuild/pantsquaint-telephone-89068
02/15/2023, 4:20 PMrequirements.pex
due to missing site-packages
#2066
☐ Add support for Pip 23.0.1. #2072
pantsbuild/pexquaint-telephone-89068
02/15/2023, 6:17 PM$ python -mpex.cli lock create lexid --style universal --resolver-version pip-2020-resolver --interpreter-constraint ">=3.7,<4"
pid 24464 -> /home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/bin/python -sE /home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/pex --disable-pip-version-check --no-python-version-warning --exists-action a --no-input --isolated -q --cache-dir /home/jsirois/.pex/pip_cache --log /tmp/pex-pip-log.n0c8ko88/pip.log download --dest /tmp/tmp5ga3zanm/usr.bin.python3.10 lexid --index-url <https://pypi.org/simple> --retries 5 --timeout 15 exited with 2 and STDERR:
>>> pyversion: pypy27
>>> pyversions: ['py38', 'pypy27']
>>> self: <pip._internal.models.wheel.Wheel object at 0x7f7917559de0>
ERROR: Exception:
Traceback (most recent call last):
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_vendor/resolvelib/resolvers.py", line 171, in _merge_into_criterion
crit = self.state.criteria[name]
KeyError: 'lexid'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_internal/cli/base_command.py", line 223, in _main
status = self.run(options, args)
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_internal/cli/req_command.py", line 180, in wrapper
return func(self, options, args)
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_internal/commands/download.py", line 130, in run
requirement_set = resolver.resolve(
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_internal/resolution/resolvelib/resolver.py", line 121, in resolve
self._result = resolver.resolve(
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_vendor/resolvelib/resolvers.py", line 453, in resolve
state = resolution.resolve(requirements, max_rounds=max_rounds)
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_vendor/resolvelib/resolvers.py", line 318, in resolve
name, crit = self._merge_into_criterion(r, parent=None)
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_vendor/resolvelib/resolvers.py", line 173, in _merge_into_criterion
crit = Criterion.from_requirement(self._p, requirement, parent)
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_vendor/resolvelib/resolvers.py", line 82, in from_requirement
if not cands:
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_vendor/resolvelib/structs.py", line 124, in __bool__
return bool(self._sequence)
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_internal/resolution/resolvelib/found_candidates.py", line 99, in __bool__
return any(self)
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 220, in iter_index_candidates
result = self._finder.find_best_candidate(
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_internal/index/package_finder.py", line 882, in find_best_candidate
candidates = self.find_all_candidates(project_name)
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_internal/index/package_finder.py", line 825, in find_all_candidates
package_links = self.process_project_url(
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_internal/index/package_finder.py", line 796, in process_project_url
package_links = self.evaluate_links(
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_internal/index/package_finder.py", line 778, in evaluate_links
candidate = self.get_install_candidate(link_evaluator, link)
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_internal/index/package_finder.py", line 757, in get_install_candidate
is_candidate, result = link_evaluator.evaluate_link(link)
File "/home/jsirois/.pex/venvs/2e77638397a501cfdb2b7c33808a3a7f5fe57413/5985ed09b49a653d6596b0e14d134c5456cf1a9f/lib/python3.10/site-packages/pip/_internal/index/package_finder.py", line 197, in evaluate_link
if not wheel.supported(supported_tags):
File "/tmp/tmplos6r6ee/_pex_pip_patches.py", line 158, in <lambda>
Wheel.supported = lambda *args, **kwargs: all(
File "/tmp/tmplos6r6ee/_pex_pip_patches.py", line 159, in <genexpr>
check(*args, **kwargs) for check in supported_checks
File "/tmp/tmplos6r6ee/_pex_pip_patches.py", line 127, in supported_version
major = int(match.group("major"))
AttributeError: 'NoneType' object has no attribute 'group'
The issue here is that the two oldest releases of lexid
have wheels with pypy27
pyver tags. For example: https://pypi.org/project/lexid/2020.1003/#files
Which has <https://files.pythonhosted.org/packages/d0/29/c6ca27464a82787ffd9be6f19be6c2715cc660957a39ec689bc1654fd89d/lexid-2020.1003-py38.pypy27-none-any.whl>
.
This patching code here is both incorrect and unhelpful when it is incorrect:
pex/pex/resolve/locker_patches.py
Lines 114 to 122 in </pantsbuild/pex/commit/06570eb41d6681a537bc561aa1a670dfb6dd2c3a|06570eb>
pantsbuild/pexquaint-telephone-89068
02/15/2023, 9:07 PMurl_template
for each tool to use a file://
URL. The URL-paths I ended up using in the image were "simplified" from the defaults - for example, I have:
[shfmt]
url_template = "file:///opt/pants-tools/shfmt/{version}/shfmt"
When CI runs with this config, it fails with:
Error launching process: Os { code: 2, kind: NotFound, message: "No such file or directory" }
I `ssh`'d into one of the executors that hit this failure, and looked inside the failing sandbox. There I saw:
1. The shfmt
binary was in the sandbox, and runnable
2. According to __run.sh
, Pants was trying to invoke ./shfmt_v3.2.4_linux_amd64
instead of plain ./shfmt
I believe this is happening because the shfmt
subsystem defines generate_exe
to hard-code the same naming pattern as is used in the default `url_pattern`:
pants/src/python/pants/backend/shell/lint/shfmt/subsystem.py
Lines 56 to 58 in </pantsbuild/pants/commit/ac9e27b142b14f079089522c1175a9e380291100|ac9e27b>
I think things would operate as expected if we deleted that generate_exe
override, since the shfmt
download is the executable itself.
Pants version
2.15.0rc4
OS
Observed on Linux
Additional info
https://app.toolchain.com/organizations/color/repos/color/builds/pants_run_2023_02_15_12_48_26_897_660d20c55cc041fbb63374c79a4402b0/
pantsbuild/pantsquaint-telephone-89068
02/16/2023, 6:49 AMgit clone <https://github.com/lablup/backend.ai>
• cd <http://backend.ai|backend.ai>
• ./scripts/install-dev.sh
It was successfully installed.
After installation processing, delete Backend.AI and reinstall Backend.AI
• ./scripts/delete-dev.sh
• rm -r .tmp .pants.d .pants.env pants-local ~/.cache/pants
• ./scripts/install-dev.sh
• the error shows up
[INFO] Installing Python...
✓ Python 3.9.16 as the Pants runtime is already installed.
✓ Python 3.10.9 as the <http://Backend.AI|Backend.AI> runtime is already installed.
[INFO] Checking Python features...
SSL support: ok
LZMA support: ok
[INFO] Bootstrapping the Pants build system...
Chosen Python 3.9.16 (from pyenv) as the local Pants interpreter
2.16.0.dev5
[INFO] Using the current working-copy directory as the installation path...
[INFO] Creating the unified virtualenv for IDEs...
14:42:06.06 [INFO] Completed: Build pex for resolve `python-kernel`
14:42:06.06 [INFO] Completed: Build pex for resolve `python-default`
14:42:06.06 [ERROR] 1 Exception encountered:
Engine traceback:
in `export` goal
IntrinsicError: No such file or directory (os error 2)
image▾
--print-stacktrace
option, Reinstalling processing is done successfully.
such as
diff --git a/scripts/install-dev.sh b/scripts/install-dev.sh
index b8f3b489..a4282552 100755
--- a/scripts/install-dev.sh
+++ b/scripts/install-dev.sh
@@ -645,7 +645,7 @@ setup_environment() {
show_info "Using the current working-copy directory as the installation path..."
show_info "Creating the unified virtualenv for IDEs..."
- $PANTS export \
+ $PANTS --print-stacktrace export \
--resolve=python-default \
--resolve=python-kernel \
--resolve=pants-plugins \
I don't understand that Backend.AI installation will take place only after adding the additional debug argument '--print-stacktrace' to 'install-dev.sh', when reinstalling after remove Backend.AI
After reinstalling failed, I've run ./pants export
.
https://user-images.githubusercontent.com/20002/218385067-2b5b8e58-2377-4fb0-9894-89e3a3632c61.png▾