lively-gpu-26436
11/26/2023, 9:10 PMabundant-tent-27407
11/27/2023, 2:54 PMlively-gpu-26436
11/27/2023, 5:38 PMpants.toml
it works fine:
[python.resolves_to_only_binary]
python-default = ["ruff"]
ancient-lawyer-12557
11/27/2023, 5:49 PMgenerate-lockfiles
Hello! I'm trying to add boto3
to a project set up with pants 2.18.0 - this is a PEX binary. I'm running on ARM macOS Sonoma (14), with python 3.11.6 set via asdf
.
17:27:06.55 [INFO] Completed: Building 1 requirement for runner.pex from the requirements.lock resolve: boto3==1.29.7
17:27:06.55 [ERROR] 1 Exception encountered:
Engine traceback:
in `run` goal - environment:mac
ProcessExecutionFailure: Process 'Building 1 requirement for runner.pex from the requirements.lock resolve: boto3==1.29.7' failed with exit code 1.
stdout:
stderr:
Failed to resolve compatible artifacts from lock requirements.lock for 1 target:
1. /Users/raph/.asdf/installs/python/3.11.6/bin/python3.11:
Failed to resolve all requirements for cp311-cp311-macosx_14_0_arm64 interpreter at /Users/raph/.asdf/installs/python/3.11.6/bin/python3.11 from requirements.lock:
Configured with:
build: True
use_wheel: True
Dependency on boto3 (via: boto3==1.29.7) not satisfied, no candidates found.
My pants.toml
contains this interpreter constraint:
interpreter_constraints = ["CPython==3.11.*"]
What confuses me is that there seems to be a wheel on PyPi. It should be able to use that wheel on my platform? Any ideas why this might be failing?rich-london-74860
11/28/2023, 3:04 AMsrc/BUILD
foo.py
bar.py
and inside BUILD
I have the target generator
python_sources(
name="lib",
sources=["*.py"]
)
In this case, src:lib
depends on src/foo.py:lib
and src/bar.py:lib
Intuitively, it makes sense that a generator had a dependency on the individual targets. A change to any individual target means a change within the generator. However, there are undesirable consequences.
Extending that example, suppose there are also these files in the directory `test`:
test/BUILD
test_foo.py
test_bar.py
and inside BUILD
I have the target generator
python_tests(
name="test_lib",
sources=["test_*.py"]
)
like before, test:test_lib
depends on test/test_foo.py:test_lib
and test/test_bar.py:test_lib
.
Now suppose test/test_foo.py:test_lib
exclusively depends on src/foo.py
and test/test_bar.py:test_lib
exclusively depends on src/bar.py
and neither src/foo.py
depends on src/bar.py
nor the reverse. If I modify src/foo.py
on HEAD
and only that file, then I would like to run all tests dependent on src/foo.py
and all tests dependent on any dependents of src/foo.py
at any depth. It seems like the right choice for that should be pants test --changed-since=HEAD --changed-dependents=transitive
but as I noted earlier:
• test/test_foo.py:test_lib
is dependent on src/foo.py
- this is correct, we want to run this test
• test:test_lib
depends on test/test_foo.py:test_lib
• Running test
on test:test_lib
also runs test/test_bar.py:test_lib
- this is incorrect
In other words, when using pants test --changed-since=HEAD --changed-dependents=transitive
tests will always run at the granularity of the python_tests
target generator.
Is there a way to make this more granular?
Is there a way to exclude generators when recursively finding transitive dependencies?
Is there something wrong in how I have unit tests setup?lively-gpu-26436
11/28/2023, 1:10 PMripe-battery-94204
11/28/2023, 1:18 PMpants fmt ::
it says the following:
14:12:16.85 [INFO] Completed: Format with Black - black made no changes.
14:12:16.85 [INFO] Completed: Format with isort - isort made no changes.
14:12:16.85 [INFO] Completed: Format with Black - black made no changes.
14:12:16.85 [INFO] Completed: Format with Black - black made no changes.
14:12:16.85 [INFO] Completed: Format with isort - isort made no changes.
✓ black made no changes.
✓ isort made no changes.
However when I then run pants lint ::
, isort gives me an error message (or a warning):
14:12:22.84 [INFO] Completed: Format with Black - black made no changes.
14:12:22.85 [INFO] Completed: Format with Black - black made no changes.
14:12:22.86 [WARN] Completed: Format with isort - isort made changes.
paths/to/files.py
paths/to/files2.py
paths/to/files3.py
…
14:12:22.86 [INFO] Completed: Lint with Flake8 - flake8 succeeded.
Partition: ['CPython==3.10.*']
14:12:22.86 [INFO] Completed: Lint with Flake8 - flake8 succeeded.
Partition: ['CPython==3.10.*']
✓ black succeeded.
✓ flake8 succeeded.
✕ isort failed.
My `mypy.ini`:
[mypy]
warn_return_any = False
warn_unused_configs = False
packages=package1, package2
# Sentence splitter is untyped, ignore imports
[mypy-sentence_splitter.*]
ignore_missing_imports = True
[mypy-scipy.*]
ignore_missing_imports = True
Does anyone know where this discrepancy could come from? I’d like to include pants lint
in CI but this stops me from using it. Any help is greatly appreciated!swift-river-73520
11/28/2023, 4:49 PMIntrinsicError: Failed to pull Docker image `sha256:f84b1f824c06320292fe5a0049a13c3b90e72c47bd4a9518e661ac93fef30a0f`: DockerResponseServerError { status_code: 404, message: "pull access denied for sha256, repository does not exist or may require 'docker login': denied: requested access to the resource is denied" }
Here's my `docker_environment`:
docker_environment(
name="container_py",
platform="linux_x86_64",
image="python:3.9",
python_bootstrap_search_path = ['<PATH>']
)
I'm able to pull python:3.9
x86_64 manually like this:
docker pull python:3.9 --platform x86_64
Any ideas?numerous-apartment-3919
11/28/2023, 5:19 PMlively-gpu-26436
11/28/2023, 6:26 PMpip_version = "20.3.4-patched"
in pants.toml
https://pantsbuild.slack.com/archives/C046T6T9U/p1701195732423829?thread_ts=1701106692.980099&cid=C046T6T9Upowerful-scooter-95162
11/28/2023, 8:16 PMlittle-pilot-11155
11/29/2023, 11:26 AMPATH
not being populated on remote exec requests as well download_python_binary
as not working in different versions of Pants. Since 2.17.1
, I cannot get Remote Execution working.
Hey Pants community 🙂
I am having a few issues with the Pants and Remote Execution feature (through BuildBarn Remote Execution) when upgrading from 2.14.0
to 2.17.1
(I actually tried a few intermediate versions in between). I am wondering if anyone had similar issues with Remote Execution. The issues that I faced:
• In 2.15.x
global remote execution mode is broken with PATH
not being set when a remote exec request is being sent, resulting in Cannot find executable \"env\" in search paths \"\""
error (related to #17262 )
◦ I believe this could be related to this call not setting environment
◦ I can enforce PATH
on a worker but I believe this should be set from a client
◦ I also recall this issue #19375 which was adding system-binaries
subsystem which might address that but I couldn't see it released
• Since 2.16.0
, with a single remote_environment
and setting remote_execution = true
in pants.toml
doesn't result in targets running remotely. This can be enabled by adding __defaults__(all=dict(environment="remote"))
to the root BUILD
file and having fallback=local
environment
◦ If I understand correctly, this unnecessary as I should be able "globally" turn off and on remote exec through remote_execution
, at least that was behavior in 2.15.2
• Since 2.17.0
the download_python_binary
function fails with cp: cannot create directory '.python-build-standalone/c12164f0e9228ec20704c1aba97eb31b8e2a482d41943d541cc8e3a9e84f7349': No such file or directory
◦ I had a look around the problematic code and it looks like the "installation_root" directory was never created but it could be that digest is incorrect
I created this repository to replicate all the above issues. There is a chance that my config is missing a setting where I can set the remote environment variables. I will see if I can patch the mentioned code and run it again but I would love to hear your thoughts 🙂nutritious-sunset-29053
11/29/2023, 1:27 PMpants export
it exports a venv plus all my tools too. Is there a way to choose which tools get exported? Or rather, to not export the tools and only export a venv with all 3rd party dependencies.ripe-gigabyte-88964
11/29/2023, 5:25 PMpants.toml
? Ideally as JSON.silly-queen-7197
11/29/2023, 5:26 PMpants list --changed-since=origin/main
it seems to list all of our 3rd party dependencies
...
//:reqs#tensorflow
//:reqs#tensorflow-io
//:reqs#typer
...
which is causing our CI to build a lot of unnecessary docker images. Is this expected behavior? Scanning through the git diff I don't see things like tensorflow anywherefresh-continent-76371
11/30/2023, 2:06 AMpyproject.toml
in our repo (along with other docker, an original pyproject.toml
project + other things).
the two python projects both use numpy, and in their pyproject.toml
are using the same version.
both BUILD files, load the requirements from poetry
poetry_requirements(
name = "reqs",
resolve = "python-default",
source = "pyproject.toml",
)
and both, use the same resolver.
I am now seeing that pants complains of inability to resolve numpy
on both projects. One is noisier than the other,; numpy is the only library commonly used between both.
It seems wrong to need to explicitly declare in each project, which of the pair to use
BUILD projecta,
dependencies = [ "lib/projecta:reqs#numpy" ]
dependencies = [ "lib/projectb:reqs#numpy" ]
Should I be using a separate resolve for each project ?victorious-zebra-49010
11/30/2023, 3:10 AM[tool.poetry]
name = "main"
version = "0.1.0"
description = ""
[tool.poetry.dependencies]
python = "^3.11,<3.13"
urllib3 = "^2.0.6"
pandas = "2.1.3"
boto3 = "1.33.2"
loguru = "0.7.2"
psycopg2-binary = "2.9.9"
pydantic = "2.5.2"
s3transfer = "0.8.1"
setuptools = "68.2.2"
tqdm = "4.66.1"
pre-commit = "3.5.0"
python-dotenv = "1.0.0"
sqlalchemy = "2.0.23"
datadog = "0.47.0"
fastapi = "0.104.1"
gunicorn = "21.2.0"
uvicorn = "0.24.0.post1"
starlette = "0.27.0"
typing-extensions = "4.8.0"
furl = "2.1.3"
ddtrace = "2.3.1"
python-json-logger = "2.0.7"
pgvector = "0.2.4"
sentence-transformers = "2.2.2"
spacy-legacy= "3.0.12"
tokenizers= "0.15.0"
transformers= "4.35.2"
tiktoken= "0.5.1"
openai= "0.28.1"
langchain= "0.0.343"
numpy= "1.26.2"
unicodedata2= "15.1.0"
anthropic = "0.7.7"
toolz = "0.11.1"
[tool.poetry.group.dev.dependencies]
pytest = "7.4.3"
pytest-mock = "3.12.0"
gitpython = "3.1.40"
ipdb = "0.13.13"
httpx = "0.25.2"
docker = "6.1.3"
[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"
[tool.isort]
profile = "black"
line_length = 88
[tool.black]
line-length = 88
some file sizes:
➜ du -h -d 1 ~/.cache/pants/named_caches/pex_root/
3.7G /Users/andrea.hutchinson/.cache/pants/named_caches/pex_root//installed_wheels
18M /Users/andrea.hutchinson/.cache/pants/named_caches/pex_root//bootstraps
1.7M /Users/andrea.hutchinson/.cache/pants/named_caches/pex_root//interpreters
32K /Users/andrea.hutchinson/.cache/pants/named_caches/pex_root//user_code
164K /Users/andrea.hutchinson/.cache/pants/named_caches/pex_root//built_wheels
1.9M /Users/andrea.hutchinson/.cache/pants/named_caches/pex_root//unzipped_pexes
745M /Users/andrea.hutchinson/.cache/pants/named_caches/pex_root//venvs
4.7M /Users/andrea.hutchinson/.cache/pants/named_caches/pex_root//pip-20.3.4-patched.pex
4.7M /Users/andrea.hutchinson/.cache/pants/named_caches/pex_root//pip-23.1.2.pex
2.6G /Users/andrea.hutchinson/.cache/pants/named_caches/pex_root//pip_cache
4.7M /Users/andrea.hutchinson/.cache/pants/named_caches/pex_root//pip-23.0.1.pex
2.6G /Users/andrea.hutchinson/.cache/pants/named_caches/pex_root//downloads
433M /Users/andrea.hutchinson/.cache/pants/named_caches/pex_root//installed_wheel_zips
2.8M /Users/andrea.hutchinson/.cache/pants/named_caches/pex_root//isolated
1.7M /Users/andrea.hutchinson/.cache/pants/named_caches/pex_root//bootstrap_zips
10G /Users/andrea.hutchinson/.cache/pants/named_caches/pex_root/
➜ du -h -d 1 ~/.pex
850M /Users/andrea.hutchinson/.pex/installed_wheels
13M /Users/andrea.hutchinson/.pex/bootstraps
24K /Users/andrea.hutchinson/.pex/interpreters
6.4M /Users/andrea.hutchinson/.pex/user_code
1.9M /Users/andrea.hutchinson/.pex/unzipped_pexes
1.9M /Users/andrea.hutchinson/.pex/isolated
873M /Users/andrea.hutchinson/.pex
➜ du -sh dist/export/python/virtualenvs/main_resolver
989M dist/export/python/virtualenvs/main_resolver
any recommendations on what can be done to speed up our workflow? does it make sense to bust caches? i see that https://pantsbuild.slack.com/archives/C046T6T9U/p1686754599977459 feels relatedquiet-dentist-42775
11/30/2023, 10:01 AM1. dependencies = [
":auth_requirements",
2.dependencies = [
":platform_requirements",
2. and the 2nd error I dont get i why I am getting that but when I mention any any single module('fastapi', 'flask'...) I dont get it.
like by creating a single python_requirement:
python_requirement(
name = "requirements",
requirements = [
"python-dotenv",
"fastapi",
"uvicorn",
"pydantic",
],
)
and then using this everywhere solves my both problems ::
like:
python_sources(
name = "auth_main",
dependencies = [
":requirements",
python_sources(
name = "platform_main",
dependencies = [
"apps/auth:requirements",
but I want build file to only take those python_modules which are mentioned in that same BUILD file. how to do that? can anyone help me out here...brash-glass-61350
11/30/2023, 5:53 PMtorch
, I get some linux_x86_64.whl
wheels, which don't work for my colleagues' machines when they are running on docker, which expects many_linux_2_31_aarch64.whl
wheels. What can we do?white-appointment-6770
11/30/2023, 6:04 PMpurple-plastic-57801
12/01/2023, 3:49 AMadhoc_tool(
name="sphinx-build",
runnable="//third_party/sphinx:build",
args=["-M", "html", "docs/", "_build"],
execution_dependencies=[
":sphinx-apidoc",
"//enclosure/docs",
"//enclosure/docs:index",
"//enclosure/enclosure",
],
output_directories=["_build"],
root_output_directory="./_build",
log_output=True,
)
But the //enclosure/enclosure
dependency is only that level, is there a way I can say give me all dependencies under //enclosure/enclosure
without specifying each subdir?refined-hydrogen-47967
12/01/2023, 8:05 AMException in thread "main" java.lang.Exception: Error while getting <https://github.com/coursier/jvm-index/raw/master/index.json>: download error: Caught java.net.ConnectException (Connection timed out) while downloading <https://github.com/coursier/jvm-index/raw/master/index.json>
I've tried setting coursier-url-template
but it only seems to work for the initial coursier binary download, and the advice here doesn't seem to work when coursier is being run by pants https://get-coursier.io/docs/other-proxy.adamant-wolf-51914
12/01/2023, 10:19 AM.so
file as a dependency for one of the python_sources. The closest I could fins is https://www.pantsbuild.org/docs/adhoc-tool#using-externally-managed-tools `system_binary`but it seems it will only work with the ad_hook. Is there a way to do that? The binary currently lives in user's `.pyenv`and should be same for everyone. Any hint would be appreciated!breezy-apple-27122
12/01/2023, 10:47 AM--cache-from
in a docker_image
when using pants package
? Reference https://docs.docker.com/engine/reference/commandline/build/#cache-fromhundreds-carpet-28072
12/01/2023, 11:41 AM10:58:41.40 [INFO] Initializing scheduler...
Traceback (most recent call last):
File "/home/dev/.cache/nce/3d6643e46b53e4cc0b2a0d5c768866226ddce3de1f57f80c4a02d8d39800fa8e/bindings/venvs/2.16.0/bin/pants", line 8, in <module>
sys.exit(main())
File "/home/dev/.cache/nce/3d6643e46b53e4cc0b2a0d5c768866226ddce3de1f57f80c4a02d8d39800fa8e/bindings/venvs/2.16.0/lib/python3.9/site-packages/pants/bin/pants_loader.py", line 123, in main
PantsLoader.main()
File "/home/dev/.cache/nce/3d6643e46b53e4cc0b2a0d5c768866226ddce3de1f57f80c4a02d8d39800fa8e/bindings/venvs/2.16.0/lib/python3.9/site-packages/pants/bin/pants_loader.py", line 110, in main
cls.run_default_entrypoint()
File "/home/dev/.cache/nce/3d6643e46b53e4cc0b2a0d5c768866226ddce3de1f57f80c4a02d8d39800fa8e/bindings/venvs/2.16.0/lib/python3.9/site-packages/pants/bin/pants_loader.py", line 92, in run_default_entrypoint
exit_code = runner.run(start_time)
File "/home/dev/.cache/nce/3d6643e46b53e4cc0b2a0d5c768866226ddce3de1f57f80c4a02d8d39800fa8e/bindings/venvs/2.16.0/lib/python3.9/site-packages/pants/bin/pants_runner.py", line 89, in run
return remote_runner.run(start_time)
File "/home/dev/.cache/nce/3d6643e46b53e4cc0b2a0d5c768866226ddce3de1f57f80c4a02d8d39800fa8e/bindings/venvs/2.16.0/lib/python3.9/site-packages/pants/bin/remote_pants_runner.py", line 123, in run
return self._connect_and_execute(pantsd_handle, executor, start_time)
File "/home/dev/.cache/nce/3d6643e46b53e4cc0b2a0d5c768866226ddce3de1f57f80c4a02d8d39800fa8e/bindings/venvs/2.16.0/lib/python3.9/site-packages/pants/bin/remote_pants_runner.py", line 165, in _connect_and_execute
return PyNailgunClient(port, executor).execute(command, args, modified_env)
native_engine.PantsdClientException: The pantsd process was killed during the run.
If this was not intentionally done by you, Pants may have been killed by the operating system due to memory overconsumption (i.e. OOM-killed). If you keep seeing this error message, try the troubleshooting steps below.
...
lively-garden-66504
12/01/2023, 9:11 PMpurple-plastic-57801
12/01/2023, 10:16 PMexperimental_wrap_as_python_sources
to work. I have an adhoc tool that is outputing dist/codegen/blah. Where blah is a python package.
Blah is generated with the openapi-generator. The code looks good.
I have set the experimental_wrap_as_python_sources
target as a dependency to the python_sources.. but I'm seeing
NoSourceRootError: No source root found for blah
Any advice to debug this?wide-midnight-78598
12/02/2023, 12:12 AMpants
aware of my aliases? E.g. I alias docker to podman, but pants doesn’t pick that up nativelypurple-plastic-57801
12/02/2023, 1:41 AMmanual
and they never run unless explicitly invoked directly.. or indirectly through a dependee.
Can I do something similar in pants?
In particular, I have a sphinx-build target that I never want to run in export-codegen.purple-plastic-57801
12/02/2023, 3:43 AMfile(
name="downloaded-sass",
source=per_platform(
macos_x86_64=http_source(
url="<https://github.com/sass/dart-sass/releases/download/1.69.5/dart-sass-1.69.5-macos-x64.tar.gz>",
len=3457742,
sha256="75e29a5bd76069abf2532c9a4b36b164f1c91a461452f0fbdc4167fd1685550c",
filename="sass.tar.gz",
),
linux_x86_64=http_source(
url="<https://github.com/sass/dart-sass/releases/download/1.69.5/dart-sass-1.69.5-linux-x64.tar.gz>",
len=3697303,
sha256="42b3d7b82098432f80d057da5e7464e96e6cd8d90827a9e3a47974f39c930218",
filename="sass.tar.gz",
),
),
)
shell_command(
name="extracted-sass",
command="tar -zxvf sass.tar.gz",
tools=["tar", "gzip"],
execution_dependencies=[":downloaded-sass"],
output_directories=["dart-sass"],
root_output_directory="./dart-sass"
)
run_shell_command(
name="sass",
command="XDG_CONFIG_DIRS=${CHROOT}:$XDG_CONFIG_DIRS {chroot}/sass $@",
workdir="/",
execution_dependencies=[":extracted-sass"],
)