modern-wolf-36228
12/16/2021, 2:18 PMsecrets
on docker_image()
is not on 2.9.0.dev4?best-florist-45041
12/16/2021, 6:25 PMpants tailor
will create both python_sources()
and docker_image()
thus creating a name conflict if it sees a Dockerfile and python code in the same location.
1) Have [python] interpreter_constraints = [">=3.9"]
, but somehow the pex_binary target, when placed inside the docker image, calls python3.8
(and fails, because I'm using a 3.9 image).
2) In my situation the entrypoint is defined through a third-party helm chart which needs to know about the target python package to import. Hence I only need a python env with my code in the image. I found using PEX_TOOLS=1 ./archive.pex venv ...
works perfectly with the corresponding include_tools
option in pex_binary
. Except that https://pex.readthedocs.io/en/v2.1.57/recipes.html#pex-app-in-a-container refers to env vars like PEX_PYTHON=3.9
, which appear to not do anything (like patch the issue (1))brief-dinner-12483
12/16/2021, 8:15 PMorigin/master
?brief-dinner-12483
12/16/2021, 8:15 PMhappy-kitchen-89482
12/16/2021, 8:21 PMbest-florist-45041
12/17/2021, 1:04 AM3rdparty/requirements.txt
file with the python_requirements()
in the BUILD sister file.
so, for example:
⯠./pants dependencies 3rdparty:boto3
3rdparty:requirements.txt
Now suppose I change the version of boto3 in requirements.txt
.
./pants list --changed-since=HEAD --changed-dependees=transitive
will return, well, basically everything. One some level this makes sense (everything has a 3rd party requirement), but how can I filter on just those files that are affected by the change to that single requirement?echoing-farmer-15630
12/17/2021, 8:25 PMpants.toml
and
[docker]
env_vars=["DOCKER_BUILDKIT=1", "SSH_AUTH_SOCK"]
...so that I could get mount=type=ssh
working for ssh authentication, and that's where we go a bit wrong.
Running under pants, dockerfiles with RUN --mount=type=ssh
don't seem to work, even though in the generated __run.sh
includes the required
#!/bin/bash
# This command line should execute the same process as pants did internally.
export DOCKER_BUILDKIT=1 SSH_AUTH_SOCK=/tmp/path/to/ssh/agent/sock
cd /tmp/process-directory-tmp
/usr/bin/docker build -t $'my-tag' -f my-dockerfile .
if I cd
to that directory and run __run.sh
, everything builds. If I pants package my-docker-target
, which is what generated that file, it doesn't build and gives an ssh key error when downloading custom requirements from a private github repo.
Anything we can do about that? Am I missing a flag or is this a problem with the docker backend?polite-garden-50641
12/18/2021, 6:06 AM~/.cache/pants/setup
doesn't cache any external pants plugins (global.plugins) and those get downloaded on every CI run. any idea which directory should be cached in order to avoid the re-download in every CI run.
I found no reference to that in https://www.pantsbuild.org/docs/using-pants-in-ci (note that we use remote caching, so we nuke lmdb_store and named_caches before uploading the data to the CI cache)polite-garden-50641
12/18/2021, 9:07 PMfresh-cat-90827
12/19/2021, 11:03 AMbillions-spring-9652
12/19/2021, 4:44 PMinterpreter_constraints = ["CPython==3.10.*"]
.
I'm on ubuntu 18.04 (bionic) and my python3.10 binary is from the deadsnakes ppa.
This is what I get:
ProcessExecutionFailure: Process 'Find interpreter for constraints: CPython==3.10.*' failed with exit code 102.
stdout:
stderr:
Could not find a compatible interpreter.
Examined the following interpreters:
1.) /usr/bin/python2.7 CPython==2.7.17
2.) /usr/bin/python3.6 CPython==3.6.9
3.) /usr/bin/python3.8 CPython==3.8.0
4.) /usr/bin/python3.9 CPython==3.9.9
(That 3.9.9 version is from deadsnakes as well. If I specify CPython==3.9.*, it's picked up and everything's fine).
So, is my python3.10 installation broken somehow, or not supported?bitter-ability-32190
12/20/2021, 2:44 PMtailor
would have me use. One BUILD
file in each directory that matters.
⦠Pros: Granular. Pants internals kind of assume this pattern (so target names are sane, etc...).
⦠Cons: Still boilerplate-y (MUCH less so than say, Bazel. But still have N files which just repeat python_sources()
, python_test_utils
, python_tests
over and over).
2. Second is one BUILD
file in the "root" which has nested globs for each (E.g. sources=["***/**.py", "!**/conftest.py", ...]
) and then I can override when necessary at the directory level (assuming that works š¤ )
⦠Pros: Minimal boilerplate
⦠Cons: Less granular. Pants internals don't assume this (the target name of the generator is funny looking when printed)
Q: Is pants smart enough for option 2 to not have double ownership? Can it be taught to be?
Q: What considerations should be made about cache invalidation? If I edit the top-level file in #2, do I bust any caching?
Q: For option 2, can I improve the names that get spit out? Does the "generators-are-not-targets" proposal help this case?hundreds-salesmen-11510
12/20/2021, 5:41 PMmodern-wolf-36228
12/21/2021, 9:25 AM--cache-from
on the docker_image
?ambitious-petabyte-59095
12/21/2021, 3:10 PM# src/python/projectA/BUILD
python_sources(
dependencies=[
"//:nltk",
]
)
python_distribution(
name="wheel",
dependencies=[":projectA"],
provides=setup_py(
name="projectA",
),
wheel=True,
)
Can someone help point me to the right direction? Thanks so much.rhythmic-battery-45198
12/21/2021, 3:36 PM"$(git log -1 --date=unix --format=%cd)-$(git rev-parse --short HEAD)"
. I see there is a note about this requiring custom plugin implementation (https://www.pantsbuild.org/docs/tagging-docker-images#tagging-images). Is there any existing examples or other places I should look before attempting to implement something? Thanks!rhythmic-battery-45198
12/21/2021, 7:13 PM.whl
file which we have added to our repo because it is particularly complex to build from source. This works fine if I use an absolute path as the requirement but can not find a way to make it work with relative paths.
python_requirement(
name="assimulo",
requirements=[
"assimulo@ file:////path/to/3rdparty/python/wheel/Assimulo-trunk-cp38-cp38-linux_x86_64.whl"
],
)
python_requirement(
name="assimulo",
requirements=[
"assimulo@ wheel/Assimulo-trunk-cp38-cp38-linux_x86_64.whl"
],
)
Is there a known path forward to use a relative path to a .whl as a requirement?better-church-90898
12/21/2021, 7:49 PMpyximport
to build the simple library rather than using setuptools - is it possible to configure pants to support this workflow without a custom plugin?fresh-cat-90827
12/21/2021, 11:05 PMRequires-Dist
section that may confuse pip
; thread has details.hundreds-father-404
12/21/2021, 11:40 PMresolve="data_science"
for example.
A. Abbreviate for less verbosity: {"py-default": "3rdparty/py/default_lock.txt"}
B. Use full name for clarity: {"python-default": "3rdparty/python/default_lock.txt"}
witty-crayon-22786
12/22/2021, 2:25 AMmelodic-thailand-99227
12/22/2021, 6:37 AMcurved-television-6568
12/22/2021, 8:27 AMplain-fireman-49959
12/22/2021, 1:50 PMpython-magic
and the module it exposes is just magic
. I checked the Module Mapping docs, however I don't understand how to use it. I expect to declare in the root BUILD file that such mapping exists
poetry_requirements()
python_requirement(
module_mapping={"python-magic": ["magic"]}
)
But I get a MappingError Targets in root-level BUILD files must be named explicitly.
loud-stone-83419
12/22/2021, 4:40 PMloud-stone-83419
12/22/2021, 4:40 PMechoing-london-29138
12/22/2021, 5:27 PM./pants test ::
but I got the following error :
18:27:01.07 [ERROR] 1 Exception encountered:
ProcessExecutionFailure: Process 'Find interpreter for constraints: CPython>=3.8' failed with exit code 1.
stdout:
stderr:
Traceback (most recent call last):
File "/Users/arth/.cache/pants/named_caches/pex_root/unzipped_pexes/01a8d22400b26b1d132a9849fb6e28962f1ecf9e/.bootstrap/pex/pex.py", line 476, in execute
exit_value = self._wrap_coverage(self._wrap_profiling, self._execute)
File "/Users/arth/.cache/pants/named_caches/pex_root/unzipped_pexes/01a8d22400b26b1d132a9849fb6e28962f1ecf9e/.bootstrap/pex/pex.py", line 401, in _wrap_coverage
return runner(*args)
File "/Users/arth/.cache/pants/named_caches/pex_root/unzipped_pexes/01a8d22400b26b1d132a9849fb6e28962f1ecf9e/.bootstrap/pex/pex.py", line 432, in _wrap_profiling
return runner(*args)
File "/Users/arth/.cache/pants/named_caches/pex_root/unzipped_pexes/01a8d22400b26b1d132a9849fb6e28962f1ecf9e/.bootstrap/pex/pex.py", line 532, in _execute
return self.execute_entry(self._pex_info.entry_point)
File "/Users/arth/.cache/pants/named_caches/pex_root/unzipped_pexes/01a8d22400b26b1d132a9849fb6e28962f1ecf9e/.bootstrap/pex/pex.py", line 668, in execute_entry
return self.execute_pkg_resources(entry_point)
File "/Users/arth/.cache/pants/named_caches/pex_root/unzipped_pexes/01a8d22400b26b1d132a9849fb6e28962f1ecf9e/.bootstrap/pex/pex.py", line 699, in execute_pkg_resources
runner = entry.resolve()
File "/Users/arth/.cache/pants/named_caches/pex_root/unzipped_pexes/01a8d22400b26b1d132a9849fb6e28962f1ecf9e/.bootstrap/pex/vendor/_vendored/setuptools/pkg_resources/__init__.py", line 2481, in resolve
module = __import__(self.module_name, fromlist=['__name__'], level=0)
File "/Users/arth/.cache/pants/named_caches/pex_root/installed_wheels/7fea57708be4117c2d962e01abd1585a8846dc06/pex-2.1.54-py2.py3-none-any.whl/pex/bin/pex.py", line 30, in <module>
from pex.pex import PEX
File "/Users/arth/.cache/pants/named_caches/pex_root/installed_wheels/7fea57708be4117c2d962e01abd1585a8846dc06/pex-2.1.54-py2.py3-none-any.whl/pex/pex.py", line 12, in <module>
from distutils import sysconfig
File "/Users/arth/.pyenv/versions/3.8.10/lib/python3.8/site-packages/_distutils_hack/__init__.py", line 92, in create_module
return importlib.import_module('setuptools._distutils')
File "/Users/arth/.pyenv/versions/3.8.10/lib/python3.8/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
ModuleNotFoundError: No module named 'setuptools'
Use --no-process-execution-local-cleanup to preserve process chroots for inspection.
echoing-london-29138
12/22/2021, 5:29 PMglamorous-tiger-7918
12/22/2021, 8:41 PM2.8.1rc1
to 2.9.0.rc0
yesterday. Noticed today that the pex files that end up getting deployed do not have executable permissions. The permissions are:
⢠2.8.1rc1
-> -rwxr-xr-x
⢠2.9.0.rc0
-> -rwxr--r--
I've tried packaging locally and we end up with the correct permissions, seems the pex files created by our CI (Github, ubuntu-latest) are missing those two executable permissions.
Any idea on what might cause this or how to remedy this?bitter-ability-32190
12/23/2021, 4:30 PM