rhythmic-morning-87313
09/28/2022, 6:29 AMripe-cpu-85141
09/28/2022, 2:52 PM./pants package ::
, the build fails saying it can't find the executable 'newuidmap' on $PATH.
I tried to add the following to the pants.toml file:
[docker]
tools = [
"newuidmap",
"newgidmap"
]
and sometimes it works, sometimes it doesn't. When it starts to work, I can remove the section and it continues to work. Sometimes the problem comes back.
I'm not sure how to reproduce that easily and I can't guarantee that nothing changed in my environment.
I tried removing the caches, but it doesn't seem to have an effect.
Any idea on what's going on or what I should do to dig deeper?clever-hamburger-59716
09/28/2022, 3:12 PMresources (
name="test-data",
sources=["./examples/data/*"],
)
pex_binary(
name="service",
entry_point="service.py",
platforms=[
"current",
"manylinux2014-x86_64-cp-39-cp39",
],
dependencies=[":test-data"],
)
test/sub_dir_2/BUILD has this:
pex_binary(
name="test_service",
entry_point="test_service.py",
dependencies=["service:test-data"],
)
I am trying to package everything as a pex file. A test in test_service.py uses the files as such:
path = str(pathlib.Path(__file__, "../../../examples/data/data.csv").resolve())
do_something_with(path)
Checking the sandbox, indeed, the files don't exist(actually no resources exist). However, I do see the files in the packaged .pex archives.
I am trying to run the test as such
./pants --keep-sandboxes=on_failure test service/test/sub_dir_2/test_service.py -- -k test_csv
I am using pants 2.13.0. I would really appreciate any pointers on how to go about including files in tests.
TIA,
CSNcareful-address-89803
09/28/2022, 3:49 PMcareful-address-89803
09/28/2022, 3:51 PMhundreds-father-404
09/28/2022, 4:51 PMcareful-address-89803
09/28/2022, 5:15 PMhundreds-father-404
09/28/2022, 5:20 PMhigh-yak-85899
09/28/2022, 5:27 PMhigh-yak-85899
09/28/2022, 6:05 PMgenerate-lockfiles
lets you supply a --custom-command
to appear in the lockfile header. I'd like developers to use that. However, if they see an invalid lockfile error, they are presented with a command that I don't want them to run.plain-carpet-73994
09/28/2022, 7:53 PM--extra-index-urls
each uses (because pytorch has CUDA and non-CUDA libs which are differentiated only by the index url). But it appears that python-repos.indexes
is a global property and can't be set per-resolve. Is there a way to do this?
(see https://pantsbuild.slack.com/archives/C046T6T9U/p1664233267501069 for some more context if necessary)brash-student-40401
09/29/2022, 1:38 PMsetup.py
? The use case here is moving code to a monorepo, and trying to use code that was previously installed with a pip install
. From some old Github conversations, I believe this should be possible, but I couldn't find documentation for how to set things up. Details in 🧵.refined-addition-53644
09/29/2022, 3:59 PM15:49:36.77 [WARN] Pants cannot infer owners for the following imports in the target src/package-local/tests/utils/test_nlp.py:
* tests.resources.fake (line: 5)
In this case src/package-local
has structure like. We have multiple of such local packages each with their own tests.
src/package-local
- package_local
- tests
high-yak-85899
09/29/2022, 5:17 PMexport
use to make a virtual environment?cold-vr-15232
09/29/2022, 6:31 PMbumpy-noon-80834
09/29/2022, 10:38 PM$ ./pants test ::
00:31:26.68 [ERROR] Completed: Run Pytest - packages/libhello/tests/test_hello.py failed (exit code 2).
============================= test session starts ==============================
platform linux -- Python 3.9.14, pytest-7.0.1, pluggy-1.0.0
rootdir: /tmp/pants-sandbox-8eRczK
plugins: cov-3.0.0
collected 0 items / 1 error
==================================== ERRORS ====================================
____________ ERROR collecting packages/libhello/tests/test_hello.py ____________
ImportError while importing test module '/tmp/pants-sandbox-8eRczK/packages/libhello/tests/test_hello.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
/usr/lib/python3.9/importlib/__init__.py:127: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
packages/libhello/tests/test_hello.py:1: in <module>
from libhello import hello
E ModuleNotFoundError: No module named 'libhello'
- generated xml file: /tmp/pants-sandbox-8eRczK/packages.libhello.tests.test_hello.py.xml -
=========================== short test summary info ============================
ERROR packages/libhello/tests/test_hello.py
!!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!
=============================== 1 error in 0.06s ===============================
What am I getting wrong?high-yak-85899
09/29/2022, 11:56 PM.cache/pants/setup
. https://www.pantsbuild.org/docs/using-pants-in-ci recommends a simple script for nuking cache directories when too large. It recommends pants/setup
as one of the directories to monitor. When we do that for our builds when they kick off, we start seeing errors like this
/home/buildbot/.cache/pants/setup/bootstrap-Linux-x86_64/2.13.0_py38/bin/python: No such file or directory
Any recommendations on what to do?rough-vase-83553
09/30/2022, 2:16 PMpython_requirements
. I'm aware that you can create an arbitrarily large lockfile without having to worry about everything in the lockfile getting dragged in to each Python target. But do you have to also create separate python_requirements
for each binary / library if you want to restrict the number of dependencies they pull in? Or can Pants still resolve which ones to pull in via import statements?high-energy-55500
09/30/2022, 4:13 PMpython_requirement
target? e.g.
python_requirement(
name="apache-airflow",
requirements=[
"apache-airflow[amazon]==2.2.2",
],
modules=["airflow"],
)
this doesn’t seem to include the amazon
extra (if i run pylint against code that uses this as a dependency, it fails to import from amazon.providers.amazon
)
however, including the extra as a separate package seems to work, e.g.
python_requirement(
name="apache-airflow",
requirements=[
"apache-airflow==2.2.2",
"apache-airflow-providers-amazon==2.4.0",
],
modules=["airflow"],
)
the downside is that i have to manually specify the compatible version of the package, which i’d like to avoidwide-midnight-78598
09/30/2022, 8:18 PMdist/...
- however, to deploy them from my ansible role, I need them in specific locations. I haven't been able to resolve symlinks (it just uploads the symlink to the target). Is there a Pantsian way to manipulate the host filesystem?bumpy-noon-80834
09/30/2022, 9:27 PMbumpy-noon-80834
10/01/2022, 12:01 AM./pants lint ::
everything is fine. But when I run pylint through VSCode using python.linting.pylintPath=./dist/export/python/virtualenvs/tools/pylint/bin/pylint
, I get pylint's impor-error
because the tools/pylint
apparently doesn't understand my imports, despite PYTHONPATH
being defined in .env
. BTW, the .env
looks fine as VSCode itself has no issue resolving those imports. Any idea how I could troubleshoot this further?worried-painter-31382
10/01/2022, 2:52 PMbored-energy-25252
10/02/2022, 4:33 PMhigh-magician-46188
10/02/2022, 4:50 PMBUILD
files.
Hi,
I’m trying to setup Pantsbuild (2.13) for the first time in an existing Python monorepo.
At the root of the repo, I’ve added an empty BUILD_ROOT
file and a pants.toml
file with:
[GLOBAL]
backend_packages = ["pants.backend.python"]
[source]
root_patterns = ["/X_*/"] # "X" is the name of the company and "X_" is the prefix of each project
There are about 50 projects in the repo.
I’ve run pants tailor ::
and then git status | wc -l
and got 644.
Do I really need to commit 640 BUILD
files?
Is there any way around it? (maybe creating one per project?)
Thanks in advance 🙂gentle-sugar-52379
10/02/2022, 7:01 PMrun_goal_use_sandbox
is there a way to make restartable
usable too? atm it only works on pex_binary
but thats slowing down the dev iterations massively.gentle-sugar-52379
10/03/2022, 7:10 AMdeployments/project_name/main.py
script and made it invokable with python_source
. i'm able to run it with ./pants run deployments/project_name
with and without the sandboxing feature. pretty nice
but ansible-runner wants to call ansible-playbook
via subprocess. i tried to include it as a requirement in my python_source
but without any luck.
the next try was: make a fake ansible-playbook
executable inside my ~/.local/bin folder
. result: my deployments/project_name/main.py
is able to find and execute it.
is it the right solution to globally install ansible so it's findable by my deployments/project_name/main.py
script or is there something i could do to include it into the path to be able to rely on the automatic fetching of ansible?fresh-cat-90827
10/03/2022, 11:58 AMresources(
name="testdata",
sources=["tests/testdata/**/*", "!tests/testdata/**/*.py"],
)
?worried-painter-31382
10/03/2022, 7:45 PM19:41:23.94 [INFO] Long running tasks:
643.75s Building docker image <http://amramedical.com/client-session-tagger:latest|amramedical.com/client-session-tagger:latest> +2 additional tags.
These are small services, with 4 requirements (largest being boto3). The time spent is mostly focused on building "requirements" pexes, meaning pex_binary
targets in the container with only requirements included. We use a lockfile, with_tools=True and layout="packed". We use only wheels.
#13 [source 4/4] RUN PEX_TOOLS=1 /usr/local/bin/python client-session-tagger-handler.pex venv --compile /app
#13 sha256:3dcefd9a9645feb44367da0c45b3370685d8cc30427c1a07e8d99fe9111ab559
#13 DONE 3.7s
#9 [dependencies 4/4] RUN PEX_TOOLS=1 /usr/local/bin/python client-session-tagger-requirements.pex venv --scope=deps --compile /app
#9 sha256:ef99e535e560ee6d3decd7805cb3c81c0d17b05123d0f2b0505b157e6efb1697
#9 DONE 637.8s
rhythmic-glass-66959
10/03/2022, 8:09 PMflake8-requirements
plugin. Since my requirements.txt
is not in the project's root directory, I need to pass the --requirements-file
option pointing to the requirements in 3rdparty/python
. So I added this to `[flake8].args`:
[flake8]
args = [
...,
"--requirements-file=3rdparty/python/requirements.txt",
]
However, the plugin doesn't seem to be able to read the text file. I suspect an issue with the sandbox. Any ideas?