thousands-plumber-33255
04/19/2023, 2:41 PM__defaults__(
{
python_tests: dict(
dependencies=[
"django/django_core/settings.py",
"django:pytest-django",
],
extra_env_vars=env_vars_test_django(),
)
}
)
My goal here is to set a list of environments as defined in a macro. Now, I would like to run pants run django/manage.py -- makemigrations --check
, but I am not able to set this list of environments as the field is unregonized for a pex_binary
target.
1. How can I adress this? Is env in 2.16 the equivalent?
2. Can I pass the macro function in in a CLI argument as well or only in a BUILD file? pants run --env=env_vars_test_django() django/manage.py -- makemigrations --check
jolly-kite-53356
04/19/2023, 4:50 PMdocker_environment(
name="python_bullseye",
platform="linux_x86_64",
image="base_image:latest",
python_bootstrap_search_path=["<PATH>"],
)
the base_image
is a customized base image I build locally, it seems like it didn’t pick up new build even if I set it to latest
, can someone help me understand if I’m doing any wrong here?rich-london-74860
04/19/2023, 5:22 PMshared
that’s a dependency for 2 projects called A
and B
. The downstream dependent projects A
and B
also have conflicting 3rd party dependencies (for the sake of this example, suppose A
depends on pandas<2.0
, B
depends on pandas>=2.0
and shared
does not use pandas
at all). This means that A
and B
must use different resolves (for the sake of this example, let’s say those are called resolve_A
and resolve_B
) and shared
must use multiple resolves with resolve=parameterize('resolve_A', 'resolve_B')
The python_sources
target should wind up looking something like:
python_sources(
name="lib",
sources=["**/*.py"],
resolve=parameterize("resolve_A", "resolve_B")
)
Now let’s say shared
should also be distributed as a python_distribution
. resolve
is not a parameter for the python_distribution
and the python_sources
require a resolve
value.
What should this be?
Should it look something like this?
python_distribution(
name="dist",
dependencies=[
":lib@resolve=resolve_A",
":lib@resolve=resolve_B"
]
)
That seems to work for our purposes, but it’s strange to duplicate the same item in dependencies
for different resolves.tall-battery-32825
04/19/2023, 6:58 PMfile(
name="dockerfile",
source="Dockerfile",
dependencies=["src:pex1"],
)
archive(
name="archive",
files=[":dockerfile"],
packages=["src:pex2"],
format="zip",
)
pants package path/to:archive
yields an archive that does not contain src:pex1
. Is this by design? Would I have to list every dependency under files
or packages
for it to be included in the archive? If so, is the dependencies
list on file
targets used for anything?jolly-kite-53356
04/19/2023, 7:35 PMfiles
target, some file is about 3G+, I’m getting this error:
Snapshot failed: Failed to digest inputs: Throw { val: Error storing Digest { hash: Fingerprint<4728718212c19c56cdcb11074aa5ef1e939461df8de8e5998130854af496d427>, size_bytes: 3395621743 }: Invalid argument, python_traceback:
Not sure what’s the best way to handle large file with pantsswift-river-73520
04/19/2023, 10:58 PMswift-river-73520
04/20/2023, 12:55 AMpex_binary
targets that uses a docker_environment
? I initially thought it was something to do with pulling from ECR, but I'm running into it with an environment that uses a simple python:3.9-slim image too (or any of my docker_environments
for that matter)
Exception: Failed to start Docker container `650937fe9f4829927de230dbdb32011b2a3c1d1107e70f1cc26f082a4dab9bc0` for image `sha256:24654773aa1e8ddd331c7fc2de5fd8c0000e1c0b0578967be3dc2d0d2b997980`: RequestTimeoutError
future-oxygen-10553
04/20/2023, 12:29 PMpytest==7.2.2
pytest-cov==4.0.0
pytest-xdist==3.2.1
and my pants.toml
...
[python.resolves]
python-default = "3rdparty/python-lock.txt"
dev = "utils/lock.txt"
...
[pytest]
args = ["--no-header", "-vv"]
install_from_resolve = "dev"
However, running pants test ::
gives me
1. /opt/homebrew/Cellar/python@3.10/3.10.11/Frameworks/Python.framework/Versions/3.10/bin/python3.10:
Failed to resolve all requirements for cp310-cp310-macosx_13_0_arm64 interpreter at /opt/homebrew/Cellar/python@3.10/3.10.11/Frameworks/Python.framework/Versions/3.10/bin/python3.10 from utils/lock.txt:
Configured with:
build: True
use_wheel: True
Dependency on pytest-cov not satisfied, 1 incompatible candidate found:
1.) pytest-cov 4 does not satisfy the following requirements:
!=2.12.1,<3.1,>=2.12 (via: pytest-cov!=2.12.1,<3.1,>=2.12)
Dependency on pytest-xdist not satisfied, 1 incompatible candidate found:
1.) pytest-xdist 3.2.1 does not satisfy the following requirements:
<3,>=2.5 (via: pytest-xdist<3,>=2.5)
Dependency on pytest not satisfied, 1 incompatible candidate found:
1.) pytest 7.2.2 does not satisfy the following requirements:
==7.0.1 (via: pytest==7.0.1)
Those versions are indeed hard-coded in the pytest subsystem, but the docs say that the method I’m trying is the right way to override thingswonderful-boots-93625
04/20/2023, 1:18 PMpython_distribution
doesn’t really help here because installed packages are not editable.
This requires having a duplicate setup.py
or pyproject.toml
in the monorepo modules, as well as several steps to install dependent modules in editable mode.
Additionally the exported venv doesn’t have pip, so that has to be added before doing any of the above
Any other approaches I’m missing?high-yak-85899
04/20/2023, 6:06 PM--test-extra-env-vars
on the command line in a way that appends to/modifies what's specific in pants.toml
?wonderful-boots-93625
04/20/2023, 11:37 PMexport
a venv without enabling resolves? So far doesn’t seem sohappy-kitchen-89482
04/20/2023, 11:47 PMswift-river-73520
04/21/2023, 12:25 AMgray-shoe-19951
04/21/2023, 1:07 AM/tmp/workspace/AB
, and branch name CD would be under /tmp/workspace/CD
. We notice that Pants will not be able to share mypy cache across branch after 2.15. I believe the issue comes from this pull request https://github.com/pantsbuild/pants/pull/18061. It seems that Pants now replies on buildroot (e.g. a path) to differentiate repos. This wont work in the setup I mentioned. I am wondering if there is a better way to implement it. or is there any workaround? Thanks!plain-librarian-97976
04/21/2023, 2:39 AM--platform
option of the docker build command. How do I specify the --platform
in the docker_image
directive?fresh-continent-76371
04/21/2023, 5:16 AMvcs_version(..)
creates somefolder/VERSION
(name = "version_txt_file"
)
2. the python_distribution
depends upon that target
python_distribution(
name = "jlab",
dependencies = [
":pyproject",
":version_txt_file",
"//:my_sources",
"//:reqs#setuptools",
"//:reqs#wheel",
],
generate_setup = True,
provides = python_artifact(
name = "mylib",
),
...
)
3. have a custom plugin, which registers the custom setup_kwargs for the python distribution. the germain point is, it looks for the file, created in 1. (and changelog - but thats the same scenario - a target shell_command(..)
produces it)
@rule
async def setup_kwargs_plugin(request: CustomSetupKwargsRequest) -> SetupKwargs:
async def get_version() -> str:
"""returns the VERSION file contents"""
digest_contents = await Get(
DigestContents,
PathGlobs(
[f"{request.target.get(PythonProvidesField).value}/VERSION"], # this needs to be more robust later
description_of_origin=f"{__name__}: get_version(), from `python_artifact()` plugin",
glob_match_error_behavior=GlobMatchErrorBehavior.error,
),
)
version = digest_contents[0].content.decode().strip()
return version
async def get_changelog() -> str:
...
version = await get_version()
changelog = await get_changelog()
return SetupKwargs(
{**request.explicit_kwargs,
"version": version,
"long_description": changelog,
},
address=request.target.address
)
so how to I "add" the target vcs_version(..) somefolder/VERSION (name = "version_txt_file" )
as a dependency to the macro call, python_artifact(...)
average-breakfast-91545
04/21/2023, 9:43 AMpandas
in a lambda layer. I've reproduced my issue locally by running the aws lambda docker images with my layer and pex unzipped to /opt/
and /var/task
respectively.
I have the following code in my pex:
# src/dz/anomaly_flagger/handler.py
import pandas as pd
def handle(event, context):
df = pd.DataFrame()
print('yo', df.shape)
print(df)
print('whoa')
If I run the handler directly, everything executes:
START RequestId: 454cf516-811d-460f-93c2-01e5b63159d4 Version: $LATEST
yo (0, 0)
Empty DataFrame
Columns: []
Index: []
whoa
END RequestId: 454cf516-811d-460f-93c2-01e5b63159d4
BUT if run the handler bootstrapped with pex (ie invoke. __pex__<http://.src.dz|.src.dz>.anomaly_flagger/handler.py
), then printing the dataframe causes pex to start an interactive console.
START RequestId: 831783bd-dc6a-43d0-8a41-a07594a55244 Version: $LATEST
yo (0, 0)
yo (0, 0)
>>> Python 3.9.16 (main, Dec 24 2022, 07:02:54)
[GCC 7.3.1 20180712 (Red Hat 7.3.1-15)] on linux
Type "help", "copyright", "credits" or "license" for more information.
(InteractiveConsole)
now exiting InteractiveConsole...
21 Apr 2023 09:40:51,423 [WARNING] (rapid) First fatal error stored in appctx: Runtime.ExitError
21 Apr 2023 09:40:51,423 [WARNING] (rapid) Process 14(bootstrap) exited: Runtime exited without providing a reason
END RequestId: 831783bd-dc6a-43d0-8a41-a07594a55244
I've got a trace which I'll include in 🧵but I'm unsure how to interpret what's happening. My guess is that the bootstrapped process is failing for some weird library-related reason, and the console is a fallback behaviour, but I can't understand why printing a dataframe would cause a hiccup.brash-area-42337
04/21/2023, 11:48 AMpowerful-eye-58407
04/21/2023, 12:02 PMIf you modify the third-party requirements of a resolve then you must regenerate its lockfile by running theHow do I see/trigger this warning? I have pandas==1.4.3 in the requirements, and generated lockfile that includes this version:goal. Pants will display an error if a lockfile is no longer compatible with its updated requirements.generate-lockfiles
"project_name": "pandas",
"version": "1.4.3"
Now, if I modify requirements.txt to bump pandas version pandas==1.5.3, and run pants test ::
there's no warning that requirements do not match the resolve - is there a way to check this that's builtin in pants? I want - maybe in the CI check - to make sure that when requirements.txt get upgraded, developers generate lockfiles and commit them to the repository as well. Thanks!ancient-terabyte-9085
04/22/2023, 12:35 AMflake8
and yapf
running with Pants on our code but I'm having some questions & wondering what the best approach is. flake8
is handling configuration well, but yapf
is giving me some trouble. In particular, we have some autogenerated files that we want to not run yapf
on. If I use a .yapfignore
, however, yapf
will return
input filenames did not match any python files
and an exit code of 1 when called against a set of files that are all excluded in the .yapfignore
. This makes ./pants lint
fail when run against a set of files of files.
I did find a fairly reasonable workaround here, using either a python_sources(skip_yapf=True)
or a
python_sources(overrides={
"*_pattern_common_to_generated_files.py": {
"skip_yapf": True
},
}
)
I'm wondering if this is the best way to handle this? ./pants tailor ::
won't know to include this line, so developers will probably have some confusion when they write something that will produce a new folder of generated files. I could mitigate that with a unit test that inspects the BUILD
files, but that also seems a little gross. Is there a way to make that *_pattern_common_to_generated_files.py
override implicit in each python_sources
?lively-gpu-26436
04/22/2023, 7:26 AMException: Failed to obtain version from local Docker: error trying to connect: No such file or directory (os error 2)
when working with environments in MacOS, the Enable default Docker socket in Docker Destkop > Advanced must be checked ✅brash-area-42337
04/22/2023, 8:28 PMbusy-vase-39202
04/22/2023, 8:57 PMbrave-hair-402
04/23/2023, 12:20 AMpants generate-lockfiles
). Where I get this error:
writing manifest file '/tmp/pants-sandbox-2f5z6H/.tmp/tmpjnm_zjwe/build/jax.egg-info/SOURCES.txt'
creating '/tmp/pants-sandbox-2f5z6H/.tmp/tmpjnm_zjwe/build/jax-0.4.8.dist-info'
Could not gather lock metadata for 1 project with source artifacts:
1. /tmp/pants-sandbox-2f5z6H/.tmp/tmpm7asbr4g/usr.bin.python3.11/multiprocess-0.70.14.tar.gz: Executing /home/sigur
dur/.cache/pants/named_caches/pex_root/venvs/6d9a4230883386f8bc74f0f14c403de3660d52be/73c7d7c618fac01449fa6e57a2121
0c0eb8a8f21/bin/python -sE /home/sigurdur/.cache/pants/named_caches/pex_root/venvs/6d9a4230883386f8bc74f0f14c403de3
660d52be/73c7d7c618fac01449fa6e57a21210c0eb8a8f21/pex -c import sys
import setuptools.build_meta
if not hasattr(setuptools.build_meta, 'prepare_metadata_for_build_wheel'):
sys.exit(75)
result = setuptools.build_meta.prepare_metadata_for_build_wheel(*('/tmp/pants-sandbox-2f5z6H/.tmp/tmp4ofkpkjq/build
',), **{})
with open('/tmp/pants-sandbox-2f5z6H/.tmp/tmpg61tovek', "w") as fp:
fp.write(result)
failed with 1
But it seems like I can manually generate a venv and install all the packages without issues that way; but it seems not to work with pants. Everything works great with python 3.10. Any ideas?wonderful-boots-93625
04/24/2023, 2:45 PMadhoc_tool
for generating documentation with pdoc
. pdoc needs the target distribution installed along with it’s deps (on top of pdoc itself) to work. pex_binary seemed to be a good way to do this, but Im probably misunderstanding how to configure this. Example:
python_requirement(
name='pdoc_req',
requirements=[
'pdoc'
]
)
pex_binary(
name='pdoc_sdk',
dependencies=[
':pdoc_req',
':dist'
],
entry_point='pdoc'
)
adhoc_tool(
name="build-pdoc",
runnable=":pdoc_sdk",
args=["pdoc", "-d", "google", "-o", "docs/", "src/my_package/"],
execution_dependencies=[":dist"],
output_files=["docs"],
root_output_directory=".",
)
run_shell_command(
name="run-build-pdoc",
command="ls {chroot}/docs",
execution_dependencies=[":build-pdoc",],
)
Doing ./pants run my_package:run-build-pdoc
yields
Engine traceback:
in `run` goal
ProcessExecutionFailure: Process 'the `adhoc_tool` at my_package:build-pdoc' failed with exit code 127.
stdout:
stderr:
env: python3.9: No such file or directory
:dist
is a python_distribution
gentle-painting-24549
04/24/2023, 3:58 PMexperimental_run_command
target as a solution for this but have been struggling.
Does the experimental_run_command
support a workflow that performs the following steps?
1. aws s3 sync <s3://bucket/dir/> ignored_path_in_monorepo/
(or a shell script that runs this)
2. Make those files available to the docker_image
sandbox
3) A plain old COPY ignored_path_in_monorepo/ /path_in_docker_image/
in the Dockerfileaverage-flag-94768
04/24/2023, 6:42 PMlemon-noon-33245
04/24/2023, 8:17 PMadhoc_tool
to install our node dependencies using pnpm
while javascript support is not yet ready. I’m having some issues with post-install scripts in core-js
and cpu-features
. Has anyone faced anything like that and have any tips for me?high-yak-85899
04/24/2023, 9:05 PM.so
files importable? My thinking is "no" since this stuff worked before we were using pants so perhaps there's some kind of venv setup that makes that magic happening.future-oxygen-10553
04/24/2023, 9:14 PM