brave-hair-402
03/31/2023, 3:35 PMripe-gigabyte-88964
03/31/2023, 3:39 PMA distribution for orjson could not be resolved for /usr/local/bin/python3.8.
Found 1 distribution for orjson that do not apply:
1.) The wheel tags for orjson 3.6.9 are cp38-cp38-macosx_11_0_arm64, cp38-cp38-macosx_10_9_universal2, cp38-cp38-macosx_10_9_x86_64 which do not match the supported tags of /usr/local/bin/python3.8:
cp38-cp38-manylinux_2_31_x86_64
... 598 more ...
I'm assuming this is because I am building the pex on my Mac but the container is a linux container. Is there a way to get around this?swift-river-73520
03/31/2023, 4:34 PMwide-midnight-78598
03/31/2023, 4:41 PMpydantic
in Python, but feels heavy for what is a pretty "simple" task generallyloud-spring-35539
03/31/2023, 5:23 PMswift-river-73520
03/31/2023, 9:03 PMdazzling-dress-95246
03/31/2023, 10:39 PM/usr/src/app:Foo
/usr/src/app/bar:Bar
/usr/src/baz:Baz
and then we set the PYTHONPATH within the image to:
/usr/src/app:/usr/src/baz
Now Foo's source root is "/". Bar gets mounted within Foo and so python modules in Foo can import python modules within Bar like so:
import bar.something
Python modules in Foo can also import python modules within Baz like so:
import baz.something
I'm currently keeping pants.toml
within Foo.
Questions:
1. what's the recommended setup here?
a. I'm currently thinking of mounting all the above within a container just like our current env and then installing pants within it.
2. should I set the pythonpath env var within the container or given that pants uses its own python binary, should I define it in pants.toml somewhere?
a. I do see this link but it refers to plugins which I don't think is right.
3. I'd like to begin by testing pants on a single file/module, let's say located at Foo/module/submodule
(which can import from across the pythonpath). However setting the source root as "/" and then calling pants tailor ::
adds BUILD files all over the repo. How can I limit it to only the one submodule?
Feel free to have me read/grep documentation if the above is covered somewhere already.gray-shoe-19951
04/01/2023, 12:23 AMjolly-kite-53356
04/02/2023, 2:53 AMplatforms=["anylinux2014_x86_64-cp-310-cp310"]
to pex or use the new
docker_environment(
name="python_bullseye",
platform="linux_x86_64",
image="python:3.10.9-slim-bullseye",
python_bootstrap_search_path=["<PATH>"],
)
but it seems like the docker image it built still in arm architecture. Could someone help me to figure this out?brave-hair-402
04/02/2023, 10:12 AMproud-dentist-22844
04/02/2023, 5:26 PMbrave-hair-402
04/02/2023, 8:29 PMcrooked-lawyer-77407
04/03/2023, 4:00 AMpants test
give a Module not found error
to find their core module when using import <core_module>
in the tests? We've had this issue for a while but have circumvented it by just running pytest directly in an exported environment, would be nice to not have to do that.happy-family-315
04/03/2023, 9:06 AMruff
with the lint
goal I don't get any output when ruff fails. How can I get an output of whats wrong?nice-park-16693
04/03/2023, 10:34 AMroot_patterns
?brave-hair-402
04/03/2023, 11:33 AMpowerful-eye-58407
04/03/2023, 3:21 PMcommon_library
BUILD
dir1
BUILD
file1.py
file2.py
dir2
file1.py
dir3
...
project1
(python sources, some depending on common_library)
project2
(python sources, some depending on common_library)
3rdparty
BUILD
requirements.txt
pants.toml
I want to be able to additionally package just the common_library and distribute it as a wheel/sdist to be used in jupyter notebooks.
I ended up with BUILD of common_library containing sth like this:
python_distribution(
name="common_library",
dependencies=[
# here, in setup.py, we had find_packages() to get all src files from dir1..dirN...
],
provides=python_artifact(
name="common",
#...
),
)
The thing is I don't know how to specify that common library depends on all subpackages within common_libary package - I don't want to list all files manually of course, as the real structure is much more complex.
I also like the fact that BUILD files are provided for each subdir of common lib, and I don't want to have a python_sources(source="**/*.py")
at the top level BUILD of the lib instead - is there any other way to approach this?limited-advantage-17894
04/03/2023, 4:02 PMprojectname
|-- src
| |-- python
| |-- data-models
| |-- src
| |-- data_models
| |-- __init__.py
| |-- module1.py
| |-- tests
| |-- BUILD
| |-- pyproject.toml
| |-- README.md
The BUILD
is like this
python_sources(
name="data_models",
sources=[
"src/data_models/**/*.py",
],
dependencies=["3rdparty/python#pydantic"],
)
resource(name="pyproject", source="pyproject.toml")
python_distribution(
name="dist",
dependencies=[
":pyproject",
],
provides=python_artifact(
name="data_models",
version="0.1.0",
),
wheel_config_settings={"--build-option": ["--python-tag", "py39"]},
generate_setup=False,
)
And the pyproject.toml
is like this
[tool.poetry]
name = "data-models"
version = "0.1.0"
readme = "README.md"
packages = [{include = "data_models", from = "src"}]
classifiers = [
"Intended Audience :: Developers",
"Programming Language :: Python :: 3.9",
]
[tool.poetry.dependencies]
python = "^3.9"
pydantic = "^1.10.0"
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
When I tried to build the package with pants package
, it complained that src/python/data-models/src/data_models does not contain any element
(the full errors are in the thread)
However, when I ran poetry build
inside data-models
, everything works well. I might be missing something when it comes to configure pants to use poetry as its build backend.
Any help is greatly appreciated!swift-river-73520
04/03/2023, 6:21 PMhigh-yak-85899
04/03/2023, 7:09 PMgrpcio-tools
(and them properly showing up in our lock file), it seems export
causes us to build that from source which takes a really long time.future-oxygen-10553
04/03/2023, 8:06 PMSourceField
for TOML files. This works great when I have a toml_source
target. Now, of course, I want to use the python_requirements
target with a pyproject.toml
file. Is there a way to configure the FieldSet
to have a union for the sources, so it pulls both `TomlSourceField`s and `PythonRequirementsSourceField`s? Assume for now that I’m fine with hard crashes if the PythonRequirementsSourceField
points to a requirements.txt
-format file 🙂nice-park-16693
04/03/2023, 8:17 PMdocker_image
target to support non-Docker image builders such as Podman. I have written a plugin which provides an oci_image
target, delegating to Podman under the covers, by basically just mercilessly extending from a handful of the Pants standard docker backend classes. But I'm wondering if there might be more robust approaches which could be contributed back to Pants core -- for example, having pants.toml
able to specify which binary should be used in place of docker
(noting this would only work for those cases in which a given container builder is compatible with the docker
command-line, and indeed with `Dockerfile`s, like podman
is). What's the best forum to have that conversation -- should I open an issue?big-xylophone-43403
04/03/2023, 8:17 PMwonderful-boots-93625
04/03/2023, 8:30 PMfiles
to be included not just resources
? we kinda abuse MANIFEST.in to put in files into the distribution, and I’m trying to replicate it.gray-shoe-19951
04/03/2023, 9:41 PM[2023-04-03T19:05:54.375Z] 19:05:51.57 [INFO] Long running tasks:
[2023-04-03T19:05:54.375Z] 863.93s Determine Python dependencies for X1.py
[2023-04-03T19:05:54.375Z] 864.06s Determine Python dependencies for X2.py
[2023-04-03T19:05:54.375Z] 865.21s Determine Python dependencies for X3.py
[2023-04-03T19:05:54.375Z] 865.24s Determine Python dependencies for X4.py
[2023-04-03T19:05:54.375Z] 867.84s Determine Python dependencies for X5.py
[2023-04-03T19:05:54.375Z] 867.85s Determine Python dependencies for X6.py
[2023-04-03T19:05:54.375Z] 867.97s Determine Python dependencies for X7.py
2. if I restart the builds, sometime it would go away
11:58:10 360.38s Test binary /bin/python.
11:58:10 360.38s Test binary /data/env/py3.9.13/bin/python.
11:58:10 360.38s Test binary /opt/conda/bin/python.
For both scenarios, I saw multiple pantsd exists. For example.
sh-4.2# ps -ef
UID PID PPID C STIME TTY TIME CMD
root 1 0 0 17:08 pts/0 00:00:00 /usr/bin/dumb-init -- /usr/local/bin/run-jnlp-client 03564869d53ea68cd383a448680f9abfa4cc44fcd3f0480712dad2283953ec15 pan
root 7 1 2 17:08 ? 00:01:25 java -XX:+UseParallelGC -XX:MinHeapFreeRatio=5 -XX:MaxHeapFreeRatio=10 -XX:GCTimeRatio=4 -XX:AdaptiveSizePolicyWeight=90
root 803 1 0 17:10 ? 00:00:00 sh -c ({ while [ -d '/tmp/workspace/script@tmp/durable-1914e334' -a \! -f '/tmp/workspace/ar_AT
root 804 803 0 17:10 ? 00:00:01 sh -c ({ while [ -d '/tmp/workspace/script@tmp/durable-1914e334' -a \! -f '/tmp/workspace/ar_AT
root 805 803 0 17:10 ? 00:00:00 sh -xe /tmp/workspace/ar_ATOMFM-327_single_eval_script@tmp/durable-1914e334/script.sh
root 820 805 0 17:10 ? 00:00:00 /home/jenkins/.pex/venvs/0bd641e3a90c5dabea350e64a646029c08613838/779eb2cc0ca9e2fdd204774cbc41848e4e7c5055/bin/python /tm
root 822 820 0 17:10 ? 00:00:00 [/home/jenkins/.] <defunct>
root 823 1 16 17:10 ? 00:10:40 pantsd [/tmp/workspace/ar_ATOMFM-327_single_eval_script]
root 1118 823 0 17:11 ? 00:00:00 pantsd [/tmp/workspace/ar_ATOMFM-327_single_eval_script]
root 1119 823 0 17:11 ? 00:00:00 pantsd [/tmp/workspace/ar_ATOMFM-327_single_eval_script]
root 1120 823 0 17:11 ? 00:00:00 pantsd [/tmp/workspace/ar_ATOMFM-327_single_eval_script]
root 1121 823 0 17:11 ? 00:00:00 pantsd [/tmp/workspace/ar_ATOMFM-327_single_eval_script]
root 4778 0 0 18:14 pts/1 00:00:00 sh
root 4785 804 0 18:14 ? 00:00:00 sleep 3
root 4786 4778 0 18:14 pts/1 00:00:00 ps -ef
I tried to disable pantsd in jenkins, it does not help. Instead of multiple pantsd, I would see the following for example.
root 798 0.0 0.0 0 0 ? Z 15:03 0:00 [python] <defunct>
root 799 0.0 0.0 0 0 ? Z 15:03 0:00 [python] <defunct>
My feeling is that somehow the child processor go stuck, I am wondering how can I troubleshoot it and I cannot reproduce it locally. Your advice will be greatly appreciated.swift-river-73520
04/03/2023, 10:59 PMpytest
folders for different subprojects which share conftest.py
fixtures? Right now I'm putting shared test utilities into a common non-conftest.py module, then having a conftest.py file local to each subproject which imports the utilities and applies the pytest fixture
wrapper. This seems to work alright and eliminates duplication of the fixture logic, but the fixtures themselves still need to be duplicated to apply the fixture
wrapper. Feels like I'm probably missing a way to share the actual conftest.py
between tests in various directories / subprojectscurved-manchester-66006
04/04/2023, 12:44 AMpants run
the app though and ran into some overlapping problems.
• (A) If I `pants run src/py/foo:the-docker-target", then AFAIK there isn't any way to pass along docker options (in the sense of docker run [OPTIONS] IMAGE [COMMAND] [ARG...]
). The app takes it's configuration from environmental variables that it needs at run (not build) time.
• (B) With pants run src/py/foo:the-pex-target
I get "The run
goal only runs in the local environment. You may experience unexpected behavior.", which I presume is a pending feature and not a fundamental limitation.
• (C) Just invoke docker run
directly. That works, but the target is a little webapp and this would leave no path towards restartable=True
or better workflows <https://github.com/pantsbuild/pants/issues/17414>fresh-continent-76371
04/04/2023, 3:06 AMvcs_version
file that is generated, and drop it into the
python_artifact(
name = "my_lib",
version <-=-- HERE
),
field.
what I can't workout is
• do i write a plugin ? (have tried many a time to write one - but seem to get trapped on the documentation
• do i write a Field ? (same how can I read the version file, and feed it into the field)
context:
• are using conventional commits
• using convco
to manage the version tagging and bumping (basically, if its main/ bump the version (in the CI script)
but am stuck to how I get that setup_tools scm, version, into the verion field for the python artifact (full well knowing that the pyproject.toml itself, is told to use setuptools_scm, which also generates the same 🙂breezy-apple-27122
04/04/2023, 8:57 AMcareful-mechanic-89327
04/04/2023, 10:32 AM12:24:56.74 [ERROR] 1 Exception encountered:
Engine traceback:
in select
in pants.core.goals.check.check
in pants.backend.scala.goals.check.scalac_check (scalac)
in pants.backend.scala.compile.scalac.compile_scala_source
in pants.jvm.compile.compile_classpath_entries
in pants.jvm.resources.assemble_resources_jar
in pants.engine.process.fallible_to_exec_result_or_raise
Traceback (most recent call last):
File "/Users/jbenito/.cache/pants/setup/bootstrap-Darwin-x86_64/pants.1Nnv7r/install/lib/python3.9/site-packages/pants/engine/process.py", line 275, in fallible_to_exec_result_or_raise
raise ProcessExecutionFailure(
pants.engine.process.ProcessExecutionFailure: Process 'Build resources JAR for sdk/transport-security-web-lib/src/test/resources:resources' failed with exit code 1.
stdout:
stderr:
/usr/bin/touch: illegal option -- d
usage:
touch [-A [-][[hh]mm]SS] [-acfhm] [-r file] [-t [[CC]YY]MMDDhhmm[.SS]] file ...
Do you have why this could be happening?
Thanks in advance