polite-journalist-87208
12/06/2022, 10:11 PMImportError: Error while finding module specification for 'foo.d.bar' (ModuleNotFoundError: No module named 'foo')
stocky-helmet-22655
12/06/2022, 10:38 PM./pants generate-artifacts
is saying it can’t find artifacts which install fine with pipenv install
faint-dress-64989
12/06/2022, 10:44 PM./pants run src/python/maergo:main
. When attempting to run the application via docker with ./pants run src/docker/maergo:python
I encounter the following:
Failed to find compatible interpreter on path /usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin.
Examined the following interpreters:
1.) /usr/local/bin/python3.10 CPython==3.10.7
2.) /usr/bin/python3.9 CPython==3.9.2
No interpreter compatible with the requested constraints was found:
Failed to resolve requirements from PEX environment @ /root/.pex/unzipped_pexes/dbe958d0761ce1dbfdab65e69c2ae9a4658f75b5.
Needed cp310-cp310-manylinux_2_31_aarch64 compatible dependencies for:
1: pydantic!=1.7,!=1.7.1,!=1.7.2,!=1.7.3,!=1.8,!=1.8.1,<2.0.0,>=1.6.2
Required by:
fastapi 0.85.2
But this pex had no ProjectName(raw='pydantic', normalized='pydantic') distributions.
2: pydantic<2.0.0,>=1.10.2
But this pex had no ProjectName(raw='pydantic', normalized='pydantic') distributions.
This is a link to my test repository. Is there some other dependency I need to add to my Dockerfile to enable the execution of pex files?nutritious-hair-72580
12/07/2022, 12:46 AMbitter-ability-32190
12/07/2022, 1:53 AMpex lock create
something halfway between --style universal
and (no args) where we are multi-platform for a select few platforms.
Passing an assortment of --platform manylinuxXXX-cp-38-cp38
gets me going until it hits timeout-decorator~=0.4
which is only sdist
.
ERROR: Could not find a version that satisfies the requirement timeout-decorator~=0.4
ERROR: No matching distribution found for timeout-decorator~=0.4
ambitious-actor-36781
12/07/2022, 3:33 AMtar ./lmdb_store/files/e/data.mdb file change as we read it
errors.
What might be triggering this? My suspicion is pantsd is flushing changes while tar is running.
Is there a way to force pantsd to cleanly flush and exit?lemon-oxygen-72498
12/07/2022, 9:25 AMinterpreter_constraints = ["==3.8.*", "==3.9.*"]
in the top-level pants.toml
file and in one library, I have interpreter_constraints=parametrize(py38=["==3.8.*"], py39=["==3.9.*"])
to generate targets for the two interpreters.
My problem is that pants
doesn't want to run black
anymore, ./pants lint --only=black ::
fails with:
InvalidLockfileError: You are using the lockfile at pants-utils/3rdparty/python/black_lockfile.lock to install the tool `black`, but it is not compatible with your configuration:
- You have set interpreter constraints (`CPython<4,>=3.7`) that are not compatible with those used to generate the lockfile (`CPython==3.8.* OR CPython==3.9.*`).You can fix this by not setting `[black].interpreter_constraints`, or by using a new custom lockfile.
But ./pants lint --only=isort ::
works 🤯 Running ./pants generate-lockfiles --resolve=black
doesn't solve the issue. What is the root explanation and what can I do?gifted-agency-25998
12/07/2022, 3:30 PMpants
being installed in the root of the repo, I find it a little more effort to go to the repo root before running it (or having to count the number ../
I need to do) Is there some way to be able to call pants from any subdirectory of the repo?stocky-helmet-22655
12/07/2022, 5:12 PMpipenv install
it works fine but when I run ./pants generate-lockfiles
it claims numpy cannot be foundrapid-bird-79300
12/07/2022, 7:09 PMskip_black=True
from being set on any target. Also consider a simple plugin for this.glamorous-accountant-97217
12/07/2022, 11:11 PM./pants export ::
and got some interpreters for tools under virtualenvs
as well as a virtualenv
that comes with message Wrote virtualenv (using Python 3.10.8) to dist/export/python/virtualenv
. the pytest env had pytest and some dependencies on my project’s sources, but it was missing numpy, which is a transitively required third party dependency. am I doing something wrong?gentle-gigabyte-52115
12/08/2022, 8:45 AM--tailor-ignore-adding-targets
as this doesn’t seem to support glob patterns. Is there any logic to prevent tailor from generating a specific target name (e.g. python_sources / python_requirements), or if there is an option to add globs to tailor-ignore-adding-targets? I have thought about adding it to pants ignore, but pants actually needs to be aware of these files outside of the tailor context.thousands-plumber-33255
12/08/2022, 8:48 AMpants.toml
/services
/serviceA
BUILD
/tests
BUILD
example_test.py
In pants.toml
I have added this entry:
[environments-preview.names]
linux_docker = "//:local_qgis"
In `serviceA/BUILD`:
docker_environment(
name="local_qgis",
image="qgis/qgis:final-3_22_9",
)
In `serviceA/tests/BUILD`:
python_tests(
environment="local_qgis"
)
i am getting the followring error with `./pants test services/serviceA/tests/example_test.py`:
ResolveError: The address //:local_qgis from the option [environments-preview].names does not exist.
The target name ':local_qgis' is not defined in the directory . Did you mean one of these target names?
* :root
Quite sure I have missed something fundamental, but don't get it. Can someone help? 🙂worried-painter-31382
12/08/2022, 9:21 AMARG SOME_IMAGE="path/to/some-image/target"
FROM $SOME_IMAGE
for dev builds, letting pants build the image dependency.
However, in our CI pipeline we often host the image FROM
images in our internal registry, so we use --docker-build-arg "SOME_IMAGE=<some-image-uri>
. This works for building the image, but pants is also still building the path/to/some-image/target
only to then not use it. Is this a bug or a feature? 🙂 It's problematic for us because we do not have internet access in our builds, but some-image
uses apt
. We would prefer if pants inspected the build args before resolving image dependencies, to not trigger docker builds that are not going to be used.freezing-lamp-12123
12/08/2022, 10:42 AMProcess
… 🪚loud-laptop-89838
12/08/2022, 1:04 PMconftext
file that has from delta import configure_spark_with_delta_pip
, That package is actually delta-spark
. When I run ./pants dependencies tests/conftest
that dependency isn't inferred. So in tests/BUILD
I added
python_requirement(
name="delta",
requirements=["delta-spark==2.2.0"]
)
But when I run the dependency check again, delta
still isn't there. Am I putting the requirement in the wrong place?aloof-appointment-30987
12/08/2022, 2:58 PMvcs_version
target to the version
property when producing a python_artifact
( python_distribution
) without hand coding a setup.py?lemon-oxygen-72498
12/08/2022, 3:16 PMinterpreter_constraints
on python_distribution
? Globally I have interpreter_constraints = ["==3.8.*", "==3.9.*"]
which pants refuse when `package`ing:
SetupPyError: Expected a single interpreter constraint for libs/image_augmentation:kaiko-image-augmentation-dist, got: CPython==3.8.* OR CPython==3.9.*.
Makes sense, so setting this defaults
in the top-level build file:
__defaults__(
{
python_distribution: dict(interpreter_constraints=["==3.8.*"]),
}
)
I obtain SetupPyError: Expected a single interpreter constraint for libs/image_ml:kaiko-image-ml-dist, got: CPython==3.8.* OR CPython==3.8.*,==3.9.*.
which looks weird to me, because it means the two fields are ORed, whereas I'm expecting __defaults__
to override the value.
At this point there's a nice hint saying to translate the conjunction using exclusions, so I changed the _defaults_
above to !=3.9.*
. Then it fails with:
Expected a single interpreter constraint for libs/image_augmentation:kaiko-image-augmentation-dist, got: CPython!=3.9.*,==3.8.* OR CPython!=3.9.*,==3.9.*.
which looks like I'm almost there if pants was trimming unsatisfiable disjunctions 😉
What else can I do 🙏 ?lemon-oxygen-72498
12/08/2022, 4:44 PMbrash-student-40401
12/08/2022, 4:55 PMrequirements.txt
and tried updating my lockfile. This is consistently throwing an error now:
ERROR: THESE PACKAGES DO NOT MATCH THE HASHES FROM THE REQUIREMENTS FILE. If you have updated the package versions, please update the hashes. Otherwise, examine the package contents carefully; someone may have tampered with them.
dandelion from <https://XXX.d.codeartifact.us-west-2.amazonaws.com/pypi/XXX/simple/app/0.0.1/app-0.0.1.tar.gz#sha256=hash>:
Expected sha256 hash
Got different hash
The expected hash is the one I see on CodeArtifact. My guess is that the "Got" one is from an earlier version that is cached - how can I do something like pip's --no-cache-dir
to get around this?faint-dress-64989
12/08/2022, 5:08 PM./pants package src/python/maergo:main
only main.py
is added to the pex file and the rest of the sources files are ignored. Does every python package need its own build that is then referenced in the main build of the application or is there a simpler solution?fresh-cat-90827
12/08/2022, 5:12 PMpowerful-florist-1807
12/08/2022, 6:29 PMlogging
library to log messages for tests. But I didn't see any output messages even though I applied the -ldebug
option, e.x. "./pants -ldebug test ...". Does anyone know how to see the logging outputs? Thanks!faint-dress-64989
12/08/2022, 6:45 PMpex_binary
build option has a param restartable
that can be set too true so an application can be restarted automatically when input files changes. This seems to be integrated with the pants run goal. I want to get this same functionality when running applications using docker-compose. Generally I would do this with a volume plus some kind of setting (ie reload w/ fastapi
). I'm not sure how to propagate pex changes to docker-compose to restart my container. I was wondering if any one has a solution to this more specifically with a container running fastapi.brash-student-40401
12/08/2022, 7:53 PMsrc
├── clients
│ ├── app
| ├── BUILD
│ └── app
| ├── BUILD
| ├── __init__.py
│ └── app.py
The first BUILD
contains the packaging target:
python_distribution(
name="app_dist",
dependencies=["src/clients/app/app"],
....
This successfully builds the distribution, but then when I import it, I have to import <http://clients.app.app|clients.app.app>
- how do I set it so that the generated packages ignore the top-level directories? Ultimately, I want users to be able to import app
.ambitious-actor-36781
12/08/2022, 9:48 PM./pants test
?
We want to keep the pytest-captured logging in failed tests.
--no-???-cleanup
or whatever sort of works, but seems heavy handed.powerful-florist-1807
12/08/2022, 10:46 PM22:36:33.91 [DEBUG] Running Run Pytest for airflow2/test/python/data_dag_tests/dag_tests/test_dags_shard_6.py:tests under semaphore with concurrency id: 8, and concurrency: 1
22:36:33.91 [DEBUG] Starting: Run Pytest for airflow2/test/python/data_dag_tests/dag_tests/test_dags_shard_6.py:tests
22:36:33.91 [DEBUG] Starting: setup_sandbox
The message "`concurrency: 1`" makes me think there is actually no concurrency. Am I correct? Thanks!brief-ambulance-96324
12/09/2022, 12:21 AMbrief-ambulance-96324
12/09/2022, 12:21 AMhappy-kitchen-89482
12/09/2022, 4:17 AM