quaint-oyster-63068
03/22/2023, 11:12 PMlambda.zip
files, then upload each zip as a separate lambda layer?jolly-kite-53356
03/22/2023, 11:55 PMrequirements.txt
updated with itsdangerous
and have ran generate-lockfiles
and I can confirm see itsdangerous
in the lock files. However, when I run the program, it keeps saying ModuleNotFoundError: No module named 'itsdangerous'
witty-laptop-37343
03/23/2023, 3:18 AMunittest.mock.patch
to mock, but it works when testing with pytests but not pants, I am wondering if there’s extra setup required to mock external apis with pants
conftest.py
@pytest.fixture(scope="session")
def mock_api():
mock_get_response_patcher = patch("some-api")
mock_get_response = mock_get_response_patcher.start()
mock_get_response.return_value = {....}
Then in my test, I have
A.py
def test_A(..., mock_api):
.....
irrelevant code
.....
I have verified the mock_api function is called correctly, but when I test it with pants, the mock function is called, but the mock_get_response
value is not what I expected, and it will actually call the mocked function instead of returning the value I set it to be.
However, when testing it with pytest, it works perfectly(it actually mocks), so I wonder if I had missed anything?big-agency-50572
03/23/2023, 11:59 AM~/.gitconfig
. I have a rule to add my repo as a safe directory (git config --global --add safe.directory '/my/git/repo'
), but when I run Pants (./pants --changed-since 1.2.3
) I'm receiving this error:
[INFO] No git repository at /my/git/repo: GitBinaryException("fatal: detected dubious ownership in repository at '/my/git/repo'\nTo add an exception for this directory, call:\n\n\tgit config --global --add safe.directory /my/git/repo\n")
Does Pants use some other Git config?average-breakfast-91545
03/23/2023, 12:24 PMentr
but wondered if pants can do this for me.average-breakfast-91545
03/23/2023, 12:43 PMrequests
and structlog
to a layer, I need build a zip with the following structure:
python/
requests/
structlog/
Looks like the archive target only supports packages and files. Do I need to create a custom target for this? Is that something you'd like contributed back?loud-laptop-17949
03/23/2023, 5:22 PM--mypy-args
to pants check
? @sparse-lifeguard-95737 and I have reproduced an issue where if we use that parameter on the command line, pants does not appear to cache the results. We believe caching is not being used because subsequent runs with no changes do not get faster, as is usual with pants.ripe-gigabyte-88964
03/23/2023, 8:15 PMpex_binary
target to the extra_build_args
field of a docker_image
target?few-arm-93065
03/23/2023, 9:45 PMwitty-laptop-37343
03/23/2023, 11:36 PMmocker
, and I have added
python_tests(
dependencies = [.., "//:root#pytest-mock"]
)
but getting
A distribution for pytest-mock could not be resolved for /opt/homebrew/Caskroom/miniconda/base/envs/sample/bin/python3.9
what’s the correct way for pants to infer this pytest-mock dependency? am I missing something?chilly-holiday-77415
03/24/2023, 11:56 AM11:53:26.12 [ERROR] 1 Exception encountered:
Engine traceback:
in `run` goal
IntrinsicError: No such file or directory (os error 2)
using
pants_version = "2.16.0.dev5"
Appreciate this might be a sharp edge of not being on a release version, but thought I’d ask since I didn’t spot any related issues. Seems to happen intermittently without an obvious trigger - and doesn’t happen with scie_pants
which I’m probably going to push everyone on the team to use instead to solve 🙂swift-river-73520
03/24/2023, 4:35 PMsrc/python/projectA
src/python/projectB
src/python/pipeline
and projectB
is dependent on projectA
. Both projects are using the poetry_requirements
target and pyproject.toml files to specify their dependencies. Eventually pipeline
will be deployed as a docker image, and has dependencies on both projectB
and projectA
.
How do I indicate these internal dependencies to Pants to ensure that projectB
and projectA
sources are packaged and included in the docker image deployed from pipeline
(or even when running the tests in pipepline
)?
It feels like I would maybe want to use parameterized resolves on each target within projectA
and projectB
?ripe-gigabyte-88964
03/24/2023, 7:24 PMbitter-ability-32190
03/24/2023, 7:53 PMalert-dawn-51425
03/24/2023, 8:45 PM| => ./pants check ::
16:44:27.01 [INFO] Initializing scheduler...
16:44:27.03 [WARN] DEPRECATED: Setting `Goal.environment_behavior=EnvironmentBehavior.UNMIGRATED` for `Goal` `list-docker-build` is scheduled to be removed in version 2.17.0.dev0.
See <https://www.pantsbuild.org/v2.15/docs/plugin-upgrade-guide>
16:44:27.03 [WARN] DEPRECATED: Setting `Goal.environment_behavior=EnvironmentBehavior.UNMIGRATED` for `Goal` `dc` is scheduled to be removed in version 2.17.0.dev0.
See <https://www.pantsbuild.org/v2.15/docs/plugin-upgrade-guide>
16:44:28.42 [INFO] Scheduler initialized.
16:44:39.36 [INFO] Canceled: Building extra_type_stubs.pex from 3rdparty/python/mypy_lockfile.txt
16:44:39.47 [ERROR] 1 Exception encountered:
Engine traceback:
in `check` goal
in Typecheck using MyPy - (environment:local, mypy)
InvalidLockfileError: You are using the lockfile at build_support/plugins/deps_lock.txt to install the resolve `pants-plugins` (from `[python].resolves`). However, it is not compatible with the current targets because:
- The targets use interpreter constraints (`CPython==3.10.*`) that are not a subset of those used to generate the lockfile (`CPython<3.10,>=3.7`).
The lockfile's interpreter constraints are set by the option `[python].resolves_to_interpreter_constraints`, which determines how the lockfile is generated. Note that that option only changes how the lockfile is generated; you must still set interpreter constraints for targets via `[python].interpreter_constraints` and the `interpreter_constraints` field (<https://www.pantsbuild.org/v2.15/docs/python-interpreter-compatibility>). All targets must have interpreter constraints that are a subset of their resolve's constraints.
To fix this, you can either adjust the interpreter constraints of the targets which use the resolve 'pants-plugins', or adjust `[python].resolves_to_interpreter_constraints` then run `generate-lockfiles`.
To regenerate your lockfile, run `./pants generate-lockfiles --resolve=pants-plugins`.
Relevent pants.toml file values:
[python]
enable_resolves = true
interpreter_constraints = ["==3.10.*"]
[python.resolves_to_interpreter_constraints]
# Pants can run with 3.7-3.9, so this lets us
# use different interpreter constraints when
# generating the lockfile than the rest of our project.
#
# Warning: it's still necessary to set the `interpreter_constraints`
# field on each `python_sources` and `python_tests` target in
# our plugin! This only impacts how the lockfile is generated.
pants-plugins = [">=3.7,<3.10"]
[python.resolves]
python-default = "3rdparty/python/python3-deps_lock.txt"
pants-plugins = "build_support/plugins/deps_lock.txt"
swift-river-73520
03/24/2023, 11:17 PMsrc/python/sirch/BUILD
src/python/sirch/sirch/*.py
src/python/sirch/sirch/BUILD
in the src/python/sirch/BUILD
file I have the following:
python_sources(
name="sirch",
sources=["sirch/*.py"]
)
docker_image(
name="docker"
)
pex_binary(name="bin")
when I run pants package src/python/etxpipelines/sirch:bin
it spits out a pex file super fast with no warnings, but when I go to unzip the pex file to inspect what ended up in it I don't see any of the first-party code from src/python/sirch/*.py
or any of the third-party dependencies I would expect either. what am I missing here? I also tried putting a pex_binary
target in src/python/sirch/sirch/BUILD
and building the pex from there, but that didn't seem to change anything. seems like the pex_binary
target isn't inferring that there are any dependencies?high-yak-85899
03/24/2023, 11:21 PM263.97s Scheduling: Building dockerfile_parser.pex from dockerfile-parser_default.lock
average-breakfast-91545
03/25/2023, 10:56 AMdirutil
module that will create me a temp dir under the build root, but the docs are full of terrifying warnings about IO in rules. Is there something I need to be aware of here in order not to enrage the Cache Gods?
From a thread in #generalproud-dentist-22844
03/25/2023, 9:27 PMexport
goal, if I want to create a virtualenv with wheels built by pants, but use the versions locked in the pex lockfile, how might I do that?
I assume there's something in PEX to do that? I couldn't find anything about lockfiles in the pex docs.witty-laptop-37343
03/26/2023, 3:16 PMPANTS_NAMED_CACHES_DIR="/myProject/pants/named_caches"
when running pytest, I see that the libraries are loaded in
"/myProject/pants/named_caches/pex_root/venvs/s/4d6744fa/venv/lib/python3.9/....."
is there a way I could get the value “4d6744fa”broad-dentist-80514
03/27/2023, 2:08 PMpip install
the .whl into a given env, the process gets stuck resolving deps. It installs fine when I pass pip the --use-deprecated=legacy-resolver
flag.
My question: Is there a way to pass --use-deprecated=legacy-resolver
into pants
?
Because currently, pants (like pip) seems to just time out at Building requirements.pex with 1 requirement: ...@ file://
high-magician-46188
03/27/2023, 3:00 PM.py
files in another package.
I thought to solve this by adding the following target in the first BUILD file:
"dependencies": [
":second-package-sources", # this refers to all of the '.py' files in the second package
],
average-father-89924
03/27/2023, 3:15 PMpants fmt file.py
runs a different isort than pants fmt ::
. The latter adds additional line breaks, while the first one removes them. To me both commands should do the same or am I wrong? We also have set up black but this problem is isort specific.
isort config from pants.toml:
[isort]
version = "isort==5.11.4"
lockfile = "3rdparty/python/isort.lock"
interpreter_constraints = ["CPython==3.11.2"]
rich-london-74860
03/27/2023, 3:16 PM--changed-since
2. Find the python_distribution
those python files belong to (if any)
3. Verify that version used in the python_distribution
has changed - since we use bumpversion
this is very easy to do by checking if the setup.cfg
file has changed, which we can assume lives in the same directory as the BUILD
file where a python_distribution
is defined
The purpose of this plug-in is to prevent code from getting merged that changes distribution, without also changing and releasing a new version. I should note that we are using artifactory to serve a private pypi.
As I mention, 1 and 3 are easy, but 2 has a gotcha. By itself --changed-since
will only list the specific targets that have changed, In the primary use case we are considering (a developer changed a few python files without changing anything else) this will only include python source targets. Adding --changed-dependees=transitive
will recursively include every dependency. This does include the python distribution target that the changed python source files live in, but for python distributions that are dependencies of other python distributions, this will include those other python distribution targets as well.
This is correct and desirable behavior in some cases. For example, for testing, if a deep dependency has changed, all of it’s dependees should be tested. However, this is not desirable for deploying python packages. When a deep dependency has changed, it is not necessary to deploy it’s dependees because those dependees do not package and include that dependency. For an app that uses the deep dependency and one or some of its dependees, it only needs to install a new version of that deep dependency.
For example, suppose in a pants repo, we have python distribution A and B where B depends on A. A includes file a.py
and B includes file b.py
. If file a.py
changed, then we want to make sure that distribution A has also changed. --changed-since
will get us a.py
only and adding --changed-dependees=transitive
gets us A and B. Outside of the pants repo, suppose there is an app C that installs A and B. Even though distribution B is a dependee of distribution A, app C only needs to install a new version of A.
There are a few fixes and workarounds that I have considered:
1. Forget about doing all of this and instead prevent overwrites on our artifactory pypi host. However, there are other use cases where we do want to overwrite existing versions and finding this problem after merging is annoying.
2. Do not express dependencies between distributions in pants
at all. In this example, this would mean B has a dependency on requirement A instead of the python sources directly. However, this will mean a different resolve
for every python distribution and it seems to defeat the point of having everything in pants
to take advantage of those other use cases where we do want full recursive dependencies.
3. Use 2 dependees
commands to get the next python distribution dependency, without getting deeper distribution dependencies ./pants list --changed-since=other | xargs ./pants dependees | xargs ./pants dependees
Option 3 is the current plan, but this feels like a hack. Ideally, I would like to be able to traverse the DAG of dependencies inside the plugin code. Is this possible?proud-dentist-22844
03/27/2023, 4:13 PMsetup.py
for wheels (python_distribution
).
In an exported venv, ./pants export ...
, I would like to use editable mode installations of all the exported resolve's code (Ie the code that will end up in that resolve's `python_distribution`s).
So, is there a way to get the pants-generated setup.py
and do something like setup.py develop
.? Or perhaps a PEP 660 install?
One constraint to the solution: Setuptools gained support for PEP 660 in v64, but that is python 3.7+, and I need python 3.6 for the StackStorm project (for now - I need to figure out editable install to unblock some other things before we can work on finally dropping 3.6).proud-dentist-22844
03/27/2023, 6:04 PMPEX_TOOLS=1 reqs.pex venv --pip some/path
, is there a way to control which version of pip+setuptools gets added to the venv?
I'm guessing I need to modify creation of the reqs.pex to ensure it includes the right versions. And if it is simply the EntireLockfile
, like pants does here, then I would need to somehow get those pip+setuptools requirements into the lockfile, right?enough-analyst-54434
03/27/2023, 6:05 PMswift-river-73520
03/27/2023, 6:21 PM__init__.py
file
for example, given a project layout like this:
src/python/pylibrary/sub1/*.py. # also has BUILD
src/python/pylibrary/sub2/*.py. # also has BUILD
src/python/pylibrary/__init__.py # <- has some imports from sub2
src/python/BUILD
one way I thought I'd be able to force pants to include all the files in pylibrary
was to explicitly list the target dependencies
python_distribution(
dependencies=["pylibrary/sub1:src", "pylibrary/sub2:src"]
)
but this seems to still only include files referenced directly or transitively from src/python/pylibrary/___init___.py
so what's the canonical way to build a wheel distribution which includes all files in a subpackage, with the goal of being able to install this library in a consumer's environment? right now it feels like the only way would be to specify a public API in the src/python/pylibrary/__init__.py
(which maybe I should be doing anyway)jolly-kite-53356
03/27/2023, 7:37 PMCould not find a compatible interpreter.
when running a python source. It seems like pants did find the interpreter defined in PYENV, but skipped them for some reasons
Examined the following working interpreters:
1.)/Users/xx/.pyenv/versions/3.7.16/bin/python3.7 CPython==3.7.16
...
Skipped the following broken interpreters:
1.) /Users/xx/.pyenv/shims/python3.7:
pyenv: python3.7: command not found
The `python3.7' command exists in these Python versions:
3.7.16
Note: See 'pyenv help global' for tips on allowing both
python2 and python3 to be found.
swift-river-73520
03/27/2023, 8:34 PMsrc/python/pipelines/pipe1/pipe1/*.py
src/python/pipelines/pipe1/pipe1/BUILD # contains pex_binary target
when I run pants package src/python/pipelines/pipe1/pipe1:bin
I get a pex file that includes all the appropriate first and 3rd party deps (awesome), but the import path for pipe1
is not quite what I'm expecting. when I launch an interpreter using the generated pex file the required import path is
from pipe1.pipe1 import main
whereas I was expecting it to be
from pipe1 import main
any ideas on what I'd need to do to get the latter import path? it's odd to me that I'm even able to do from pipe1.pipe1 import ...
as there's no __init__.py
file at src/python/pipelines/pipe1/