lemon-oxygen-72498
01/24/2023, 7:54 PMbump
plugin we have written is now having hiccups with messages of the form:
Filesystem changed during run: retryingBecause thein 500ms...Bump
bump
task bumps version numbers in pyproject.toml
files, the retry is causing multiple bumps to happen with one command. As in, we request bumping the patch number once, and 0.0.1
gets bumped to 0.0.3
or 0.0.4
. It's not deterministic and I did not yet understand the trigger 🤷
My question is: would --no-watch-filesystem --no-pantsd
be a quick workaround until I figure things correctly?refined-addition-53644
01/24/2023, 7:56 PMcurved-manchester-66006
01/24/2023, 8:30 PMfoo[bar]
, other code to depend on foo[bar,bax]
and yet other code to depend on foo[qux]
, then those dependencies need to be in separate resolvers?crooked-country-1937
01/25/2023, 2:58 AMsilly-queen-7197
01/25/2023, 6:28 AMrequirements.txt
and requirements-dev.txt
. The first is for 3rd party dependencies we use in our application while the second is limited to pytest
, black
, flake8
, and the like. When developing we have one venv
that we can point VS Code . In addition to the code completion benefits we can use VS Code's Testing extension with python to easily set breakpoints when debugging tests.
I ran ./pants export ::
but pytest
isn't included as a dependency in the main venv
it exported and I'm not sure how to use the exported pytest
venv
as it's not able to import any of my code (I'm using a src
style layout). I could just include pytest
as a 3rd party dependency and specify export = false
in my pants.toml
but I don't want to ship pytest
to production.
How can I setup Pants so I can maintain install dependencies (e.g. pytest, typeshed, etc) to facilitate a good developer experience while ensuring I don't ship dev dependencies to production?lemon-oxygen-72498
01/25/2023, 8:37 AMchilly-holiday-77415
01/25/2023, 11:13 AMasdf
over pyenv
but either once set up make that easy), just do `./pants run x`” and skip any conversation about venvs when onboarding a non-python colleague to a codebase
• ./pants repl ::
with ipython set in the config is really greatchilly-holiday-77415
01/25/2023, 11:16 AM11:11:33.65 [WARN] Pants cannot infer owners for the following imports in the target <..>/stepfn.py:
* mypy_boto3_stepfunctions.client.SFNClient (line: 7)
I’m using type stubs mapping here, which makes mypy happy but still emits the above warning - but I can’t also use module_mapping
- how should I avoid that warning? Everything works at runtime, but I’d rather not go and comment each import line if there’s a better way 🙂
type_stubs_module_mapping={
"boto3-stubs": ["boto3"],
},
cold-branch-54016
01/25/2023, 12:08 PMdocker_environment(
name="dev_docker",
platform="linux_x86_64",
image="test", # python:3.10.4-slim-bullseye docker image
python_bootstrap_search_path=["<PATH>"]
)
On the test I have set the environment
python_tests(
name="tests",
environment="dev_docker"
)
When I now try to run that test I get the following FileNotFoundError:
./pants test ::
13:06:26.16 [ERROR] 1 Exception encountered:
Engine traceback:
in `test` goal
ProcessExecutionFailure: Process 'Find interpreter for constraints: CPython>=3.10.*' failed with exit code 1.
stdout:
stderr:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/runpy.py", line 189, in _run_module_as_main
mod_name, mod_spec, code = _get_main_module_details(_Error)
File "/usr/local/lib/python3.10/runpy.py", line 223, in _get_main_module_details
return _get_module_details(main_name)
File "/usr/local/lib/python3.10/runpy.py", line 129, in _get_module_details
spec = importlib.util.find_spec(mod_name)
File "/usr/local/lib/python3.10/importlib/util.py", line 103, in find_spec
return _find_spec(fullname, parent_path)
File "<frozen importlib._bootstrap>", line 945, in _find_spec
File "<frozen importlib._bootstrap_external>", line 1439, in find_spec
File "<frozen importlib._bootstrap_external>", line 1411, in _get_spec
File "<frozen zipimport>", line 170, in find_spec
File "<frozen importlib._bootstrap>", line 431, in spec_from_loader
File "<frozen importlib._bootstrap_external>", line 741, in spec_from_file_location
File "<frozen zipimport>", line 229, in get_filename
File "<frozen zipimport>", line 752, in _get_module_code
File "<frozen zipimport>", line 586, in _get_data
FileNotFoundError: [Errno 2] No such file or directory: '/pants-sandbox/pants-sandbox-kEtOGg/./pex'
Does anyone know what the problem might be?polite-garden-50641
01/25/2023, 3:54 PMflat-zoo-31952
01/25/2023, 8:31 PMERROR: Could not find a version that satisfies the requirement pantsbuild.pants.testutil<2.15,>=2.14.0a0
ERROR: No matching distribution found for pantsbuild.pants.testutil<2.15,>=2.14.0a0
seems like there are definitely distributions that match thatcurved-manchester-66006
01/25/2023, 9:42 PMloud-laptop-17949
01/26/2023, 12:11 AMnutritious-hair-72580
01/26/2023, 12:08 PMpants.ci.toml
Then, in your CI script or config, set the environment variableIn my testing, i seem to need to specifyto use this new config file, in addition toPANTS_CONFIG_FILES=pants.ci.toml
.pants.toml
PANTS_CONFIG_FILES="['pants.toml', 'pants.ci.toml']"
for the options to get mergedfamous-architect-76219
01/26/2023, 1:05 PMpydantic
library.
I'm building locally PEX with the following snippet in the BUILD and it is running fine on my laptop:
pex_binary(
name="pydantic-cli",
entry_point="main.py",
dependencies=[
"projects/py/project_2:poetry#pydantic",
]
)
Now I'm trying to pack this PEX into the Docker, so I've added the following snippet to the BUILD
docker_image(
name="pydantic-cli-docker",
dependencies=[
":pydantic-cli",
]
)
I'm able to pack
the docker image, but when I'm running the container it fails:
Failed to find compatible interpreter on path /usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin.
Examined the following interpreters:
1.) /usr/local/bin/python3.9 CPython==3.9.16
No interpreter compatible with the requested constraints was found:
A distribution for pydantic could not be resolved for /usr/local/bin/python3.9.
Found 2 distributions for pydantic that do not apply:
1.) The wheel tags for pydantic 1.10.4 are cp39-cp39-macosx_10_9_x86_64 which do not match the supported tags of /usr/local/bin/python3.9:
cp39-cp39-manylinux_2_28_aarch64
... 304 more ...
2.) The wheel tags for pydantic 1.10.4 are cp39-cp39-macosx_11_0_arm64 which do not match the supported tags of /usr/local/bin/python3.9:
cp39-cp39-manylinux_2_28_aarch64
... 304 more ...
Now pydantic
is coming with pre-compiled binaries in the wheel, but for some reason it finds them incompatible.
Note: My laptop is M1 Macbook
I would like to be able to debug such scenarios alone, but I'm not sure where to start here, please advise.some-book-89972
01/26/2023, 4:28 PMloud-laptop-17949
01/26/2023, 7:20 PMpants export --resolve=pylint
and getting a venv, is there a way to get a pex instead?stocky-helmet-22655
01/26/2023, 7:26 PM--process-execution-local-parallelism=1
. That causes building requirements.pex, pytest.pex, etc to be done in serial and I would like to parallelize this first part then run the tests in serialfancy-daybreak-62348
01/26/2023, 8:32 PMthousands-france-27863
01/26/2023, 10:15 PMpoetry version xxx
(patch, minor, major). So each publish library has its own cadence depending on the changes.
I don’t see anything option in pants to allow versioning (specially individually) but I might be missing something. Is this, or another similar option, supported?
As an alternative, I was thinking it might just be ok to add a macro that runs just before the package state (but only if the system has detected changes).
Having an individual version and increasing the version number for all packages is also an option, but that means we deploy all libraries all the time (even if there aren’t changes) what ain’t ideal.
Other options/ideas welcomed!
Thanks in advance!miniature-table-61213
01/27/2023, 8:24 AMresolve
for a bunch of python_sources
calls. I have a repo with a couple different projects which have conflicting numpy version dependencies among other things. Based on this guide I was able to setup different resolves and set those resolves for the python_requirements
calls that I have for these projects. However, when trying to setup the resolves for my python_sources
, I am having to add resolve="something"
to a bunch of different BUILD
files since I have a bunch of subdirectories within the project. I was wondering if there is an easy way to ask pants
to consider the same resolve for all BUILD
files within a subdirectory so I can just keep python_sources()
in my BUILD files for these subdirectories. Hopefully this makes sense, definitely let me know if I am misunderstanding anything here. Thanks!curved-microphone-39455
01/27/2023, 5:29 PM.pantd.d/pants.log -> .pants.d/pants.log
on the last version of https://www.pantsbuild.org/v2.15/docs/using-pants-in-ci#tip-store-pants-logs-as-artifactsplain-night-51324
01/27/2023, 7:27 PMBuilding 47 requirements for requirements.pex from the thirdparty/python/default.lock resolve: alembic==1.4.3
is taking forever. any docs/tips on ci integrations? All I am running is ./pants test ::
curved-manchester-66006
01/27/2023, 7:44 PMdry-analyst-73584
01/27/2023, 8:30 PMfresh-continent-76371
01/27/2023, 9:39 PMloud-laptop-17949
01/27/2023, 10:37 PMexport
goal that I can't seem to figure out. Earlier today I deleted my local pants caches and now when I run ./pants export --resolve=default
the symlink in the dist directory links to a non-existent directory in ~/.caches.
Running ./pants export ::
to export all resolves does not seem to have this problem.alert-psychiatrist-14102
01/28/2023, 7:02 AMcrooked-country-1937
01/28/2023, 3:40 PM#3 [internal] load metadata for <http://docker.io/library/python:3.8|docker.io/library/python:3.8>
#3 sha256:edad251955f644c6004999f0af04035912392fa02db26821676452becbc715fb
[2023-01-28T15:31:23.094439000Z][docker-credential-desktop][F] get system info: exit status 127
[goroutine 1 [running, locked to thread]:
[common/pkg/system.init.0()
[ common/pkg/system/os_info.go:32 +0x29d
#3 ERROR: rpc error: code = Unknown desc = error getting credentials - err: exit status 1, out: ``
------
> [internal] load metadata for <http://docker.io/library/python:3.8|docker.io/library/python:3.8>:
------
failed to solve with frontend dockerfile.v0: failed to create LLB definition: rpc error: code = Unknown desc = error getting credentials - err: exit status 1, out: ``
proud-dentist-22844
01/28/2023, 5:31 PMtarget()
with dependencies on the subdirs. That should ensure the glob is satisfied for the tests.
Here is the directory and the target()
I’m adding for it:
https://github.com/StackStorm/st2/tree/master/st2tests/st2tests/fixtures/packs
https://github.com/StackStorm/st2/pull/5874/files#diff-0a765a0e1409bcce280bee085bd5903e219b4c63a442696f8d23216a525d882a
But, I am afraid (and I’ve got feedback from others who are afraid) of not keeping the target()
dependencies up-to-date as people add new directories. Can I use globs in dependencies
to avoid that? And if yes, can I use !
to ignore a couple of directories so they are not direct dependencies (but transitive is fine)?