loud-spring-35539
04/13/2023, 3:18 AMbored-energy-25252
04/13/2023, 4:58 AMquiet-painter-18838
04/13/2023, 10:51 AMpowerful-eye-58407
04/13/2023, 4:30 PM$ pants update-build-files --check ::
18:29:33.87 [ERROR] 1 Exception encountered:
Engine traceback:
in Update all BUILD files
InvalidLockfileError: You are using the `python-default` lockfile at lockfiles/default.lock with incompatible inputs.
- The inputs use interpreter constraints (`CPython<4,>=3.7`) that are not a subset of those used to generate the lockfile (`CPython==3.8.*`).- The input interpreter constraints are specified by your code, using the `[python].interpreter_constraints` option and the `interpreter_constraints` target field.
- To create a lockfile with new interpreter constraints, update the option `[python].resolves_to_interpreter_constraints`, and then generate the lockfile (see below).
See <https://www.pantsbuild.org/v2.16/docs/python-interpreter-compatibility> for details.
To regenerate your lockfile, run `pants generate-lockfiles --resolve=python-default`.
See <https://www.pantsbuild.org/v2.16/docs/python-third-party-dependencies> for details.
loud-spring-35539
04/13/2023, 5:38 PMpex_binary
target is not adding any of my actual code (other than <http://init.to|init__.py) to> the pex file. Any tips on debugging this?abundant-autumn-67998
04/13/2023, 6:00 PMpex3 lock create .
fails in a folder with only pyproject.toml
when it includes editable dependencies, eg:
[tool.poetry.dependencies]
python = "^3.9"
requests = "^2.28.2"
project2 = {path = "../project2", develop = true}
Without the project2
line the lock works. However with that line this fails with FileNotFoundError: [Errno 2] No such file or directory: '/Users/shalabh/.pex/pip_cache/.tmp/project2'
. is there an easy way to exclude these path based dependencies?
Alternatively, what's the best way to create a dependencies pex from such a project. I'm thinking if we export this into a requirements.txt
and then feed that to pex - that might be a better option?gray-shoe-19951
04/14/2023, 12:53 AMhttp_proxy
, https_proxy
and no_proxy
env vars. However, it seems pants is not able to download the binary from our internal hosting and the reason appears to me is that it does not use no_proxy
. In other words, with http_proxy
and https_proxy
, it wont be able to download the binary from internal hosting since internal hosting is not publicly accessible. Extra info: if I do not override the url_template, pants can download external binary with http_proxy
and https_proxy
. Is it correct?ripe-gigabyte-88964
04/14/2023, 2:03 AMripe-gigabyte-88964
04/14/2023, 2:27 PMIntrinsicError: Side-effects are not allowed in this context: SideEffecting types must be acquired via parameters to `@rule`s.
Has something changed since the guide here was written? Alternatively, is this another case of pants obfuscating the real error message?alert-dawn-51425
04/14/2023, 6:17 PMSetupPyError: Expected a single interpreter constraint for python/org/domain/library:library, got: CPython==3.10.* OR CPython==3.10.*,==3.9.*.
I am not sure where the interpreter constraints are generated from.
For the python_distribution, we have
kwargs["interpreter_constraints"] = ["==3.10.*"]
defined in a macro.
However, in pants.toml we have it declared as
interpreter_constraints = ["==3.10.*", "==3.9.*"]
So that the plugin code can work. I thought that adding the interpreter constrain on all the pex_binary, python_sources, python_tests, python_test_utils and on the python_distribution would ensure that this would only use python 3.10.
Why is it still bringing in 3.9?loud-spring-35539
04/14/2023, 6:56 PMimport numpy as np
ModuleNotFoundError: No module named 'numpy'
----------------------------------------
ERROR: Failed building wheel for nmslib
ERROR: Failed to build one or more wheels
Confirmed the deps exist in the requirements file and what not. Tips for debugging?busy-vase-39202
04/14/2023, 9:18 PMhigh-yak-85899
04/14/2023, 11:21 PM23:06:17.95 [INFO] Long running tasks:
3567.56s Scheduling: Determine Python dependencies for astranis-python/astranis/gnc/odet/gmat_common.py:sources
3567.68s Scheduling: Determine Python dependencies for astranis-python/astranis/gnc/gmat/queried_data_files.py
3568.93s Scheduling: Parse Dockerfile.
Basically ran for an hour and timed out the build. I wasn't able to reproduce locally, but this did fail when we attempted a rebuild. We're just running a ./pants package ::
command with some tag filtering. Any ideas of where to go chasing for what could be holding up this scheduling? The logs show it makes it through building one of our artifacts and then hangs on this for the remaining of the time.silly-queen-7197
04/14/2023, 11:48 PMray.init(runtime_env=...)
I have an "entrypoint" item-rank/src/item_rank/publish.py
so I can do something like ./pants dependencies --transitive item-rank/src/item_rank/publish.py
. If I keep all of my requirements in a single requirements.txt
file I can parse the output and get something like
"db-dtypes~=1.0.4",
"gcsfs~=2022.10.0",
"google-cloud-bigquery-storage==2.16.0",
from
//:reqs#db-dtypes
//:reqs#gcsfs
//:reqs#google-cloud-bigquery
with a little bit bash golf.
How might I translate
archipelago/src/archipelago/foo.py
capstan/src/capstan/bar.py
into env vars like PYTHONPATH=$PYTHONPATH:archipelago/src:capstan/src
e.g. the relevant source roots (I suppose this is just "./pants roots"). I've done this manually and it seems to work (I'm running into permission issues so there are still errors that will take until Monday to resolve, but it seems like this is a viable route)
Edit - I never really asked a question, this comment is really more that I'm curious how other folks have approached integrating with ray. I feel like I'm traveling down the wrong path here yet I should be able to query pants about everything I need to construct an environment for my ray workers. A better solution would be if ray understood pex but that doesn't appear to be a supported feature yethigh-yak-85899
04/15/2023, 12:05 AM"True"
and "False"
?happy-kitchen-89482
04/15/2023, 12:08 AMbitter-ability-32190
04/17/2023, 4:51 PMdocker_environment
could be a docker_image
target (or at least use a series of docker instructions), but looks like I'm mistaken?high-yak-85899
04/17/2023, 5:15 PM./pants test ::
wouldn't include certain tests unless you specifically called out their tags. I think in python, it's likely recommended to use pytest markers, but that can cause issues since ./pants test ::
with a pytest marker added would include a lot of modules that don't end up running any tests and then report a false failure.polite-art-32636
04/17/2023, 5:51 PMpants tailor ::
to boostrap my repo, I end up with a plethora of BUILD
files in every subfolder of all the apps we have (our devs love folders for module separation). Is this really best practice? How do I convince my devs that all these files are necessary and the best practice? especially when they will never be customized or touched after they're created. Can I create one or two per service that would contain all my information necessary for building pex, environments, docker images, etc?gifted-autumn-61196
04/17/2023, 6:13 PMreplace
found in the go.mod? we have a seperate shared module that several systems depend on, and currently we use replace to point at the local version instead of the github module so tests are ran with any changes done to the shared module.ripe-gigabyte-88964
04/17/2023, 7:58 PMFilesystem changed during run: retrying
when trying to call workspace.write_digest
from a goal rule. Looking through .pants.d/pants.log
but nothing is popping out at me as the issue. The digest I am trying to write is just an empty directory:
dagster_home_path = dist_dir.relpath / "dagster_home"
dagster_home_digest = await Get(Digest, CreateDigest([Directory(str(dagster_home_path))]))
logger.debug(f"Creating dagster home directory at {dagster_home_path}")
workspace.write_digest(dagster_home_digest)
average-father-89924
04/18/2023, 9:24 AMModuleNotFoundError: No module named 'azure.cosmos'
. I added the package, generated the locks and exported (azure-cosmos exists in pyproject.toml, the lock files, and also in the dist/.../python/../bin. So, I am pretty sure it is there. I searched here on slack and read the documentation that I need to do a python_requirement(name="azure-cosmos", requirements=["azure-cosmos"])
in my BUILD file, which I did but to no avail. The azure-cosmos
package is used by a submodul that I am importing. I added the python_requirement
line to both BUILD files in the submodule as well as in my main module. I add the stack trace in the thread. Any help is appreciated!famous-xylophone-36532
04/18/2023, 1:38 PMcurved-manchester-66006
04/18/2023, 6:31 PMwonderful-boots-93625
04/18/2023, 6:31 PMpackage
output parseable results of what was built? This is because of --changed-since
only a subset of targets will be built, they’ll have new names because of versioning, and downstream tooling needs to publish or run deployments on those built artifacts.swift-river-73520
04/18/2023, 9:01 PMdocker_image
targets:
Failed to fire hook: while creating logrus local file hook: user: Current requires cgo or $USER, $HOME set in environment
[2023-04-18T20:58:17.637886000Z][docker-credential-desktop][F] get system info: exec: "sw_vers": executable file not found in $PATH
[goroutine 1 [running, locked to thread]:
[common/pkg/system.init.0()
[ common/pkg/system/os_info.go:32 +0x1bc
#3 ERROR: rpc error: code = Unknown desc = error getting credentials - err: exit status 1, out: ``
#5 [auth] sharing credentials for <http://123456789.dkr.ecr.us-east-1.amazonaws.com|123456789.dkr.ecr.us-east-1.amazonaws.com>
#5 sha256:00484c1754ca42cdfd60e0asdf234asdf8a41a33b1f30d70412c4254
#5 ERROR: error getting credentials - err: exit status 1, out: ``
------
> [internal] load metadata for <http://123456789.dkr.ecr.us-east-1.amazonaws.com/databricks-base:0.0.2|123456789.dkr.ecr.us-east-1.amazonaws.com/databricks-base:0.0.2>:
------
------
> [auth] sharing credentials for <http://123456789.dkr.ecr.us-east-1.amazonaws.com|123456789.dkr.ecr.us-east-1.amazonaws.com>:
------
failed to solve with frontend dockerfile.v0: failed to create LLB definition: rpc error: code = Unknown desc = error getting credentials - err: exit status 1, out: ``
I've ran into this error before and found that adding
[docker]
env_vars = [
"HOME",
"USER",
"PATH",
]
fixed the error, but now the error has reappeared despite the fact that I still have the [docker.env_vars]
section configured in pants.toml
. any ideas on how I can squash this?gray-shoe-19951
04/18/2023, 10:01 PMbroad-processor-92400
04/18/2023, 10:40 PMhigh-yak-85899
04/18/2023, 11:13 PMpip install
or does it do manual downloads with the file urls listed in the lockfile?refined-hydrogen-47967
04/19/2023, 3:40 AMjvm_artifact(
name="hadoop-common_test",
group="org.apache.hadoop",
artifact="hadoop-common",
version="2.7.3:test",
resolve="jvm-default",
)
Where the classifier "test" is appended to the version string. Is this the recommended method for specifying a classifier?
In Pants v1 there's a classifier field for jars in a jar_library so I'm also wondering if there was a reason for that not being supported in jvm_artifact.