high-yak-85899
12/13/2022, 11:40 PMprotobuf<3.18
. There were some important security vulnerabilities fixed since that we have to restrict to protobuf>3.19
in our repo. Result is there is no way for me to add the package to my project. The risk of not being able to resolve dependencies is that the project may not work, but I'm fairly confident it will with some tickets I've seen. So, I'm wondering if it's possible to ignore this dependency mismatch while I wait for an upstream fix? The only alternative I can think is to fork the dependency and bump it myself, but that seems like a large undertaking.gray-shoe-19951
12/14/2022, 12:32 AM@testbook(PATH_NOTEBOOKS / "prep.ipynb", execute=True)
def test_prep_notebook(tb: TestbookNotebookClient):
print_notebook_output(tb)
The error is like
/.cache/pants/named_caches/pex_root/venvs/s/86f08450/venv/lib/python3.9/site-packages/jupyter_client/client.py:202: in _async_wait_for_ready
raise RuntimeError("Kernel died before replying to kernel_info")
E RuntimeError: Kernel died before replying to kernel_info
How to solve this issue? Thanks you!gray-shoe-19951
12/14/2022, 1:50 AMcold-vr-15232
12/14/2022, 9:49 AMaloof-appointment-30987
12/14/2022, 6:10 PM./nectl/neuroedge_cli/BUILD
python_sources(name = "neuroedge_cli")
pex_binary(
name = "nectl",
dependencies = [":neuroedge_cli"],
entry_point = "./nectl.py",
)
If I run
./pants run nectl/neuroedge_cli/nectl.py
Everything is fine. If I instead run
./pants run nectl/neuroedge_cli:nectl
It fails with:
12:58:13.40 [ERROR] 1 Exception encountered:
ProcessExecutionFailure: Process 'Building nectl.neuroedge_cli/nectl.pex with 10 requirements: click==8.1.3, dvg-ringbuffer==1.0.3, keyring==23.5.1, numba==0.56.0, numpy==1.22.4, pandas~=1.4, requests==2.25.1, rich==12.4.1, scipy==1.8.1, tables==3.7.0' failed with exit code 1.
stdout:
stderr:
pid 28706 -> /Users/russellzarse/.cache/pants/named_caches/pex_root/venvs/cac1718c056bb509f51fcdcc0c376b33deaaa8ec/e6831f8bef1e0125d178dbca2c603370d4eeae8a/bin/python -sE /Users/russellzarse/.cache/pants/named_caches/pex_root/venvs/cac1718c056bb509f51fcdcc0c376b33deaaa8ec/e6831f8bef1e0125d178dbca2c603370d4eeae8a/pex --disable-pip-version-check --no-python-version-warning --exists-action a --no-input --isolated -q --cache-dir /Users/russellzarse/.cache/pants/named_caches/pex_root/pip_cache --log /private/var/folders/_d/zt8y2x055597l63bhdztp82c0000gn/T/pants-sandbox-8W6RvT/.tmp/pex-pip-log.ooax9v3d/pip.log download --dest /Users/russellzarse/.cache/pants/named_caches/pex_root/downloads/resolver_download.hetwm61n/usr.local.Cellar.python@3.11.3.11.0.Frameworks.Python.framework.Versions.3.11.bin.python3.11 click==8.1.3 dvg-ringbuffer==1.0.3 keyring==23.5.1 numba==0.56.0 numpy==1.22.4 pandas~=1.4 requests==2.25.1 rich==12.4.1 scipy==1.8.1 tables==3.7.0 --index-url <https://pypi.org/simple/> --retries 5 --timeout 15 exited with 1 and STDERR:
WARNING: Discarding <https://files.pythonhosted.org/packages/b3/23/fd8e7aa70f6c0b41c99de6aae7afc6850ebac2477687e68c6529bfaa41ba/numba-0.56.0.tar.gz#sha256=87a647dd4b8fce389869ff71f117732de9a519fe07663d9a02d75724eb8e244d> (from <https://pypi.org/simple/numba/>) (requires-python:>=3.7). Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.
ERROR: Could not find a version that satisfies the requirement numba==0.56.0
ERROR: No matching distribution found for numba==0.56.0
If I add interpreter_constraints
to the pex_binary
target
pex_binary(
name = "nectl",
interpreter_constraints=['CPython==3.8.*'],
dependencies = [":neuroedge_cli"],
entry_point = "./nectl.py",
)
run succeed. However...
pex_binary(
name = "nectl",
interpreter_constraints=['CPython>=3.8,<4'],
dependencies = [":neuroedge_cli"],
entry_point = "./nectl.py",
)
fails in the same way
Why does the pex_binary require explicit constraints?
Shouldn't the CPython>=3.8,<4
configuration work?curved-microphone-39455
12/14/2022, 7:16 PMproject/BUILD
looks like this
python_sources(
name="project",
dependencies = [
"src/libraries/lib1:poetry",
"src/libraries/lib2:poetry",
"src/libraries/lib3:poetry",
]
)
poetry_requirements(
name="poetry",
)
and all lib looks like this
poetry_requirements(
name="poetry",
module_mapping={
[...] # multiple modules mapping
},
overrides={
[...] # multiple overrides
}
)
it looks like it getting the local dependencies since its throwing errors from inside the library
src/libraries/lib1/lib1/v2/enums/base_format.py:5: in <module>
import yaml
E ModuleNotFoundError: No module named 'yaml'
but if I run the tests of lib1
everything is ok.
Local Dependencies are already declared as Editable in the pyproject.toml
abundant-autumn-67998
12/14/2022, 7:56 PM--include-tools
or --venv
? I found a roundabout way to do this, is there something simpler?
pex --include-tools -o just-tools.pex
PEX_TOOLS=1 PEX_PATH=the-pex-without-tools.pex ./just-tools.pex venv my-venv
While can build our user's newer pexes with --include-tools
, we also need to support their older pexes which were not built with tools.proud-dentist-22844
12/14/2022, 10:12 PM[pylint].source_plugins
in a separate resolve from the rest of the code. That has been working very nicely.
When I run pylint, the venv includes pylint, the source_plugins, and the code that pylint is inspecting. So far so good.
But, now I want to test code in my source_plugins directory, and đź’Ą.
For my “fixtures” I have the test write a dummy python file from a string to a temporary directory. That dummy file imports a module that is in a different resolve. If I were running pylint, this module would be part of the code that pylint inspects, which works just fine. For testing, how do I make that module available to import during the test?ambitious-actor-36781
12/14/2022, 10:40 PMcold-vr-15232
12/15/2022, 10:34 AMbrash-student-40401
12/15/2022, 9:47 PMshutil.copytree(src/services/app/static, etc)
. This works fine when I run the pex target, but once I build a Docker image from it I get FileNotFoundError: [Errno 2] No such file or directory: 'src/services/app/static'
. I tried adding the directory as both a files
and resources
and having my docker target depend on it, but that didn't help. What's the magic I'm looking for here? Am I not referencing it right in the Docker image, or are those files not going to be there at all without some extra work? For reference, my dockerfile itself is simply
FROM python:3.8
ENTRYPOINT ["/bin/app.pex"]
COPY src.services.app.scripts/app.pex /bin
cool-yacht-37128
12/15/2022, 11:52 PMvcs_version
target as a tag for the `docker_image`target? Do I need to write a small plugin?curved-farmer-66180
12/16/2022, 1:01 AMRun mkdir .tmp
mkdir .tmp
./pants --no-verify-config version
shell: /usr/bin/bash -e {0}
env:
pythonLocation: /opt/hostedtoolcache/Python/3.9.16/x64
LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.9.16/x64/lib
./pants: line 400: /home/runner/.cache/pants/setup/bootstrap-Linux-x86_64/2.13.0_py39/bin/python: No such file or directory
Error: Process completed with exit code 127.
curved-farmer-66180
12/16/2022, 3:25 AM12:21:12.39 [INFO] Completed: Building docker image docker:latest
12:21:12.40 [ERROR] 1 Exception encountered:
ProcessExecutionFailure: Process 'Building docker image docker:latest' failed with exit code 1.
stdout:
stderr:
#1 [internal] load build definition from Dockerfile.superman
#1 sha256:d00f80a10014e5eb26d752fef9acfd03ed0d23487d4e93688ad0a6753a8a42cf
#1 transferring dockerfile: 144B 0.0s done
#1 DONE 0.0s
#2 [internal] load .dockerignore
#2 sha256:c686b491462bcaba2ca2d35328c1948309e89ce47a5300a88f8b18edf0a79466
#2 transferring context: 2B done
#2 DONE 0.0s
#3 [internal] load metadata for <http://docker.io/library/python:3.9|docker.io/library/python:3.9>
#3 sha256:5ceb849adf4ca2eed629b09d8add870d381c45f6f40f947be68f6595c97a8af8
Failed to fire hook: while creating logrus local file hook: user: Current requires cgo or $USER, $HOME set in environment
[2022-12-16T03:21:12.378018000Z][docker-credential-desktop][F] get system info: exec: "sw_vers": executable file not found in $PATH
[goroutine 1 [running, locked to thread]:
[common/pkg/system.init.0()
[ common/pkg/system/os_info.go:32 +0x29d
#3 ERROR: rpc error: code = Unknown desc = error getting credentials - err: exit status 1, out: ``
------
> [internal] load metadata for <http://docker.io/library/python:3.9|docker.io/library/python:3.9>:
------
failed to solve with frontend dockerfile.v0: failed to create LLB definition: rpc error: code = Unknown desc = error getting credentials - err: exit status 1, out: ``
Use `--keep-sandboxes=on_failure` to preserve the process chroot for inspection.
modern-monkey-78364
12/16/2022, 7:38 AMpython_sources()
python_tests(
name="tests",
)
And pants used to figure out all the dependencies. But now after the upgrade, all the test cases with above BUILD files are failing since pants is not able to figure out the dependencies. Is there a setting I am missing or this is no longer supported?billions-agency-3905
12/16/2022, 1:07 PMERROR: Could not find a version that satisfies the requirement pantsbuild.pants==2.14.0 (from versions: 0.0.17, 0.0.18, 0.0.20, 0.0.21, 0.0.22, 0.0.23, 0.0.24, 0.0.25, 0.0.26, 0.0.27, 0.0.28, 0.0.29, 0.0.30, 0.0.31, 0.0.32, 0.0.33, 0.0.34, 0.0.35, 0.0.36, 0.0.37, 0.0.38, 0.0.39, 0.0.40, 0.0.41, 0.0.42, 0.0.43, 0.0.44, 0.0.45, 0.0.46, 0.0.47, 0.0.48, 0.0.49, 0.0.50, 0.0.51, 0.0.52, 0.0.53, 0.0.54, 0.0.55, 0.0.56, 0.0.57, 0.0.58, 0.0.59, 0.0.60, 0.0.61, 0.0.62, 0.0.63, 0.0.64, 0.0.65, 0.0.66, 0.0.67, 0.0.68, 0.0.69, 0.0.70, 0.0.71, 0.0.72, 0.0.73, 0.0.74, 0.0.75, 0.0.76, 0.0.77, 0.0.79, 0.0.80, 0.0.81, 0.0.82, 1.0.0, 1.0.1, 1.1.0, 1.2.0, 1.2.1, 1.3.0, 1.4.0, 1.5.0, 1.6.0, 1.7.0)
ERROR: No matching distribution found for pantsbuild.pants==2.14.0
chilly-holiday-77415
12/16/2022, 1:11 PM./pants check ::
- specifically using mypy. It seems that Pants runs mypy per-file, which as far as I understand ignores any exclude
arg passed to mypy. I don’t want to typecheck my tests - is that something I can do?big-xylophone-43403
12/16/2022, 3:49 PMgorgeous-eve-12553
12/16/2022, 3:58 PMsparse-lifeguard-95737
12/16/2022, 6:12 PMexport-codegen
today that generated a ton of small files, I started getting “pantsd OOMKilled” errors on every following run, even on other commands. running with --no-pantsd
now shows:
/Users/runner/.cargo/git/checkouts/lmdb-rs-369bfd26153a2575/6ae7a55/lmdb-sys/lmdb/libraries/liblmdb/:2126: Assertion 'rc == 0' failed in mdb_page_dirty()
Abort trap: 6
has anyone hit this before & know what it means?rich-kite-32423
12/16/2022, 9:27 PMaverage-sugar-68948
12/16/2022, 9:35 PMbroad-processor-92400
12/17/2022, 1:04 AMpantsd
is restarting often, presumably because it's hitting the memory limit. We can just increase the memory limit, but is there a way to understand more about what's going on (e.g. in case we have some simple misconfiguration)?
I tried ./pants --stats-memory-summary ...
but that only reports 10MB (summing all rows), very different to htop reporting 400MB-500MB resident.proud-dentist-22844
12/17/2022, 3:16 AMpants-plugins
resolve that matches pants' interpreter constraints. When I started trying to run some tests, I ran into some conflicting constraints (InvalidLockfileError), so I tried making the default constraints be a superset of all the resolves.
https://github.com/StackStorm/st2/pull/5847/commits/9f26453cb0ff86ff6007a8acc093673431916efc
But now I'm getting different constraints conflict (InvalidLockfileError) issues:
https://github.com/StackStorm/st2/actions/runs/3718138874/jobs/6306123192#step:5:32
What am I missing? (I'm on pants 2.14.0)glamorous-oxygen-87874
12/17/2022, 7:43 AMcrooked-country-1937
12/18/2022, 1:56 PM./pants generate-lockfiles
for specific requirements.txt? need some guidance on how to setup codebases requiring multiple python versions.
We have 2 big codebases. 1. py3.7
used in airflow, another py3.10
used in dockerized apps.
To support these, I went with 2 requirements.txt. 3rdparty/airflow/requirements.txt
and 3rdparty/python/requirements.txt
. Added python_sources(resolve="airflow-default", interpreter_constraints=["CPython>=3.7.10, <3.8"],)
to all BUILD files for airflow codebases.
Also changed my pants.toml
to this:
[python]
interpreter_constraints = ["CPython>=3.7.10, <3.8", "==3.10.*"]
My local python is version 3.10.8
The issue is ./pants generate-lockfiles
when resolving packages for airflow seems to be looking for packages which support 3.10.8. Which causes resolution issues.billions-agency-3905
12/19/2022, 12:25 PMdocker
binary. I can't use dind since it needs privileged access. I usually use kaniko, but that does not have the same CLI as docker. Tried Podman, but it complains about nsenter
when run from pants.adamant-magazine-16751
12/19/2022, 1:35 PMcurved-microphone-39455
12/19/2022, 2:53 PMresources
and the files
targets and wondering how I can define correctly this dependencies. I have already done it for a single file with resource()
and add it to the python_sources
dependencies but for folder that I want to include everything. Thanks in advance 🙂cold-branch-54016
12/19/2022, 3:28 PMpytest-xdist
plugin with 8 workers, each worker then ran these fixtures once and the tests were evenly distributed across the workers.
I am also open for other suggestions on how to potentially solve this issue.