aloof-appointment-30987
12/26/2022, 8:01 PM# Works inside Docker
pex_binary(
name="eeg_processor",
entry_point="app.py",
complete_platforms = [
'lib_third_party/platforms:linux-py38',
],
)
# Does not work: A distribution for confluent-kafka could not be resolved
pex_binary(
name="eeg_processor_mac",
entry_point="app.py",
complete_platforms = [
'lib_third_party/platforms:macos_x86_64',
],
)
pex_binary(
name="eeg_processor_default",
entry_point="app.py",
)
docker_image(
name="neuroedge_eeg_processor",
image_tags=[
"neuroedge_eeg_processor",
# TODO: is this tag useful? "<http://optios.jfrog.io/default-docker-local/neuroedge_eeg_processor:0.1.0|optios.jfrog.io/default-docker-local/neuroedge_eeg_processor:0.1.0>"
],
)
platform _macos_x86_64_ was produced using pex3 interpreter inspect --markers --tags
It does not produce a working pex file.
platform linux-py38 was produced using docker run --rm -it python:3.8 sh -c 'python -mvenv venv && venv/bin/pip -q install pex && venv/bin/pex3 interpreter inspect --markers --tags -i2'
and works inside a simple linux base Docker container.
linux-py38 works correctly when run from inside a Docker container. ./pants run realtime/eeg_processor:neuroedge_eeg_processor
./pants run realtime/eeg_processor:eeg_processor_mac
(using macos_x86_64) fails with: A distribution for confluent-kafka could not be resolved...
./pants run realtime/eeg_processor:eeg_processor_default
works without issue.
Is there some way to produce a multi-platform pex distributable that can be executed from both mac and linux?
Example project attachedhundreds-father-404
12/27/2022, 6:06 PMbusy-vase-39202
12/27/2022, 7:12 PMrich-kite-32423
12/27/2022, 9:42 PMpytest.ini
in the root directory contains:
[pytest]
minversion = 7.0.1
python_files = *.py
addopts = -rA
The python_files = *.py
should make pytest look in all files for tests, according to https://docs.pytest.org/en/7.1.x/example/pythoncollection.html#customizing-test-collection, but it doesn't seem to be; it only seems to look in `foo_test.py`:
bruce@groot:~/RethinkingObjects$ ./pants test ::
14:40:42.23 [ERROR] Completed: Run Pytest - test_experiments/foo_test.py:tests failed (exit code 5).
============================= test session starts ==============================
platform linux -- Python 3.10.6, pytest-7.0.1, pluggy-1.0.0
rootdir: /tmp/pants-sandbox-pDDAT5, configfile: pytest.ini
plugins: forked-1.4.0, cov-3.0.0, xdist-2.5.0
collected 0 items
- generated xml file: /tmp/pants-sandbox-pDDAT5/test_experiments.foo_test.py.tests.xml -
============================ no tests ran in 0.02s =============================
rich-kite-32423
12/27/2022, 9:44 PMaddopts = -rA
is having any effect, either).glamorous-nail-59544
12/27/2022, 11:41 PMpackage
on all docker_image
targets in my repo?shy-advantage-49800
12/28/2022, 11:03 AMshy-advantage-49800
12/28/2022, 2:10 PMrapid-bird-79300
12/28/2022, 5:09 PM./pants peek :: --peek-output-file=/dev/null
with cache. When we regenerate the cache this was fixed. Is there any reason this would happen or do we have a way to prevent this (obviously remote caching but we're not there yet)
[2022-12-28T14:18:45Z] Exception: Could not identify a process to backtrack to for: Missing digest: Couldn't find file contents for "app/something/__init__.py": Was not present in the local store: Digest { hash: Fingerprint<5c836f5086164c2048428ca39322f7d6ae372bedf67de8f0788c61f5ee365499>, size_bytes: 35 }
shy-advantage-49800
12/29/2022, 4:20 PMpyproject.toml
dependencies, I don't see that package being added to the wheel metadata. Example: I have both packages: data-types
and `entities`; and entities
depends on data-types
, if I remove data-types
from the pyprojec.toml
, the latter is not inferred anyway.
I'm using the following macro:
def poetry_distribution(name, package, **kwargs):
resources(name="package_data", sources=["pyproject.toml"])
python_distribution(
name="dist",
dependencies=[":package_data", f"src/python/{name}/{package}"],
provides=python_artifact(name=name),
generate_setup=False,
)
Does anyone know what I'm doing wrong?shy-advantage-49800
12/29/2022, 4:46 PMcurved-television-6568
12/29/2022, 9:13 PMstraight-farmer-24902
12/29/2022, 10:39 PMcareful-address-89803
12/30/2022, 12:45 AMpex -r requirements.txt --lock requirements.lock
), but it didn't like the header pants put on that file.happy-kitchen-89482
12/30/2022, 1:52 PMrich-kite-32423
12/30/2022, 8:08 PMconftest.py
file, or any part of it? Thanks...busy-vase-39202
12/30/2022, 11:01 PMrich-kite-32423
12/31/2022, 6:16 PMpython_test
target for each file in the sources
field."
Should that be sources
"argument", rather than "field"? I've seen this in a number of places and it had me hunting for a "sources field" until I guessed that it probably meant the function argument.proud-dentist-22844
01/02/2023, 6:56 AM[python-infer].string_imports
and `[python-infer].assets`: https://www.pantsbuild.org/docs/reference-python-infer#string_imports
These flags work one way, but I would like to invert the signals given to the python dep inference system. So, the flag enables checking all springs for possible imports or assets, and then you use code comments or a !
(negative) dependency to exclude false positives.
My code base has a lot of strings that look similar to an import but they aren't. So, turning string import inference on globally sounds very problematic to me.
So, instead of excluding false positives, could I use comments to say # pants: string_imports on
and # pants: string_imports off
? In other words, I want to enable string import inference selectively with code comments. I think something similar would be great for assets too.creamy-truck-92790
01/02/2023, 1:15 PMpants.toml
...
[GLOBAL]
backend_packages = [
"pants.backend.python",
"pants.backend.python.mixed_interpreter_constraints",
"pants.backend.project_info",
]
[python-bootstrap]
search_path = ["<PYENV>", "<PATH>"]
[python]
interpreter_constraints = ["CPython==3.10.2"]
.python-version
3.9.13 # This is only set because pants doesn't work with 3.10.*
pyenv versions
system
3.9.13 (set by .python-version)
3.10.2
3.10.6
now if I run anything with pants, it only find my system interpreter and 3.9.13, but the other two not. Basically it doesn't matter what I set under search_path, I always get the same result:
Examined the following interpreters:
1.) /usr/bin/python3.10 CPython==3.10.8
2.) ~/.pyenv/versions/3.9.13/bin/python3.9 CPython==3.9.13
Any ideas someone?creamy-truck-92790
01/02/2023, 1:36 PMhappy-kitchen-89482
01/02/2023, 1:36 PMhappy-kitchen-89482
01/02/2023, 1:38 PM<PYENV>
is supposed to see all the pyenv-installed interpreters in the versions dir, regardless of which ones are set as the current globalcreamy-truck-92790
01/02/2023, 1:38 PMcreamy-truck-92790
01/02/2023, 1:39 PMcreamy-truck-92790
01/02/2023, 1:39 PMhappy-kitchen-89482
01/02/2023, 1:39 PMcreamy-truck-92790
01/02/2023, 2:51 PMcreamy-truck-92790
01/02/2023, 3:15 PMcreamy-truck-92790
01/02/2023, 3:42 PM