wide-midnight-78598
12/02/2023, 12:12 AMpants
aware of my aliases? E.g. I alias docker to podman, but pants doesn’t pick that up nativelypurple-plastic-57801
12/02/2023, 1:41 AMmanual
and they never run unless explicitly invoked directly.. or indirectly through a dependee.
Can I do something similar in pants?
In particular, I have a sphinx-build target that I never want to run in export-codegen.purple-plastic-57801
12/02/2023, 3:43 AMfile(
name="downloaded-sass",
source=per_platform(
macos_x86_64=http_source(
url="<https://github.com/sass/dart-sass/releases/download/1.69.5/dart-sass-1.69.5-macos-x64.tar.gz>",
len=3457742,
sha256="75e29a5bd76069abf2532c9a4b36b164f1c91a461452f0fbdc4167fd1685550c",
filename="sass.tar.gz",
),
linux_x86_64=http_source(
url="<https://github.com/sass/dart-sass/releases/download/1.69.5/dart-sass-1.69.5-linux-x64.tar.gz>",
len=3697303,
sha256="42b3d7b82098432f80d057da5e7464e96e6cd8d90827a9e3a47974f39c930218",
filename="sass.tar.gz",
),
),
)
shell_command(
name="extracted-sass",
command="tar -zxvf sass.tar.gz",
tools=["tar", "gzip"],
execution_dependencies=[":downloaded-sass"],
output_directories=["dart-sass"],
root_output_directory="./dart-sass"
)
run_shell_command(
name="sass",
command="XDG_CONFIG_DIRS=${CHROOT}:$XDG_CONFIG_DIRS {chroot}/sass $@",
workdir="/",
execution_dependencies=[":extracted-sass"],
)
clever-gigabyte-29368
12/02/2023, 6:04 PMpants run my_pex_binary_target
will rebuild from scratch. Could someone suggest how to troubleshoot?powerful-scooter-95162
12/04/2023, 2:13 AMCompleted: Typecheck using MyPy - mypy - mypy failed (exit code 2).
mypy: Invalid python executable './requirements_venv.pex_bin_python_shim.sh': Exec format error
alert-psychiatrist-14102
12/04/2023, 10:06 AMpants run my_target
, and then created the AMI with the relevant caches. However, when we create a new machine and run the same pants run my_target
it still spends a lot of time on building the requirements.
Anyone knows how to improve this? (we have huge requirements, and this stage can take up to 20 mins)modern-manchester-33562
12/04/2023, 1:22 PMisort
but are experiencing issues when doing so. As many before me have reported there seems to be confusion with what packages are 1st and 3rd party. I've tried many of the proposed solutions here and in the issue tracker by e.g. setting src = ["src/*"]
but that seems to not be able to identify all 1st party packages. The only solution that seems to do exactly what is expected is to list all packages manually under:
[tool.ruff.lint.isort]
known-first-party = [
"my_package1",
"my_package2",
"my_packageN",
]
But surely that can't be the solution can it? Is there anything else I've missed that I can try out or is this still an open issue?ripe-gigabyte-88964
12/04/2023, 6:53 PMlively-garden-66504
12/04/2023, 8:43 PMmodule_mapping
issue. I'm trying to use the pytest-json-report
package which has a nonstandard import format, but even module_mapping
seems insufficient to infer deps. I have to explicitly include the dep as a dependency.lively-garden-66504
12/04/2023, 10:20 PMpytest.main('./tests')
where it's located under dir/
from the repo root. I tried to get this to work via pants run dir:runner-target
but it doesn't work because pytest can't find that directory. cd dir && pants run :runner-target
also doesn't work.
So I included files(sources=["**"])
as a dependency to the Pex target. pants dependencies dir/tests_runner.py
shows that all files targets are includes as deps. But running the Pex still doesn't work in any case. So a ran with --preserve-sandboxes=always
and went inside the sandbox. I see dir/test_runner.py
instead, but nothing else in the directory.breezy-mouse-20493
12/04/2023, 11:45 PMpants package
-- is to add a text resource
that has a version string inside it. (It's not something that I want to do long term, but it has gotten me this far...).
Today I'm looking at the pants package
goal and have added this to the build file:
python_sources(
dependencies=[":version"]
)
resource(
# Adding a version file as a package resource
name="version",
source="VERSION"
)
python_distribution(
name="dist",
...
provides=python_artifact(
...
# Adding a version here too, as required by `provides=`?
version="1.2.3+what",
),
)
So now I have two ways of declaring a package version. And two ways of inspecting the version at runtime.
In the first version, I am able do version = pkgutil.get_data(__name__, "VERSION").decode().strip()
from the python runtime to get the package version using the text file. On the other hand, this is not guaranteed to coincide with the package metadata's notion of a version, which is given by the provides
argument of python_distribution
Which versioning mechanism is correct? Also, what alternatives exist?
As for why we want to inspect the package version at runtime: Some of our CLI tools provide a --version option, and to accomplish this we inspect the package version and echo that back to the user in the terminal.lively-garden-66504
12/05/2023, 12:55 AM❯ dist/dir/tests_runner.pex
env: python3.9: No such file or directory
Can try to come up with a repro if needed. Docs and intuition suggest this should respect interpreter constraints so I'm fairly surprised.most-beach-54881
12/05/2023, 11:52 AMabundant-tent-27407
12/05/2023, 12:39 PMpants.toml
file I have the following:
find_links = [
"file://%(buildroot)s/wheels"
]
This then results in urls in the lockfile with "url": "file:///Users/scottmelhop/...../wheels/...
Obviously when I push to CI I get errors. Any ideas how to get around this without having to generate lockfiles each time I run CI?handsome-dusk-11158
12/05/2023, 1:04 PMpex_binary(
name="bin",
script="streamlit",
args=["run", "projects/test/main.py", "--server.port", "8501", "--global.developmentMode", "false"],
dependencies=[":source"],
execution_mode="venv",
restartable=True,
)
However, because the path in the run command points to the local filesystem this does not work when I use the pex in a Docker container. To make it work in the container, I have to change the args to point to the main file in the python env site packages like so:
args=["run", "lib/python3.11/site-packages/test/main.py", "--server.port", "8501", "--global.developmentMode", "false", "--server.address", "0.0.0.0"]
Is there a better way to accomplish this? I imagine people using gunicorn or something similar where a python file needs to be passed as an argument to the actual program would run into the same issue?rich-london-74860
12/05/2023, 5:36 PMpants
, we publish packges using twine
(which I believe pants publish
uses as well), which uses the environment variables TWINE_REPOSITORY_URL
, TWINE_USERNAME
, TWINE_PASSWORD
.
When using pants
, it’s not possible to use environment variables in the repositories parameter of python_distribution, so instead we dynamically generate a ~/.pypirc
file with values pulled from environment variables. In the .pypirc
file, we defined an index alias
[distutils]
index-servers = artifacts
[artifactory]
username = xxxx
password = xxxx
repository = <https://artifacts.our-host.com/some/path/pypi/pypi-local>
Therefore, in the python_distribution
target, repositories=["@artifacts"]
.
With this setup pants publish
publishes packages to our private pypi repository as expected. Recently a bug in CI stopped generating this .pypirc
file. Now there is no .pypirc
file, but the TWINE_*
environment variables are still defined, but somehow pants publish
still works?
If there is no .pypirc
file, then how does pants
know what @artifacts
is?
Is twine
defaulting to the environment variables?
If that is the case, then could any dummy alias be set for repositories
- e.g. repositories=["@null"]
(pants publish
will skip a package entirely if repositories
is not set at all)ripe-gigabyte-88964
12/05/2023, 6:57 PMboundless-zebra-79556
12/05/2023, 7:06 PMtorch
like documented and discussed many times before. Is there any reason not to allow --style configurable for generating lockfiles here https://github.com/pantsbuild/pants/blob/6c90b26f62c108d5a8724cd92db3b69887f8f431/src/python/pants/backend/python/goals/lockfile.py#L120 ? Maybe we can drop target-system advisory if the style is strict, assuming the developer knows what they are doing generating machine-specific lock file?curved-television-6568
12/05/2023, 7:28 PMbreezy-mouse-20493
12/05/2023, 9:56 PMpants repl
. Given a repo that has multiple projects, what addresses are valid to use with pants repl
?
src
└── python
├── BUILD
├── project1
├── project2
├── project3
.
.
.
pants repl src/python/project1
works as expected. However if project1
and project2
both have a common third party dependency, then I see Pex `ResolveError`'s. So, apparently it's not okay to do this.
Which is a bummer because I wanted to pants repl ::
. What should I be doing instead? This is sort of a hypothetical... Not sure who would actually want to open a REPL with the entire universe in it, but someone on my team might ask to do this and I want to know what the story is 😄breezy-mouse-20493
12/05/2023, 10:25 PMpants export --resolve=default
and then in VSCode I find that venv with "> python: Select interpreter". However, this doesn't install first-party code into the environment. Without first-party code installed in "editiable" mode, I can't have VSCode launch and debug a Python module, as none of my first-party code is in the venv.alert-psychiatrist-14102
12/06/2023, 1:14 PMcurved-manchester-66006
12/06/2023, 1:47 PMcold-vr-15232
12/06/2023, 3:32 PMbreezy-apple-27122
12/06/2023, 3:35 PMpants test
). When I run pants test , the files in the folder I'm adding the tests are being imported automatically even if my test file is empty. Is this common behavior? Is there a way to tweak it?
My BUILD file in the folder:
python_tests(name="tests")
__defaults__({"python_tests": {"sources": ["**/test_*.py"]}})
famous-kilobyte-26155
12/06/2023, 4:26 PMbrash-glass-61350
12/06/2023, 4:54 PMpants run
, I want do add flag: --filter-address-regex='.*@resolve=XXX'
. I imagine there's some way to do this in pants.toml
? In particular, can XXX
be an environment variable?
Essentially, I want to specify that when I call pants run path/to/my:file
it should call pants run path/to/my:file@resolve=XXX
, were XXX
is set according to some environment variable (or some other mechanism)fresh-cat-90827
12/07/2023, 12:08 AMhelpful-rocket-63313
12/07/2023, 12:21 AMai-cpu
and ai-gpu
). my resolves looks like:
python_requirements(
name="reqs",
resolve=parametrize("python-default", "ai-gpu", "ai-cpu"),
)
python_requirements(
name="ai-gpu",
source="ai-gpu-requirements.txt",
resolve="ai-gpu",
)
python_requirements(
name="ai-cpu",
source="ai-cpu-requirements.txt",
resolve="ai-cpu",
)
Based on docs, If a first-party target is compatible with multiple resolves, e.g., shared utility code, you can use the parametrize mechanism with the resolve= field.
i should use `python_sources(resolve=parametrize('ai-gpu', 'ai-cpu', 'python-default'). Given my shared code is spread across multiple directories, it will get unmaintainable to always set these resolves for all the targets. So i am thinking to set the defaults at root of python source code, like:
__defaults__(
{(python_sources, python_source): dict(resolve=parametrize(python-default", "ai-gpu", "ai-cpu) )}
)
and for the 2 main files(pex_binary targets) that i am interested it, i will set the resolve ai-cpu
, ai-gpu
. Do you think this is a good idea? Should __defaults__
be used/exploited like this, does it form a good practice, or is there a better way of doing this?white-twilight-61019
12/07/2023, 4:05 AMcelery[sqs]
and celery[redis]
package in poetry (pyproject.toml
) config. I can able to generate python_default.lock
file. However I'm not sure how to add this extras as dependancy in BUILD file. Can someone please guide me how to configure this?
# pyproject.toml
[tool.poetry.group.pkg-celery_worker.dependencies]
celery = {extras = ["redis", "sqs"], version = "^5.3.6"}
# BUILD file
python_sources(
name="pkg",
sources=["celery/**/*.py"],
dependencies=[
"3rdparty/python:poetry#boto3",
"3rdparty/python:poetry#celery",
"3rdparty/python:poetry#redis",
],
)