Hi! I'm trying to have my python code communicate ...
# general
f
Hi! I'm trying to have my python code communicate (via subprocesses) with • binaries (possibly built from source) and • other python virtual envs (eg installed via conda) I'd like this to happen during development, (CI) testing, and deployment via PEX in docker images (using
pex venv
). How do I best organise this with Pants? Via
adhoc_tool
, I'm able to compile the binaries, or install the venv via conda. These can then be used a dependency for my
python_source
targets. This makes sense for
pants test
usage. However, when deploying to a docker image, it seems to make more sense to me to manually move the built artefacts separately from the PEX over to the image, in separate instructions (in which case we don't want to bundle the artifacts inside the PEX). It seems quite annoying/unintuitive to me that pants is able to tie those things together nicely, but when I'm building a docker image, I need to use an entirely different approach to bundle these artifacts. Is there a better way, in which we can sensibly have pant produce a single target which is able to run with
pants test
and produces an artifact I can directly deploy to a docker image? Thanks!