narrow-activity-17405
08/31/2020, 8:58 AMnarrow-activity-17405
08/31/2020, 9:36 AMhappy-kitchen-89482
09/01/2020, 4:22 AMsetup.py
. Depending on it doesn't do anything. If you want to depend on a published version of B then you do so like any other "3rdparty" requirement (but you have to have published it first). But I'm guessing that's not what you want? Typically since A/tests
and B
are in the same repo you would have A/tests
depend on B
as an internal dependency. That is, A/tests
's python_tests
should depend on B's python_library
(not its python_distribution
). Then the test can invoke B's code directly (rather than spawning a process to run it).happy-kitchen-89482
09/01/2020, 4:23 AM_dist
target is interesting, but therehappy-kitchen-89482
09/01/2020, 4:23 AMsetup.py
into a dist (for example, should it build a wheel or an sdist?)happy-kitchen-89482
09/01/2020, 4:28 AMnarrow-activity-17405
09/01/2020, 7:49 AMnarrow-activity-17405
09/01/2020, 7:58 AMnarrow-activity-17405
09/01/2020, 8:22 AMhundreds-father-404
09/01/2020, 8:46 AMnarrow-activity-17405
09/02/2020, 6:44 AMrun_pants(["run", "src/python/package_A/script"])
Using this approach, is it possible to run multiple binaries at the same time? I will give it a try, thanks for the tip!hundreds-father-404
09/02/2020, 7:12 AMInitially, I thought it is only for testing of plugins but it can probably be used to run any script within the project, right?Typically, it’s designed for running on synthetic test projects created via
setup_tmpdir()
. I imagine you want to run on your actual project, rather than some contrived example, right?
This is where things would start to fall apart. Tests run in a chroot (tmpdir) in Pants, meaning that they will only copy over files that are known about through the dependencies()
field. (Why? This allows us to safely run tests in parallel and remote execution). So, your test would not have access to src/python/package_A/script
.
You could put a dependency on the python_binary
target in src/python/package_A/script
, but that won’t do what you’re expecting. That will copy over the Python files with the source roots stripped, like package_A/script
. It won’t copy over the raw file, and it won’t copy over the BUILD file either. So, run_pants(["run", "src/python/package_A/script"])
would complain “No BUILD file in src/python/package_A/script”.
You could work around that by declaring a files()
target owning that script, and owning its original BUILD file. A files()
target doesn’t strip source roots - the files get copied exactly as-is. Then, add a dependency to the files()
target to the python_tests()
.
That works fine, and we do that for several Pants integration tests. Unless…`src/python/package_A/script` depends on a bunch of other files, like most projects do. You would need a files()
target owning all of its transitive dependencies..which we’ve found is not feasible to do.
So, my suggestion is only helpful if either a) you’re okay with creating a synthetic target, or b) "src/python/package_A/script"
has very few dependencies. I suspect neither of those are true, unfortunately.
--
Given the above, Benjy is thinking through more robust ways to do what you’re askingnarrow-activity-17405
09/03/2020, 2:49 PMI imagine you want to run on your actual project, rather than some contrived example, rightExactly. Unfortunately, scripts have quite a lot of dependencies. Looking forward to hearing about some more robust way 🙂 At the moment I had to disable integration tests. Many thanks!
narrow-activity-17405
09/03/2020, 3:02 PMModuleNotFoundError: No module named 'A'
I tried running S1 with ./pants run S1 as well as "./dist/S1.pex" with the same result. How can this be solved? Maybe it is just a matter of PYTHONPATH or something like that? I thought that pex is "just" virtual environment so anything that runs within it should be able to import anything that is there 🙂
This can be "solved" by deploying packages through pypi but we also need to build docker images for our scripts and it would be nice if we can just copy one pex file into each docker image. Also, being able to easily test this functionality during development would be nice.narrow-activity-17405
09/04/2020, 2:00 PMpypath = ":".join(sys.path)
myenv = os.environ.copy()
myenv["PYTHONPATH"] = pypath
PROCESS = await asyncio.create_subprocess_exec(
script_path,
stdin=asyncio.subprocess.PIPE,
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.STDOUT,
env=myenv,