early-businessperson-60137
01/10/2025, 8:01 PMpip install
the same version of playwright depended on by the test
• Run as root playwright install-deps chromium
• Run as the test runner user playwright install chromium
It's possible we could get the playwright install chromium
to work via a shell_command
dependency, but the fact that playwright install-deps chromium
needs to run as root makes doing everything strictly through pants dependencies tough. If we could do something like
COPY /playwright_test.py:<http://thirdparty_deps.xyz|thirdparty_deps.xyz>
RUN pip install <http://thirdparty_deps.xyz|thirdparty_deps.xyz> && playwright install-deps chromium
it would be super convenient
Example 2:
GCP Dataflow (an Apache Beam executor) has a weird execution model where it pickles python code from the launcher to the workers and expects the workers to have an environment that can run them. I haven't had a chance to try using a PEX yet, but I'm a bit nervous. However, they do provide support to supply the docker container for the worker, which typically people pip install
the necessary deps and then either pip install -e
or PYTHONPATH
the source code. If we could instead package up our deps and source code and install them into the worker, it would be super convenient.
I get the sense that the Multi-Stage Build in the python docker blog post is doing something kind of similar, but I couldn't quite grok what's going on there.curved-manchester-66006
01/10/2025, 8:10 PMRUN --mount=type=bind,target=/tmp/bound.pex,source=$PEX_FILE,ro PEX_TOOLS=1 python3.11 /tmp/bound.pex venv --compile $VENV_DIR
to get a "regular" venv in the docker image. The blog post is about optimizations on top of thatearly-businessperson-60137
01/10/2025, 8:44 PMRUN $VENV_DIR/bin/activate
after that?curved-manchester-66006
01/10/2025, 8:46 PMearly-businessperson-60137
01/10/2025, 8:48 PM