It probably goes against the entire ethos of pex files, but is there a way to dynamically add something to a pex file? I have a situation where I have developers launching Dagster jobs with steps that run remotely on Databricks and another platform. These steps will be ran from a docker file which uses a PEX for packaging the python code to be executed. I'd like to support local development in such a way that the code the developer is working on can get shipped to the remote execution container and used to update the PEX file with any changes that have been made to the source code (this would only be internal source code, dependencies wouldn't be getting upgraded). That way the remote environment is up to date with the developer's local environment. I've looked into creating a new pex file and shipping it to the remote execution environment each time, but it seems to basically build a pex file from scratch every time, which can take multiple minutes - I'm looking for something that will ideally be able to package and ship in < 1 minute.
So it'd be awesome if I could just do something like ship a no-dependency pex file to the remote environment and merge it with the pex that's already there. I got close to getting something working by unpacking the remote pex into a venv and then installing the updated source code at runtime, but the issue is I don't have a good way of only packaging up the sources for a given target (corresponding to a Dagster user code location). Although I could package up a no-dep pex, ship it to the remote location, unpack, then manually install the sources into the remote venv (unpacked from pex) - but I've gone down this rabbit hole and while it seems that it'll work, it's super messy and seems to require manually manipulating the remote venv - obviously far from ideal. Any other suggestions for dynamically updating a pex file at runtime or merging two pex files?