To temporarily install locally built wheels for lo...
# general
r
To temporarily install locally built wheels for local testing, I've set up a workflow like: https://github.com/lablup/backend.ai/tree/main/wheelhouse . The pain point is that
pants generate-lockfiles --resolve=python-default
takes a too long time (more than a minute), which I need to run whenever I modify a single referred package in this way. How could I just scan and update the lockfile by specifying the only relevant package names? (e.g.,
pants generate-files --resolve=python-default --include-only=aiodocker
)
h
Curious about why you need to install locally built wheels, rather than consume the code directly from the repo at HEAD?
r
Want to modify the 3rd party and lint/check/test against with it on the fly. I'm the maintainer of aiodocker, and I'd like to confirm the changes have minimal impacts to existing codebase like Backend.AI using it and pantsbuild.
Regenerating lockfiles works, but it just takes TOO LONG TIME to do so whenever I change a single line of code in the 3rd party. I also don't want to commit & push & trigger CI & pull & etc. to just change and verify a few lines of code.
One of the motivation here is that I'm trying to add type annotations to the overall public API of aiodocker and check the impacts to the downstream...
h
So
aiodocker
lives in some other repo? Or in the same repo?
r
h
Ah, that makes sense
So your process today is - publish in that other repo, bump the version in your monorepo's requirements.txt, and regenerate the lockfile?
r
yes, and I’d like to reflect the local changes of aiodocker without taking too much time (regenerating lockfiles…)
actually putting “aiodocker @ file:///…” in requirements.txt works with latest pex and pip
aiodocker uses hatch to automatically version with the commit distance to the lastest release tag
these make the lockfile’s aiodocker version to be auto-updated when i regenerate the lockfile after a local commit, without modifying requirements.txt, but still i need to regenerate it anyway to make the changes effective when i run other pants commands
if this is just a caching problem, it would be workarounded as deleting a part of pants local cache only about aiodocker. would this be possible?
after making local changes, • having to make a local commit - acceptable, but i’d like to skip this if pssible • having to bump a local version - this can be eliminated by using file:// url • having to regenerate the lockfile - NOT ok because it takes too long…
for the backend.ai monorepo, regenerating python.lock takes more than 90 seconds usually…
h
I see, so you want an auto-refreshing editable install, something like that?
Or that
file:///
hack to work...
r
yes!
h
So there is no way that I can think of to get this sort of thing working. How would you do this outside of Pants? with
file:///
? You couldn't then check that in
since it points to a local file
r
For projects without using pants, I just
pip install -e /path/to/local/checkout
in the venv.
I can do this as well for the exported venvs, but the problem is that lint/check/test environments which pants “hermertically” generates.
h
Right, so this is just for local debugging, it sounds like. You don't ever check in any state that refers to the local checkout.
Pants doesn't really like having local state that isn't reconstructible from checked-in state
So this is an interesting use case
r
well, essentially what I’m going to do is to break the pants’ abstraction on hermetic environments by punching a hole with editable packages, but if this becomes possible, it would be easier to develop with multiple repositories.
we are doing a kind of semi-monorepo or hybrid approach: most first party codes are in a monorepo while there are multiple packages developed together as external dependencies. (enterprise-only plugins, open sources in other organizations like aio-libs, etc.)
h
Yeah, would be good to have a robust solution for this
Maybe this is one for the new Python backend