adventurous-rain-63973
01/24/2024, 10:14 PMwide-midnight-78598
01/25/2024, 3:12 PMadventurous-rain-63973
01/25/2024, 3:15 PMwide-midnight-78598
01/25/2024, 3:17 PMadventurous-rain-63973
01/25/2024, 3:31 PMlibs
that are shared and some projects
that use them. Everything works well and at the current status all matching the requirements. I'm explicitly distinguishing between libs (shared) and projects.
However, there are two cases we have to deal with, which might break things.
1 - A developer is working on a new project and its project being in a development
status. They would like to continue to integrate all the transitive dependencies compatible with the main resolve, however they also want to have some freedom like in a sandbox.
2 - We have a project that will make use of some of the libs
, potentially never being a dependency for other project, and introducing a third-party dep that creates some conflicts for whatever reason with the set currently in the resolve.
In both situations, we are currently forcing to make it work. For instance, if that library has to enter into the requirements and fails for some strict or misconfigured setup.py
, then we fork the project and relax those strict requirements to build a modified wheel. The same happens in case (1), when the developers are still unsure whether that third-party dependency will ever be adopted, but in the meantime struggling to make it work with the monorepo. This is not effortless, and happening more often.
Therefore, I'm trying to identify a way to manage this. I see that there is a minimum (which is actually big) of deps that has to be global. I would identify this with the libs
resolve. Then, I would like to have any other projects being part of other resolves (even one each), but all inheriting the same set of dependencies and pinned versions of the third-party deps for libs
.
I don't know if made my self clear better this time 😕wide-midnight-78598
01/25/2024, 4:03 PMWe have a monorepo, and we are religiously keeping one single resolve..... Whelp, I see a problem 🙂 What's the reason for this? It sounds pretty explicitly like sticking to a single-resolve is the underlying problem, no? And, so because everything needs to be in a single resolve - something like this won't fly?
python_requirements(
name="webapps_shared",
source="webapps-shared-requirements.txt",
resolve=parametrize("webapps_django3", "webapps_django4")
)
I guess I'm not understanding how what you were proposing above would solve the problem with transitive dependencies. Like, you have a large resolve file, but then want to override a dep - that would have to transitively override dependencies all the way down, no? So aren't you just at multiple resolves with more steps?
Is there any way to make relaxed constraints files work? https://www.pantsbuild.org/2.18/reference/subsystems/python#resolves_to_constraints_fileadventurous-rain-63973
01/25/2024, 4:11 PMlibs
consistency with the third-party libraries with exact versions (security policies, for instance). At this point let's call it as libs
.
It's not really that I have a big resolve that I want to override. It's more that I want to extend that by adding something for a single project. If I simply replicate the libs-requirements.txt
file, how can I guarantee to pick the same versions of the common libraries (in libs
lockfile) for all the extended resolves?adventurous-rain-63973
01/25/2024, 4:13 PMlibs
lockfile as constraints to generate the extended one.wide-midnight-78598
01/25/2024, 4:25 PMoverrides
adventurous-rain-63973
01/25/2024, 4:41 PMlibs
referring to Python project that are supposed to be imported. They have a requirements-libs.txt
file, let's say there is requests>1.0.0
. I generate a libs.lockfile
that pins the version of the third party libraries, which pick requests==1.0.1
.
Then, I have projects A
and B
, about a problem that we actually have. They are both using some of the libs
. A
adds batchgenerators
in the requirements list, which has the transitive requirement of unittest2
. B
instead wants to use dbt
which is incompatible with unittest2
. To be more precise, just having unittest2
makes it fails when launched. In this situation libs
cannot work as a global resolve.
What I would like to do is to create for each of them a new resolve and their own requirements.txt
file, but by inheriting the requirements and exact versions from libs
. If I simply tell the target to use requirements-libs.txt
, I don't think I can guarantee to end up with the exact same version in libs.lockfile
, like requests==1.0.1
. I need those exact version to be used as constraints when generating A.lockfile
and B.lockfile
, essentially.curved-manchester-66006
01/25/2024, 6:11 PMadventurous-rain-63973
01/25/2024, 7:59 PMcurved-manchester-66006
01/25/2024, 8:21 PMfoo
, cp
it and then add new projects to it without changing the existing pins would that help?adventurous-rain-63973
01/25/2024, 8:26 PMcurved-manchester-66006
01/25/2024, 8:33 PMcurved-manchester-66006
01/25/2024, 8:34 PMwc -l 3rdparty/py/requirements.txt
134 3rdparty/py/requirements.txt
and I'd be curious at what order of magnitude you are running into a wall for "one big shared lockfile"adventurous-rain-63973
01/25/2024, 8:38 PM