Just some thoughts on making working with resolves...
# general
n
Just some thoughts on making working with resolves easier and giving developers a bit more flexibilty: Does or will Pants have a feature such that if
A
is the set of requirements input into the resolve
python-default
and
B
is a derivative set of requirements in the sense that the resolve of
python-default-B
is compatible with the sub-resolve of the closure of
./pants dependencies proj/lib::
(filtered
A
), then
python_sources
in that glob can be assumed compatible with
python-default-B
without having to declare it and maintain a list? The idea is that as more and more people add to
proj/app
while
proj/lib
stays relatively stable with a minimal set of thirdparty dependencies, people can start defining deltas to the baseline requirements/resolve
B, C, D, ...
and not have to worry about maintaining the
BUILD
files of
proj/lib/**
- it would just automatically work because Pants would detect that
python-default-B
is compatible with
python-default
to the extent that
proj/lib/**
is concerned (or in reality, the closure of whatever the app is depending on in firstparty code). Basically, wondering if Pants does, can, or even should have the ability to infer resolve compatibility (similar to how it does dependency inference). Furthermore, can we imagine a project in
proj/app
pinning their requirements in a
pyproject.toml
and asking Pants to take care of the rest in creating an implied
python_requirements(name="someapp")
and corresponding resolve
python-default-someapp
, perhaps by just adding a directive in its
BUILD
files (or
pyproject.toml
and Pants inferring it is applicable to all
BUILD
file targets blow it) such as
auto_resolve_with_base_requirements = "3rdparty/python:A"
. Maintaining requirements / resolves in the top-level
3rdparty
package and then updating targets throughout the repo that are heavily depended on (lib/util code) for the highly individual requirement deltas of apps/services feels wrong (but see philosophical point below). I think as-of now you can place the new
python_requirements
in the project's subdir, declare the new resolve, etc., but it still seems you would have to go through all the firstparty deps and add your resolve to their list, which feels a bit unsnure to me. Also the author of the app would have to make sure their
python_requirements
includes all the stuff their firstparty dependencies care about (not just their app). It's easy enough to write a script that forms the right input requirements from just their own requirements, but could still be error-prone or a frustrating experience for someone not used to working w/ Pants or a monorepo. At least initially Pants could error if the requirements delta does not result in a compatible resolve with the default resolve of the closure of its dependencies. But even this might be too restrictive. For example, if
python-resolve
has
pandas==1.1.0
but the new app changes the requirements to
pandas=1.2.0
and depends on some first party
python_sources
depending on
3rdparty/python:A#pandas
, then this is a conflict. But probably the individual app does not care and the author does not have time to initiate a repo-wide upgrade of pandas (and all the governance/testing/signoff that would be needed for that), so given some option like
override_autoresolve_conflicts = true
, Pants could assume that the dependency is also compatible with
python-default-B
and use
3rdparty/python:B#pandas
instead. The philosophical point is app / services builders should be able to (if they want to take the risk) override the pinned requirements more easily and w/o having to maintain compatible resolve lists in the shared firstparty code. Most of that code has very flexible requirements anyway, but are nontheless pinned down from Pant's perspective (unless we make the inputs
A
have more open constraints, but now you are making apps have loose constraints and their dependencies can all the sudden change if someone runs
pants generate-lockfiles
again). On the other hand, by having rampant project-specific version pinning, the repo gets harder to maintain and people are less likely to test their apps against upgrades and will likely in the future stop working completely when one of the overrides actually results in a crash in one of their dependencies. Some warning / configurable limit on the number of builds of such a project could be an idea to manage this though?
w
regarding the first half of this (sorry, Slack isn’t great for long format posts like this): “being compatible with a sub-resolve” is effectively equivalent to just using the super/larger-resolve. only the relevant portion of a resolve is used with each consumer. but if you defined a “delta”, then you would no longer necessarily be consuming a subset of the larger resolve… unless the delta only broadened/loosened ranges, i suppose? but i’m not sure how useful that is.
Maintaining requirements / resolves in the top-level
3rdparty
package and then updating targets throughout the repo that are heavily depended on (lib/util code) for the highly individual requirement deltas of apps/services feels wrong (but see philosophical point below).
although multiple named resolves enables a lot of usecases that a single global argument does not, the implicit assumption is still that you will only end up with a handful, which are maintained by a smaller number of folks in your repository. my suggestion is that per-project resolves should be pretty rare, and treated as tech-debt (for cases like importing a new project into the monorepo, etc) which is eventually unified
but to be clear: we want to reduce boilerplate significantly, and
__defaults__
will go a long way there in
2.14.x
, since you’ll be able to adjust the resolves for entire subdirectories quickly
it’s just not necessarily a goal to make it easy to do resolve-per-project
and yea, you acknowledge all of this in your post (“On the other hand, by having…“)
it’s a spectrum, and we want to make it easy to live closer to one end, while still enabling you to fluctuate toward the middle