Re: the multiple resolves feature in Pants 2.10 be...
# general
l
Re: the multiple resolves feature in Pants 2.10 beta - https://www.pantsbuild.org/v2.10/docs/reference-python#section-enable-resolves - I see the statement about this feature not working if you have VCS requirements. There's also mention of a workaround. Is that workaround specific to the issues with Poetry's lockfile generation, or does it also apply to VCS requirements? And if it doesn't apply to VCS requirements, it seems safe to conclude that if you have VCS requirements (I have 2), then at least for 2.10, you cannot use the multiple resolves feature?
h
Ah, that should maybe be a new paragraph re: the workaround. The workaround indeed is not for VCS requirements, nor for `[python-repos]`; only for the issue with transitive deps & bad env markers However, there is a workaround if you want to try out multiple resolves with compromises. The key issue for VCS is that pip requires
--hash
for every single entry if it is used by any single one, but there is no
--hash
for VCS. So, you can't use
--hash
at all, which is a bummer for supply chain safety but you can still at least use multiple resolves Two ideas: 1. Still use
generate-lockfiles
but then manually strip hashes. 2. Don't use Pants to generate the lockfiles. Use a technique like from https://www.pantsbuild.org/docs/python-third-party-dependencies#user-lockfile. You'll need to disable Pants's lockfile staleness checks by setting
[python].invalid_lockfile_behavior = 'ignore'
Does that make sense?
l
Interesting - I've been generating a lockfile via pip freeze, hadn't checked out the invalid_lockfile_behavior flag yet. Is that specific to 2.10?
h
That option was added to Pants 2.7 when we added tool lockfiles. A major change with multiple resolves is that Pants should be generating lockfiles for you, not only for tool lockfiles but also your user code. We hate how users have been having to manually generate them Only, that automatic generation doesn't work if you have
[python-repos]
, VCS requirements, or encounter that transitive-deps & env markers issue mentioned in the docs. So, you won't yet benefit from Pants doing generation, but you can still benefit from Pants being able to consume multiple lockfiles Then in hopefully Pants 2.11, we'll be able to switch to Pex doing lockfile generation a la pip, which will address those limitations.
l
I think idea 1 above seems reasonable - generate lockfiles and strip hashes; that's a simple one-time task . And that will work with multiple resolves in 2.10 , even with a couple VCS requirements?
h
Under-the-hood, Pants is converting your requirements into a Poetry pyproject.toml and then calling Poetry to generate the lockfile. I am not 100% certain that Pants properly converts the VCS requirement for Poetry...so it is possible that you would also need to manually fix up the generated lockfile to switch back to VCS rather than the normal one from PyPI. But yes, once you can get valid lockfiles generated—however that be—you should be good to go as long as
[python].resolves
points to the right paths and follow the instructions about the
resolve
field etc. Lockfile consumption is somewhat decoupled from lockfile generation, to your benefit.
l
@hundreds-father-404 thanks for this guidance - wanted a quick sanity check; I generated a lockfile via generate-lockfiles, then I kept the metadata in place and replaced everything else with the contents of "pip freeze". I then replaced my Flask requirement with a VCS path like this:
Copy code
flask@git+ssh://git@github.com/redacted/flask.git@rr/click8 ; platform_system=='Linux'
And it appears to work. Oh - I'm on 2.10.0rc2 and using enable_resolves=true, and this is in 3rdparty/python/default_lock.txt . Is that expected that the above works? I'm certainly glad it does! Haven't fully tested, but seems good so far, just wanted to verify I'm not doing anything wacky that just happens to work
h
Excellent! Yes, it is expected to work.
Hey @lively-exabyte-12840 check out https://github.com/pantsbuild/pants/pull/14675 😄 Feedback definitely appreciated!