Hello team :wave: I have a situation where for on...
# general
r
Hello team 👋 I have a situation where for one particular developer on one particular machine,
pants
on v2.16 (yes, I know it's deprecated) keeps using the wrong interpreter constraints for resolves instead of what's in
pants.toml
when running
export
or
generate-lockfiles
. This generally results in
export
or
generate-lockfiles
failing. Other developers on different machines cannot replicate the issue. I've verified that the user does not have a
~/.pants.rc
file and the same error appears when using
--no-pantsrc
. If I use the
--keep-sandboxes=on_failure
parameter, once the command fails, if I remove the
--python
parameter from
__run.sh
scrip, then the script succeeds. Is there anything else I should be checking?
b
What python versions do people have installed? Any notable differences?
r
Everyone basically has an installation of 3.8 and 3.9
b
Can you confirm specifically what a working machine has and the failing machine has?
r
Everyone has 3.8.10 and 3.9.6
But I don't believe the exact python versions matter here,
pants
is using 3.8 on resolves where we specifically set
intepreter_constraints>=3.9
And on
generate-lockfiles
specifically, we get this error:
Copy code
The conflict is caused by:
     The user requested pendulum<3
     dagster 1.7.9 depends on pendulum<4 and >=3; python_version >= "3.12"
but there is no
interpreter_constraint
where we even allow python 3.12 nor is python 3.12 installed on this user's laptop
b
Thanks for the detail, and sorry for the trouble. Can you share your pants.toml and/or any relevant interpreter constraints? I’m not sure how you’ll best narrow this down… some ideas: • use a fresh cache by redirecting the caches to temporary dirs (https://www.pantsbuild.org/2.21/reference/global-options#local_store_dir and https://www.pantsbuild.org/2.21/reference/global-options#named_caches_dir) • compare the __run scripts in detail, especially any paths to Python interpreters • ???
r
Sorry, the user who is having trouble with this is a little busy this week, but I will get back to you on this thread as soon as I can
👍 1