Hello :wave:! Question about rebuilding invocation...
# general
a
Hello 👋! Question about rebuilding invocations if no code has changed. It seems that for me,
pants package
caches the state of the environment to decide when to re-run. The following holds
Copy code
./pants package <targetX>  # proper building happens
./pants package <targetX>  # no building happens 
# close and open terminal, no code changes
./pants package <targetX>  # again rebuilding
is that expected? I'm on 2.16.0dev7 and quite an old
./pants
launcher script (no scie-pants yet). Thanks!
1
h
What are the packageable targets you're building in this example? pex_binary? docker_image?
a
Hello @happy-kitchen-89482, in this case it is
pex_binary
h
Hmm, this shouldn't be happening. Are you sure it's rebuilding fully, or are you seeing some work happening because pantsd restarted?
Can you show more detailed logs?
a
Thanks! Actually after increasing the log verbosity I start thinking I might be after something: the
fnm_multishells
in path change on every invocation (this is some artifact of the https://github.com/Schniz/fnm manager of node versions...) Here are the logs: first build, second build (noop) and third build after reopening the terminal (I redacted some paths which are unimportant, sorry). Can it be that the because of unrelated change in
PATH
it gets re-run? In
bazel
I believe the
build --incompatible_strict_action_env=true
is responsible for being less sensitive to env changes... Thanks!
h
That absolutely could be the reason - the PATH can get mixed in to process invocation cache keys (if it's exposed to the underlying process). We are looking into mitigating this for multiple reasons - this being one. Another being to get more remote caching hits, when two clients differ only on their PATH, which is a common case.
a
Thanks @happy-kitchen-89482 and I must say I totally bombed this question 🤦 🤦 -- while searching for
incompatible_strict_action_env
in this Slack, I found -- guess what -- https://pantsbuild.slack.com/archives/C046T6T9U/p1647260136815439 😄 it's like I reteruned to the topic and completely forgot what I myself had on it 😄 Thanks once again!
h
hahaha, no worries. we've all done this...