I had a question about starting to use remote cach...
# general
p
I had a question about starting to use remote caching (e.g. https://www.toolchain.com). I like the idea and we'd like to do it BUT we do something odd with our dependencies. Per a previous conversation on this Slack channel we're using PyTorch and sometimes need the GPU version of it and sometime the CPU version. Annoyingly, the CPU and GPU versions are the same pip version and the only difference between them is which
--extra-index-url
you pass to
pip
. To make things realitvely easy we control this with an env variable and a wrapper script: if the variable is set we set the
PANTS_PYTHON_REPOS_INDEXES
and
PANTS_PYTHON_RESOLVES
environment variables to pull the right dependencies. So.... given our funky setup, can we still use remote caching or is it going to be unaware of those variables and thus deliver the wrong dependencies (e.g. if I build with CPU and my collegue sets those env vars and thus expects a GPU build will they end up pulling the CPU dependencies from the cache)?