Hey friends, how are you all doing? I hope you're ...
# general
r
Hey friends, how are you all doing? I hope you're all well. First things first, happy new year!:) So, I have a big
requirements.txt
file that's shared between lots of projects, and this file is being managed by pants. One issue we're facing is that when a project needs to test a new library, it takes a very long time for pants to generate a new lockfile as it tries to check if all dependencies are compatible with each other. I am sure this is not something that I am the only one facing and I wanted to know if you can share some suggestions on how to deal with this issue. Currently, our projects are separated by contexts, so I'm testing creating
requirements.txt
files for each context to have a smaller subset of dependencies for each context instead of having only one
requimements.txt
for the entire repo. This makes the issue more manageable, but we lose the central dependency management that pants gives us. One thing that I also want to try is to use a single
requirements.txt
file for the entire repo, but to create different resolves for each context with their subset of dependencies. I am not sure if this solves my issue, but if this could simplify and speed up the process of generating lockfiles, that would be great. So, what are your thoughts on these approaches, and could you share any suggestions on how to deal with a single
requirements.txt
, but many projects? Also, is there any way to create a resolve from dependencies that pants detected by itself? I think that would be a cool experiment to try. Thanks in advance!
c
One issue we're facing is that when a project needs to test a new library, it takes a very long time for pants to generate a new lockfile as it tries to check if all dependencies are compatible with each other.
The time for
generate-lockfiles
is a problem, or that it updates "a bunch of other stuff" is a problem in its own right?
r
Hey @curved-manchester-66006 both are big issues. If we had some way to just install the new lib trying to conform to the already installed dependencies, we would solve both issues and we would be able to select a better time to update all dependencies together.
c
Makes sense. I think that sounds a lot like https://github.com/pantsbuild/pants/issues/15704
👍 1
r
It seems to relate very closely to the issue you linked. While this issue is still in progress, any suggestions on how to deal with this issue using some not as good, but good enough method?
c
Some approaches we have tried (we are currently trying One Big Resolve): • Have someone update "everything" frequently so there is less drift. • We found that adding an extra constraints file that had >some_release_in_the_past_2_years could speed things up a lot. There is are a bunch of things you could try to make pip resolve depdencies faster, but it's all spooky heuristics. • Hand edit the lockfile :-(
❤️ 1
r
Thank you so much for the answer, @curved-manchester-66006.
p
Our lockfile generation was taking a very long time too. The reason was that pip was downloading multiple versions of various packages to determine compatibility. There is a way to look at the raw
pip.log
file that prints pip activity. In that log file you can see pip saying things like “downloading multiple versions…this could take a while” Implementing cburrough’s 2nd suggestion to add tighter constraints for packages that pip complains about would help pip speed up. More conversation in this thread
❤️ 2
r
@polite-angle-82480 Thanks! I'll check your solution! :^)
f
Hey! I am also facing this solution and am curious if the multiple resolve option is a good one for reducing the wait time on testing out dependency compatibility?
p
Absolutely! I’d definitely recommend you get a 2nd opinion but in general splitting up resolves will by definition remove some packages from a resolve, leading to lower chance of version incompatibilities and faster pip install times
f
awesome thanks i will play around with this and report my findings...