Is there a concept of "scoping" python requirement...
# general
w
Is there a concept of "scoping" python requirements to a specific python_library and its dependencies? I want to include a large list of requirements to match an environment, but not pollute dependency inference for all other libraries in the project.
h
Not really - but what is the nature of the pollution you're concerned about? dep inference will make sure that only the true deps of those other libraries are used, no matter if the "universe" of requirements known to the repo is large
👍 1
I think I might be missing the point
w
Mostly duplicate dependency issues. I want to run and test code that runs on a bunch of old versions of requests, google libs, etc but don't want to have those be used by any other lib (causing dep inference to be ambiguous).
Is that kind of a weird proposition? I was hoping maybe there'd be a way to declare a requirement but only "scope" it to the current build file for dep inference purposes. Either that or somehow specify a "default" requirement/dependency for the repo and other declarations of the same library aren't used for dep inference. Only used if specified directly.
This may be a bad idea! I don't fully understand what impact that may have design-wise.
h
How many of these requirements are there, roughly?
like 3-5, or more than 10?
w
The list is much larger than 10. https://cloud.google.com/composer/docs/concepts/versioning/composer-versions#images You can expand one of "packages" sections to see what I mean. I ran some tests against that code and we used around 10 of those. I'm sure there are more dependencies that our current tests don't catch.
h
This kind of thing will be possible with the upcoming multiple lockfile support
Where you can give a set of requirements a logical name, and depend on them via that name
w
Oh neat! This will probably also enable lockfiles for requirements that have python version-dependent libraries, right? I think I found the proposal doc and issues, so I'll keep an eye out. Thanks again. You're the best 👏
h
Actually I think you can have python version-dependent libraries even in a single lockfile. @hundreds-father-404 is that right?
h
Yeah,
Django==2.0 ; sys_version<'3'
and
Django==3.1 ; sys_version>'3.5'
for example in your requirements.txt. Pants will combine those into one
python_requirement
target and whenever
Django
is used, both requirements get used and pip/pex determines which version to use based on the environment (These are "environment markers")
w
Huh, when I had tried that for data classes, and generated the lockfiles (constraints?) it just saved it without version information. What I used was.
dataclasses==0.8.0; python_version == '3.6'
Maybe I misread the output.
Yeah, when I used that and the version that pants is using to generate the contraints file is > python 3.6, there's no dataclasses in constraints file. Am I conflating concepts (lockfiles vs constraints file)?