Is there a way to configure mypy to build per pack...
# general
h
Is there a way to configure mypy to build per package, instead of globally? Similar to how unit tests work?
c
can you elaborate a bit..? you can run mypy over a subset of your sources.. but isn't that you want?
h
If I run
pants check ::
it tries to build a single env for all my packages (seemingly)
Maybe let me try again with more specific targets
c
Ah.. as far as I know, there is a single env for mypy to execute in, but I'm not sure how it works with pulling in the application 3rd party dependencies for typing purposes..
h
Yea, it seems to build a single env, which is tough when you have 300+ python packages in a repo, usually with conflicting deps šŸ˜…
s
If you have conflicting deps, you will probably need to use different resolves for that
h
Right. I guess my comment is,
pants test ::
already works and figures this out. But seems like more extensive work is needed to make this work for mypy
s
idk, mypy in our repo does the partitioning by resolve and interpreter_constraints out of the box
Copy code
Partition #1 - auctions, ['CPython==3.11.*']:
Success: no issues found in 15 source files

Partition #2 - auctions, ['CPython==3.11.*', 'CPython==3.12.*']:
Success: no issues found in 917 source files

Partition #3 - black, ['CPython==3.11.*']:
Success: no issues found in 1 source file

Partition #4 - buildkite, ['CPython==3.11.*', 'CPython==3.12.*']:
Success: no issues found in 2 source files

Partition #5 - default, ['CPython==3.11.*']:
Success: no issues found in 568 source files

Partition #6 - default, ['CPython==3.12.*']:
Success: no issues found in 26 source files

Partition #7 - default, ['CPython==3.11.*', 'CPython==3.12.*']:
Success: no issues found in 946 source files
šŸ¤” 1
āœ… 1
h
Interesting... I'll have to fiddle around with this more.
s
are you sure you have resolves defined for python_sources? not just in pants.toml
h
So the thing is, I have 400+ python packages. Each package has • its own pyproject.toml (some of these will have conflicting python package dependencies) • sometimes interpreter constraints, if the global constraint is invalid And
pants test ::
just kind of works with this setup, builds an env for each package, and then runs tests successfully If I need to define resolves for 400+ packages in order to use
pants check
... I'm probably not going to get that done anytime soon lol.
The repo is open source, if you were curious. https://github.com/run-llama/llama_index/blob/main/pants.toml
s
I don't see resolves enabled, so you're using a single resolve
Here is my take on this https://github.com/grihabor/llama_index/pull/1/files#diff-0e52f2670837f305c9a0d8be0f90156bf366431563506c21584fa2ea4f42ba1fR19 it's not working yet, I'm trying
pants generate-lockfiles
, it finishes but doesn't write files, looks like a bug. But the idea is to create a resolve per project
ok, I managed to generate lockfiles, and now
pants check ::
eats 50gb of memory during
ā ¤ 15.39s Resolve coarsened targets
and crashes with OOM
I'm trying
pants --pantsd-max-memory-usage=20GiB check ::
but it looks like this
Resolve coarsened targets
stage ignores the flag, I think it is another bug
Even
pants --pantsd-max-memory-usage=20GiB check llama-index-core/:: llama-index-cli/::
fails with OOM, it looks like a serious bug
h
Oh wow, thanks for taking a stab at this!! Thats a ton of work šŸ˜…
Yea, seems like memory usage is a bit of a blocker then šŸ¤”
s
Maybe there are ways to make it work, but I've run out of ideas
h
I really appreciate the help! I know I'm supposed to be using lockfiles to begin with, so it's good to know how to get even that far šŸ™
For mypy, I might have to figure out how to get pre-commit to work properly again šŸ˜… or just some shell script to run on changed packages for CICD
s
I've created an issue about the high memory consumption https://github.com/pantsbuild/pants/issues/20568
āœ… 1
šŸ’Ŗ 2
h
Yikes, thanks for the issue, that will require some urgent examination
w
Did anything move forward on this?
s
AFAIK nope
😟 1
šŸ”„ 1