hi friends! we're up to 1.28.0 in prod, and quickl...
# development
w
hi friends! we're up to 1.28.0 in prod, and quickly working on 1.29.0. Seems like a lot of our python targets are still defaulting to
v1
goals. Is there a good place to go to see what
v2
goals are enabled on each version?
h
That’s awesome! You can run
./pants --v1 --no-v2 goals
and then
./pants --no-v1 --v2 goals
to see what is offered by each Many goals like
fmt
and
lint
have both a v1 and v2 implementation, so when you run
./pants --v1 --v2 fmt
, it will attempt to use both. Is it ever using v2 for Python, other than
fmt
and
lint
? You might be missing adding
pants.backend.python
to
backend_packages2
w
it is not, which was surprising to me but this makes a lot of sense, thanks
h
Ah, makes sense. The reason you were using v2 in some places is that you had activated the v2 linters, which work stand-alone, even if
pants.backend.python
is not activated for v2. When adding
pants.backend.python
to v2, you’ll want to remove it from
backend_packages
. See https://pants.readme.io/docs/how-to-upgrade-to-the-v2-engine for some new docs I wrote last week on how to upgrade to v2
w
excellent thanks
very eager to start moving to v2
🔥 1
@hundreds-father-404 back to bother y'all, we had the following for v1 pytest in our toml file:
Copy code
[test.pytest]
extra_pythonpath = "%(buildroot)s
fail_fast = true
chroot = true
doesn't seem to be a valid v2 block
or supported in the regular
pytest
block
how do i specify this in v2 going forward
also, perhaps there's a way to limit the concurrency?
./pants test src::
just spun up like 67 pytest processes, which is excellent, but some of these are also using pyspark exceutions
so my computer is taking a dve right now
WHEEEEE!
w
yikes!
so, for local execution, the number of parallel processes is based on the detected CPU cores
on my machine it’s:
Copy code
--process-execution-local-parallelism=<int>
      default: 16
      Number of concurrent processes that may be executed locally.
in some environments (containerized, etc) you might need to adjust that
w
gotcha, this is a pants option?
w
it is
w
cool
w
there is one other setting that might be affecting you, which is the concurrency of pex resolves:
Copy code
--python-setup-resolver-jobs=<int>
      default: None
      The maximum number of concurrent jobs to resolve wheels with.
see https://github.com/pantsbuild/pants/issues/9964 for more info, but you’ll probably want to lower it to 2 or 4
(and… we should fix that by default. apologies.)
w
no worries! any insight into the deprecated config block above
w
yea: in v2, the nested “goals contain tasks” idiom is gone, and so
pytest
is configured directly as
pytest
see
./pants help pytest
w
right i tried moving those settings there, but one of them gave me some issue let me reporduce
w
@wonderful-iron-54019: some of them are no longer valid
as you can see, v2 is parallel… but it is also sandboxed by default
so the
--chroot
flag is the baked in default now
w
ah gotcha
w
the
extra_pythonpath = "%(buildroot)s
bit … might be problematic
w
was about to ask
w
are you including a specific blob of code in “all targets” with that?
w
sorry im not sure i understand the question
we have a few testing helpers outside our source root that we were including, i believe
w
it looks like that line is truncated… is it specifying a particular directory?
got it
w
oh no , i was manually transcribing, i think i forgot to end the string
w
ah.
so, what we’d recommend would be to leave that setting out, and add dependencies to any targets that needed the helpers…
👍 1
if that ends up being too much boilerplate for your tests, i’d recommend creating a macro that adds the dependency: https://pants.readme.io/docs/macros
💯 1
w
oh lovely
this is all super helpful, thanks
w
strawname:
python_tests_with_lovely_helpers
sure thing!
w
qq re: macros, there's no way to extend a symbol is there? would be excellent to just introduce default behavior on, say, all declared
python_library
targets without having to modify build files
it actually seems like:
Copy code
def python_library(**kwargs):
   # do stuff
   python_library(**kwargs)
doesn't infinitely recurse as i would expect!
i don't know how or why but that's pretty cool!
scratch that. it worked once and failed then hit infinit recursion on subsequent runs
ok more inveestigation. It works if i define a "recursive"
python_library
or a "recursive"
python_tests
but if i define both i hit maximum recursion depth limits 🤔
a
Weird question: are you defining them in the same file, or different files?
w
same file
a
I'm pretty sure redefining symbols wasn't intended to be supported, but it's probably not hard to support reasonably...
w
seems to kind of work
still kind of shocked abou. that
w
You can extend a symbol with the target API, but it's in alpha
Can ping Eric about it in an hour or so when he's more likely to be awake =)
w
sure
we're likely stuck at v1 for everything but formatting without that or substantial reworking of our build definitions
w
But, one thing to note: adding a dep to all targets won't quite work, because one of them will end up with a dep on itself.
And that holds regardless of macros or using the target API
w
this was to add a source exclusion, but sure that makes sense
should be easy to inspect the target name to prevent an self-add tho right?
w
Yea.
What kind of source exclusion...? You can also ask pants to globally ignore things
w
we have schema migration steps scripts throught out our repo. for some reason
pylint src
does not pick them up since they dont have a
__init__.py
defined in their folder, however,
./pants lint src::
is surfacing a wholle bunch of lint errors in there we didn't know were there
in. general observing a lot more lint errors from.
./pants lint
over
pylint
even tho AFAIK i've included our config and pinned the version
w
If you'd like to ignore a filename globally, can add it to the --pants-ignore option: see
./pants help-advanced
.
w
a super weird one includes it finding
similar lines
across different targets, even tho that doesn't happen in
pylint src
and its my understanding that since those targets are isolated in different runtimes it should never trigger something like that
w
cc @hundreds-father-404
w
another thing i'd like to highlight is we have dependencies that are 'dropped in'. to our code post deployment. Airflow and pyspark being the two primary ones
we install them as dev-dependencies in our run environment so that the linter recognizes them
but don't want to add them as dependencies to the target
tried specifying them as
extra_requirements
but that ledd to some resolution errors
w
We should try to have a thread per topic if possible, heh
w
agreed sorry
think we can close out the macros
looks like i can get desired results with a global ignor
👍 1
h
With extending a target, that only allows you to add a new field. You can’t modify a pre-existing field. So you couldn’t change the dependencies of python_library You would either need a new field or to use a macro, both of which require using a new symbol name.
w
y'all might want to look into that because i am ddefinitely adding an exclustion to th sources array wihout using a. new symbol nam
Copy code
def python_library(**kwargs):
    sources = kwargs.get("sources", [])
    sources.append("!**/db_migrations")
    kwargs["sources"] = sources
    return python_library(**kwargs)
👀 1
works like a charm