Is there a way to let Pants treat pytest's exit co...
# general
r
Is there a way to let Pants treat pytest's exit code 5 as a successful run?
i'm skipping some tests using markers (e.g.,
-m 'not integration'
)
a
Shitty solution: a pytest “plugin” that just provides an empty test if none have been collected
b
h
I'm still open to the idea of changing Pants to tolerate it though. We've gone back and forth on what to do and tended to the conservative approach of mirroring the underlying tool The argument for keeping it is that it's a useful warning when you forgot to implement test But then it gets annoying in other cases, especially
-m
and
-k
like you say We could add an option, but what should the default be? And ugh options bloat
h
I think an option to coerce this to 0 makes sense
h
with default to what?
this is one of our more frequently asked questions over the past 2-3 years, so I'd bias I think towards defaulting to the coercion, that we tolerate exit code 5. I suspect it happens far more frequently w/ Pants than non-Pants repos because we run test files one-process-per-file
b
I'm personally conflicted here 😞
h
how so?
b
We're coercing the behavior of the underlying tool. Which that error exists for a reason (although I don't completely agree). Idealistically Pants sits on top of the tools and doesn't have much in between. The less we change, the more users can intuit. OTOH our execution strategy is not what the tools are built to expect, and having users fall into failure isn't good.
h
We're coercing the behavior of the underlying tool. Which that error exists for a reason (although I don't completely agree). Idealistically Pants sits on top of the tools and doesn't have much in between. The less we change, the more users can intuit.
Yeah, that's my general philosophy. Pants v1 did much more customization of tools, e.g. having a custom implementation of Flake8 builtin lol
b
Instead we could nudge people towards the pytest-way of solving this and treat this as an opportunity to educate on how Pants operates under-the-hood
r
I suspect it happens far more frequently w/ Pants than non-Pants repos because we run test files one-process-per-file
Yes, I think this is the main reason why the Pants team is facing this question...
I put those integration tests into separate test modules and setting
-m 'not integration'
makes those modules to collect no tests.
For now, I'm going to use
pytest-custom_exit_code
...
h
Oh, does pytest have the option to turn this off?
r
pytest itself does not, but we can install
custom_exit_code
plugin
it works well
h
We can document that as a workaround