Hey all, we have been battling a strange sort of b...
# general
g
Hey all, we have been battling a strange sort of bug in our software that exists somewhere at the intersection of
pants
,
pytest
,
docker
and
alembic
The top level symptom is that different tests seem to somehow "find" the nominally isolated database testing containers of other tests, which leads to a strange variety of errors. Most tests follow a recipe where they (by means of pytest fixtures) 1. Spin up a
testcontainers
database instance 2. Migrate the nude database up to the current state (alembic) 3. Run a test 4. Migrate the database back down 5. Close out the container What is confusing is that we occasionally in CI will set a large subset of tests will fail the the artifact that it will report non-existant tables, with what looks like a mismatched down-migration. It feels like tests within a batch are polluting each other somehow. Is there a way to determine which tests are batched together for a particular run?
I should note that 1-5, and 2-4 are tied together by pretty minimal context managers, but are also pytest fixtures, so it feels like something there isn't acting as we hope. We never see this issue when overall serially with raw pytest
h
Pants usually runs multiple pytest processes concurrently, unless you took pains to make it not do that. Depending on how truly isolated the databases are, that could be the cause.
When you say "batch" do you mean that you are using Pants's test batching feature?
Either way, you can use the execution_slot_var option to get a unique integer for each concurrent pytest invocation that you can mix in to the database name or whatever determines uniqueness
g
That much I know, and each of the tests in question are nominally spinning up a container on a random port etc. so they really shouldn't be able to see each other across processes. What I guess I am asking after is twofold: • Is there a way to get a list of the tests/files that will have the same PANTS_EXECUTION_SLOT • Is there anything interesting/different that pants does with pytest's fixtures that could be causing unexpected behaviour? Alembic has some known concurrency issues wrt. to migrations, but my assessment was that everything within a test batch should be run serially and basically just as pytest would
h
Re the first point - there isn't but you could have your tests log their PANTS_EXECUTION_SLOT and gather that info?
Re the second point - I don't think so? Pants doesn't do anything special with fixtures. Within a batch, yeah, it's just executing pytest one time in a sandbox
you can run
--keep-sandboxes=always
to preserve those sandboxes and then you can play around inside them directly, without Pants in the mix, to see what's going on