Hi all, I'm trying to set-up the following integra...
# general
Hi all, I'm trying to set-up the following integration testing environment using Pants and I'm getting stuck. We are currently using
to create environments for running integration tests. We might want to move to using Kubernetes pods in future. In both cases we are running the tests from a container which should be created by the test framework. This is different from the way pants runs tests - they are executed in the same environment as pants. Ideally I would like
./pants test ./test/my-integration-test.py
to do the following: 1. Set up a docker-compose test environment. The necessary data for it should be collected from the target. 2. Build the necessary pex files for the test 3. Copy the pex files to a container in the environment 4. Execute the test from the environment and collect the results 5. Tear-down the test envrionment Do you think that something like this is possible to achieve? I'm open to writing plugins as long as I don't need to rewrite the whole pytest rule to get this thing to work.
Interesting case!
In #4, does "execute the test" mean running pytest, but in one of the containers in the environment?
yes, that's how I was imagining it
I created a POC which works, but it turned out to be very slow. Basically I made a shell script
which creates a test environment for each test which contains the whole repository in a container. Then it runs
./pants test ...
in the created environment. This works, but it is very slow.
@curved-television-6568 interested in your thoughts here. We don't support docker-compose yet, right?
Basically, I think Pants has rules to create all the legos here (other than docker-compose, but I can't imagine that is super-hard) it just requires a plugin to fit them all together
@eager-lamp-68986 are there multiple containers in this environment?
Yes, there are
Yea, no docker-compose support, yet. I’be been toying with that a bit in my mind, but nothing concrete yet..
Perhaps a detailed use case such as this may help drive what features are needed to get a good experience for managing a set of services with a docker compose file in tandem with Pants targets.
This is a little different because it's around running tests
@eager-lamp-68986 what about setting up the environment once for all tests and then running
inside the environment? Or is it important for each test to have a separate standalone environment?
For now that can work. But if we are going to bring all of our Python projects in a single monorepo that approach will be limiting, since different projects require different test environments. Also we'll have difficulties running tests in parallel, because they might make the same changes in the test databases. And we will need to make sure that we clean the database state properly after each integration test. So overall this will require a little bit more effort from our developers than if we could just run separate tests in separate environments. Thanks for the suggestion @happy-kitchen-89482. This is probably what I'll do for now.
This should all be doable, but may require some plugin work
Regarding the database concurrency problem, one way around it is this option: https://www.pantsbuild.org/docs/reference-pytest#section-execution-slot-var
So to run, say, 8 tests concurrently you set up 8 databases and ensure that each running test uses the database corresponding to the value in that env var (and cleans it up afterwards)
That saves you having to set up and tear down fresh databases for every test
Another related thing that comes up a lot, and we are thinking about, is to have Pants itself execute processes in a docker container. Today there are two "executors" - local and remote, and this would add a third, "local, in a container"
@eager-lamp-68986 could you open a feature request ticket for this at https://github.com/pantsbuild/pants/issues/new/choose ? It would be great to document all this great info in a less ephemeral place than Slack...
Hey all 👋 I'm running into a similar environment for integration tests, where we use a docker-compose file to startup some containers (e.g. localstack) before executing the integration tests. In our case, pants doesn't need to be executed inside any container, but the containers are necessary for the integration tests to succeed, and they need to startup every time the integration tests are running. I'm wondering if there is anything available out of the box on pants to accomplish that?
☝🏻 1
@able-school-92027 I have a similar use case to this. Have you found any solution to support his workflow?
Hey @lemon-eye-70471 We end up writing an integration test target on our custom pants plugin that calls the docker-compose commands prior to the test command.
Thanks for sharing! Does the test command then run pytest inside containers started with docker compose? I'm not sure I entirely understand.
In our case we're executing .NET tests, but they are not executed inside the container. We use the docker-compose commands to spin up the containers we need, but the
dotnet test ...
command is executed inside the sandbox but outside any container.
✔️ 1