Hi Team, we are using Pants for pyspark apps. Rece...
# general
b
Hi Team, we are using Pants for pyspark apps. Recently we faced an issue where while executing the test-suite , it started failing because it cannot find the enough core. We realised it was happening because it started executing the all pyspark test cases in parallel. As a workaround for we are using
--debug
flag to run the pyspark tests in sequence. Is it possible to control the number of tests that can run in parallel ?
h
It'll slow things down
But I don't understand the failure - what is happening when your tests run in parallel?
Are the tests colliding on a resource? If so you can use https://www.pantsbuild.org/v2.15/docs/reference-pytest#execution_slot_var to point each test at a different instance of the resource.
Or is it that the spark library expects to have a lot of cores available to itself...
b
test suites, we are explicity setting numbers cores as 1 for the spark lib. Our test are running on github runners, the issue is that it doesnt happen all time but sometimes if the github runner doesnt have enough cores, it fails
Also we are still on pants 2.10 version