Hi, I got this message for some tests `[WARN] Fail...
# general
r
Hi, I got this message for some tests
[WARN] Failed to generate JUnit XML data for test/python/...tests.
Do you know how I can troubleshoot this problem in Pants? I tried to find some logs and use the flag
--keep-sandboxes=always
, but without success... Note that the tests are successful. Thank you.
e
It would be good to know the Pants version (
./pants -V
) and in what way do you mean `--keep-sandboxes=always`~ without success. Were you able to find the appropriate sandbox and replicate the failure message using the
__run.sh
script in that sandbox?
r
Sure, I run
2.14.0.dev6
e
So presumably you're running Pants with some command line flags. If you could run including `-ldebug`in those flags as well as using
test --force
and include your full command line and full output, that might be enlightening.
The `-ldebug`will give more log info and the
test --force
will force the tests to re-run even if they have cached success.
r
About the sandbox, I'm pretty new to Pants and I thought that maybe there were some additional logs or something...
e
Aha. There are but just in the form of log lines to the console that print out the location of each saved sandbox.
So, high level, Pants runs nothing in your code tree. It builds a sandbox in a /tmp dir with just what's needed (including tools) and runs from there.
This option keeps the sandboxes from getting cleaned up so you can inspect them.
There will be a `__run.sh`script at the root of each sandbox that simulates exactly what Pants actually runs using its Rust engine code.
It can be useful to poke around the sandbox to see file layout and examine and run
__run.sh
to debug.
And, correcting my suggestion this has to do with coverage collection - I read wrongly - it does not: https://github.com/pantsbuild/pants/blob/a69b8f0ef4e14451dbf7f39fd23adaf3f2700535/src/python/pants/backend/python/goals/pytest_runner.py#L370-L376
r
Thanks for the explanation. I ran the test target with
-ldebug
and
--test-force
, but I see nothing more about Junit XML reports.
e
So a heisenbug. To clarify, was the logged warning causing you material issues or were you just trying to understand the warning and fix it to have a clean build / a bit of sanity?
Do you use remote caching?
r
No material issue. Like I said, the tests are successful. But I need the reports in my CI pipeline to be analyzed by SonarQube... So I'm trying to understand the warning and what went wrong when generating the JUnit XML reports...
e
Ah, I'd call that a material issue. You need the reports!
😉 1
Yeah, so without a sandbox from the issue reproducing this is hard to diagnose. When Pants runs pytest it add command line flag to have it emit the JUnit XML and then snapshots all the files output during the pytest run. For the snapshot to exist (there would have been an ugly backtrace if it did not), but the JUnit XML file not to be in the snapshot is unexpected.
From my debug advice above, the `-ldebug`would have proved / disproved the pytest JUnit XML flags were being passed and the sandbox would have allowed further diagnosis.
I just want to step back and confirm this is flaky generally. Is that true? I.E.: Before adding the extra flags to debug all this, would this happen sporadically or all the time?
r
All the time.
Ok, is there a doc on how to use the
sandboxe
and
__run.sh
to troubleshoot?
e
There is not. But it's just a bash script in a tree of files, So it's like troubleshooting any other program. At this Point Pants is out of your way.
r
Ok, I see. I'll try that.
e
So you'd want to start with
__run.sh
and run that and see if the junit file is actually output where the script asks pytest to output it.
👍 1
r
So, using the sandbox, the JUnit XML reports should be generated in the
extra-output
folder?
e
I'm not sure, what is the option set to in
__run.sh
? The full command line should be in there.
If you want to share the
__run.sh
or even the full sandbox tarball I'm happy to take a look.
r
./pytest_runner.pex_pex_shim.sh -c build-support/python/tools/pytest.ini -vv --no-header -s $'--cov-report=' $'--cov-config=build-support/python/tools/.coveragerc' $'--cov=.' $'--cov=src/python' tests/python/nautobot_tests/nautobot_animal_sounds_tests/test_api.py
I don't see
--junitxml-...
e
Ok, thats unexpected - I think. Let me look at the 2.14.0.dev6 code a bit closer here...
r
But what's weird, is that it only happens for some tests, the ones with Nautobot (a Django application)...
e
That should mean the request was "debug": https://github.com/pantsbuild/pants/blob/release_2.14.0.dev6/src/python/pants/backend/python/goals/pytest_runner.py#L287 but you're not, to your knowledge running in that mode.
r
Let me re-run the test for sure...
No, I'm not running in debug mode...
e
Ok. Yeah that would be
--test-debug
or
--test-debug-adapter
and would apply to all tests. You should not see the behavior vary amongst different tests like you do.
r
Ok, I think I found something...
My Nautobot tests set the
PYTEST_ADDOPTS
extra_env_vars
in the
python_tests
target. It seems to override the content of this variable set by pants...
This variable is used by Pants to pass the options for junitxml.
Working report:
Copy code
❯ cat __run.sh

#!/bin/bash

# This command line should execute the same process as pants did internally.

export PANTS_EXECUTION_SLOT=1 PEX_EXTRA_SYS_PATH=$'.:src/python' PYTEST_ADDOPTS=$'--color=yes --junitxml=tests.python.nornir_tests.nornir_greet_tests.test_greeting_tasks.py@tests.xml -o junit_family=xunit2'

cd /tmp/pants-sandbox-vjs2JG

./pytest_runner.pex_pex_shim.sh -c build-support/python/tools/pytest.ini -vv --no-header -s $'--cov-report=' $'--cov-config=build-support/python/tools/.coveragerc' $'--cov=.' $'--cov=src/python' tests/python/nornir_tests/nornir_greet_tests/test_greeting_tasks.py
Not working:
Copy code
❯ cat __run.sh

#!/bin/bash

# This command line should execute the same process as pants did internally.

export DJANGO_SETTINGS_MODULE=nautobot.core.tests.nautobot_config NAUTOBOT_CONFIG=./src/python/netops_nautobot/nautobot_config.py NAUTOBOT_REDIS_PASSWORD=decinablesprewad PANTS_EXECUTION_SLOT=7 PEX_EXTRA_SYS_PATH=$'.:src/python' PYTEST_ADDOPTS=$'-p pytest_nautobot --reuse-db'

cd /tmp/pants-sandbox-SHdrIJ

./pytest_runner.pex_pex_shim.sh -c build-support/python/tools/pytest.ini -vv --no-header -s $'--cov-report=' $'--cov-config=build-support/python/tools/.coveragerc' $'--cov=.' $'--cov=src/python' tests/python/nautobot_tests/nautobot_animal_sounds_tests/test_api.py
My BUILD:
Copy code
python_tests(
    name="tests",
    dependencies=[
        "3rdparty/python:nautobot-plugin-reqs",
        "3rdparty/python:reqs#pytest-django",
        "src/python/nautobot_animal_sounds",
        "src/python/netops_nautobot",
        "src/python/pytest_nautobot",
    ],
    extra_env_vars=[
        "DJANGO_SETTINGS_MODULE=nautobot.core.tests.nautobot_config",
        "NAUTOBOT_CONFIG=./src/python/netops_nautobot/nautobot_config.py",
        "NAUTOBOT_REDIS_PASSWORD=decinablesprewad",
        "PYTEST_ADDOPTS=-p pytest_nautobot --reuse-db",
    ],
    sources=["**/test_*.py"],
)
e
Alrighty - looking back at code here but this looks like a bug. Thanks for digging in!
Definitely straight up bug: https://github.com/pantsbuild/pants/blob/a69b8f0ef4e14451dbf7f39fd23adaf3f2700535/src/python/pants/backend/python/goals/pytest_runner.py#L315-L322 Your BUILD env vars are that last splat which overwrites our internal use. We do not try to merge.
🙌 1
I think merging is the right thing to do, but this needs experimentation to see what happens when you specify conflicting pytest options via the merge, etc.
@rhythmic-glass-66959 I assume only applying these pytest options for these tests is very intentional and that's why their not configured globally via https://www.pantsbuild.org/docs/reference-pytest#section-args
I think the explicit
"3rdparty/python:nautobot-plugin-reqs",
dep pretty much answers my question.
r
Yes, these options are specific to these tests (nautobot).
e
Alrighty. Well, that issue is nice and self-contained and probably amenable to a contribution without exorbitant effort. I'm not sure if that's something you're interested in.
r
Sure, I'll take that one!
Thank you for your help, very appreciated!
e
Excellent. The pantsbuild.org contributor docs should get you down the road but please speak up if you run into roadbumps and need guidance.
👍 1
h
Thanks for reporting @rhythmic-glass-66959, and thanks for offering to fix!