Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Terraform Test: add JUNIT output #34264

Open
rvdouderaa opened this issue Nov 17, 2023 · 16 comments · Fixed by #34973, #36324, #36316 or #36315
Open

Terraform Test: add JUNIT output #34264

rvdouderaa opened this issue Nov 17, 2023 · 16 comments · Fixed by #34973, #36324, #36316 or #36315
Assignees
Labels
active-experiment Request has an active experiment that's welcoming testing and feedback enhancement experiment/test-junit-xml Feedback about the terraform test experimental JUnit XML output new new issue not yet triaged terraform test

Comments

@rvdouderaa
Copy link

Terraform Version

terraform version
Terraform v1.6.3
on darwin_arm64

Your version of Terraform is out of date! The latest version
is 1.6.4. You can update by downloading from https://www.terraform.io/downloads.html

Use Cases

Tools like tfsec and tflint can output their test results as junit xml, which can then eg. be published and shown in the tests tab of azure devops pipeline runs.

Attempted Solutions

n/a it's not in the documentation

Proposal

Add the functionality to export test results (or at least on test failure) as JUNIT XML files, so this can be used in eg. Azure Devops to show result in the tests tab, like tfsec / tflint.

References

No response

@rvdouderaa rvdouderaa added enhancement new new issue not yet triaged labels Nov 17, 2023
@apparentlymart
Copy link
Contributor

Thanks for this feedback, @rvdouderaa!

Earlier iterations of the terraform test experiment did have some JUnit XML support, but we found two challenges along the way:

  1. The JUnit XML format was originally tailored for JUnit itself, and it's close enough to be useful for various other test frameworks in other programming languages that us similar programming paradigms (object-oriented / imperative), but it's not so clear how best to map Terraform's quite different concepts onto its model.
  2. JUnit XML is not actually formally specified anywhere and so supporting it seems to be a matter of just finding and studying all implementations of it to learn what subset of the format is reasonable to use, how to use that subset so that the test result output is actually useful, and working around bugs and quirks in the implementations.

We also got very, very little feedback on that part of the experiment and so we didn't feel confident enough in our guesses at problem 1 to warrant spending all the time working through problem 2.

However, now that you're here with a request specifically for this, we can potentially make some progress on problem 1!

To help with that, I wonder if you'd be willing to create and share a practical example of a JUnit XML file reporting the results of a realistic terraform test run you've done where there were some interesting failures to report, and/or where the success case produces something useful for you in the Azure DevOps UI.

That would help by illustrating one possible mapping from Terraform's concepts to the JUnit concepts that is at least useful for Azure DevOps in particular, and hopefully also useful in some other similar test orchestrators that support JUnit XML. The idea here would be to figure out a mapping that is useful in practice with JUnit XML parser implementations that are in real use, as opposed to a mapping which is theoretically plausible but causes the result to be less useful in real existing test reporting software.

Thanks again!

@rvdouderaa
Copy link
Author

Thanks for the response @apparentlymart

I found the following page, which describes the format
https://github.com/testmoapp/junitxml / https://windyroad.com.au/dl/Open%20Source/JUnit.xsd

Maybe no official documentation, however the Apache Ant implementations seems to be the defacto standard.

As for use cases. We want to add Terraform Test to our CI/CD pipelines as we are using tfsec and tflint. Results should be exported to a readable format by Azure Devops, so test results can be published and pipeline then failed. This would need a feature to continue when he step fails, so the error can be handled by the nexts step.

It doesn't really matter on which test it would fail, the tests are there for a reason.

For example:

run "input_validation" {
    command = plan

    variables {
        name = "asdfaslkdjfkasjdflkasjdf!@$@$"
        resource_group_name = "core-services-rg"
    }

    expect_failures = [
        var.name
    ]
}

This name should not be accepted by the module (only a-z and 0-9) so the input should fail. If not, something is wrong with the modules validation check and we should not be able to publish the module. And we would it to be displayed like this tfsec error

image

Hope this makes our usecase(s) clear.

@apparentlymart
Copy link
Contributor

Thanks for that extra context, @rvdouderaa!

If you're able to do so, it would help to see an example of exactly what JUnit XML tfsec and tflint are creating for you, since presumably in your case mimicking how those tools use JUnit XML would give a result that fits well with how you're already using it with those other tools.

Of course if you can't share it then we ought to be able to test with those tools ourselves at some point to find out, but seeing an example from your already-running system will make it easier to quickly see if mimicking how those tools use the format is likely to be a viable strategy for terraform test.

If you would like to share those files, I'd suggest creating a GitHub Gist with the two (or more) files inside it and linking to it here, just because GitHub comments don't work very well for sharing longer code examples.

Thanks!

@rvdouderaa
Copy link
Author

@apparentlymart
I created a gist with a tfsec and a tflint xml output
https://gist.github.com/rvdouderaa/40821f63aa1407279a3e29292f34ce0c

@apparentlymart
Copy link
Contributor

Thanks for sharing those, @rvdouderaa.

It seems that both of these tools have made the decision that the file containing the problem should be treated as the "test class name" in JUnit terms. However, it also seems from your screenshot of Azure DevOps that it doesn't actually pay attention to that at all; I don't see the filenames appearing anywhere in the UI.

It also seems that tfsec elected to use "tfsec" as the name of the entire suite, while tflint did not name the test suite at all.

For naming the test cases themselves, tflint used some sort of identifier terraform_typed_variables -- I guess that's the machine-readable name for one of its lint rules? -- while tfsec seems to have just chosen a human-readable description of the problem itself (duplicating the text from the failure message inside) rather than of what was being tested.

For terraform test we have some additional concepts that would need to be mapped onto the JUnit XML model:

  • The name of the test scenario (the .tftest.hcl file) that was being evaluated.
  • The name of the test step (the label of the run block) that was being evaluated.
  • The address of each checkable object, when reporting the outcomes of preconditions, postconditions, variable validation rules, and check blocks.

Given that, I suppose one possible way to map it would be:

  • The overall file is rooted with a testsuites element, so that the report can include multiple suites.

  • Each test scenario is a separate testsuite element, named after the basename of the .tftest.hcl file it came from.

  • Inside each suite, each run block is a testcase, whose name is the label in the run block header. class would not be present at all, since Terraform doesn't have anything analogous to "classes" in JUnit, and the tool you are using seems to ignore it anyway.

  • If a test step encounters an error or warning, the usual diagnostics rendering (with color codes omitted) would be placed in a system-err element beneath that test step's testcase.

  • If at least one checkable object fails any of its checks, the testcase contains a failure element whose body contains some textual representation of all of the failures.

    (It isn't clear to me whether multiple failure elements are supported; ideally there'd be a separate failure for each checkable object that failed so that each one can be presented separately, but I've assumed here that tools would only accept one since I can't see any example of multiple failures in the docs you linked.)

Does that seem plausible to you as a way to populate the JUnit XML format based on a terraform test outcome?

@rvdouderaa
Copy link
Author

@apparentlymart you can see the testsuite name in the Azure Devops screenshot (first line with a cross, tfsec (1/1)
That's the only reference. In the screenshot above, there were no tflint findings.

The proposed solutions seems plausible. However I tried this with multiple findings in tflint, and it creates 2 seperate testcase entries... I updated the gist with an example.

@mefarazahmad
Copy link

looking forward for this functionality to be enabled. When can we expect this to be live?

@crw
Copy link
Contributor

crw commented Jan 3, 2024

@mefarazahmad There is no commitment. I would recommend following the PR: #34291

@apparentlymart
Copy link
Contributor

Hi all,

Today's alpha release of Terraform CLI includes an early experimental option to generate JUnit XML output.

For those who are interested in seeing this move forward, it would be much appreciated if you could share feedback in the community forum topic about the experiment.

I must forewarn that I'm currently focused primarily on a different project and so this JUnit XML experiment is a bit of a "background task" for me and so I might be slow to respond, but I do intend to collect up all the feedback and act on it later.

Thanks!


(We use the community forum for this sort of early feedback, rather than discussion here in GitHub, because the Discourse forum does at least have some support for tracking which comments are in reply to which other comments, whereas unstructured GitHub issue discussions tend to just become a confusing pile! We'll transition back to discussing in GitHub once it seems clearer what design we're going to move forward with and we become more concerned with the details of getting it implemented "for real".)

@apparentlymart
Copy link
Contributor

apparentlymart commented Mar 4, 2024

Hi again!

After some discussion in the community forum topic I linked earlier, there were some conclusions I wanted to bring back in here to inform subsequent rounds of experiment:

  • The way we’re describing the test scenarios (each separate .tftest.hcl file) doesn’t seem to match what these tools are expecting: testsuite names didn't appear anywhere in the UI of either Azure DevOps or GitLab CI/CD. It seems like we should try moving the test scenario name into the “classname” attribute instead, to see if that makes it visible to these tools.

  • Test time durations are effectively mandatory in this format, because tools assume that if they are absent then the test took zero seconds to run, rather than (what we had hoped for) treating it as “test duration unknown”.

    This one is trickier because the test harness doesn’t currently measure total duration of tech test step and scenario at all, so we’ll need to add that to the test harness itself before we could include that information in the JUnit XML output.

Thanks to everyone for sharing their feedback and screenshots!

I'm going to be away from this issue for at least a little while since my attention is needed elsewhere, but hopefully the above will be useful for either future-me or someone else working on a second round of experiment soon.

@apparentlymart
Copy link
Contributor

apparentlymart commented Apr 9, 2024

Some subsequent discussion in the community forum topic led to an additional idea:

  • It would be nice if the test harness treated test failures as different to normal errors, and then we placed the failure message text in the single-line "failure message" (the message attribute of the failure element), which today is always set to one of a small set of hard-coded messages reporting the test status.

    Much like the feedback about durations, this seems to require some changes to the test runner itself -- to report test failures as a separate signal to errors -- rather than just a change to the JUnit renderer.

@apparentlymart
Copy link
Contributor

apparentlymart commented May 7, 2024

Hi all,

(Sorry for accidentally closing this before. I accidentally triggered GitHub's automatic closing of this issue due to the wording I used in a pull request. 🤦‍♂️ )

The latest alpha release includes some updates responding to the two items of feedback I described two comments above this one. There's more information in the community forum topic requesting feedback, and I'd appreciate any efforts to re-test this (or to test it for the first time!) with your chosen JUnit XML-supporting software.

This doesn't include any changes for the third item I described in the comment directly above this one, because that seems to require more invasive changes. Although I hope to do it eventually, my focus for the moment is on figuring out the best way to map Terraform's testing model onto the JUnit XML format, and it already seems clear where the improved failure messages would be placed (in the message attribute of the failure element), so there's no strong need to immediately experiment further with that part.

Thanks!

@apparentlymart apparentlymart reopened this May 7, 2024
@apparentlymart apparentlymart added experiment/test-junit-xml Feedback about the terraform test experimental JUnit XML output active-experiment Request has an active experiment that's welcoming testing and feedback labels May 9, 2024
@Liam-Johnston
Copy link

Awesome initiative! Have tested with the alpha build and it works very nicely. Was there any update on when this could get into the main build @apparentlymart ?

@apparentlymart
Copy link
Contributor

Hi @Liam-Johnston,

I no longer work on the Terraform team at HashiCorp, so I cannot make any statement about how continued development of this feature is being prioritized.

@Liam-Johnston
Copy link

If anyone does stumble across this thread and want a quick and easy translator to get JUnit test results this should help

https://github.com/Liam-Johnston/tftest-to-junitxml

# for free to join this conversation on GitHub. Already have an account? # to comment