-
Notifications
You must be signed in to change notification settings - Fork 9.7k
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Terraform Test: add JUNIT output #34264
Comments
Thanks for this feedback, @rvdouderaa! Earlier iterations of the
We also got very, very little feedback on that part of the experiment and so we didn't feel confident enough in our guesses at problem 1 to warrant spending all the time working through problem 2. However, now that you're here with a request specifically for this, we can potentially make some progress on problem 1! To help with that, I wonder if you'd be willing to create and share a practical example of a JUnit XML file reporting the results of a realistic That would help by illustrating one possible mapping from Terraform's concepts to the JUnit concepts that is at least useful for Azure DevOps in particular, and hopefully also useful in some other similar test orchestrators that support JUnit XML. The idea here would be to figure out a mapping that is useful in practice with JUnit XML parser implementations that are in real use, as opposed to a mapping which is theoretically plausible but causes the result to be less useful in real existing test reporting software. Thanks again! |
Thanks for the response @apparentlymart I found the following page, which describes the format Maybe no official documentation, however the Apache Ant implementations seems to be the defacto standard. As for use cases. We want to add Terraform Test to our CI/CD pipelines as we are using It doesn't really matter on which test it would fail, the tests are there for a reason. For example: run "input_validation" {
command = plan
variables {
name = "asdfaslkdjfkasjdflkasjdf!@$@$"
resource_group_name = "core-services-rg"
}
expect_failures = [
var.name
]
} This name should not be accepted by the module (only a-z and 0-9) so the input should fail. If not, something is wrong with the modules validation check and we should not be able to publish the module. And we would it to be displayed like this ![]() Hope this makes our usecase(s) clear. |
Addition: This are the test formats supported by Azure Devops: Junit scheme: |
Thanks for that extra context, @rvdouderaa! If you're able to do so, it would help to see an example of exactly what JUnit XML Of course if you can't share it then we ought to be able to test with those tools ourselves at some point to find out, but seeing an example from your already-running system will make it easier to quickly see if mimicking how those tools use the format is likely to be a viable strategy for If you would like to share those files, I'd suggest creating a GitHub Gist with the two (or more) files inside it and linking to it here, just because GitHub comments don't work very well for sharing longer code examples. Thanks! |
@apparentlymart |
Thanks for sharing those, @rvdouderaa. It seems that both of these tools have made the decision that the file containing the problem should be treated as the "test class name" in JUnit terms. However, it also seems from your screenshot of Azure DevOps that it doesn't actually pay attention to that at all; I don't see the filenames appearing anywhere in the UI. It also seems that tfsec elected to use "tfsec" as the name of the entire suite, while tflint did not name the test suite at all. For naming the test cases themselves, tflint used some sort of identifier For
Given that, I suppose one possible way to map it would be:
Does that seem plausible to you as a way to populate the JUnit XML format based on a |
@apparentlymart you can see the testsuite name in the Azure Devops screenshot (first line with a cross, The proposed solutions seems plausible. However I tried this with multiple findings in tflint, and it creates 2 seperate |
looking forward for this functionality to be enabled. When can we expect this to be live? |
@mefarazahmad There is no commitment. I would recommend following the PR: #34291 |
Hi all, Today's alpha release of Terraform CLI includes an early experimental option to generate JUnit XML output. For those who are interested in seeing this move forward, it would be much appreciated if you could share feedback in the community forum topic about the experiment. I must forewarn that I'm currently focused primarily on a different project and so this JUnit XML experiment is a bit of a "background task" for me and so I might be slow to respond, but I do intend to collect up all the feedback and act on it later. Thanks! (We use the community forum for this sort of early feedback, rather than discussion here in GitHub, because the Discourse forum does at least have some support for tracking which comments are in reply to which other comments, whereas unstructured GitHub issue discussions tend to just become a confusing pile! We'll transition back to discussing in GitHub once it seems clearer what design we're going to move forward with and we become more concerned with the details of getting it implemented "for real".) |
Hi again! After some discussion in the community forum topic I linked earlier, there were some conclusions I wanted to bring back in here to inform subsequent rounds of experiment:
Thanks to everyone for sharing their feedback and screenshots! I'm going to be away from this issue for at least a little while since my attention is needed elsewhere, but hopefully the above will be useful for either future-me or someone else working on a second round of experiment soon. |
Some subsequent discussion in the community forum topic led to an additional idea:
|
Hi all, (Sorry for accidentally closing this before. I accidentally triggered GitHub's automatic closing of this issue due to the wording I used in a pull request. 🤦♂️ ) The latest alpha release includes some updates responding to the two items of feedback I described two comments above this one. There's more information in the community forum topic requesting feedback, and I'd appreciate any efforts to re-test this (or to test it for the first time!) with your chosen JUnit XML-supporting software. This doesn't include any changes for the third item I described in the comment directly above this one, because that seems to require more invasive changes. Although I hope to do it eventually, my focus for the moment is on figuring out the best way to map Terraform's testing model onto the JUnit XML format, and it already seems clear where the improved failure messages would be placed (in the Thanks! |
Awesome initiative! Have tested with the alpha build and it works very nicely. Was there any update on when this could get into the main build @apparentlymart ? |
Hi @Liam-Johnston, I no longer work on the Terraform team at HashiCorp, so I cannot make any statement about how continued development of this feature is being prioritized. |
If anyone does stumble across this thread and want a quick and easy translator to get JUnit test results this should help https://github.com/Liam-Johnston/tftest-to-junitxml |
Terraform Version
terraform version Terraform v1.6.3 on darwin_arm64 Your version of Terraform is out of date! The latest version is 1.6.4. You can update by downloading from https://www.terraform.io/downloads.html
Use Cases
Tools like
tfsec
andtflint
can output their test results as junit xml, which can then eg. be published and shown in the tests tab of azure devops pipeline runs.Attempted Solutions
n/a it's not in the documentation
Proposal
Add the functionality to export test results (or at least on test failure) as JUNIT XML files, so this can be used in eg. Azure Devops to show result in the tests tab, like tfsec / tflint.
References
No response
The text was updated successfully, but these errors were encountered: