What is JUnit?
JUnit is a widely used testing framework for Java applications, enabling developers to write and execute tests for their code. The JUnit framework provides a systematic way to verify that the code performs as expected, making it an essential tool in the software development lifecycle.
This framework is not used in the embedded system development world, but it produces a standard result format: JUnit XML.
JUnit XML format
The JUnit XML format is a standardized schema for presenting test results. Originally created after the JUnit testing framework for Java, it has gained universal acceptance across various programming languages and frameworks.
The JUnit-XML format is designed to present test results in a clear, structured way that’s easily parsed and understood by continuous integration tools, build systems, and other automation platforms, allowing for seamless integration into automated workflows.
This format is today used in all testing environments, from web and application development to embedded systems. We integrate it to report MCU or embedded Linux testing.
How to write JUnit test cases for XML?
In a JUnit XML report, the top-level <testsuites> tag encompasses the overall test results. The <testcase> and <testsuite> elements are the fundamental building blocks, organizing and structuring the tests. <testsuite> elements act as nodes in the document and can contain nested <testsuite> elements or individual <testcase> elements.
These tags summarize their contents, including execution time, number of tests, and counts of failures and skipped tests. Thus, a quick look at the top <testsuites> tag provides a comprehensive overview of the report.
<testcase name=”ExampleTest” time=”15.03″ assertions=”2″ classname=”ExampleTestSuite”>
<properties>
<property name=”test_description”><![CDATA[This is an Example Test]]></property>
<property name=”step[failure]” value=”wait_for_pattern” />
<property name=”step[passed]” value=”log” />
</properties>
<system-out><![CDATA[- Action “wait_for_pattern” FAIL
– Action “log” PASS
None]]></system-out>
<failure type=”ConsoleCannotOpenError” message=”” />
</testcase>
Within <testcase> elements, <failure> or <skipped> tags indicate the outcome of the test. The <failure> tag may also include message and type attributes for detailed error information. <testcase> elements can capture standard output using <system-out> tags and maintain <properties> tags to serialize additional metadata or contextual information in named <property> elements. Additionally, <testcase> elements are organized in the report hierarchy through the classname attribute, which denotes their parent class name.
How to view JUnit XML reports?
Reporting is a crucial aspect of automated testing. JUnit provides a robust format for summarizing the results of executed test suites, while also allowing detailed examination of test outcomes. This capability facilitates in-depth investigation of issues or bugs identified during the tests.
JUnit XML reports can be accessed through various tools that parse and display XML files in a user-friendly format. Many CI systems, such as Jenkins and GitLab, offer built-in viewers for JUnit XML reports. These tools provide a summary of test results, including counts of passed, failed, and skipped tests, and offer detailed information about each test case. Let’s explore visualizations possibilities:
Local files
One straightforward method for managing test results is using local files. For example, an Excel file can serve as a basic reporting format. JUnit XML reports can be visualized using a JUnit XML viewer, which makes the information more accessible.
Advantages:
- Easy to implement and use.
- Accessible to anyone within the company.
- Flexible format that allows for customization.
Limitations:
- Difficult to get a comprehensive view with multiple test configurations.
- Limited display capabilities.
- Requires manual management, storage, and archiving of files.
- Can become cumbersome with frequent test iterations.
- Challenging to track requirement coverage.
CI tools: GitLab / GitHub / Jenkins
Modern CI/CD tools like GitLab, GitHub, and Jenkins offer integrated test reporting features. They allow you to launch test suites and review summaries of all executions on a single page. GitLab, in particular, supports JUnit XML format natively and allows for custom properties to include test evidence.
Advantages:
- Fully integrated with test execution processes.
- Provides a web interface with all necessary information.
- Easily integrates with other tools.
Limitations:
- Manual test results integration can be challenging.
- Feature sets vary across tools; sometimes plugins are required.
- Limited deep-dive analysis capabilities; proper management of test results as artifacts is crucial.
- Custom field management varies by tool.
Test management applications: Jira & XRay
Dedicated test management applications like Jira or XRay offer the most comprehensive approach to managing tests and tracking coverage. These tools can create detailed dashboards and support both automated and manual testing. JUnit XML is commonly used for updating and uploading test results in these applications.
Advantages:
- Extensive reporting and dashboard capabilities.
- Excellent support for both automated and manual testing.
- Facilitates easy tracking of requirement coverage.
- Natively integrates version support.
Limitations:
- More complex to set up and manage.
- Typically involves additional costs.
- Custom field management varies by tool.
How to integrate JUnit XML in CI pipelines?
Integrating JUnit into CI pipelines involves configuring the CI system to run JUnit tests and generate XML reports. Most CI tools support this integration natively. For instance, in GitLab, you can configure your job to interpret JUnit XML files generated by Pluma automated testing tool by adding them to the artifacts as follows:
artifacts:
when: always
paths:
– pluma-results
reports:
junit: pluma-results/pluma-results-*.junit.xml
This setup ensures that test results are included in the CI pipeline build reports, enabling automated quality checks.
JUnit XML pros & cons
Pros
- Interoperability: Widely recognized format supported by many tools.
- Readability: Human-readable and structured data.
- Extensibility: Supports custom tags and rich metadata.
- Automation-friendly: Easily parsed and processed programmatically.
Cons
- Verbosity: Large file sizes and processing overhead.
- Parsing performance: Slower parsing compared to lightweight formats like JSON.
- Complexity: Intricate structure for large test suites and a learning curve for unfamiliar developers.
JUnit XML best practices
- Consistent naming: Use clear and consistent names for test cases and suites.
- Metadata utilization: Include meaningful metadata such as error messages and test durations.
- Structured reporting: Maintain a well-structured hierarchy in the XML for easier parsing and reporting.
- Automated generation: Automate the generation and archiving of JUnit XML reports within CI pipelines.
- Regular cleanup: Periodically clean up old reports to manage storage and performance.
Subscribe to our newsletter
An example of integration of JUnit XML with Pluma automated testing tool
In Pluma, a tool for embedded testing, we use the JUnit XML format to generate detailed test reports and facilitate thorough analysis. This integration is incorporated into our Welma Yocto Linux distribution and managed through GitLab CI/CD pipelines. The JUnit XML format allows us to automate test reporting across various hardware platforms for more efficiency.
The following diagram demonstrates the architecture of our test bench, showing how JUnit XML is used to automate and manage testing processes across different hardware configurations. This setup ensures precise and scalable test automation, significantly improving our development and deployment workflows.




