[clang] [llvm] [ci] New script to generate test reports as Buildkite Annotations (PR #113447)

via cfe-commits cfe-commits at lists.llvm.org
Thu Oct 24 02:41:21 PDT 2024


llvmbot wrote:


<!--LLVM PR SUMMARY COMMENT-->

@llvm/pr-subscribers-clang

Author: David Spickett (DavidSpickett)

<details>
<summary>Changes</summary>

The CI builds now send the results of every lit run to a unique file.
This means we can read them all to make a combined report for all
tests.

This report will be shown as an "annotation" in the build results:
https://buildkite.com/docs/agent/v3/cli-annotate#creating-an-annotation

Here is an example: https://buildkite.com/llvm-project/github-pull-requests/builds/112660#_
(make sure it is showing "All" instead of "Failures")

This is an alternative to using the existing Buildkite plugin:
https://github.com/buildkite-plugins/junit-annotate-buildkite-plugin

As the plugin is:
* Specific to Buildkite, and we may move away from Buildkite.
* Requires docker, unless we were to fork it ourselves.
* Does not let you customise the report format unless again,
  we make our own fork.

Annotations use GitHub's flavour of Markdown so the main code in the
script generates that text. There is an extra "style" argument generated
to make the formatting nicer in Buildkite.

"context" is the name of the annotation that will be created. By using
different context names for Linux and Windows results we get 2 separate
annotations.

The script also handles calling the buildkite-agent. This makes passing
extra arguments to the agent easier, rather than piping the output of
this script into the agent.

In the future we can remove the agent part of it and simply use
the report content. Either printed to stdout or as a comment on
the GitHub PR.

---
Full diff: https://github.com/llvm/llvm-project/pull/113447.diff


7 Files Affected:

- (modified) .ci/generate-buildkite-pipeline-premerge (+2-2) 
- (added) .ci/generate_test_report.py (+328) 
- (modified) .ci/monolithic-linux.sh (+15-6) 
- (modified) .ci/monolithic-windows.sh (+7-3) 
- (modified) clang/README.md (+2) 
- (modified) clang/test/Driver/cl-pch.c (+1-1) 
- (modified) llvm/README.txt (+2) 


``````````diff
diff --git a/.ci/generate-buildkite-pipeline-premerge b/.ci/generate-buildkite-pipeline-premerge
index 7676ff716c4185..e52133751f09b1 100755
--- a/.ci/generate-buildkite-pipeline-premerge
+++ b/.ci/generate-buildkite-pipeline-premerge
@@ -272,7 +272,7 @@ if [[ "${linux_projects}" != "" ]]; then
   artifact_paths:
   - 'artifacts/**/*'
   - '*_result.json'
-  - 'build/test-results.xml'
+  - 'build/test-results*.xml'
   agents: ${LINUX_AGENTS}
   retry:
     automatic:
@@ -295,7 +295,7 @@ if [[ "${windows_projects}" != "" ]]; then
   artifact_paths:
   - 'artifacts/**/*'
   - '*_result.json'
-  - 'build/test-results.xml'
+  - 'build/test-results*.xml'
   agents: ${WINDOWS_AGENTS}
   retry:
     automatic:
diff --git a/.ci/generate_test_report.py b/.ci/generate_test_report.py
new file mode 100644
index 00000000000000..f2ae116ace99af
--- /dev/null
+++ b/.ci/generate_test_report.py
@@ -0,0 +1,328 @@
+# Script to parse many JUnit XML result files and send a report to the buildkite
+# agent as an annotation.
+#
+# To run the unittests:
+# python3 -m unittest discover -p generate_test_report.py
+
+import argparse
+import unittest
+from io import StringIO
+from junitparser import JUnitXml, Failure
+from textwrap import dedent
+from subprocess import check_call
+
+
+def junit_from_xml(xml):
+    return JUnitXml.fromfile(StringIO(xml))
+
+
+class TestReports(unittest.TestCase):
+    def test_title_only(self):
+        self.assertEqual(_generate_report("Foo", []), ("", None))
+
+    def test_no_tests_in_testsuite(self):
+        self.assertEqual(
+            _generate_report(
+                "Foo",
+                [
+                    junit_from_xml(
+                        dedent(
+                            """\
+          <?xml version="1.0" encoding="UTF-8"?>
+          <testsuites time="0.00">
+          <testsuite name="Empty" tests="0" failures="0" skipped="0" time="0.00">
+          </testsuite>
+          </testsuites>"""
+                        )
+                    )
+                ],
+            ),
+            ("", None),
+        )
+
+    def test_no_failures(self):
+        self.assertEqual(
+            _generate_report(
+                "Foo",
+                [
+                    junit_from_xml(
+                        dedent(
+                            """\
+          <?xml version="1.0" encoding="UTF-8"?>
+          <testsuites time="0.00">
+          <testsuite name="Passed" tests="1" failures="0" skipped="0" time="0.00">
+          <testcase classname="Bar/test_1" name="test_1" time="0.00"/>
+          </testsuite>
+          </testsuites>"""
+                        )
+                    )
+                ],
+            ),
+            (
+                dedent(
+                    """\
+              # Foo
+
+              * 1 test passed"""
+                ),
+                "success",
+            ),
+        )
+
+    def test_report_single_file_single_testsuite(self):
+        self.assertEqual(
+            _generate_report(
+                "Foo",
+                [
+                    junit_from_xml(
+                        dedent(
+                            """\
+          <?xml version="1.0" encoding="UTF-8"?>
+          <testsuites time="8.89">
+          <testsuite name="Bar" tests="4" failures="2" skipped="1" time="410.63">
+          <testcase classname="Bar/test_1" name="test_1" time="0.02"/>
+          <testcase classname="Bar/test_2" name="test_2" time="0.02">
+            <skipped message="Reason"/>
+          </testcase>
+          <testcase classname="Bar/test_3" name="test_3" time="0.02">
+            <failure><![CDATA[Output goes here]]></failure>
+          </testcase>
+          <testcase classname="Bar/test_4" name="test_4" time="0.02">
+            <failure><![CDATA[Other output goes here]]></failure>
+          </testcase>
+          </testsuite>
+          </testsuites>"""
+                        )
+                    )
+                ],
+            ),
+            (
+                dedent(
+                    """\
+          # Foo
+
+          * 1 test passed
+          * 1 test skipped
+          * 2 tests failed
+
+          ## Failed tests
+          (click to see output)
+
+          ### Bar
+          <details>
+          <summary>Bar/test_3/test_3</summary>
+
+          ```
+          Output goes here
+          ```
+          </details>
+          <details>
+          <summary>Bar/test_4/test_4</summary>
+
+          ```
+          Other output goes here
+          ```
+          </details>"""
+                ),
+                "error",
+            ),
+        )
+
+    MULTI_SUITE_OUTPUT = (
+        dedent(
+            """\
+        # ABC and DEF
+
+        * 1 test passed
+        * 1 test skipped
+        * 2 tests failed
+
+        ## Failed tests
+        (click to see output)
+
+        ### ABC
+        <details>
+        <summary>ABC/test_2/test_2</summary>
+
+        ```
+        ABC/test_2 output goes here
+        ```
+        </details>
+
+        ### DEF
+        <details>
+        <summary>DEF/test_2/test_2</summary>
+
+        ```
+        DEF/test_2 output goes here
+        ```
+        </details>"""
+        ),
+        "error",
+    )
+
+    def test_report_single_file_multiple_testsuites(self):
+        self.assertEqual(
+            _generate_report(
+                "ABC and DEF",
+                [
+                    junit_from_xml(
+                        dedent(
+                            """\
+          <?xml version="1.0" encoding="UTF-8"?>
+          <testsuites time="8.89">
+          <testsuite name="ABC" tests="2" failures="1" skipped="0" time="410.63">
+          <testcase classname="ABC/test_1" name="test_1" time="0.02"/>
+          <testcase classname="ABC/test_2" name="test_2" time="0.02">
+            <failure><![CDATA[ABC/test_2 output goes here]]></failure>
+          </testcase>
+          </testsuite>
+          <testsuite name="DEF" tests="2" failures="1" skipped="1" time="410.63">
+          <testcase classname="DEF/test_1" name="test_1" time="0.02">
+            <skipped message="reason"/>
+          </testcase>
+          <testcase classname="DEF/test_2" name="test_2" time="0.02">
+            <failure><![CDATA[DEF/test_2 output goes here]]></failure>
+          </testcase>
+          </testsuite>
+          </testsuites>"""
+                        )
+                    )
+                ],
+            ),
+            self.MULTI_SUITE_OUTPUT,
+        )
+
+    def test_report_multiple_files_multiple_testsuites(self):
+        self.assertEqual(
+            _generate_report(
+                "ABC and DEF",
+                [
+                    junit_from_xml(
+                        dedent(
+                            """\
+          <?xml version="1.0" encoding="UTF-8"?>
+          <testsuites time="8.89">
+          <testsuite name="ABC" tests="2" failures="1" skipped="0" time="410.63">
+          <testcase classname="ABC/test_1" name="test_1" time="0.02"/>
+          <testcase classname="ABC/test_2" name="test_2" time="0.02">
+            <failure><![CDATA[ABC/test_2 output goes here]]></failure>
+          </testcase>
+          </testsuite>
+          </testsuites>"""
+                        )
+                    ),
+                    junit_from_xml(
+                        dedent(
+                            """\
+          <?xml version="1.0" encoding="UTF-8"?>
+          <testsuites time="8.89">
+          <testsuite name="DEF" tests="2" failures="1" skipped="1" time="410.63">
+          <testcase classname="DEF/test_1" name="test_1" time="0.02">
+            <skipped message="reason"/>
+          </testcase>
+          <testcase classname="DEF/test_2" name="test_2" time="0.02">
+            <failure><![CDATA[DEF/test_2 output goes here]]></failure>
+          </testcase>
+          </testsuite>
+          </testsuites>"""
+                        )
+                    ),
+                ],
+            ),
+            self.MULTI_SUITE_OUTPUT,
+        )
+
+
+def _generate_report(title, junit_objects):
+    style = None
+
+    if not junit_objects:
+        return ("", style)
+
+    failures = {}
+    tests_run = 0
+    tests_skipped = 0
+    tests_failed = 0
+
+    for results in junit_objects:
+        for testsuite in results:
+            tests_run += testsuite.tests
+            tests_skipped += testsuite.skipped
+            tests_failed += testsuite.failures
+
+            for test in testsuite:
+                if (
+                    not test.is_passed
+                    and test.result
+                    and isinstance(test.result[0], Failure)
+                ):
+                    if failures.get(testsuite.name) is None:
+                        failures[testsuite.name] = []
+                    failures[testsuite.name].append(
+                        (test.classname + "/" + test.name, test.result[0].text)
+                    )
+
+    if not tests_run:
+        return ("", style)
+
+    style = "error" if tests_failed else "success"
+    report = [f"# {title}", ""]
+
+    tests_passed = tests_run - tests_skipped - tests_failed
+
+    def plural(num_tests):
+        return "test" if num_tests == 1 else "tests"
+
+    if tests_passed:
+        report.append(f"* {tests_passed} {plural(tests_passed)} passed")
+    if tests_skipped:
+        report.append(f"* {tests_skipped} {plural(tests_skipped)} skipped")
+    if tests_failed:
+        report.append(f"* {tests_failed} {plural(tests_failed)} failed")
+
+    if failures:
+        report.extend(["", "## Failed tests", "(click to see output)"])
+        for testsuite_name, failures in failures.items():
+            report.extend(["", f"### {testsuite_name}"])
+            for name, output in failures:
+                report.extend(
+                    [
+                        "<details>",
+                        f"<summary>{name}</summary>",
+                        "",
+                        "```",
+                        output,
+                        "```",
+                        "</details>",
+                    ]
+                )
+
+    return "\n".join(report), style
+
+
+def generate_report(title, junit_files):
+    return _generate_report(title, [JUnitXml.fromfile(p) for p in junit_files])
+
+
+if __name__ == "__main__":
+    parser = argparse.ArgumentParser()
+    parser.add_argument(
+        "title", help="Title of the test report, without Markdown formatting."
+    )
+    parser.add_argument("context", help="Annotation context to write to.")
+    parser.add_argument("junit_files", help="Paths to JUnit report files.", nargs="*")
+    args = parser.parse_args()
+
+    report, style = generate_report(args.title, args.junit_files)
+    check_call(
+        [
+            "buildkite-agent",
+            "annotate",
+            "--context",
+            args.context,
+            "--style",
+            style,
+            report,
+        ]
+    )
diff --git a/.ci/monolithic-linux.sh b/.ci/monolithic-linux.sh
index b78dc59432b65c..4c4c1c756ba98f 100755
--- a/.ci/monolithic-linux.sh
+++ b/.ci/monolithic-linux.sh
@@ -28,18 +28,24 @@ if [[ -n "${CLEAR_CACHE:-}" ]]; then
   ccache --clear
 fi
 
-function show-stats {
+function at-exit {
+  python3 "${MONOREPO_ROOT}"/.ci/generate_test_report.py ":linux: Linux x64 Test Results" \
+    "linux-x64-test-results" "${BUILD_DIR}"/test-results*.xml
+
   mkdir -p artifacts
   ccache --print-stats > artifacts/ccache_stats.txt
 }
-trap show-stats EXIT
+trap at-exit EXIT
 
 projects="${1}"
 targets="${2}"
 
+lit_args="-v --xunit-xml-output ${BUILD_DIR}/test-results.xml --use-unique-output-file-name --timeout=1200 --time-tests"
+
 echo "--- cmake"
 pip install -q -r "${MONOREPO_ROOT}"/mlir/python/requirements.txt
 pip install -q -r "${MONOREPO_ROOT}"/lldb/test/requirements.txt
+pip install -q junitparser==3.2.0
 cmake -S "${MONOREPO_ROOT}"/llvm -B "${BUILD_DIR}" \
       -D LLVM_ENABLE_PROJECTS="${projects}" \
       -G Ninja \
@@ -47,7 +53,7 @@ cmake -S "${MONOREPO_ROOT}"/llvm -B "${BUILD_DIR}" \
       -D LLVM_ENABLE_ASSERTIONS=ON \
       -D LLVM_BUILD_EXAMPLES=ON \
       -D COMPILER_RT_BUILD_LIBFUZZER=OFF \
-      -D LLVM_LIT_ARGS="-v --xunit-xml-output ${BUILD_DIR}/test-results.xml --timeout=1200 --time-tests" \
+      -D LLVM_LIT_ARGS="${lit_args}" \
       -D LLVM_ENABLE_LLD=ON \
       -D CMAKE_CXX_FLAGS=-gmlt \
       -D LLVM_CCACHE_BUILD=ON \
@@ -87,7 +93,8 @@ if [[ "${runtimes}" != "" ]]; then
       -D CMAKE_BUILD_TYPE=RelWithDebInfo \
       -D CMAKE_INSTALL_PREFIX="${INSTALL_DIR}" \
       -D LIBCXX_TEST_PARAMS="std=c++03" \
-      -D LIBCXXABI_TEST_PARAMS="std=c++03"
+      -D LIBCXXABI_TEST_PARAMS="std=c++03" \
+      -D LLVM_LIT_ARGS="${lit_args}"
 
   echo "--- ninja runtimes C++03"
 
@@ -104,7 +111,8 @@ if [[ "${runtimes}" != "" ]]; then
       -D CMAKE_BUILD_TYPE=RelWithDebInfo \
       -D CMAKE_INSTALL_PREFIX="${INSTALL_DIR}" \
       -D LIBCXX_TEST_PARAMS="std=c++26" \
-      -D LIBCXXABI_TEST_PARAMS="std=c++26"
+      -D LIBCXXABI_TEST_PARAMS="std=c++26" \
+      -D LLVM_LIT_ARGS="${lit_args}"
 
   echo "--- ninja runtimes C++26"
 
@@ -121,7 +129,8 @@ if [[ "${runtimes}" != "" ]]; then
       -D CMAKE_BUILD_TYPE=RelWithDebInfo \
       -D CMAKE_INSTALL_PREFIX="${INSTALL_DIR}" \
       -D LIBCXX_TEST_PARAMS="enable_modules=clang" \
-      -D LIBCXXABI_TEST_PARAMS="enable_modules=clang"
+      -D LIBCXXABI_TEST_PARAMS="enable_modules=clang" \
+      -D LLVM_LIT_ARGS="${lit_args}"
 
   echo "--- ninja runtimes clang modules"
   
diff --git a/.ci/monolithic-windows.sh b/.ci/monolithic-windows.sh
index 91e719c52d4363..303cfadfa917cc 100755
--- a/.ci/monolithic-windows.sh
+++ b/.ci/monolithic-windows.sh
@@ -27,16 +27,20 @@ if [[ -n "${CLEAR_CACHE:-}" ]]; then
 fi
 
 sccache --zero-stats
-function show-stats {
+function at-exit {
+  python "${MONOREPO_ROOT}"/.ci/generate_test_report.py ":windows: Windows x64 Test Results" \
+    "windows-x64-test-results" "${BUILD_DIR}"/test-results*.xml
+
   mkdir -p artifacts
   sccache --show-stats >> artifacts/sccache_stats.txt
 }
-trap show-stats EXIT
+trap at-exit EXIT
 
 projects="${1}"
 targets="${2}"
 
 echo "--- cmake"
+pip install junitparser==3.2.0
 pip install -q -r "${MONOREPO_ROOT}"/mlir/python/requirements.txt
 
 # The CMAKE_*_LINKER_FLAGS to disable the manifest come from research
@@ -53,7 +57,7 @@ cmake -S "${MONOREPO_ROOT}"/llvm -B "${BUILD_DIR}" \
       -D LLVM_ENABLE_ASSERTIONS=ON \
       -D LLVM_BUILD_EXAMPLES=ON \
       -D COMPILER_RT_BUILD_LIBFUZZER=OFF \
-      -D LLVM_LIT_ARGS="-v --xunit-xml-output ${BUILD_DIR}/test-results.xml --timeout=1200 --time-tests" \
+      -D LLVM_LIT_ARGS="-v --xunit-xml-output ${BUILD_DIR}/test-results.xml --use-unique-output-file-name --timeout=1200 --time-tests" \
       -D COMPILER_RT_BUILD_ORC=OFF \
       -D CMAKE_C_COMPILER_LAUNCHER=sccache \
       -D CMAKE_CXX_COMPILER_LAUNCHER=sccache \
diff --git a/clang/README.md b/clang/README.md
index b98182d8a3f684..94b1e1a7a07433 100644
--- a/clang/README.md
+++ b/clang/README.md
@@ -23,3 +23,5 @@ If you're interested in more (including how to build Clang) it is best to read t
 * If you find a bug in Clang, please file it in the LLVM bug tracker:
   
     https://github.com/llvm/llvm-project/issues
+
+test change
\ No newline at end of file
diff --git a/clang/test/Driver/cl-pch.c b/clang/test/Driver/cl-pch.c
index 36d83a11242dc6..1a9527458ebf4f 100644
--- a/clang/test/Driver/cl-pch.c
+++ b/clang/test/Driver/cl-pch.c
@@ -25,7 +25,7 @@
 // CHECK-YCTP-SAME: -o
 // CHECK-YCTP-SAME: pchfile.pch
 // CHECK-YCTP-SAME: -x
-// CHECK-YCTP-SAME: c++-header
+// CHECK-YCTP-SAME: c++-header -- this will fail tests on windows!
 
 // Except if a later /TC changes it back.
 // RUN: %clang_cl -Werror /Yc%S/Inputs/pchfile.h /FI%S/Inputs/pchfile.h /c /Fo%t/pchfile.obj /Fp%t/pchfile.pch -v -- %s 2>&1 \
diff --git a/llvm/README.txt b/llvm/README.txt
index b9b71a3b6daff1..ba60b8ffdd072c 100644
--- a/llvm/README.txt
+++ b/llvm/README.txt
@@ -15,3 +15,5 @@ documentation setup.
 
 If you are writing a package for LLVM, see docs/Packaging.rst for our
 suggestions.
+
+test change
\ No newline at end of file

``````````

</details>


https://github.com/llvm/llvm-project/pull/113447


More information about the cfe-commits mailing list