[clang] [llvm] Ci report script testing! (PR #113447)
David Spickett via cfe-commits
cfe-commits at lists.llvm.org
Wed Oct 23 06:36:16 PDT 2024
https://github.com/DavidSpickett updated https://github.com/llvm/llvm-project/pull/113447
>From ba99bed96bd69db7c86d43a2119255569872f41f Mon Sep 17 00:00:00 2001
From: David Spickett <david.spickett at linaro.org>
Date: Mon, 21 Oct 2024 12:34:17 +0000
Subject: [PATCH 1/3] [ci] Write test results to unique file names
In this patch I'm using a new lit option so that the pipeline
writes many results files, one for each time lit is run:
```
--use-unique-output-file-name
When enabled, lit will add a unique element to the output file name, before the extension. For example "results.xml" will become "results.<something>.xml". The
"<something>" is not ordered in any way and is chosen so that existing files are not overwritten. [Default: Off]
```
(I added this to lit recently)
Now if I run the Linux build:
$ bash ./.ci/monolithic-linux.sh "clang;lldb;lld" "check-lldb-shell check-lld" "libcxx;libcxxabi" "check-libcxx check-libcxxabi"
I get multiple test result files. In my case some tests fail so runtimes aren't checked, but all projects are
so there is 1 file for lldb and one for lld:
$ ls build/*.xml
build/test-results.klc82utf.xml build/test-results.majylh73.xml
This change just collects the XML files as artifacts. Once I know that's
working, I can setup a test reporting plugin to build a summary from them.
---
.ci/generate-buildkite-pipeline-premerge | 4 ++--
.ci/monolithic-linux.sh | 13 +++++++++----
.ci/monolithic-windows.sh | 2 +-
3 files changed, 12 insertions(+), 7 deletions(-)
diff --git a/.ci/generate-buildkite-pipeline-premerge b/.ci/generate-buildkite-pipeline-premerge
index 7676ff716c4185..e52133751f09b1 100755
--- a/.ci/generate-buildkite-pipeline-premerge
+++ b/.ci/generate-buildkite-pipeline-premerge
@@ -272,7 +272,7 @@ if [[ "${linux_projects}" != "" ]]; then
artifact_paths:
- 'artifacts/**/*'
- '*_result.json'
- - 'build/test-results.xml'
+ - 'build/test-results*.xml'
agents: ${LINUX_AGENTS}
retry:
automatic:
@@ -295,7 +295,7 @@ if [[ "${windows_projects}" != "" ]]; then
artifact_paths:
- 'artifacts/**/*'
- '*_result.json'
- - 'build/test-results.xml'
+ - 'build/test-results*.xml'
agents: ${WINDOWS_AGENTS}
retry:
automatic:
diff --git a/.ci/monolithic-linux.sh b/.ci/monolithic-linux.sh
index b78dc59432b65c..17ea51c08fafd3 100755
--- a/.ci/monolithic-linux.sh
+++ b/.ci/monolithic-linux.sh
@@ -37,6 +37,8 @@ trap show-stats EXIT
projects="${1}"
targets="${2}"
+lit_args="-v --xunit-xml-output ${BUILD_DIR}/test-results.xml --use-unique-output-file-name --timeout=1200 --time-tests"
+
echo "--- cmake"
pip install -q -r "${MONOREPO_ROOT}"/mlir/python/requirements.txt
pip install -q -r "${MONOREPO_ROOT}"/lldb/test/requirements.txt
@@ -47,7 +49,7 @@ cmake -S "${MONOREPO_ROOT}"/llvm -B "${BUILD_DIR}" \
-D LLVM_ENABLE_ASSERTIONS=ON \
-D LLVM_BUILD_EXAMPLES=ON \
-D COMPILER_RT_BUILD_LIBFUZZER=OFF \
- -D LLVM_LIT_ARGS="-v --xunit-xml-output ${BUILD_DIR}/test-results.xml --timeout=1200 --time-tests" \
+ -D LLVM_LIT_ARGS="${lit_args}" \
-D LLVM_ENABLE_LLD=ON \
-D CMAKE_CXX_FLAGS=-gmlt \
-D LLVM_CCACHE_BUILD=ON \
@@ -87,7 +89,8 @@ if [[ "${runtimes}" != "" ]]; then
-D CMAKE_BUILD_TYPE=RelWithDebInfo \
-D CMAKE_INSTALL_PREFIX="${INSTALL_DIR}" \
-D LIBCXX_TEST_PARAMS="std=c++03" \
- -D LIBCXXABI_TEST_PARAMS="std=c++03"
+ -D LIBCXXABI_TEST_PARAMS="std=c++03" \
+ -D LLVM_LIT_ARGS="${lit_args}"
echo "--- ninja runtimes C++03"
@@ -104,7 +107,8 @@ if [[ "${runtimes}" != "" ]]; then
-D CMAKE_BUILD_TYPE=RelWithDebInfo \
-D CMAKE_INSTALL_PREFIX="${INSTALL_DIR}" \
-D LIBCXX_TEST_PARAMS="std=c++26" \
- -D LIBCXXABI_TEST_PARAMS="std=c++26"
+ -D LIBCXXABI_TEST_PARAMS="std=c++26" \
+ -D LLVM_LIT_ARGS="${lit_args}"
echo "--- ninja runtimes C++26"
@@ -121,7 +125,8 @@ if [[ "${runtimes}" != "" ]]; then
-D CMAKE_BUILD_TYPE=RelWithDebInfo \
-D CMAKE_INSTALL_PREFIX="${INSTALL_DIR}" \
-D LIBCXX_TEST_PARAMS="enable_modules=clang" \
- -D LIBCXXABI_TEST_PARAMS="enable_modules=clang"
+ -D LIBCXXABI_TEST_PARAMS="enable_modules=clang" \
+ -D LLVM_LIT_ARGS="${lit_args}"
echo "--- ninja runtimes clang modules"
diff --git a/.ci/monolithic-windows.sh b/.ci/monolithic-windows.sh
index 91e719c52d4363..9ec44c22442d06 100755
--- a/.ci/monolithic-windows.sh
+++ b/.ci/monolithic-windows.sh
@@ -53,7 +53,7 @@ cmake -S "${MONOREPO_ROOT}"/llvm -B "${BUILD_DIR}" \
-D LLVM_ENABLE_ASSERTIONS=ON \
-D LLVM_BUILD_EXAMPLES=ON \
-D COMPILER_RT_BUILD_LIBFUZZER=OFF \
- -D LLVM_LIT_ARGS="-v --xunit-xml-output ${BUILD_DIR}/test-results.xml --timeout=1200 --time-tests" \
+ -D LLVM_LIT_ARGS="-v --xunit-xml-output ${BUILD_DIR}/test-results.xml --use-unique-output-file-name --timeout=1200 --time-tests" \
-D COMPILER_RT_BUILD_ORC=OFF \
-D CMAKE_C_COMPILER_LAUNCHER=sccache \
-D CMAKE_CXX_COMPILER_LAUNCHER=sccache \
>From 29b0d906440e9854164a0a6d915e4a7a8e8198f7 Mon Sep 17 00:00:00 2001
From: David Spickett <david.spickett at linaro.org>
Date: Wed, 23 Oct 2024 11:39:15 +0100
Subject: [PATCH 2/3] WIP annotations script
---
.ci/generate-buildkite-pipeline-premerge | 48 ++--
.ci/generate_test_report.py | 328 +++++++++++++++++++++++
.ci/monolithic-linux.sh | 8 +-
3 files changed, 358 insertions(+), 26 deletions(-)
create mode 100644 .ci/generate_test_report.py
diff --git a/.ci/generate-buildkite-pipeline-premerge b/.ci/generate-buildkite-pipeline-premerge
index e52133751f09b1..06defbb8d037f5 100755
--- a/.ci/generate-buildkite-pipeline-premerge
+++ b/.ci/generate-buildkite-pipeline-premerge
@@ -289,27 +289,27 @@ if [[ "${linux_projects}" != "" ]]; then
EOF
fi
-if [[ "${windows_projects}" != "" ]]; then
- cat <<EOF
-- label: ':windows: Windows x64'
- artifact_paths:
- - 'artifacts/**/*'
- - '*_result.json'
- - 'build/test-results*.xml'
- agents: ${WINDOWS_AGENTS}
- retry:
- automatic:
- - exit_status: -1 # Agent was lost
- limit: 2
- - exit_status: 255 # Forced agent shutdown
- limit: 2
- timeout_in_minutes: 150
- env:
- CC: 'cl'
- CXX: 'cl'
- LD: 'link'
- commands:
- - 'C:\\BuildTools\\Common7\\Tools\\VsDevCmd.bat -arch=amd64 -host_arch=amd64'
- - 'bash .ci/monolithic-windows.sh "$(echo ${windows_projects} | tr ' ' ';')" "$(echo ${windows_check_targets})"'
-EOF
-fi
+# if [[ "${windows_projects}" != "" ]]; then
+# cat <<EOF
+# - label: ':windows: Windows x64'
+# artifact_paths:
+# - 'artifacts/**/*'
+# - '*_result.json'
+# - 'build/test-results*.xml'
+# agents: ${WINDOWS_AGENTS}
+# retry:
+# automatic:
+# - exit_status: -1 # Agent was lost
+# limit: 2
+# - exit_status: 255 # Forced agent shutdown
+# limit: 2
+# timeout_in_minutes: 150
+# env:
+# CC: 'cl'
+# CXX: 'cl'
+# LD: 'link'
+# commands:
+# - 'C:\\BuildTools\\Common7\\Tools\\VsDevCmd.bat -arch=amd64 -host_arch=amd64'
+# - 'bash .ci/monolithic-windows.sh "$(echo ${windows_projects} | tr ' ' ';')" "$(echo ${windows_check_targets})"'
+# EOF
+# fi
diff --git a/.ci/generate_test_report.py b/.ci/generate_test_report.py
new file mode 100644
index 00000000000000..a68d8a8d7fae81
--- /dev/null
+++ b/.ci/generate_test_report.py
@@ -0,0 +1,328 @@
+# Script to parse many JUnit XML result files and produce a combined report
+# that CI jobs can show.
+#
+# To run the unittests:
+# python3 -m unittest discover -p generate_test_report.py
+
+import argparse
+import unittest
+from io import StringIO
+from junitparser import JUnitXml, Failure
+from textwrap import dedent
+from subprocess import check_call
+
+
+def junit_from_xml(xml):
+ return JUnitXml.fromfile(StringIO(xml))
+
+
+class TestReports(unittest.TestCase):
+ def test_title_only(self):
+ self.assertEqual(_generate_report("Foo", []), ("", None))
+
+ def test_no_tests_in_testsuite(self):
+ self.assertEqual(
+ _generate_report(
+ "Foo",
+ [
+ junit_from_xml(
+ dedent(
+ """\
+ <?xml version="1.0" encoding="UTF-8"?>
+ <testsuites time="0.00">
+ <testsuite name="Empty" tests="0" failures="0" skipped="0" time="0.00">
+ </testsuite>
+ </testsuites>"""
+ )
+ )
+ ],
+ ),
+ ("", None),
+ )
+
+ def test_no_failures(self):
+ self.assertEqual(
+ _generate_report(
+ "Foo",
+ [
+ junit_from_xml(
+ dedent(
+ """\
+ <?xml version="1.0" encoding="UTF-8"?>
+ <testsuites time="0.00">
+ <testsuite name="Passed" tests="1" failures="0" skipped="0" time="0.00">
+ <testcase classname="Bar/test_1" name="test_1" time="0.00"/>
+ </testsuite>
+ </testsuites>"""
+ )
+ )
+ ],
+ ),
+ (
+ dedent(
+ """\
+ # Foo
+
+ * 1 test passed"""
+ ),
+ "success",
+ ),
+ )
+
+ def test_report_single_file_single_testsuite(self):
+ self.assertEqual(
+ _generate_report(
+ "Foo",
+ [
+ junit_from_xml(
+ dedent(
+ """\
+ <?xml version="1.0" encoding="UTF-8"?>
+ <testsuites time="8.89">
+ <testsuite name="Bar" tests="4" failures="2" skipped="1" time="410.63">
+ <testcase classname="Bar/test_1" name="test_1" time="0.02"/>
+ <testcase classname="Bar/test_2" name="test_2" time="0.02">
+ <skipped message="Reason"/>
+ </testcase>
+ <testcase classname="Bar/test_3" name="test_3" time="0.02">
+ <failure><![CDATA[Output goes here]]></failure>
+ </testcase>
+ <testcase classname="Bar/test_4" name="test_4" time="0.02">
+ <failure><![CDATA[Other output goes here]]></failure>
+ </testcase>
+ </testsuite>
+ </testsuites>"""
+ )
+ )
+ ],
+ ),
+ (
+ dedent(
+ """\
+ # Foo
+
+ * 1 test passed
+ * 1 test skipped
+ * 2 tests failed
+
+ ## Failed tests
+ (click to see output)
+
+ ### Bar
+ <details>
+ <summary>Bar/test_3/test_3</summary>
+
+ ```
+ Output goes here
+ ```
+ </details>
+ <details>
+ <summary>Bar/test_4/test_4</summary>
+
+ ```
+ Other output goes here
+ ```
+ </details>"""
+ ),
+ "error",
+ ),
+ )
+
+ MULTI_SUITE_OUTPUT = (
+ dedent(
+ """\
+ # ABC and DEF
+
+ * 1 test passed
+ * 1 test skipped
+ * 2 tests failed
+
+ ## Failed tests
+ (click to see output)
+
+ ### ABC
+ <details>
+ <summary>ABC/test_2/test_2</summary>
+
+ ```
+ ABC/test_2 output goes here
+ ```
+ </details>
+
+ ### DEF
+ <details>
+ <summary>DEF/test_2/test_2</summary>
+
+ ```
+ DEF/test_2 output goes here
+ ```
+ </details>"""
+ ),
+ "error",
+ )
+
+ def test_report_single_file_multiple_testsuites(self):
+ self.assertEqual(
+ _generate_report(
+ "ABC and DEF",
+ [
+ junit_from_xml(
+ dedent(
+ """\
+ <?xml version="1.0" encoding="UTF-8"?>
+ <testsuites time="8.89">
+ <testsuite name="ABC" tests="2" failures="1" skipped="0" time="410.63">
+ <testcase classname="ABC/test_1" name="test_1" time="0.02"/>
+ <testcase classname="ABC/test_2" name="test_2" time="0.02">
+ <failure><![CDATA[ABC/test_2 output goes here]]></failure>
+ </testcase>
+ </testsuite>
+ <testsuite name="DEF" tests="2" failures="1" skipped="1" time="410.63">
+ <testcase classname="DEF/test_1" name="test_1" time="0.02">
+ <skipped message="reason"/>
+ </testcase>
+ <testcase classname="DEF/test_2" name="test_2" time="0.02">
+ <failure><![CDATA[DEF/test_2 output goes here]]></failure>
+ </testcase>
+ </testsuite>
+ </testsuites>"""
+ )
+ )
+ ],
+ ),
+ self.MULTI_SUITE_OUTPUT,
+ )
+
+ def test_report_multiple_files_multiple_testsuites(self):
+ self.assertEqual(
+ _generate_report(
+ "ABC and DEF",
+ [
+ junit_from_xml(
+ dedent(
+ """\
+ <?xml version="1.0" encoding="UTF-8"?>
+ <testsuites time="8.89">
+ <testsuite name="ABC" tests="2" failures="1" skipped="0" time="410.63">
+ <testcase classname="ABC/test_1" name="test_1" time="0.02"/>
+ <testcase classname="ABC/test_2" name="test_2" time="0.02">
+ <failure><![CDATA[ABC/test_2 output goes here]]></failure>
+ </testcase>
+ </testsuite>
+ </testsuites>"""
+ )
+ ),
+ junit_from_xml(
+ dedent(
+ """\
+ <?xml version="1.0" encoding="UTF-8"?>
+ <testsuites time="8.89">
+ <testsuite name="DEF" tests="2" failures="1" skipped="1" time="410.63">
+ <testcase classname="DEF/test_1" name="test_1" time="0.02">
+ <skipped message="reason"/>
+ </testcase>
+ <testcase classname="DEF/test_2" name="test_2" time="0.02">
+ <failure><![CDATA[DEF/test_2 output goes here]]></failure>
+ </testcase>
+ </testsuite>
+ </testsuites>"""
+ )
+ ),
+ ],
+ ),
+ self.MULTI_SUITE_OUTPUT,
+ )
+
+
+def _generate_report(title, junit_objects):
+ style = None
+
+ if not junit_objects:
+ return ("", style)
+
+ failures = {}
+ tests_run = 0
+ tests_skipped = 0
+ tests_failed = 0
+
+ for results in junit_objects:
+ for testsuite in results:
+ tests_run += testsuite.tests
+ tests_skipped += testsuite.skipped
+ tests_failed += testsuite.failures
+
+ for test in testsuite:
+ if (
+ not test.is_passed
+ and test.result
+ and isinstance(test.result[0], Failure)
+ ):
+ if failures.get(testsuite.name) is None:
+ failures[testsuite.name] = []
+ failures[testsuite.name].append(
+ (test.classname + "/" + test.name, test.result[0].text)
+ )
+
+ if not tests_run:
+ return ("", style)
+
+ style = "error" if tests_failed else "success"
+ report = [f"# {title}", ""]
+
+ tests_passed = tests_run - tests_skipped - tests_failed
+
+ def plural(num_tests):
+ return "test" if num_tests == 1 else "tests"
+
+ if tests_passed:
+ report.append(f"* {tests_passed} {plural(tests_passed)} passed")
+ if tests_skipped:
+ report.append(f"* {tests_skipped} {plural(tests_skipped)} skipped")
+ if tests_failed:
+ report.append(f"* {tests_failed} {plural(tests_failed)} failed")
+
+ if failures:
+ report.extend(["", "## Failed tests", "(click to see output)"])
+ for testsuite_name, failures in failures.items():
+ report.extend(["", f"### {testsuite_name}"])
+ for name, output in failures:
+ report.extend(
+ [
+ "<details>",
+ f"<summary>{name}</summary>",
+ "",
+ "```",
+ output,
+ "```",
+ "</details>",
+ ]
+ )
+
+ return "\n".join(report), style
+
+
+def generate_report(title, junit_files):
+ return _generate_report(title, [JUnitXml.fromfile(p) for p in junit_files])
+
+
+if __name__ == "__main__":
+ parser = argparse.ArgumentParser()
+ parser.add_argument(
+ "title", help="Title of the test report, without Markdown formatting."
+ )
+ parser.add_argument("context", help="Annotation context to write to.")
+ parser.add_argument("junit_files", help="Paths to JUnit report files.", nargs="*")
+ args = parser.parse_args()
+
+ report, style = generate_report(args.title, args.junit_files)
+ check_call(
+ [
+ "buildkite-agent",
+ "annotate",
+ "--context",
+ args.context,
+ "--style",
+ style,
+ report,
+ ]
+ )
diff --git a/.ci/monolithic-linux.sh b/.ci/monolithic-linux.sh
index 17ea51c08fafd3..4c4c1c756ba98f 100755
--- a/.ci/monolithic-linux.sh
+++ b/.ci/monolithic-linux.sh
@@ -28,11 +28,14 @@ if [[ -n "${CLEAR_CACHE:-}" ]]; then
ccache --clear
fi
-function show-stats {
+function at-exit {
+ python3 "${MONOREPO_ROOT}"/.ci/generate_test_report.py ":linux: Linux x64 Test Results" \
+ "linux-x64-test-results" "${BUILD_DIR}"/test-results*.xml
+
mkdir -p artifacts
ccache --print-stats > artifacts/ccache_stats.txt
}
-trap show-stats EXIT
+trap at-exit EXIT
projects="${1}"
targets="${2}"
@@ -42,6 +45,7 @@ lit_args="-v --xunit-xml-output ${BUILD_DIR}/test-results.xml --use-unique-outpu
echo "--- cmake"
pip install -q -r "${MONOREPO_ROOT}"/mlir/python/requirements.txt
pip install -q -r "${MONOREPO_ROOT}"/lldb/test/requirements.txt
+pip install -q junitparser==3.2.0
cmake -S "${MONOREPO_ROOT}"/llvm -B "${BUILD_DIR}" \
-D LLVM_ENABLE_PROJECTS="${projects}" \
-G Ninja \
>From 1eed72398223ebe05133585c61c03a062815d7df Mon Sep 17 00:00:00 2001
From: David Spickett <david.spickett at linaro.org>
Date: Wed, 23 Oct 2024 14:35:57 +0100
Subject: [PATCH 3/3] changes to trigger builds
---
clang/README.md | 2 ++
llvm/README.txt | 2 ++
2 files changed, 4 insertions(+)
diff --git a/clang/README.md b/clang/README.md
index b98182d8a3f684..94b1e1a7a07433 100644
--- a/clang/README.md
+++ b/clang/README.md
@@ -23,3 +23,5 @@ If you're interested in more (including how to build Clang) it is best to read t
* If you find a bug in Clang, please file it in the LLVM bug tracker:
https://github.com/llvm/llvm-project/issues
+
+test change
\ No newline at end of file
diff --git a/llvm/README.txt b/llvm/README.txt
index b9b71a3b6daff1..ba60b8ffdd072c 100644
--- a/llvm/README.txt
+++ b/llvm/README.txt
@@ -15,3 +15,5 @@ documentation setup.
If you are writing a package for LLVM, see docs/Packaging.rst for our
suggestions.
+
+test change
\ No newline at end of file
More information about the cfe-commits
mailing list