[PATCH] D60337: [test-suite] litsupport/modules/microbenchmark.py: propagate perf file to the microbenchmarks
Roman Lebedev via Phabricator via llvm-commits
llvm-commits at lists.llvm.org
Fri Apr 5 13:03:47 PDT 2019
lebedev.ri created this revision.
lebedev.ri added reviewers: MatzeB, hfinkel, homerdin.
Herald added a project: LLVM.
Currently, for the microbenchmarks, the `microbenchmark` module is enabled,
and `timeit` module is disabled. Which means, the actual microbenchmark executable
will not have `exec_time` metric, and will not be present in the reports.
But the `perf` module only knows about that main run, it doesn't know anything
about the micro-benchmark results. So it only sets the `profile` metric for that
main run. Which won't be present in the reports. So it is completely impossible
to view the perf profiles...
Depends on D60336 <https://reviews.llvm.org/D60336>
Repository:
rT test-suite
https://reviews.llvm.org/D60337
Files:
litsupport/modules/microbenchmark.py
Index: litsupport/modules/microbenchmark.py
===================================================================
--- litsupport/modules/microbenchmark.py
+++ litsupport/modules/microbenchmark.py
@@ -39,6 +39,12 @@
exec_time_metric = lit.Test.toMetricValue(benchmark['cpu_time'])
microBenchmark.addMetric('exec_time', exec_time_metric)
+ # Propagate the perf profile to the microbenchmark.
+ if context.profilefile:
+ microBenchmark.addMetric(
+ 'profile', lit.Test.toMetricValue(
+ context.profilefile))
+
# Add Micro Result
context.micro_results[name] = microBenchmark
-------------- next part --------------
A non-text attachment was scrubbed...
Name: D60337.193946.patch
Type: text/x-patch
Size: 704 bytes
Desc: not available
URL: <http://lists.llvm.org/pipermail/llvm-commits/attachments/20190405/aebed33a/attachment.bin>
More information about the llvm-commits
mailing list