<html>
<head>
<meta content="text/html; charset=utf-8" http-equiv="Content-Type">
</head>
<body bgcolor="#FFFFFF" text="#000000">
<br>
<br>
<div class="moz-cite-prefix">On 19/01/2016 17:34, David Blaikie
wrote:<br>
</div>
<blockquote
cite="mid:CAENS6Et=Zumz5Hm-u-WEb3Ey2e6mTq8ByRvwiPd1skvdgWaLXA@mail.gmail.com"
type="cite">
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<div dir="ltr">On Tue, Jan 19, 2016 at 6:34 AM, Kristof Beyls via
llvm-commits <span dir="ltr"><<a moz-do-not-send="true"
href="mailto:llvm-commits@lists.llvm.org" target="_blank">llvm-commits@lists.llvm.org</a>></span>
wrote:<br>
<div class="gmail_extra">
<div class="gmail_quote">
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex">kristof.beyls
created this revision.<br>
kristof.beyls added a reviewer: cmatthews.<br>
kristof.beyls added a subscriber: llvm-commits.<br>
<br>
We've found it useful to have a table on the daily report
page which reports for each<br>
machine how many tests were seen on a given day. It has
helped us to quickly notice<br>
when a certain machine started producing fewer
test/benchmark results for some reason.<br>
<br>
AFAIK, there isn't another way to quickly notice if a
particular machine starts producing<br>
fewer-than-expected benchmark results.<br>
</blockquote>
<div><br>
</div>
<div>I imagine this metric would be confused if there was a
day with relatively few commits, no? (in that case you
wouldn't need to run benchmarks because you'd be up to
date)<br>
<br>
Perhaps a better/alternative metric would be "commits per
benchmark run"? (this has the opposite effect sometimes,
of course - if commit rate spikes it'll look like the bot
slowed down - so perhaps some metric about commit rate?)
This would help catch the underlying concern (or what I
assume is the underlying concern) - when data is
insufficiently granular for good analysis.<br>
<br>
(another alternative might be time taken for a report -
this will fluctuate if many new tests are added to the
repository, but would otherwise be independent of commit
rate (either too high, or too low))</div>
</div>
</div>
</div>
</blockquote>
Hi David,<br>
<br>
I probably should've explained the metric better.<br>
The daily report page takes a run for each machine, for each day,
and analyzes those.<br>
If it finds regressions or improvements in the benchmark results,
it'll show those.<br>
It'll also show if a particular benchmark program turned from
passing to failing, or from failing to passing.<br>
However, so far, it doesn't indicate if all of a sudden a machine
stopped producing results for a particular benchmark program all
together.<br>
The intent of this table is to do a basic sanity check that on
consecutive days, the machines keep on reporting results for the
same number of benchmark programs. When the number of programs in
the test-suite changes, or when you add more proprietary/external
benchmarks, this number will change. But on the far majority of
days, the number of benchmark programs reported on by a machine
should remain stable.<br>
<br>
We're actively using the daily report page as the first thing to
look at to get an impression of how ToT LLVM has evolved on the
machines and benchmarks we track.<br>
We've been using this table for about half a year downstream, and it
has helped us on a few occasions to quickly notice that a particular
machine had a problem in that it didn't produce a full set of
results anymore. There is no other way to detect that this easily,
and that's why the addition of this table is useful: a small amount
of extra information on the daily report page that makes it more
useful to give a quick overview of what today's status of ToT is.<br>
<br>
So, this table/metric isn't intended to give an indication of how
quickly a machine gives feedback - but rather if a machine is still
producing a full set of results.<br>
Having a metric on how quick a machine runs may also be useful, but
I'm not sure if it should be part of the daily report page.<br>
<br>
Thanks,<br>
<br>
Kristof<br>
<br>
<br>
<br>
</body>
</html>