[LLVMdev] Why is the default LNT aggregation function min instead of mean
tobias at grosser.es
Thu Jan 16 17:03:49 PST 2014
I am currently investigating how to ensure that LNT only shows relevant
performance regressions for the -O3 performance tests I am running.
One question that came up here is why the default aggregate function for
LNT is 'min' instead of 'mean'. This looks a little surprising from the
statistical point, but also from looking at my test results picking
'min' seems to be an inferior choice.
For all test runs I have looked at, picking mean largely reduces the
run-over-run changes reported due to noise.
See this run e.g:
If we use the median, we just get just one change reported:
If you use min, we get eight reports one claiming over 100% performance
reduction for a case that is really just pure noise. I am planning to
look into using better statistical methods. However, as a start, could
we switch the default to 'mean'?
More information about the llvm-dev