[llvm-testresults] lab-mini-01__O3-plain__clang_DEV__x86_64 test results

llvm-testresults at cs.uiuc.edu llvm-testresults at cs.uiuc.edu
Sun Apr 8 02:50:59 PDT 2012


http://llvm.org/perf/db_default/v4/nts/177?compare_to=176&baseline=6
Nickname: lab-mini-01__O3-plain__clang_DEV__x86_64:1
Comparing:
     Run: 177, Order:  154278, Start Time: 2012-04-08 08:55:08, End Time: 2012-04-08 09:51:03
      To: 176, Order:  154273, Start Time: 2012-04-08 00:59:54, End Time: 2012-04-08 01:55:35
Baseline: 6, Order:  153879, Start Time: 2012-04-02 17:48:39, End Time: 2012-04-02 18:44:53

===============
Tests Summary
===============

Performance Regressions: 0 (8 on baseline)
Performance Improvements: 1 (27 on baseline)
Unchanged Tests: 887 (853 on baseline)
Total Tests: 888

===========================
Run-Over-Run Changes Detail
===========================
Performance Improvements - Execution Time
-----------------------------------------
  MultiSource/Benchmarks/llubenchmark/llu: -2.52% (3.8687 => 3.7713, std. dev.: 0.0188)

================================
Run-Over-Baseline Changes Detail
================================
Performance Regressions - Execution Time
----------------------------------------
  SingleSource/Benchmarks/Misc/richards_benchmark: 7.76% (1.1560 => 1.2457, std. dev.: 0.0068)
  MultiSource/Applications/hexxagon/hexxagon: 4.92% (12.0921 => 12.6875, std. dev.: 0.0017)
  MultiSource/Benchmarks/Trimaran/enc-pc1/enc-pc1: 4.41% (0.9692 => 1.0119, std. dev.: 0.0009)
  SingleSource/Benchmarks/Misc/salsa20: 3.66% (7.4271 => 7.6989, std. dev.: 0.0077)
  MultiSource/Benchmarks/ASC_Sequoia/IRSmk/IRSmk: 2.84% (8.8259 => 9.0762, std. dev.: 0.0061)
  MultiSource/Applications/lua/lua: 2.00% (26.5159 => 27.0450, std. dev.: 0.0169)
  MultiSource/Applications/lemon/lemon: 1.99% (1.4298 => 1.4582, std. dev.: 0.0010)
  MultiSource/Benchmarks/Ptrdist/ks/ks: 1.28% (2.0152 => 2.0409, std. dev.: 0.0001)

Performance Improvements - Compile Time
---------------------------------------
  MultiSource/Benchmarks/Trimaran/enc-md5/enc-md5: -6.44% (0.2670 => 0.2498, std. dev.: 0.0002)
  MultiSource/Applications/lua/lua: -2.43% (3.7467 => 3.6556, std. dev.: 0.0038)
  MultiSource/Benchmarks/Prolangs-C/unix-smail/unix-smail: -2.19% (0.5577 => 0.5455, std. dev.: 0.0025)
  MultiSource/Benchmarks/PAQ8p/paq8p: -1.61% (2.0176 => 1.9852, std. dev.: 0.0050)
  MultiSource/Applications/lemon/lemon: -1.58% (1.0313 => 1.0150, std. dev.: 0.0009)
  MultiSource/Benchmarks/MiBench/telecomm-gsm/telecomm-gsm: -1.34% (1.3819 => 1.3634, std. dev.: 0.0049)
  MultiSource/Benchmarks/mediabench/gsm/toast/toast: -1.33% (1.3813 => 1.3629, std. dev.: 0.0003)
  MultiSource/Applications/hexxagon/hexxagon: -1.33% (1.2292 => 1.2129, std. dev.: 0.0003)
  MultiSource/Applications/JM/ldecod/ldecod: -1.31% (6.5754 => 6.4892, std. dev.: 0.0063)
  MultiSource/Benchmarks/FreeBench/pifft/pifft: -1.19% (0.9108 => 0.9000, std. dev.: 0.0008)
  MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: -1.18% (4.4433 => 4.3909, std. dev.: 0.0082)
  MultiSource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: -1.17% (13.2842 => 13.1285, std. dev.: 0.0152)
  MultiSource/Applications/JM/lencod/lencod: -1.16% (14.8535 => 14.6815, std. dev.: 0.0142)
  MultiSource/Benchmarks/Prolangs-C/TimberWolfMC/timberwolfmc: -1.16% (7.2180 => 7.1346, std. dev.: 0.0074)
  MultiSource/Benchmarks/mafft/pairlocalalign: -1.15% (8.4961 => 8.3980, std. dev.: 0.0056)
  MultiSource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: -1.14% (1.1364 => 1.1235, std. dev.: 0.0010)
  MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -1.13% (6.2785 => 6.2073, std. dev.: 0.0016)
  MultiSource/Benchmarks/nbench/nbench: -1.12% (1.0594 => 1.0475, std. dev.: 0.0021)
  MultiSource/Benchmarks/MallocBench/espresso/espresso: -1.09% (3.9660 => 3.9226, std. dev.: 0.0052)
  MultiSource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: -1.09% (4.3801 => 4.3324, std. dev.: 0.0023)
  MultiSource/Benchmarks/Prolangs-C++/city/city: -1.06% (1.9828 => 1.9618, std. dev.: 0.0071)
  MultiSource/Applications/SIBsim4/SIBsim4: -1.04% (1.2749 => 1.2616, std. dev.: 0.0013)
  MultiSource/Benchmarks/Prolangs-C/bison/mybison: -1.03% (1.3245 => 1.3108, std. dev.: 0.0017)

Performance Improvements - Execution Time
-----------------------------------------
  SingleSource/Benchmarks/BenchmarkGame/spectral-norm: -6.20% (1.2870 => 1.2072, std. dev.: 0.0000)
  MultiSource/Benchmarks/Ptrdist/yacr2/yacr2: -3.42% (1.0460 => 1.0102, std. dev.: 0.0000)
  SingleSource/Benchmarks/Misc/fp-convert: -1.68% (3.4016 => 3.3445, std. dev.: 0.0000)
  MultiSource/Benchmarks/Bullet/bullet: -1.65% (7.3305 => 7.2096, std. dev.: 0.0087)

Report Time: 1.59s
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/llvm-testresults/attachments/20120408/0c15930b/attachment.html>


More information about the llvm-testresults mailing list