[llvm-testresults] lab-mini-01__O3-plain__clang_DEV__x86_64 test results

llvm-testresults at cs.uiuc.edu llvm-testresults at cs.uiuc.edu
Fri Apr 6 22:55:17 PDT 2012


http://llvm.org/perf/db_default/v4/nts/168?compare_to=167&baseline=6
Nickname: lab-mini-01__O3-plain__clang_DEV__x86_64:1
Comparing:
     Run: 168, Order:  154249, Start Time: 2012-04-07 04:59:40, End Time: 2012-04-07 05:55:20
      To: 167, Order:  154248, Start Time: 2012-04-07 03:38:52, End Time: 2012-04-07 04:34:30
Baseline: 6, Order:  153879, Start Time: 2012-04-02 17:48:39, End Time: 2012-04-02 18:44:53

===============
Tests Summary
===============

Performance Regressions: 1 (9 on baseline)
Performance Improvements: 0 (33 on baseline)
Unchanged Tests: 887 (846 on baseline)
Total Tests: 888

===========================
Run-Over-Run Changes Detail
===========================
Performance Regressions - Execution Time
----------------------------------------
  SingleSource/Benchmarks/Dhrystone/fldry: 1.27% (1.9303 => 1.9549, std. dev.: 0.0016)

================================
Run-Over-Baseline Changes Detail
================================
Performance Regressions - Execution Time
----------------------------------------
  SingleSource/Benchmarks/Misc/richards_benchmark: 7.91% (1.1560 => 1.2474, std. dev.: 0.0022)
  MultiSource/Applications/hexxagon/hexxagon: 4.93% (12.0921 => 12.6878, std. dev.: 0.0007)
  MultiSource/Benchmarks/Trimaran/enc-pc1/enc-pc1: 4.36% (0.9692 => 1.0115, std. dev.: 0.0009)
  SingleSource/Benchmarks/Misc/salsa20: 3.69% (7.4271 => 7.7013, std. dev.: 0.0061)
  MultiSource/Benchmarks/ASC_Sequoia/IRSmk/IRSmk: 2.86% (8.8259 => 9.0782, std. dev.: 0.0071)
  MultiSource/Applications/lua/lua: 2.07% (26.5159 => 27.0638, std. dev.: 0.0269)
  MultiSource/Applications/lemon/lemon: 1.96% (1.4298 => 1.4578, std. dev.: 0.0062)
  SingleSource/Benchmarks/Dhrystone/fldry: 1.34% (1.9290 => 1.9549, std. dev.: 0.0016)
  MultiSource/Benchmarks/Ptrdist/ks/ks: 1.28% (2.0152 => 2.0409, std. dev.: 0.0001)

Performance Improvements - Compile Time
---------------------------------------
  MultiSource/Benchmarks/Trimaran/enc-md5/enc-md5: -6.18% (0.2670 => 0.2505, std. dev.: 0.0002)
  MultiSource/Applications/lua/lua: -2.64% (3.7467 => 3.6477, std. dev.: 0.0016)
  MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: -2.21% (0.8818 => 0.8623, std. dev.: 0.0001)
  MultiSource/Benchmarks/PAQ8p/paq8p: -1.64% (2.0176 => 1.9845, std. dev.: 0.0005)
  MultiSource/Applications/lemon/lemon: -1.50% (1.0313 => 1.0158, std. dev.: 0.0010)
  MultiSource/Applications/lambda-0_1_3/lambda: -1.45% (0.7038 => 0.6936, std. dev.: 0.0006)
  MultiSource/Applications/SIBsim4/SIBsim4: -1.32% (1.2749 => 1.2581, std. dev.: 0.0004)
  MultiSource/Benchmarks/FreeBench/pifft/pifft: -1.32% (0.9108 => 0.8988, std. dev.: 0.0046)
  MultiSource/Applications/JM/ldecod/ldecod: -1.30% (6.5754 => 6.4898, std. dev.: 0.0054)
  MultiSource/Benchmarks/MiBench/consumer-lame/consumer-lame: -1.26% (3.6692 => 3.6228, std. dev.: 0.0080)
  MultiSource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: -1.26% (13.2842 => 13.1171, std. dev.: 0.0203)
  MultiSource/Applications/JM/lencod/lencod: -1.21% (14.8535 => 14.6735, std. dev.: 0.0120)
  MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: -1.19% (4.4433 => 4.3904, std. dev.: 0.0006)
  MultiSource/Benchmarks/mafft/pairlocalalign: -1.17% (8.4961 => 8.3964, std. dev.: 0.0033)
  MultiSource/Benchmarks/nbench/nbench: -1.16% (1.0594 => 1.0471, std. dev.: 0.0004)
  MultiSource/Benchmarks/MiBench/telecomm-gsm/telecomm-gsm: -1.16% (1.3819 => 1.3659, std. dev.: 0.0004)
  MultiSource/Benchmarks/MallocBench/espresso/espresso: -1.16% (3.9660 => 3.9201, std. dev.: 0.0082)
  MultiSource/Benchmarks/mediabench/gsm/toast/toast: -1.14% (1.3813 => 1.3656, std. dev.: 0.0005)
  MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -1.13% (6.2785 => 6.2076, std. dev.: 0.0052)
  MultiSource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: -1.13% (4.3801 => 4.3307, std. dev.: 0.0007)
  MultiSource/Benchmarks/Prolangs-C/bison/mybison: -1.10% (1.3245 => 1.3099, std. dev.: 0.0006)
  SingleSource/Benchmarks/Adobe-C++/simple_types_loop_invariant: -1.10% (1.5188 => 1.5021, std. dev.: 0.0006)
  MultiSource/Applications/d/make_dparser: -1.10% (3.1889 => 3.1539, std. dev.: 0.0028)
  MultiSource/Benchmarks/Prolangs-C/TimberWolfMC/timberwolfmc: -1.08% (7.2180 => 7.1398, std. dev.: 0.0035)
  MultiSource/Applications/ClamAV/clamscan: -1.06% (14.5298 => 14.3758, std. dev.: 0.0116)
  MultiSource/Benchmarks/Prolangs-C/agrep/agrep: -1.04% (1.2193 => 1.2066, std. dev.: 0.0004)
  SingleSource/Benchmarks/Adobe-C++/loop_unroll: -1.03% (3.2742 => 3.2406, std. dev.: 0.0011)
  MultiSource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: -1.02% (1.1364 => 1.1248, std. dev.: 0.0006)
  MultiSource/Applications/hbd/hbd: -1.00% (1.3693 => 1.3556, std. dev.: 0.0017)

Performance Improvements - Execution Time
-----------------------------------------
  SingleSource/Benchmarks/BenchmarkGame/spectral-norm: -6.20% (1.2870 => 1.2072, std. dev.: 0.0000)
  MultiSource/Benchmarks/Ptrdist/yacr2/yacr2: -3.41% (1.0460 => 1.0103, std. dev.: 0.0000)
  MultiSource/Benchmarks/Bullet/bullet: -1.70% (7.3305 => 7.2059, std. dev.: 0.0236)
  SingleSource/Benchmarks/Misc/fp-convert: -1.68% (3.4016 => 3.3445, std. dev.: 0.0000)

Report Time: 1.92s
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/llvm-testresults/attachments/20120407/544cea09/attachment.html>


More information about the llvm-testresults mailing list