[llvm-testresults] lab-mini-01__O3-plain__clang_DEV__x86_64 test results

llvm-testresults at cs.uiuc.edu llvm-testresults at cs.uiuc.edu
Sat Apr 7 02:50:23 PDT 2012


http://llvm.org/perf/db_default/v4/nts/170?compare_to=168&baseline=6
Nickname: lab-mini-01__O3-plain__clang_DEV__x86_64:1
Comparing:
     Run: 170, Order:  154254, Start Time: 2012-04-07 08:54:37, End Time: 2012-04-07 09:50:27
      To: 168, Order:  154249, Start Time: 2012-04-07 04:59:40, End Time: 2012-04-07 05:55:20
Baseline: 6, Order:  153879, Start Time: 2012-04-02 17:48:39, End Time: 2012-04-02 18:44:53

===============
Tests Summary
===============

Performance Regressions: 0 (8 on baseline)
Performance Improvements: 1 (31 on baseline)
Unchanged Tests: 887 (849 on baseline)
Total Tests: 888

===========================
Run-Over-Run Changes Detail
===========================
Performance Improvements - Execution Time
-----------------------------------------
  SingleSource/Benchmarks/Dhrystone/fldry: -1.21% (1.9549 => 1.9312, std. dev.: 0.0076)

================================
Run-Over-Baseline Changes Detail
================================
Performance Regressions - Execution Time
----------------------------------------
  SingleSource/Benchmarks/Misc/richards_benchmark: 8.17% (1.1560 => 1.2505, std. dev.: 0.0022)
  MultiSource/Applications/hexxagon/hexxagon: 4.92% (12.0921 => 12.6872, std. dev.: 0.0007)
  MultiSource/Benchmarks/Trimaran/enc-pc1/enc-pc1: 4.48% (0.9692 => 1.0126, std. dev.: 0.0007)
  SingleSource/Benchmarks/Misc/salsa20: 3.70% (7.4271 => 7.7020, std. dev.: 0.0051)
  MultiSource/Benchmarks/ASC_Sequoia/IRSmk/IRSmk: 2.48% (8.8259 => 9.0448, std. dev.: 0.0198)
  MultiSource/Applications/lua/lua: 1.95% (26.5159 => 27.0322, std. dev.: 0.0148)
  MultiSource/Applications/lemon/lemon: 1.92% (1.4298 => 1.4572, std. dev.: 0.0011)
  MultiSource/Benchmarks/Ptrdist/ks/ks: 1.29% (2.0152 => 2.0411, std. dev.: 0.0000)

Performance Improvements - Compile Time
---------------------------------------
  MultiSource/Benchmarks/Trimaran/enc-md5/enc-md5: -6.25% (0.2670 => 0.2503, std. dev.: 0.0004)
  MultiSource/Applications/lua/lua: -2.66% (3.7467 => 3.6472, std. dev.: 0.0032)
  MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: -2.18% (0.8818 => 0.8626, std. dev.: 0.0010)
  MultiSource/Applications/lemon/lemon: -1.57% (1.0313 => 1.0151, std. dev.: 0.0010)
  MultiSource/Applications/lambda-0_1_3/lambda: -1.42% (0.7038 => 0.6938, std. dev.: 0.0012)
  MultiSource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: -1.32% (13.2842 => 13.1091, std. dev.: 0.0035)
  MultiSource/Applications/JM/ldecod/ldecod: -1.28% (6.5754 => 6.4912, std. dev.: 0.0020)
  MultiSource/Benchmarks/FreeBench/pifft/pifft: -1.27% (0.9108 => 0.8992, std. dev.: 0.0000)
  MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: -1.27% (4.4433 => 4.3868, std. dev.: 0.0023)
  MultiSource/Applications/SIBsim4/SIBsim4: -1.26% (1.2749 => 1.2589, std. dev.: 0.0009)
  MultiSource/Benchmarks/mafft/pairlocalalign: -1.24% (8.4961 => 8.3911, std. dev.: 0.0055)
  MultiSource/Benchmarks/nbench/nbench: -1.19% (1.0594 => 1.0468, std. dev.: 0.0013)
  MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -1.15% (6.2785 => 6.2063, std. dev.: 0.0035)
  MultiSource/Benchmarks/mediabench/gsm/toast/toast: -1.13% (1.3813 => 1.3657, std. dev.: 0.0012)
  MultiSource/Benchmarks/Prolangs-C/TimberWolfMC/timberwolfmc: -1.11% (7.2180 => 7.1376, std. dev.: 0.0059)
  MultiSource/Applications/JM/lencod/lencod: -1.11% (14.8535 => 14.6884, std. dev.: 0.0073)
  MultiSource/Applications/ClamAV/clamscan: -1.09% (14.5298 => 14.3707, std. dev.: 0.0216)
  MultiSource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: -1.09% (1.1364 => 1.1240, std. dev.: 0.0005)
  MultiSource/Benchmarks/Prolangs-C/bison/mybison: -1.07% (1.3245 => 1.3103, std. dev.: 0.0022)
  SingleSource/Benchmarks/Adobe-C++/simple_types_loop_invariant: -1.07% (1.5188 => 1.5026, std. dev.: 0.0013)
  MultiSource/Benchmarks/MiBench/consumer-lame/consumer-lame: -1.07% (3.6692 => 3.6301, std. dev.: 0.0062)
  MultiSource/Benchmarks/Prolangs-C/agrep/agrep: -1.05% (1.2193 => 1.2065, std. dev.: 0.0026)
  SingleSource/Benchmarks/Adobe-C++/loop_unroll: -1.05% (3.2742 => 3.2399, std. dev.: 0.0017)
  MultiSource/Benchmarks/MallocBench/gs/gs: -1.04% (3.8646 => 3.8246, std. dev.: 0.0083)
  MultiSource/Applications/treecc/treecc: -1.02% (1.9270 => 1.9074, std. dev.: 0.0015)
  MultiSource/Benchmarks/MiBench/office-ispell/office-ispell: -1.00% (1.4563 => 1.4417, std. dev.: 0.0024)
  MultiSource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: -1.00% (4.3801 => 4.3362, std. dev.: 0.0051)

Performance Improvements - Execution Time
-----------------------------------------
  SingleSource/Benchmarks/BenchmarkGame/spectral-norm: -6.20% (1.2870 => 1.2072, std. dev.: 0.0000)
  MultiSource/Benchmarks/Ptrdist/yacr2/yacr2: -3.42% (1.0460 => 1.0102, std. dev.: 0.0000)
  SingleSource/Benchmarks/Misc/fp-convert: -1.68% (3.4016 => 3.3446, std. dev.: 0.0000)
  MultiSource/Benchmarks/Bullet/bullet: -1.67% (7.3305 => 7.2081, std. dev.: 0.0015)

Report Time: 1.40s
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/llvm-testresults/attachments/20120407/a2c8cf62/attachment.html>


More information about the llvm-testresults mailing list