[llvm-testresults] lab-mini-01__O3-plain__clang_DEV__x86_64 test results

llvm-testresults at cs.uiuc.edu llvm-testresults at cs.uiuc.edu
Fri Apr 27 05:43:34 PDT 2012


http://llvm.org/perf/db_default/v4/nts/433?compare_to=432&baseline=6
Nickname: lab-mini-01__O3-plain__clang_DEV__x86_64:1
Comparing:
     Run: 433, Order: 155703, Start Time: 2012-04-27 11:48:16, End Time: 2012-04-27 12:44:02
      To: 432, Order: 155700, Start Time: 2012-04-27 10:27:09, End Time: 2012-04-27 11:22:50
Baseline: 6, Order: 153879, Start Time: 2012-04-02 17:48:39, End Time: 2012-04-02 18:44:53

===============
Tests Summary
===============

Performance Regressions: 0 (33 on baseline)
Performance Improvements: 1 (26 on baseline)
Added Tests: 0 (2 on baseline)
Unchanged Tests: 889 (829 on baseline)
Total Tests: 890

===========================
Run-Over-Run Changes Detail
===========================
Performance Improvements - Compile Time
---------------------------------------
  SingleSource/UnitTests/ObjC/block-byref-aggr: -11.35% (0.4272 => 0.3787, std. dev.: 0.0165)

================================
Run-Over-Baseline Changes Detail
================================
Performance Regressions - Compile Time
--------------------------------------
  MultiSource/Benchmarks/Prolangs-C/agrep/agrep: 2.21% (1.2193 => 1.2462, std. dev.: 0.0006)
  MultiSource/Benchmarks/Prolangs-C/assembler/assembler: 2.07% (0.5265 => 0.5374, std. dev.: 0.0011)
  MultiSource/Benchmarks/Prolangs-C/football/football: 1.87% (0.8310 => 0.8465, std. dev.: 0.0003)
  MultiSource/Benchmarks/Prolangs-C/unix-tbl/unix-tbl: 1.60% (0.9199 => 0.9346, std. dev.: 0.0003)
  MultiSource/Benchmarks/MiBench/office-ispell/office-ispell: 1.39% (1.4563 => 1.4766, std. dev.: 0.0018)
  MultiSource/Applications/hexxagon/hexxagon: 1.35% (1.2292 => 1.2458, std. dev.: 0.0043)
  MultiSource/Applications/spiff/spiff: 1.35% (0.7506 => 0.7607, std. dev.: 0.0014)
  MultiSource/Benchmarks/mediabench/gsm/toast/toast: 1.34% (1.3813 => 1.3998, std. dev.: 0.0009)
  MultiSource/Benchmarks/MiBench/telecomm-gsm/telecomm-gsm: 1.28% (1.3819 => 1.3996, std. dev.: 0.0014)
  MultiSource/Applications/treecc/treecc: 1.24% (1.9270 => 1.9508, std. dev.: 0.0008)
  MultiSource/Benchmarks/Bullet/bullet: 1.22% (33.2944 => 33.7012, std. dev.: 0.0107)
  MultiSource/Benchmarks/Prolangs-C/bison/mybison: 1.16% (1.3245 => 1.3399, std. dev.: 0.0012)
  MultiSource/Applications/Burg/burg: 1.16% (1.1061 => 1.1189, std. dev.: 0.0003)
  MultiSource/Benchmarks/MallocBench/gs/gs: 1.15% (3.8646 => 3.9092, std. dev.: 0.0012)
  MultiSource/Applications/SPASS/SPASS: 1.14% (13.8202 => 13.9778, std. dev.: 0.0054)
  MultiSource/Benchmarks/Prolangs-C++/city/city: 1.10% (1.9828 => 2.0047, std. dev.: 0.0066)
  SingleSource/Benchmarks/Adobe-C++/simple_types_constant_folding: 1.10% (2.1007 => 2.1239, std. dev.: 0.0021)
  MultiSource/Benchmarks/nbench/nbench: 1.09% (1.0594 => 1.0709, std. dev.: 0.0004)
  MultiSource/Applications/hbd/hbd: 1.07% (1.3693 => 1.3840, std. dev.: 0.0024)

Performance Regressions - Execution Time
----------------------------------------
  SingleSource/Benchmarks/Misc/fbench: 24.45% (1.8175 => 2.2618, std. dev.: 0.0310)
  SingleSource/Benchmarks/Adobe-C++/stepanov_vector: 6.51% (2.5629 => 2.7298, std. dev.: 0.0000)
  MultiSource/Benchmarks/Fhourstones-3_1/fhourstones3_1: 5.88% (1.5824 => 1.6754, std. dev.: 0.0011)
  SingleSource/Benchmarks/Adobe-C++/stepanov_abstraction: 4.79% (4.5947 => 4.8146, std. dev.: 0.0001)
  MultiSource/Benchmarks/Fhourstones/fhourstones: 3.73% (1.7688 => 1.8347, std. dev.: 0.0007)
  SingleSource/Benchmarks/Misc/salsa20: 3.65% (7.4271 => 7.6979, std. dev.: 0.0062)
  MultiSource/Benchmarks/Trimaran/enc-pc1/enc-pc1: 3.25% (0.9692 => 1.0007, std. dev.: 0.0002)
  MultiSource/Applications/hexxagon/hexxagon: 3.00% (12.0921 => 12.4546, std. dev.: 0.0006)
  MultiSource/Benchmarks/ASC_Sequoia/IRSmk/IRSmk: 2.90% (8.8259 => 9.0822, std. dev.: 0.0066)
  MultiSource/Benchmarks/ASC_Sequoia/CrystalMk/CrystalMk: 2.53% (7.9275 => 8.1284, std. dev.: 0.0094)
  SingleSource/Benchmarks/Misc/ffbench: 2.20% (0.8273 => 0.8455, std. dev.: 0.0002)
  SingleSource/Benchmarks/BenchmarkGame/recursive: 1.30% (2.0200 => 2.0462, std. dev.: 0.0001)
  MultiSource/Benchmarks/sim/sim: 1.25% (5.0168 => 5.0797, std. dev.: 0.0135)
  SingleSource/Benchmarks/Shootout/sieve: 1.17% (5.1531 => 5.2136, std. dev.: 0.0001)

Performance Improvements - Compile Time
---------------------------------------
  SingleSource/UnitTests/ObjC/block-byref-aggr: -18.68% (0.4657 => 0.3787, std. dev.: 0.0165)
  MultiSource/Benchmarks/Trimaran/enc-md5/enc-md5: -6.22% (0.2670 => 0.2504, std. dev.: 0.0002)
  MultiSource/Benchmarks/Trimaran/enc-3des/enc-3des: -5.61% (0.3511 => 0.3314, std. dev.: 0.0000)
  MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: -2.13% (0.8818 => 0.8630, std. dev.: 0.0036)
  MultiSource/Applications/lua/lua: -1.08% (3.7467 => 3.7064, std. dev.: 0.0011)

Performance Improvements - Execution Time
-----------------------------------------
  SingleSource/Benchmarks/Misc/richards_benchmark: -12.89% (1.1560 => 1.0070, std. dev.: 0.0039)
  SingleSource/Benchmarks/McGill/chomp: -10.46% (1.2308 => 1.1021, std. dev.: 0.0059)
  MultiSource/Benchmarks/Ptrdist/yacr2/yacr2: -10.12% (1.0460 => 0.9401, std. dev.: 0.0000)
  MultiSource/Benchmarks/Olden/perimeter/perimeter: -6.59% (0.3139 => 0.2932, std. dev.: 0.0001)
  SingleSource/Benchmarks/BenchmarkGame/nsieve-bits: -4.97% (1.3780 => 1.3095, std. dev.: 0.0037)
  SingleSource/Benchmarks/Misc/perlin: -4.84% (6.2497 => 5.9472, std. dev.: 0.0001)
  MultiSource/Benchmarks/BitBench/drop3/drop3: -4.67% (0.4366 => 0.4162, std. dev.: 0.0000)
  MultiSource/Benchmarks/FreeBench/neural/neural: -4.33% (0.2333 => 0.2232, std. dev.: 0.0005)
  MultiSource/Benchmarks/FreeBench/fourinarow/fourinarow: -3.70% (0.3458 => 0.3330, std. dev.: 0.0001)
  MultiSource/Benchmarks/Olden/bh/bh: -3.66% (1.9762 => 1.9038, std. dev.: 0.0012)
  SingleSource/Benchmarks/Shootout/lists: -2.08% (6.3560 => 6.2239, std. dev.: 0.0089)
  MultiSource/Applications/minisat/minisat: -2.07% (9.3358 => 9.1424, std. dev.: 0.0054)
  MultiSource/Benchmarks/Bullet/bullet: -2.05% (7.3305 => 7.1802, std. dev.: 0.0039)
  MultiSource/Benchmarks/VersaBench/8b10b/8b10b: -1.80% (6.6320 => 6.5126, std. dev.: 0.0055)
  SingleSource/Benchmarks/Misc/fp-convert: -1.68% (3.4016 => 3.3445, std. dev.: 0.0000)
  SingleSource/Benchmarks/Misc/ReedSolomon: -1.67% (7.1097 => 6.9912, std. dev.: 0.0003)
  SingleSource/Benchmarks/CoyoteBench/huffbench: -1.53% (20.3410 => 20.0290, std. dev.: 0.0713)
  MultiSource/Applications/lua/lua: -1.27% (26.5159 => 26.1788, std. dev.: 0.0308)
  SingleSource/Benchmarks/Shootout/heapsort: -1.26% (3.9173 => 3.8680, std. dev.: 0.0132)
  MultiSource/Applications/spiff/spiff: -1.24% (3.0263 => 2.9889, std. dev.: 0.0022)
  MultiSource/Benchmarks/nbench/nbench: -1.01% (11.2619 => 11.1482, std. dev.: 0.0023)

Added Tests - Compile Time
--------------------------
  SingleSource/Regression/C/compare

Added Tests - Execution Time
----------------------------
  SingleSource/Regression/C/compare

Report Time: 1.09s
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/llvm-testresults/attachments/20120427/b29a01ec/attachment.html>


More information about the llvm-testresults mailing list