[llvm-testresults] lab-mini-01__O3-plain__clang_DEV__x86_64 test results

llvm-testresults at cs.uiuc.edu llvm-testresults at cs.uiuc.edu
Tue Apr 24 17:41:10 PDT 2012


http://llvm.org/perf/db_default/v4/nts/401?compare_to=400&baseline=6
Nickname: lab-mini-01__O3-plain__clang_DEV__x86_64:1
Comparing:
     Run: 401, Order: 155502, Start Time: 2012-04-24 23:45:43, End Time: 2012-04-25 00:41:34
      To: 400, Order: 155492, Start Time: 2012-04-24 22:24:20, End Time: 2012-04-24 23:20:17
Baseline: 6, Order: 153879, Start Time: 2012-04-02 17:48:39, End Time: 2012-04-02 18:44:53

===============
Tests Summary
===============

Performance Regressions: 1 (30 on baseline)
Performance Improvements: 2 (25 on baseline)
Added Tests: 2 (2 on baseline)
Unchanged Tests: 885 (833 on baseline)
Total Tests: 890

===========================
Run-Over-Run Changes Detail
===========================
Performance Regressions - Execution Time
----------------------------------------
  SingleSource/Benchmarks/Misc/fbench: 24.15% (1.8218 => 2.2618, std. dev.: 0.0365)

Performance Improvements - Execution Time
-----------------------------------------
  SingleSource/Benchmarks/Misc/perlin: -6.73% (6.3761 => 5.9473, std. dev.: 0.0000)
  MultiSource/Benchmarks/FreeBench/neural/neural: -4.33% (0.2333 => 0.2232, std. dev.: 0.0000)

Added Tests - Compile Time
--------------------------
  SingleSource/Regression/C/compare

Added Tests - Execution Time
----------------------------
  SingleSource/Regression/C/compare

================================
Run-Over-Baseline Changes Detail
================================
Performance Regressions - Compile Time
--------------------------------------
  MultiSource/Benchmarks/Prolangs-C/agrep/agrep: 2.04% (1.2193 => 1.2442, std. dev.: 0.0029)
  MultiSource/Benchmarks/Prolangs-C/assembler/assembler: 2.03% (0.5265 => 0.5372, std. dev.: 0.0021)
  MultiSource/Benchmarks/Prolangs-C/football/football: 1.76% (0.8310 => 0.8456, std. dev.: 0.0052)
  MultiSource/Benchmarks/MallocBench/cfrac/cfrac: 1.73% (0.7460 => 0.7589, std. dev.: 0.0046)
  MultiSource/Benchmarks/Prolangs-C/unix-tbl/unix-tbl: 1.54% (0.9199 => 0.9341, std. dev.: 0.0004)
  MultiSource/Benchmarks/mediabench/gsm/toast/toast: 1.29% (1.3813 => 1.3991, std. dev.: 0.0009)
  MultiSource/Benchmarks/MiBench/telecomm-gsm/telecomm-gsm: 1.29% (1.3819 => 1.3997, std. dev.: 0.0042)
  MultiSource/Benchmarks/MiBench/office-ispell/office-ispell: 1.27% (1.4563 => 1.4748, std. dev.: 0.0014)
  SingleSource/Benchmarks/Adobe-C++/loop_unroll: 1.19% (3.2742 => 3.3131, std. dev.: 0.0009)
  MultiSource/Applications/kimwitu++/kc: 1.18% (16.6129 => 16.8093, std. dev.: 0.0045)
  MultiSource/Benchmarks/Prolangs-C/bison/mybison: 1.13% (1.3245 => 1.3395, std. dev.: 0.0008)
  MultiSource/Applications/treecc/treecc: 1.12% (1.9270 => 1.9485, std. dev.: 0.0017)
  MultiSource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: 1.07% (1.1364 => 1.1486, std. dev.: 0.0009)
  MultiSource/Applications/Burg/burg: 1.07% (1.1061 => 1.1179, std. dev.: 0.0006)
  MultiSource/Benchmarks/MallocBench/gs/gs: 1.06% (3.8646 => 3.9056, std. dev.: 0.0049)
  MultiSource/Applications/SPASS/SPASS: 1.04% (13.8202 => 13.9645, std. dev.: 0.0017)

Performance Regressions - Execution Time
----------------------------------------
  SingleSource/Benchmarks/Misc/fbench: 24.45% (1.8175 => 2.2618, std. dev.: 0.0365)
  SingleSource/Benchmarks/Adobe-C++/stepanov_vector: 6.51% (2.5629 => 2.7298, std. dev.: 0.0000)
  MultiSource/Benchmarks/Fhourstones-3_1/fhourstones3_1: 5.87% (1.5824 => 1.6753, std. dev.: 0.0004)
  SingleSource/Benchmarks/Adobe-C++/stepanov_abstraction: 4.78% (4.5947 => 4.8145, std. dev.: 0.0001)
  SingleSource/Benchmarks/Misc/salsa20: 3.76% (7.4271 => 7.7064, std. dev.: 0.0019)
  MultiSource/Benchmarks/Fhourstones/fhourstones: 3.62% (1.7688 => 1.8329, std. dev.: 0.0009)
  MultiSource/Benchmarks/Trimaran/enc-pc1/enc-pc1: 3.27% (0.9692 => 1.0009, std. dev.: 0.0001)
  MultiSource/Benchmarks/ASC_Sequoia/IRSmk/IRSmk: 3.19% (8.8259 => 9.1075, std. dev.: 0.0110)
  MultiSource/Applications/hexxagon/hexxagon: 3.00% (12.0921 => 12.4549, std. dev.: 0.0003)
  MultiSource/Benchmarks/ASC_Sequoia/CrystalMk/CrystalMk: 2.30% (7.9275 => 8.1096, std. dev.: 0.0175)
  SingleSource/Benchmarks/Misc/ffbench: 2.25% (0.8273 => 0.8459, std. dev.: 0.0001)
  SingleSource/Benchmarks/BenchmarkGame/recursive: 1.27% (2.0200 => 2.0457, std. dev.: 0.0004)
  SingleSource/Benchmarks/Shootout/sieve: 1.17% (5.1531 => 5.2136, std. dev.: 0.0000)
  MultiSource/Benchmarks/sim/sim: 1.09% (5.0168 => 5.0713, std. dev.: 0.0057)

Performance Improvements - Compile Time
---------------------------------------
  MultiSource/Benchmarks/Trimaran/enc-md5/enc-md5: -6.89% (0.2670 => 0.2486, std. dev.: 0.0000)
  MultiSource/Benchmarks/Trimaran/enc-3des/enc-3des: -6.32% (0.3511 => 0.3289, std. dev.: 0.0000)
  MultiSource/Benchmarks/MiBench/security-rijndael/security-rijndael: -2.54% (0.6576 => 0.6409, std. dev.: 0.0010)
  MultiSource/Applications/lua/lua: -1.36% (3.7467 => 3.6959, std. dev.: 0.0095)

Performance Improvements - Execution Time
-----------------------------------------
  SingleSource/Benchmarks/Misc/richards_benchmark: -12.72% (1.1560 => 1.0089, std. dev.: 0.0049)
  MultiSource/Benchmarks/Ptrdist/yacr2/yacr2: -10.11% (1.0460 => 0.9402, std. dev.: 0.0000)
  SingleSource/Benchmarks/McGill/chomp: -10.07% (1.2308 => 1.1069, std. dev.: 0.0039)
  MultiSource/Benchmarks/Olden/perimeter/perimeter: -6.56% (0.3139 => 0.2933, std. dev.: 0.0003)
  SingleSource/Benchmarks/Misc/perlin: -4.84% (6.2497 => 5.9473, std. dev.: 0.0000)
  MultiSource/Benchmarks/BitBench/drop3/drop3: -4.67% (0.4366 => 0.4162, std. dev.: 0.0000)
  MultiSource/Benchmarks/FreeBench/neural/neural: -4.33% (0.2333 => 0.2232, std. dev.: 0.0000)
  SingleSource/Benchmarks/BenchmarkGame/nsieve-bits: -4.25% (1.3780 => 1.3195, std. dev.: 0.0030)
  MultiSource/Benchmarks/Olden/bh/bh: -3.70% (1.9762 => 1.9030, std. dev.: 0.0010)
  MultiSource/Benchmarks/FreeBench/fourinarow/fourinarow: -3.70% (0.3458 => 0.3330, std. dev.: 0.0004)
  SingleSource/Benchmarks/Shootout/lists: -2.11% (6.3560 => 6.2217, std. dev.: 0.0080)
  MultiSource/Applications/minisat/minisat: -2.04% (9.3358 => 9.1453, std. dev.: 0.0047)
  MultiSource/Benchmarks/VersaBench/8b10b/8b10b: -1.99% (6.6320 => 6.5003, std. dev.: 0.0056)
  MultiSource/Benchmarks/Bullet/bullet: -1.86% (7.3305 => 7.1945, std. dev.: 0.0003)
  SingleSource/Benchmarks/Misc/fp-convert: -1.68% (3.4016 => 3.3445, std. dev.: 0.0000)
  SingleSource/Benchmarks/Misc/ReedSolomon: -1.66% (7.1097 => 6.9918, std. dev.: 0.0002)
  SingleSource/Benchmarks/CoyoteBench/huffbench: -1.48% (20.3410 => 20.0390, std. dev.: 0.0404)
  SingleSource/Benchmarks/Shootout/heapsort: -1.47% (3.9173 => 3.8596, std. dev.: 0.0020)
  MultiSource/Applications/spiff/spiff: -1.23% (3.0263 => 2.9891, std. dev.: 0.0012)
  MultiSource/Applications/lua/lua: -1.01% (26.5159 => 26.2471, std. dev.: 0.0332)
  MultiSource/Benchmarks/nbench/nbench: -1.01% (11.2619 => 11.1487, std. dev.: 0.0029)

Added Tests - Compile Time
--------------------------
  SingleSource/Regression/C/compare

Added Tests - Execution Time
----------------------------
  SingleSource/Regression/C/compare

Report Time: 1.61s
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/llvm-testresults/attachments/20120424/43eb0998/attachment.html>


More information about the llvm-testresults mailing list