[llvm-testresults] lab-mini-01__O3-plain__clang_DEV__x86_64 test results

llvm-testresults at cs.uiuc.edu llvm-testresults at cs.uiuc.edu
Sat Apr 21 20:23:42 PDT 2012


http://llvm.org/perf/db_default/v4/nts/363?compare_to=362&baseline=6
Nickname: lab-mini-01__O3-plain__clang_DEV__x86_64:1
Comparing:
     Run: 363, Order: 155302, Start Time: 2012-04-22 02:28:08, End Time: 2012-04-22 03:24:02
      To: 362, Order: 155300, Start Time: 2012-04-22 01:06:39, End Time: 2012-04-22 02:02:26
Baseline: 6, Order: 153879, Start Time: 2012-04-02 17:48:39, End Time: 2012-04-02 18:44:53

===============
Tests Summary
===============

Performance Regressions: 1 (26 on baseline)
Performance Improvements: 0 (22 on baseline)
Unchanged Tests: 887 (840 on baseline)
Total Tests: 888

===========================
Run-Over-Run Changes Detail
===========================
Performance Regressions - Execution Time
----------------------------------------
  MultiSource/Benchmarks/llubenchmark/llu: 1.70% (3.7897 => 3.8540, std. dev.: 0.0169)

================================
Run-Over-Baseline Changes Detail
================================
Performance Regressions - Compile Time
--------------------------------------
  MultiSource/Benchmarks/Prolangs-C/assembler/assembler: 1.99% (0.5265 => 0.5370, std. dev.: 0.0003)
  MultiSource/Benchmarks/Prolangs-C/agrep/agrep: 1.92% (1.2193 => 1.2427, std. dev.: 0.0003)
  MultiSource/Benchmarks/Prolangs-C/football/football: 1.77% (0.8310 => 0.8457, std. dev.: 0.0005)
  MultiSource/Benchmarks/Prolangs-C/unix-tbl/unix-tbl: 1.41% (0.9199 => 0.9329, std. dev.: 0.0011)
  MultiSource/Applications/spiff/spiff: 1.41% (0.7506 => 0.7612, std. dev.: 0.0008)
  MultiSource/Benchmarks/MiBench/office-ispell/office-ispell: 1.38% (1.4563 => 1.4764, std. dev.: 0.0001)
  MultiSource/Applications/kimwitu++/kc: 1.36% (16.6129 => 16.8385, std. dev.: 0.0089)
  MultiSource/Applications/treecc/treecc: 1.22% (1.9270 => 1.9505, std. dev.: 0.0023)
  MultiSource/Benchmarks/MallocBench/gs/gs: 1.18% (3.8646 => 3.9101, std. dev.: 0.0108)
  MultiSource/Benchmarks/Prolangs-C/bison/mybison: 1.08% (1.3245 => 1.3388, std. dev.: 0.0008)
  MultiSource/Applications/hexxagon/hexxagon: 1.03% (1.2292 => 1.2418, std. dev.: 0.0002)

Performance Regressions - Execution Time
----------------------------------------
  SingleSource/Benchmarks/Adobe-C++/stepanov_vector: 6.51% (2.5629 => 2.7297, std. dev.: 0.0000)
  MultiSource/Benchmarks/Fhourstones-3_1/fhourstones3_1: 5.88% (1.5824 => 1.6754, std. dev.: 0.0010)
  SingleSource/Benchmarks/Adobe-C++/stepanov_abstraction: 4.78% (4.5947 => 4.8144, std. dev.: 0.0001)
  MultiSource/Benchmarks/Trimaran/enc-pc1/enc-pc1: 4.19% (0.9692 => 1.0098, std. dev.: 0.0000)
  SingleSource/Benchmarks/Misc/salsa20: 3.72% (7.4271 => 7.7034, std. dev.: 0.0010)
  MultiSource/Benchmarks/Fhourstones/fhourstones: 3.71% (1.7688 => 1.8345, std. dev.: 0.0005)
  MultiSource/Applications/hexxagon/hexxagon: 3.00% (12.0921 => 12.4544, std. dev.: 0.0013)
  MultiSource/Benchmarks/ASC_Sequoia/IRSmk/IRSmk: 2.85% (8.8259 => 9.0777, std. dev.: 0.0154)
  SingleSource/Benchmarks/Misc/ffbench: 2.19% (0.8273 => 0.8454, std. dev.: 0.0003)
  MultiSource/Benchmarks/ASC_Sequoia/CrystalMk/CrystalMk: 2.16% (7.9275 => 8.0986, std. dev.: 0.0185)
  MultiSource/Benchmarks/llubenchmark/llu: 2.14% (3.7731 => 3.8540, std. dev.: 0.0169)
  SingleSource/Benchmarks/Misc/perlin: 2.02% (6.2497 => 6.3762, std. dev.: 0.0003)
  SingleSource/Benchmarks/BenchmarkGame/recursive: 1.26% (2.0200 => 2.0455, std. dev.: 0.0003)
  SingleSource/Benchmarks/Shootout/sieve: 1.19% (5.1531 => 5.2142, std. dev.: 0.0001)
  MultiSource/Benchmarks/sim/sim: 1.10% (5.0168 => 5.0719, std. dev.: 0.0123)

Performance Improvements - Compile Time
---------------------------------------
  MultiSource/Benchmarks/Trimaran/enc-md5/enc-md5: -7.53% (0.2670 => 0.2469, std. dev.: 0.0003)
  MultiSource/Benchmarks/Trimaran/enc-3des/enc-3des: -6.61% (0.3511 => 0.3279, std. dev.: 0.0020)
  MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: -1.94% (0.8818 => 0.8647, std. dev.: 0.0005)
  MultiSource/Applications/lua/lua: -1.26% (3.7467 => 3.6996, std. dev.: 0.0010)

Performance Improvements - Execution Time
-----------------------------------------
  SingleSource/Benchmarks/Misc/richards_benchmark: -12.73% (1.1560 => 1.0088, std. dev.: 0.0064)
  SingleSource/Benchmarks/McGill/chomp: -10.42% (1.2308 => 1.1026, std. dev.: 0.0054)
  MultiSource/Benchmarks/Ptrdist/yacr2/yacr2: -10.11% (1.0460 => 0.9402, std. dev.: 0.0000)
  MultiSource/Benchmarks/Olden/perimeter/perimeter: -6.59% (0.3139 => 0.2932, std. dev.: 0.0001)
  SingleSource/Benchmarks/BenchmarkGame/nsieve-bits: -5.25% (1.3780 => 1.3056, std. dev.: 0.0196)
  MultiSource/Benchmarks/BitBench/drop3/drop3: -4.67% (0.4366 => 0.4162, std. dev.: 0.0000)
  MultiSource/Benchmarks/FreeBench/fourinarow/fourinarow: -3.64% (0.3458 => 0.3332, std. dev.: 0.0001)
  MultiSource/Benchmarks/Olden/bh/bh: -3.48% (1.9762 => 1.9074, std. dev.: 0.0006)
  SingleSource/Benchmarks/Shootout/lists: -2.27% (6.3560 => 6.2119, std. dev.: 0.0096)
  MultiSource/Benchmarks/VersaBench/8b10b/8b10b: -1.93% (6.6320 => 6.5038, std. dev.: 0.0023)
  MultiSource/Applications/minisat/minisat: -1.78% (9.3358 => 9.1699, std. dev.: 0.0454)
  SingleSource/Benchmarks/Misc/fp-convert: -1.68% (3.4016 => 3.3445, std. dev.: 0.0001)
  SingleSource/Benchmarks/Misc/ReedSolomon: -1.66% (7.1097 => 6.9915, std. dev.: 0.0002)
  SingleSource/Benchmarks/Shootout/heapsort: -1.37% (3.9173 => 3.8638, std. dev.: 0.0009)
  MultiSource/Benchmarks/Bullet/bullet: -1.30% (7.3305 => 7.2354, std. dev.: 0.0032)
  MultiSource/Applications/spiff/spiff: -1.26% (3.0263 => 2.9882, std. dev.: 0.0029)
  SingleSource/Benchmarks/CoyoteBench/huffbench: -1.14% (20.3410 => 20.1091, std. dev.: 0.0056)
  MultiSource/Benchmarks/nbench/nbench: -1.02% (11.2619 => 11.1469, std. dev.: 0.0011)

Report Time: 1.49s
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/llvm-testresults/attachments/20120421/6ba935b6/attachment.html>


More information about the llvm-testresults mailing list