[llvm-testresults] lab-mini-01__O3-plain__clang_DEV__x86_64 test results

llvm-testresults at cs.uiuc.edu llvm-testresults at cs.uiuc.edu
Fri Apr 20 17:41:15 PDT 2012


http://llvm.org/perf/db_default/v4/nts/348?compare_to=347&baseline=6
Nickname: lab-mini-01__O3-plain__clang_DEV__x86_64:1
Comparing:
     Run: 348, Order: 155253, Start Time: 2012-04-20 23:45:31, End Time: 2012-04-21 00:41:34
      To: 347, Order: 155239, Start Time: 2012-04-20 22:24:02, End Time: 2012-04-20 23:19:54
Baseline: 6, Order: 153879, Start Time: 2012-04-02 17:48:39, End Time: 2012-04-02 18:44:53

===============
Tests Summary
===============

Performance Regressions: 0 (35 on baseline)
Performance Improvements: 0 (23 on baseline)
Unchanged Tests: 888 (830 on baseline)
Total Tests: 888

===========================
Run-Over-Run Changes Detail
===========================
================================
Run-Over-Baseline Changes Detail
================================
Performance Regressions - Compile Time
--------------------------------------
  MultiSource/Benchmarks/Prolangs-C/assembler/assembler: 2.20% (0.5265 => 0.5381, std. dev.: 0.0008)
  MultiSource/Benchmarks/Prolangs-C/agrep/agrep: 2.12% (1.2193 => 1.2452, std. dev.: 0.0080)
  MultiSource/Applications/spiff/spiff: 1.96% (0.7506 => 0.7653, std. dev.: 0.0004)
  MultiSource/Benchmarks/Prolangs-C/football/football: 1.91% (0.8310 => 0.8469, std. dev.: 0.0007)
  MultiSource/Applications/hbd/hbd: 1.86% (1.3693 => 1.3948, std. dev.: 0.0097)
  MultiSource/Benchmarks/MiBench/office-ispell/office-ispell: 1.72% (1.4563 => 1.4814, std. dev.: 0.0004)
  MultiSource/Benchmarks/Prolangs-C/unix-tbl/unix-tbl: 1.71% (0.9199 => 0.9356, std. dev.: 0.0005)
  MultiSource/Applications/treecc/treecc: 1.61% (1.9270 => 1.9581, std. dev.: 0.0002)
  MultiSource/Applications/kimwitu++/kc: 1.50% (16.6129 => 16.8625, std. dev.: 0.0210)
  MultiSource/Benchmarks/MallocBench/gs/gs: 1.45% (3.8646 => 3.9207, std. dev.: 0.0084)
  MultiSource/Benchmarks/VersaBench/dbms/dbms: 1.43% (0.7710 => 0.7820, std. dev.: 0.0010)
  MultiSource/Benchmarks/Bullet/bullet: 1.36% (33.2944 => 33.7469, std. dev.: 0.0118)
  MultiSource/Applications/hexxagon/hexxagon: 1.33% (1.2292 => 1.2455, std. dev.: 0.0010)
  MultiSource/Applications/Burg/burg: 1.31% (1.1061 => 1.1206, std. dev.: 0.0029)
  MultiSource/Benchmarks/Prolangs-C/bison/mybison: 1.24% (1.3245 => 1.3409, std. dev.: 0.0012)
  MultiSource/Applications/siod/siod: 1.22% (2.1791 => 2.2056, std. dev.: 0.0033)
  MultiSource/Applications/SPASS/SPASS: 1.17% (13.8202 => 13.9817, std. dev.: 0.0104)
  MultiSource/Applications/sqlite3/sqlite3: 1.13% (16.2614 => 16.4454, std. dev.: 0.0264)
  MultiSource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: 1.06% (13.2842 => 13.4255, std. dev.: 0.0045)
  MultiSource/Benchmarks/Prolangs-C++/city/city: 1.04% (1.9828 => 2.0034, std. dev.: 0.0009)
  MultiSource/Benchmarks/Prolangs-C/TimberWolfMC/timberwolfmc: 1.02% (7.2180 => 7.2919, std. dev.: 0.0080)

Performance Regressions - Execution Time
----------------------------------------
  SingleSource/Benchmarks/Adobe-C++/stepanov_vector: 6.51% (2.5629 => 2.7298, std. dev.: 0.0001)
  MultiSource/Benchmarks/Fhourstones-3_1/fhourstones3_1: 5.90% (1.5824 => 1.6757, std. dev.: 0.0000)
  SingleSource/Benchmarks/Adobe-C++/stepanov_abstraction: 4.79% (4.5947 => 4.8148, std. dev.: 0.0001)
  MultiSource/Benchmarks/Trimaran/enc-pc1/enc-pc1: 4.19% (0.9692 => 1.0098, std. dev.: 0.0000)
  MultiSource/Benchmarks/Fhourstones/fhourstones: 3.75% (1.7688 => 1.8352, std. dev.: 0.0043)
  SingleSource/Benchmarks/Misc/salsa20: 3.68% (7.4271 => 7.7006, std. dev.: 0.0026)
  MultiSource/Applications/hexxagon/hexxagon: 2.99% (12.0921 => 12.4537, std. dev.: 0.0011)
  MultiSource/Benchmarks/ASC_Sequoia/IRSmk/IRSmk: 2.58% (8.8259 => 9.0538, std. dev.: 0.0198)
  MultiSource/Benchmarks/ASC_Sequoia/CrystalMk/CrystalMk: 2.24% (7.9275 => 8.1054, std. dev.: 0.0074)
  SingleSource/Benchmarks/Misc/ffbench: 2.19% (0.8273 => 0.8454, std. dev.: 0.0003)
  SingleSource/Benchmarks/Misc/perlin: 2.03% (6.2497 => 6.3763, std. dev.: 0.0000)
  SingleSource/Benchmarks/BenchmarkGame/recursive: 1.27% (2.0200 => 2.0457, std. dev.: 0.0003)
  MultiSource/Benchmarks/sim/sim: 1.21% (5.0168 => 5.0774, std. dev.: 0.0143)
  SingleSource/Benchmarks/Shootout/sieve: 1.17% (5.1531 => 5.2136, std. dev.: 0.0000)

Performance Improvements - Compile Time
---------------------------------------
  MultiSource/Benchmarks/Trimaran/enc-md5/enc-md5: -7.15% (0.2670 => 0.2479, std. dev.: 0.0002)
  MultiSource/Benchmarks/Trimaran/enc-3des/enc-3des: -5.84% (0.3511 => 0.3306, std. dev.: 0.0005)
  MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: -2.19% (0.8818 => 0.8625, std. dev.: 0.0010)
  MultiSource/Applications/lua/lua: -1.08% (3.7467 => 3.7064, std. dev.: 0.0051)

Performance Improvements - Execution Time
-----------------------------------------
  SingleSource/Benchmarks/Misc/richards_benchmark: -12.60% (1.1560 => 1.0104, std. dev.: 0.0012)
  MultiSource/Benchmarks/Ptrdist/yacr2/yacr2: -10.12% (1.0460 => 0.9401, std. dev.: 0.0000)
  SingleSource/Benchmarks/McGill/chomp: -9.70% (1.2308 => 1.1114, std. dev.: 0.0006)
  MultiSource/Benchmarks/Olden/perimeter/perimeter: -6.56% (0.3139 => 0.2933, std. dev.: 0.0001)
  SingleSource/Benchmarks/BenchmarkGame/nsieve-bits: -5.09% (1.3780 => 1.3079, std. dev.: 0.0103)
  MultiSource/Benchmarks/BitBench/drop3/drop3: -4.67% (0.4366 => 0.4162, std. dev.: 0.0000)
  MultiSource/Benchmarks/FreeBench/fourinarow/fourinarow: -3.73% (0.3458 => 0.3329, std. dev.: 0.0002)
  MultiSource/Benchmarks/Olden/bh/bh: -3.52% (1.9762 => 1.9066, std. dev.: 0.0004)
  SingleSource/Benchmarks/Shootout/lists: -2.17% (6.3560 => 6.2181, std. dev.: 0.0117)
  MultiSource/Applications/minisat/minisat: -1.97% (9.3358 => 9.1516, std. dev.: 0.0286)
  MultiSource/Benchmarks/VersaBench/8b10b/8b10b: -1.88% (6.6320 => 6.5071, std. dev.: 0.0021)
  SingleSource/Benchmarks/Misc/fp-convert: -1.68% (3.4016 => 3.3445, std. dev.: 0.0000)
  SingleSource/Benchmarks/Misc/ReedSolomon: -1.66% (7.1097 => 6.9915, std. dev.: 0.0003)
  SingleSource/Benchmarks/Shootout/heapsort: -1.40% (3.9173 => 3.8626, std. dev.: 0.0071)
  SingleSource/Benchmarks/CoyoteBench/huffbench: -1.25% (20.3410 => 20.0866, std. dev.: 0.0448)
  MultiSource/Benchmarks/Bullet/bullet: -1.21% (7.3305 => 7.2415, std. dev.: 0.0234)
  MultiSource/Applications/spiff/spiff: -1.18% (3.0263 => 2.9906, std. dev.: 0.0017)
  MultiSource/Applications/lua/lua: -1.10% (26.5159 => 26.2253, std. dev.: 0.0544)
  MultiSource/Benchmarks/nbench/nbench: -1.02% (11.2619 => 11.1473, std. dev.: 0.0010)

Report Time: 1.03s
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/llvm-testresults/attachments/20120420/7ee54aad/attachment.html>


More information about the llvm-testresults mailing list