[llvm-testresults] lab-mini-01__O3-plain__clang_DEV__x86_64 test results

llvm-testresults at cs.uiuc.edu llvm-testresults at cs.uiuc.edu
Tue Apr 17 04:18:09 PDT 2012


http://llvm.org/perf/db_default/v4/nts/299?compare_to=298&baseline=6
Nickname: lab-mini-01__O3-plain__clang_DEV__x86_64:1
Comparing:
     Run: 299, Order: 154916, Start Time: 2012-04-17 10:22:24, End Time: 2012-04-17 11:18:23
      To: 298, Order: 154915, Start Time: 2012-04-17 09:00:32, End Time: 2012-04-17 09:56:32
Baseline: 6, Order:  153879, Start Time: 2012-04-02 17:48:39, End Time: 2012-04-02 18:44:53

===============
Tests Summary
===============

Performance Regressions: 0 (34 on baseline)
Performance Improvements: 0 (22 on baseline)
Unchanged Tests: 888 (832 on baseline)
Total Tests: 888

===========================
Run-Over-Run Changes Detail
===========================
================================
Run-Over-Baseline Changes Detail
================================
Performance Regressions - Compile Time
--------------------------------------
  MultiSource/Benchmarks/MiBench/security-blowfish/security-blowfish: 4.01% (0.2721 => 0.2830, std. dev.: 0.0002)
  MultiSource/Applications/spiff/spiff: 1.81% (0.7506 => 0.7642, std. dev.: 0.0006)
  MultiSource/Benchmarks/Prolangs-C/football/football: 1.78% (0.8310 => 0.8458, std. dev.: 0.0057)
  SingleSource/UnitTests/ObjC++/property-reference-object: 1.75% (0.5818 => 0.5920, std. dev.: 0.0003)
  SingleSource/UnitTests/ObjC++/property-reference: 1.71% (0.7061 => 0.7182, std. dev.: 0.0021)
  MultiSource/Applications/kimwitu++/kc: 1.65% (16.6129 => 16.8862, std. dev.: 0.0122)
  MultiSource/Benchmarks/MiBench/office-ispell/office-ispell: 1.59% (1.4563 => 1.4794, std. dev.: 0.0007)
  MultiSource/Applications/treecc/treecc: 1.57% (1.9270 => 1.9573, std. dev.: 0.0052)
  MultiSource/Benchmarks/Prolangs-C/unix-tbl/unix-tbl: 1.51% (0.9199 => 0.9338, std. dev.: 0.0007)
  MultiSource/Applications/Burg/burg: 1.47% (1.1061 => 1.1224, std. dev.: 0.0015)
  MultiSource/Benchmarks/Prolangs-C/agrep/agrep: 1.43% (1.2193 => 1.2367, std. dev.: 0.0006)
  MultiSource/Applications/siod/siod: 1.34% (2.1791 => 2.2084, std. dev.: 0.0035)
  MultiSource/Benchmarks/Bullet/bullet: 1.33% (33.2944 => 33.7359, std. dev.: 0.0128)
  MultiSource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: 1.21% (13.2842 => 13.4446, std. dev.: 0.0134)
  MultiSource/Applications/SPASS/SPASS: 1.16% (13.8202 => 13.9801, std. dev.: 0.0084)
  MultiSource/Benchmarks/MallocBench/gs/gs: 1.14% (3.8646 => 3.9088, std. dev.: 0.0074)
  MultiSource/Benchmarks/Prolangs-C/bison/mybison: 1.11% (1.3245 => 1.3392, std. dev.: 0.0007)
  MultiSource/Benchmarks/Prolangs-C/TimberWolfMC/timberwolfmc: 1.09% (7.2180 => 7.2968, std. dev.: 0.0077)
  MultiSource/Applications/hexxagon/hexxagon: 1.08% (1.2292 => 1.2425, std. dev.: 0.0004)
  MultiSource/Applications/sqlite3/sqlite3: 1.08% (16.2614 => 16.4373, std. dev.: 0.0176)

Performance Regressions - Execution Time
----------------------------------------
  SingleSource/Benchmarks/Adobe-C++/stepanov_vector: 6.51% (2.5629 => 2.7298, std. dev.: 0.0004)
  MultiSource/Benchmarks/Fhourstones-3_1/fhourstones3_1: 5.90% (1.5824 => 1.6757, std. dev.: 0.0002)
  SingleSource/Benchmarks/Adobe-C++/stepanov_abstraction: 4.79% (4.5947 => 4.8147, std. dev.: 0.0001)
  MultiSource/Benchmarks/Trimaran/enc-pc1/enc-pc1: 4.19% (0.9692 => 1.0098, std. dev.: 0.0000)
  MultiSource/Benchmarks/Fhourstones/fhourstones: 3.74% (1.7688 => 1.8350, std. dev.: 0.0059)
  SingleSource/Benchmarks/Misc/salsa20: 3.64% (7.4271 => 7.6974, std. dev.: 0.0037)
  MultiSource/Applications/hexxagon/hexxagon: 2.99% (12.0921 => 12.4534, std. dev.: 0.0003)
  MultiSource/Benchmarks/ASC_Sequoia/IRSmk/IRSmk: 2.78% (8.8259 => 9.0712, std. dev.: 0.0150)
  MultiSource/Benchmarks/ASC_Sequoia/CrystalMk/CrystalMk: 2.40% (7.9275 => 8.1179, std. dev.: 0.0144)
  SingleSource/Benchmarks/Misc/ffbench: 2.26% (0.8273 => 0.8460, std. dev.: 0.0000)
  SingleSource/Benchmarks/Misc/perlin: 2.02% (6.2497 => 6.3762, std. dev.: 0.0001)
  SingleSource/Benchmarks/BenchmarkGame/recursive: 1.29% (2.0200 => 2.0460, std. dev.: 0.0005)
  MultiSource/Applications/lambda-0_1_3/lambda: 1.22% (5.3167 => 5.3818, std. dev.: 0.0032)
  SingleSource/Benchmarks/Shootout/sieve: 1.17% (5.1531 => 5.2135, std. dev.: 0.0000)

Performance Improvements - Compile Time
---------------------------------------
  MultiSource/Benchmarks/Trimaran/enc-md5/enc-md5: -7.30% (0.2670 => 0.2475, std. dev.: 0.0002)
  MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: -1.91% (0.8818 => 0.8650, std. dev.: 0.0017)
  MultiSource/Applications/lua/lua: -1.08% (3.7467 => 3.7062, std. dev.: 0.0025)

Performance Improvements - Execution Time
-----------------------------------------
  SingleSource/Benchmarks/Misc/richards_benchmark: -12.06% (1.1560 => 1.0166, std. dev.: 0.0015)
  MultiSource/Benchmarks/Ptrdist/yacr2/yacr2: -10.12% (1.0460 => 0.9401, std. dev.: 0.0001)
  SingleSource/Benchmarks/McGill/chomp: -9.69% (1.2308 => 1.1115, std. dev.: 0.0002)
  MultiSource/Benchmarks/Olden/perimeter/perimeter: -6.53% (0.3139 => 0.2934, std. dev.: 0.0003)
  SingleSource/Benchmarks/BenchmarkGame/nsieve-bits: -4.72% (1.3780 => 1.3130, std. dev.: 0.0048)
  MultiSource/Benchmarks/BitBench/drop3/drop3: -4.67% (0.4366 => 0.4162, std. dev.: 0.0009)
  MultiSource/Benchmarks/FreeBench/fourinarow/fourinarow: -3.59% (0.3458 => 0.3334, std. dev.: 0.0000)
  MultiSource/Benchmarks/Olden/bh/bh: -3.51% (1.9762 => 1.9068, std. dev.: 0.0006)
  MultiSource/Applications/minisat/minisat: -2.09% (9.3358 => 9.1404, std. dev.: 0.0165)
  SingleSource/Benchmarks/Shootout/lists: -1.97% (6.3560 => 6.2309, std. dev.: 0.0055)
  MultiSource/Benchmarks/VersaBench/8b10b/8b10b: -1.95% (6.6320 => 6.5028, std. dev.: 0.0044)
  SingleSource/Benchmarks/Misc/fp-convert: -1.68% (3.4016 => 3.3445, std. dev.: 0.0000)
  SingleSource/Benchmarks/Misc/ReedSolomon: -1.66% (7.1097 => 6.9914, std. dev.: 0.0001)
  SingleSource/Benchmarks/Shootout/heapsort: -1.35% (3.9173 => 3.8643, std. dev.: 0.0162)
  SingleSource/Benchmarks/CoyoteBench/huffbench: -1.35% (20.3410 => 20.0666, std. dev.: 0.0470)
  MultiSource/Benchmarks/Bullet/bullet: -1.28% (7.3305 => 7.2368, std. dev.: 0.0199)
  MultiSource/Applications/lua/lua: -1.24% (26.5159 => 26.1875, std. dev.: 0.0405)
  MultiSource/Applications/spiff/spiff: -1.20% (3.0263 => 2.9900, std. dev.: 0.0019)
  MultiSource/Benchmarks/nbench/nbench: -1.01% (11.2619 => 11.1485, std. dev.: 0.0007)

Report Time: 1.46s
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/llvm-testresults/attachments/20120417/33daa07c/attachment.html>


More information about the llvm-testresults mailing list