[llvm-testresults] lauro-clang-x86 i686 nightly tester results

Apache apache at cs.uiuc.edu
Mon Jul 14 22:26:27 PDT 2008


http://llvm.org/nightlytest/test.php?machine=294&night=6715
Name: laurov-desktop
Nickname: lauro-clang-x86
Buildstatus: OK

New Test Passes:
None

New Test Failures:
None

Added Tests:
Benchmarks/Misc/himenobmtxpa
UnitTests/2008-07-13-InlineSetjmp


Removed Tests:
Benchmarks/Himeno/himenobmtxpa


Significant changes in test results:
LLC:
 singlesource/Benchmarks/Dhrystone/fldry: 5.84% (9.41 => 8.86)
 singlesource/Benchmarks/McGill/chomp: -24.57% (2.89 => 3.60)
 singlesource/Benchmarks/Misc/ffbench: 14.92% (4.76 => 4.05)
 singlesource/Benchmarks/Shootout/methcall: -7.70% (6.36 => 6.85)
 multisource/Benchmarks/sim/sim: 13.12% (10.52 => 9.14)
CBE:
 singlesource/Benchmarks/McGill/chomp: -11.19% (2.95 => 3.28)
 singlesource/Benchmarks/Misc/ffbench: -30.14% (3.55 => 4.62)
 singlesource/Benchmarks/Shootout/objinst: -12.40% (7.34 => 8.25)
JIT:
 singlesource/Benchmarks/McGill/chomp: -12.20% (3.28 => 3.68)
 singlesource/Benchmarks/Misc/richards_benchmark: 10.55% (2.18 => 1.95)
 singlesource/Benchmarks/Shootout/heapsort: -5.04% (7.34 => 7.71)
 multisource/Benchmarks/sim/sim: -14.29% (9.31 => 10.64)
LLC-BETA:
 singlesource/Benchmarks/McGill/chomp: -11.76% (3.06 => 3.42)
 singlesource/Benchmarks/Misc/ffbench: -31.18% (3.40 => 4.46)
LLC compile:
 multisource/Applications/JM/ldecod/ldecod: 66.20% (4.5442 => 1.5360)
 multisource/Applications/JM/lencod/lencod: 64.69% (9.5045 => 3.3561)
 multisource/Applications/SIBsim4/SIBsim4: 69.64% (0.6720 => 0.2040)
 multisource/Applications/SPASS/SPASS: 64.83% (8.9845 => 3.1602)
 multisource/Applications/d/make_dparser: 59.96% (1.7281 => 0.6920)
 multisource/Applications/treecc/treecc: 58.69% (2.1881 => 0.9040)
 multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 76.20% (3.9162 => 0.9320)
 multisource/Benchmarks/FreeBench/pifft/pifft: 65.36% (0.6120 => 0.2120)
 multisource/Benchmarks/MiBench/automotive-susan/automotive-susan: 73.56% (0.3480 => 0.0920)
 multisource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: 60.32% (1.2600 => 0.5000)
 multisource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: 57.71% (7.7084 => 3.2602)
 multisource/Benchmarks/MiBench/office-ispell/office-ispell: 60.14% (1.1440 => 0.4560)
 multisource/Benchmarks/Prolangs-C/TimberWolfMC/timberwolfmc: 65.16% (3.1001 => 1.0800)
 multisource/Benchmarks/Prolangs-C/agrep/agrep: 64.78% (0.6360 => 0.2240)
 multisource/Benchmarks/Prolangs-C/assembler/assembler: 59.81% (0.4280 => 0.1720)
 multisource/Benchmarks/Prolangs-C/bison/mybison: 68.22% (1.0320 => 0.3280)
 multisource/Benchmarks/Prolangs-C/compiler/compiler: 58.06% (0.3720 => 0.1560)
 multisource/Benchmarks/Prolangs-C/football/football: 59.41% (0.6800 => 0.2760)
 multisource/Benchmarks/Prolangs-C/unix-tbl/unix-tbl: 58.02% (0.5240 => 0.2200)
 multisource/Benchmarks/Ptrdist/yacr2/yacr2: 67.05% (0.3520 => 0.1160)
 multisource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: 65.60% (1.1280 => 0.3880)
 multisource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: 63.41% (0.6560 => 0.2400)
LLC-BETA compile:
 multisource/Applications/JM/ldecod/ldecod: 65.69% (4.6162 => 1.5840)
 multisource/Applications/JM/lencod/lencod: 65.83% (9.6686 => 3.3042)
 multisource/Applications/SIBsim4/SIBsim4: 65.12% (0.6880 => 0.2400)
 multisource/Applications/SPASS/SPASS: 65.11% (9.2285 => 3.2201)
 multisource/Applications/d/make_dparser: 63.21% (1.7721 => 0.6520)
 multisource/Applications/treecc/treecc: 59.26% (2.2681 => 0.9240)
 multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 75.08% (4.0282 => 1.0040)
 multisource/Benchmarks/FreeBench/pifft/pifft: 67.50% (0.6400 => 0.2080)
 multisource/Benchmarks/MiBench/automotive-susan/automotive-susan: 64.77% (0.3520 => 0.1240)
 multisource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: 58.10% (1.3080 => 0.5480)
 multisource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: 57.92% (7.9284 => 3.3362)
 multisource/Benchmarks/MiBench/office-ispell/office-ispell: 59.66% (1.1800 => 0.4760)
 multisource/Benchmarks/MiBench/telecomm-gsm/telecomm-gsm: 61.63% (0.3440 => 0.1320)
 multisource/Benchmarks/Prolangs-C/TimberWolfMC/timberwolfmc: 64.38% (3.2002 => 1.1400)
 multisource/Benchmarks/Prolangs-C/agrep/agrep: 62.20% (0.6560 => 0.2480)
 multisource/Benchmarks/Prolangs-C/assembler/assembler: 64.22% (0.4360 => 0.1560)
 multisource/Benchmarks/Prolangs-C/bison/mybison: 68.82% (1.0520 => 0.3280)
 multisource/Benchmarks/Prolangs-C/compiler/compiler: 61.86% (0.3880 => 0.1480)
 multisource/Benchmarks/Prolangs-C/football/football: 58.38% (0.6920 => 0.2880)
 multisource/Benchmarks/Prolangs-C/simulator/simulator: 59.60% (0.3960 => 0.1600)
 multisource/Benchmarks/Prolangs-C/unix-tbl/unix-tbl: 52.90% (0.5520 => 0.2600)
 multisource/Benchmarks/Ptrdist/yacr2/yacr2: 69.57% (0.3680 => 0.1120)
 multisource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: 60.00% (1.1800 => 0.4720)
 multisource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: 62.28% (0.6680 => 0.2520)
GCCAS:
 multisource/Applications/SPASS/SPASS: -6.03% (9.8246 => 10.4166)




More information about the llvm-testresults mailing list