[llvm-testresults] lauro-clang-x86 i686 nightly tester results

Apache apache at cs.uiuc.edu
Mon Feb 25 21:12:19 PST 2008


http://llvm.org/nightlytest/test.php?machine=245&night=5091
Name: laurov-desktop
Nickname: lauro-clang-x86
Buildstatus: OK

New Test Passes:
None

New Test Failures:
None

Added Tests:
None

Removed Tests:
None

Significant changes in test results:
CBE:
 singlesource/Benchmarks/CoyoteBench/lpbench: 17.73% (17.37 => 14.29)
 singlesource/Benchmarks/Dhrystone/dry: -352.44% (0.82 => 3.71)
 singlesource/Benchmarks/Dhrystone/fldry: 17.75% (13.80 => 11.35)
 singlesource/Benchmarks/Misc/flops: -5.57% (14.36 => 15.16)
 singlesource/Benchmarks/Shootout/methcall: 12.10% (11.82 => 10.39)
 multisource/Applications/viterbi/viterbi: 38.66% (23.54 => 14.44)
 multisource/Benchmarks/Fhourstones/fhourstones: 32.92% (4.07 => 2.73)
 multisource/Benchmarks/FreeBench/distray/distray: 27.85% (0.79 => 0.57)
 multisource/Benchmarks/FreeBench/fourinarow/fourinarow: 35.27% (2.41 => 1.56)
 multisource/Benchmarks/MiBench/telecomm-CRC32/telecomm-CRC32: -73.91% (0.46 => 0.80)
 multisource/Benchmarks/Olden/bisort/bisort: 29.73% (1.85 => 1.30)
LLC-BETA:
 singlesource/Benchmarks/CoyoteBench/lpbench: 16.34% (17.07 => 14.28)
 singlesource/Benchmarks/Dhrystone/dry: -55.21% (5.76 => 8.94)
 singlesource/Benchmarks/Dhrystone/fldry: -44.07% (8.69 => 12.52)
 singlesource/Benchmarks/Misc/flops: -6.83% (17.86 => 19.08)
 multisource/Applications/viterbi/viterbi: 43.89% (29.03 => 16.29)
 multisource/Benchmarks/Fhourstones/fhourstones: 37.85% (4.65 => 2.89)
 multisource/Benchmarks/FreeBench/fourinarow/fourinarow: 30.77% (2.60 => 1.80)
 multisource/Benchmarks/MiBench/telecomm-CRC32/telecomm-CRC32: -71.74% (0.46 => 0.79)
 multisource/Benchmarks/Olden/bisort/bisort: 28.50% (1.93 => 1.38)
 multisource/Benchmarks/Olden/em3d/em3d: 6.13% (4.73 => 4.44)
 multisource/Benchmarks/OptimizerEval/optimizer-eval: 6.45% (122.84 => 114.92)
 multisource/Benchmarks/Ptrdist/bc/bc: -60.00% (1.35 => 2.16)
 multisource/Benchmarks/SciMark2-C/scimark2: 6.23% (28.90 => 27.10)
 multisource/Benchmarks/llubenchmark/llu: 19.83% (11.80 => 9.46)
 multisource/Benchmarks/sim/sim: 32.63% (16.06 => 10.82)
LLC:
 singlesource/Benchmarks/Dhrystone/dry: 35.75% (8.95 => 5.75)
 singlesource/Benchmarks/Dhrystone/fldry: 23.35% (11.99 => 9.19)
 singlesource/Benchmarks/Shootout/lists: -59.57% (8.88 => 14.17)
 multisource/Applications/SIBsim4/SIBsim4: 51.44% (13.86 => 6.73)
 multisource/Applications/SPASS/SPASS: -15.61% (19.15 => 22.14)
 multisource/Applications/viterbi/viterbi: 9.48% (18.03 => 16.32)
 multisource/Benchmarks/Fhourstones/fhourstones: 37.58% (4.63 => 2.89)
 multisource/Benchmarks/FreeBench/fourinarow/fourinarow: 28.17% (2.52 => 1.81)
 multisource/Benchmarks/FreeBench/mason/mason: 52.38% (0.42 => 0.20)
 multisource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: -87.10% (0.31 => 0.58)
 multisource/Benchmarks/MiBench/telecomm-CRC32/telecomm-CRC32: -71.74% (0.46 => 0.79)
 multisource/Benchmarks/Olden/bisort/bisort: 20.93% (1.72 => 1.36)
 multisource/Benchmarks/Olden/em3d/em3d: 47.94% (8.72 => 4.54)
 multisource/Benchmarks/OptimizerEval/optimizer-eval: -5.00% (107.37 => 112.74)
 multisource/Benchmarks/Ptrdist/bc/bc: 42.73% (2.20 => 1.26)
 multisource/Benchmarks/SciMark2-C/scimark2: 6.92% (28.91 => 26.91)
 multisource/Benchmarks/VersaBench/8b10b/8b10b: -14.44% (8.38 => 9.59)
 multisource/Benchmarks/sim/sim: 39.53% (16.04 => 9.70)
JIT:
 multisource/Applications/JM/lencod/lencod: 31.29% (30.36 => 20.86)
 multisource/Applications/SIBsim4/SIBsim4: -12.08% (8.61 => 9.65)
 multisource/Applications/spiff/spiff: 35.65% (2.16 => 1.39)
 multisource/Applications/viterbi/viterbi: 34.32% (25.00 => 16.42)
 multisource/Benchmarks/FreeBench/fourinarow/fourinarow: 29.50% (2.78 => 1.96)
 multisource/Benchmarks/FreeBench/mason/mason: 48.94% (0.47 => 0.24)
 multisource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: -63.64% (0.55 => 0.90)
 multisource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: -60.03% (6.48 => 10.37)
 multisource/Benchmarks/MiBench/office-ispell/office-ispell: -64.29% (0.42 => 0.69)
 multisource/Benchmarks/MiBench/telecomm-CRC32/telecomm-CRC32: -92.68% (0.41 => 0.79)
 multisource/Benchmarks/Olden/bisort/bisort: 27.27% (1.98 => 1.44)
 multisource/Benchmarks/Olden/em3d/em3d: 12.52% (5.75 => 5.03)
 multisource/Benchmarks/Olden/perimeter/perimeter: 35.48% (0.62 => 0.40)
 multisource/Benchmarks/Prolangs-C/bison/mybison: -55.77% (1.04 => 1.62)
 multisource/Benchmarks/Ptrdist/bc/bc: 41.84% (3.37 => 1.96)
 multisource/Benchmarks/SciMark2-C/scimark2: 14.81% (29.17 => 24.85)
 multisource/Benchmarks/llubenchmark/llu: -8.93% (10.19 => 11.10)
 multisource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: -53.57% (0.56 => 0.86)
 multisource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: -58.49% (0.53 => 0.84)
 multisource/Benchmarks/sim/sim: 40.86% (16.57 => 9.80)
GCCAS:
 multisource/Applications/SIBsim4/SIBsim4: 41.38% (0.9280 => 0.5440)
 multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -67.55% (3.0201 => 5.0603)
 multisource/Benchmarks/MiBench/office-ispell/office-ispell: -52.60% (0.6920 => 1.0560)
 multisource/Benchmarks/MiBench/security-rijndael/security-rijndael: -80.93% (0.8600 => 1.5560)
 multisource/Benchmarks/mediabench/gsm/toast/toast: -60.42% (0.3840 => 0.6160)
 multisource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: -48.39% (1.3640 => 2.0241)
 multisource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: -51.22% (0.4920 => 0.7440)
LLC compile:
 multisource/Applications/SIBsim4/SIBsim4: 38.04% (1.1040 => 0.6840)
 multisource/Applications/spiff/spiff: 36.99% (0.5840 => 0.3680)
 multisource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: -60.09% (1.0520 => 1.6841)
 multisource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: -63.27% (7.1444 => 11.6647)
 multisource/Benchmarks/MiBench/office-ispell/office-ispell: -58.57% (1.0040 => 1.5920)
 multisource/Benchmarks/Prolangs-C/bison/mybison: -63.86% (0.9960 => 1.6320)
 multisource/Benchmarks/Ptrdist/bc/bc: 39.45% (1.1560 => 0.7000)
 multisource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: -58.91% (0.8080 => 1.2840)
 multisource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: -64.33% (0.6280 => 1.0320)
LLC-BETA compile:
 multisource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: -61.61% (1.0520 => 1.7001)
 multisource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: -46.98% (7.1604 => 10.5246)
 multisource/Benchmarks/Prolangs-C/bison/mybison: -58.70% (0.9880 => 1.5680)
 multisource/Benchmarks/Prolangs-C/football/football: 39.90% (0.7920 => 0.4760)
 multisource/Benchmarks/Ptrdist/bc/bc: -54.64% (0.7320 => 1.1320)
 multisource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: -45.05% (0.8880 => 1.2880)
 multisource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: -64.33% (0.6280 => 1.0320)




More information about the llvm-testresults mailing list