[llvm-testresults] lauro-clang-x86 i686 nightly tester results

Apache apache at cs.uiuc.edu
Fri Apr 25 22:12:22 PDT 2008


http://llvm.org/nightlytest/test.php?machine=245&night=5853
Name: laurov-desktop
Nickname: lauro-clang-x86
Buildstatus: OK

New Test Passes:
None

New Test Failures:
UnitTests/printargs [GCCAS, Bytecode, LLC compile, LLC-BETA compile, JIT codegen, CBE, LLC, LLC-BETA, JIT] 


Added Tests:
None

Removed Tests:
None

Significant changes in test results:
CBE:
 singlesource/Benchmarks/BenchmarkGame/fannkuch: 8.84% (6.11 => 5.57)
 singlesource/Benchmarks/BenchmarkGame/recursive: 37.28% (4.56 => 2.86)
 singlesource/Benchmarks/CoyoteBench/lpbench: 8.86% (16.37 => 14.92)
 singlesource/Benchmarks/McGill/chomp: 46.99% (5.64 => 2.99)
 singlesource/Benchmarks/McGill/queens: 27.98% (6.04 => 4.35)
 singlesource/Benchmarks/Misc/ReedSolomon: 15.33% (12.39 => 10.49)
 singlesource/Benchmarks/Misc/ffbench: 29.87% (6.16 => 4.32)
 singlesource/Benchmarks/Misc/flops-8: 23.72% (5.86 => 4.47)
 singlesource/Benchmarks/Misc/oourafft: 12.84% (15.81 => 13.78)
 singlesource/Benchmarks/Shootout/ary3: -5.16% (6.01 => 6.32)
 singlesource/Benchmarks/Shootout/hash: 6.78% (7.38 => 6.88)
 singlesource/Benchmarks/Shootout/lists: 5.14% (11.28 => 10.70)
 singlesource/Benchmarks/Shootout/methcall: 5.29% (10.96 => 10.38)
 singlesource/Benchmarks/Shootout/objinst: 14.01% (8.71 => 7.49)
 singlesource/Benchmarks/Shootout/sieve: 14.21% (11.89 => 10.20)
 multisource/Applications/JM/lencod/lencod: 7.83% (16.98 => 15.65)
 multisource/Applications/SIBsim4/SIBsim4: 10.99% (4.55 => 4.05)
 multisource/Applications/aha/aha: 13.04% (4.60 => 4.00)
 multisource/Applications/viterbi/viterbi: 27.35% (19.49 => 14.16)
 multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 20.50% (1.61 => 1.28)
 multisource/Benchmarks/Fhourstones/fhourstones: 30.54% (4.65 => 3.23)
 multisource/Benchmarks/McCat/12-IOtest/iotest: 40.30% (0.67 => 0.40)
 multisource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: 40.29% (1.39 => 0.83)
 multisource/Benchmarks/MiBench/consumer-lame/consumer-lame: 36.63% (1.01 => 0.64)
 multisource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: 42.86% (0.56 => 0.32)
 multisource/Benchmarks/MiBench/network-patricia/network-patricia: 54.35% (0.46 => 0.21)
 multisource/Benchmarks/Olden/bisort/bisort: 28.99% (2.07 => 1.47)
 multisource/Benchmarks/Olden/em3d/em3d: 40.81% (7.67 => 4.54)
 multisource/Benchmarks/Olden/power/power: 15.38% (4.55 => 3.85)
 multisource/Benchmarks/Olden/treeadd/treeadd: 16.55% (7.01 => 5.85)
 multisource/Benchmarks/Olden/tsp/tsp: 37.22% (4.54 => 2.85)
 multisource/Benchmarks/Ptrdist/bc/bc: 22.15% (1.49 => 1.16)
LLC-BETA:
 singlesource/Benchmarks/BenchmarkGame/fannkuch: 11.25% (5.51 => 4.89)
 singlesource/Benchmarks/BenchmarkGame/n-body: 10.54% (4.27 => 3.82)
 singlesource/Benchmarks/BenchmarkGame/recursive: 10.99% (2.73 => 2.43)
 singlesource/Benchmarks/McGill/chomp: 40.08% (4.99 => 2.99)
 singlesource/Benchmarks/McGill/queens: 32.02% (6.34 => 4.31)
 singlesource/Benchmarks/Misc/fbench: 14.64% (8.88 => 7.58)
 singlesource/Benchmarks/Misc/ffbench: 48.35% (6.37 => 3.29)
 singlesource/Benchmarks/Misc/flops-1: 10.43% (4.22 => 3.78)
 singlesource/Benchmarks/Misc/flops-3: 27.45% (7.25 => 5.26)
 singlesource/Benchmarks/Misc/flops-5: 5.88% (7.65 => 7.20)
 singlesource/Benchmarks/Shootout/ary3: -7.46% (5.90 => 6.34)
 singlesource/Benchmarks/Shootout/fib2: 5.35% (4.30 => 4.07)
 singlesource/Benchmarks/Shootout/matrix: 15.63% (4.35 => 3.67)
 singlesource/Benchmarks/Shootout/sieve: 12.79% (11.88 => 10.36)
 multisource/Applications/JM/lencod/lencod: 29.62% (20.02 => 14.09)
 multisource/Applications/SIBsim4/SIBsim4: 24.63% (9.50 => 7.16)
 multisource/Applications/viterbi/viterbi: 7.12% (17.69 => 16.43)
 multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 14.56% (1.58 => 1.35)
 multisource/Benchmarks/Fhourstones/fhourstones: 32.91% (4.77 => 3.20)
 multisource/Benchmarks/McCat/12-IOtest/iotest: 41.03% (0.78 => 0.46)
 multisource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: 34.81% (1.35 => 0.88)
 multisource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: 43.86% (0.57 => 0.32)
 multisource/Benchmarks/MiBench/network-patricia/network-patricia: 55.32% (0.47 => 0.21)
 multisource/Benchmarks/Olden/bisort/bisort: 27.36% (2.12 => 1.54)
 multisource/Benchmarks/Olden/em3d/em3d: 16.13% (5.27 => 4.42)
 multisource/Benchmarks/Olden/power/power: 26.69% (5.02 => 3.68)
 multisource/Benchmarks/Ptrdist/ft/ft: 22.02% (2.77 => 2.16)
 multisource/Benchmarks/SciMark2-C/scimark2: -29.13% (24.37 => 31.47)
JIT:
 singlesource/Benchmarks/BenchmarkGame/n-body: 7.65% (3.79 => 3.50)
 singlesource/Benchmarks/BenchmarkGame/nsieve-bits: 9.97% (3.11 => 2.80)
 singlesource/Benchmarks/BenchmarkGame/recursive: 11.47% (2.79 => 2.47)
 singlesource/Benchmarks/CoyoteBench/lpbench: 19.92% (18.57 => 14.87)
 singlesource/Benchmarks/McGill/chomp: 42.99% (5.49 => 3.13)
 singlesource/Benchmarks/McGill/queens: 17.86% (5.32 => 4.37)
 singlesource/Benchmarks/Misc/fbench: 14.76% (9.08 => 7.74)
 singlesource/Benchmarks/Misc/ffbench: 38.71% (6.69 => 4.10)
 singlesource/Benchmarks/Misc/flops-1: 31.16% (5.52 => 3.80)
 singlesource/Benchmarks/Misc/flops-3: 7.05% (5.67 => 5.27)
 singlesource/Benchmarks/Misc/flops-6: 6.12% (6.05 => 5.68)
 singlesource/Benchmarks/Misc/flops-8: 16.54% (5.32 => 4.44)
 singlesource/Benchmarks/Misc/oourafft: 11.52% (13.97 => 12.36)
 singlesource/Benchmarks/Misc/whetstone: 6.10% (6.07 => 5.70)
 singlesource/Benchmarks/Shootout/fib2: 26.56% (5.61 => 4.12)
 singlesource/Benchmarks/Shootout/hash: 5.97% (7.37 => 6.93)
 singlesource/Benchmarks/Shootout/sieve: 12.03% (12.80 => 11.26)
 multisource/Applications/JM/lencod/lencod: 13.63% (28.17 => 24.33)
 multisource/Applications/SIBsim4/SIBsim4: 18.10% (10.94 => 8.96)
 multisource/Applications/viterbi/viterbi: 32.01% (24.27 => 16.50)
 multisource/Benchmarks/McCat/12-IOtest/iotest: 45.24% (0.84 => 0.46)
 multisource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: 39.60% (1.49 => 0.90)
 multisource/Benchmarks/MiBench/automotive-susan/automotive-susan: 41.46% (0.82 => 0.48)
 multisource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: 39.12% (12.55 => 7.64)
 multisource/Benchmarks/MiBench/network-patricia/network-patricia: 52.73% (0.55 => 0.26)
 multisource/Benchmarks/MiBench/office-ispell/office-ispell: 41.49% (0.94 => 0.55)
 multisource/Benchmarks/Olden/bisort/bisort: 29.91% (2.24 => 1.57)
 multisource/Benchmarks/Olden/em3d/em3d: 48.97% (8.76 => 4.47)
 multisource/Benchmarks/Olden/power/power: 26.86% (5.10 => 3.73)
 multisource/Benchmarks/Olden/treeadd/treeadd: 34.50% (8.87 => 5.81)
 multisource/Benchmarks/Olden/tsp/tsp: 37.99% (5.66 => 3.51)
LLC:
 singlesource/Benchmarks/BenchmarkGame/partialsums: 13.11% (2.44 => 2.12)
 singlesource/Benchmarks/CoyoteBench/lpbench: 8.04% (16.17 => 14.87)
 singlesource/Benchmarks/McGill/chomp: 45.86% (5.80 => 3.14)
 singlesource/Benchmarks/Misc/fbench: 15.50% (8.97 => 7.58)
 singlesource/Benchmarks/Misc/ffbench: 31.05% (5.54 => 3.82)
 singlesource/Benchmarks/Misc/flops-1: 31.52% (5.52 => 3.78)
 singlesource/Benchmarks/Misc/flops-2: 7.29% (2.88 => 2.67)
 singlesource/Benchmarks/Misc/flops-3: 13.91% (6.11 => 5.26)
 singlesource/Benchmarks/Misc/oourafft: 14.45% (14.60 => 12.49)
 singlesource/Benchmarks/Misc/richards_benchmark: 45.61% (2.85 => 1.55)
 singlesource/Benchmarks/Shootout/hash: 10.78% (7.61 => 6.79)
 singlesource/Benchmarks/Shootout/objinst: 16.55% (8.94 => 7.46)
 singlesource/Benchmarks/Shootout/sieve: 17.86% (12.54 => 10.30)
 singlesource/UnitTests/Vector/multiplies: -13.30% (6.69 => 7.58)
 multisource/Applications/aha/aha: 8.56% (4.79 => 4.38)
 multisource/Applications/viterbi/viterbi: 30.87% (23.84 => 16.48)
 multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 15.79% (1.52 => 1.28)
 multisource/Benchmarks/Fhourstones/fhourstones: 28.15% (4.44 => 3.19)
 multisource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: 36.03% (1.36 => 0.87)
 multisource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: 42.86% (0.56 => 0.32)
 multisource/Benchmarks/Olden/em3d/em3d: 48.05% (8.72 => 4.53)
 multisource/Benchmarks/Olden/power/power: 26.69% (5.02 => 3.68)
 multisource/Benchmarks/Olden/treeadd/treeadd: 33.22% (8.67 => 5.79)
 multisource/Benchmarks/Olden/tsp/tsp: 36.91% (5.50 => 3.47)
 multisource/Benchmarks/Ptrdist/yacr2/yacr2: 30.67% (2.25 => 1.56)
GCCAS:
 multisource/Applications/Burg/burg: 37.55% (1.0760 => 0.6720)
 multisource/Applications/ClamAV/clamscan: 22.32% (9.6405 => 7.4884)
 multisource/Applications/JM/ldecod/ldecod: 10.14% (3.8642 => 3.4722)
 multisource/Applications/SPASS/SPASS: 13.20% (5.9403 => 5.1563)
 multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 5.27% (6.8364 => 6.4764)
 multisource/Benchmarks/MiBench/automotive-susan/automotive-susan: 42.68% (0.6560 => 0.3760)
 multisource/Benchmarks/MiBench/consumer-lame/consumer-lame: 41.87% (3.0762 => 1.7881)
 multisource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: 42.08% (16.7890 => 9.7246)
 multisource/Benchmarks/MiBench/office-ispell/office-ispell: 38.80% (1.1960 => 0.7320)
 multisource/Benchmarks/MiBench/security-rijndael/security-rijndael: 34.01% (1.3880 => 0.9160)
LLC-BETA compile:
 multisource/Applications/JM/ldecod/ldecod: 10.67% (5.5883 => 4.9923)
 multisource/Applications/JM/lencod/lencod: 39.38% (17.9611 => 10.8886)
 multisource/Applications/SPASS/SPASS: 8.63% (10.7046 => 9.7806)
 multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 39.38% (7.9644 => 4.8283)
 multisource/Benchmarks/MiBench/automotive-susan/automotive-susan: 35.37% (0.5880 => 0.3800)
 multisource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: 25.19% (11.5767 => 8.6605)
 multisource/Benchmarks/MiBench/office-ispell/office-ispell: 15.58% (1.5920 => 1.3440)
LLC compile:
 multisource/Applications/JM/lencod/lencod: 26.35% (14.8009 => 10.9006)
 multisource/Applications/SPASS/SPASS: 6.13% (10.4406 => 9.8006)
 multisource/Applications/treecc/treecc: 26.22% (3.3722 => 2.4881)
 multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 37.47% (7.7404 => 4.8402)
 multisource/Benchmarks/MiBench/automotive-susan/automotive-susan: 43.11% (0.6680 => 0.3800)
 multisource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: 39.23% (14.3248 => 8.7045)
 multisource/Benchmarks/MiBench/office-ispell/office-ispell: 40.18% (2.2801 => 1.3640)
 multisource/Benchmarks/Prolangs-C/agrep/agrep: 23.63% (0.9480 => 0.7240)




More information about the llvm-testresults mailing list