[llvm-testresults] cfarm x86_64 nightly tester results

Apache apache at cs.uiuc.edu
Fri Dec 19 07:26:14 PST 2008


http://llvm.org/nightlytest/test.php?machine=302&night=8979
Name: gcc11
Nickname: cfarm
Buildstatus: OK

New Test Passes:
None

New Test Failures:
None

Added Tests:
test/Analysis/BasicAA/nocapture.ll
test/CodeGen/X86/widen_arith-1.ll
test/CodeGen/X86/widen_arith-2.ll
test/CodeGen/X86/widen_arith-3.ll
test/CodeGen/X86/widen_arith-4.ll
test/CodeGen/X86/widen_arith-5.ll
test/CodeGen/X86/widen_arith-6.ll
test/CodeGen/X86/widen_cast-1.ll
test/CodeGen/X86/widen_cast-2.ll
test/CodeGen/X86/widen_cast-3.ll
test/CodeGen/X86/widen_cast-4.ll
test/CodeGen/X86/widen_cast-5.ll
test/CodeGen/X86/widen_conv-1.ll
test/CodeGen/X86/widen_conv-2.ll
test/CodeGen/X86/widen_conv-3.ll
test/CodeGen/X86/widen_conv-4.ll
test/CodeGen/X86/widen_select-1.ll
test/CodeGen/X86/widen_shuffle-1.ll
test/CodeGen/X86/widen_shuffle-2.ll


Removed Tests:
None

Significant changes in test results:
JIT:
 singlesource/Benchmarks/CoyoteBench/fftbench: -27.29% (4.14 => 5.27)
 singlesource/Benchmarks/CoyoteBench/lpbench: -7.20% (15.13 => 16.22)
 singlesource/Benchmarks/Misc/flops: -15.37% (12.88 => 14.86)
 singlesource/Benchmarks/Shootout-C++/ackermann: -8.98% (3.23 => 3.52)
 singlesource/Benchmarks/Shootout/methcall: -5.15% (8.55 => 8.99)
 singlesource/Benchmarks/Shootout/objinst: 8.94% (7.05 => 6.42)
 multisource/Applications/Burg/burg: 6.57% (4.72 => 4.41)
 multisource/Applications/aha/aha: 10.31% (5.82 => 5.22)
 multisource/Benchmarks/Olden/treeadd/treeadd: 6.91% (7.96 => 7.41)
 multisource/Benchmarks/Prolangs-C++/life/life: -13.36% (2.62 => 2.97)
LLC:
 singlesource/Benchmarks/CoyoteBench/fftbench: 22.11% (4.84 => 3.77)
 singlesource/Benchmarks/Misc/ffbench: 12.57% (3.74 => 3.27)
 singlesource/Benchmarks/Shootout-C++/hash2: 9.33% (8.57 => 7.77)
 multisource/Applications/lambda-0.1.3/lambda: -5.72% (7.69 => 8.13)
LLC-BETA:
 singlesource/Benchmarks/Misc/ffbench: 8.52% (3.99 => 3.65)
 singlesource/Benchmarks/Misc/flops-1: 8.39% (4.29 => 3.93)
 multisource/Benchmarks/Olden/treeadd/treeadd: -8.01% (7.24 => 7.82)
 multisource/Benchmarks/VersaBench/bmm/bmm: 8.11% (8.38 => 7.70)
CBE:
 singlesource/Benchmarks/Shootout/methcall: 13.41% (9.84 => 8.52)
 multisource/Benchmarks/Olden/treeadd/treeadd: -8.26% (7.14 => 7.73)
 multisource/Benchmarks/VersaBench/bmm/bmm: 5.10% (8.83 => 8.38)
GCCAS:
 multisource/Applications/Burg/burg: 5.03% (4.0562 => 3.8522)
 multisource/Applications/JM/ldecod/ldecod: 5.73% (23.0294 => 21.7093)
 multisource/Applications/JM/lencod/lencod: 5.91% (50.9631 => 47.9510)
 multisource/Applications/SIBsim4/SIBsim4: 6.84% (3.7402 => 3.4842)
 multisource/Applications/d/make_dparser: 5.31% (10.3886 => 9.8366)
 multisource/Applications/lua/lua: 5.54% (14.3809 => 13.5848)
 multisource/Applications/oggenc/oggenc: 5.99% (12.8287 => 12.0607)
 multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 7.92% (25.0895 => 23.1014)
 multisource/Benchmarks/MallocBench/espresso/espresso: 7.00% (11.3126 => 10.5206)
 multisource/Benchmarks/MallocBench/gs/gs: 5.10% (10.3526 => 9.8246)
 multisource/Benchmarks/MiBench/automotive-susan/automotive-susan: 8.29% (2.5561 => 2.3441)
 multisource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: 6.34% (10.7966 => 10.1126)
 multisource/Benchmarks/MiBench/consumer-lame/consumer-lame: 6.26% (10.4246 => 9.7725)
 multisource/Benchmarks/MiBench/office-ispell/office-ispell: 5.55% (4.4002 => 4.1562)
 multisource/Benchmarks/Prolangs-C/TimberWolfMC/timberwolfmc: 6.56% (15.6809 => 14.6529)
 multisource/Benchmarks/Prolangs-C/agrep/agrep: 6.90% (3.0161 => 2.8081)
 multisource/Benchmarks/Prolangs-C/bison/mybison: 7.68% (2.7081 => 2.5001)
 multisource/Benchmarks/Ptrdist/bc/bc: 5.98% (3.4122 => 3.2082)
 multisource/Benchmarks/Trimaran/enc-3des/enc-3des: 8.07% (2.8761 => 2.6441)




More information about the llvm-testresults mailing list