[llvm-testresults] cfarm x86_64 nightly tester results
Apache
apache at cs.uiuc.edu
Thu Nov 6 06:52:38 PST 2008
http://llvm.org/nightlytest/test.php?machine=302&night=8398
Name: gcc11
Nickname: cfarm
Buildstatus: OK
New Test Passes:
None
New Test Failures:
None
Added Tests:
test/CodeGen/PowerPC/delete-node.ll
test/DebugInfo/2008-11-05-InlinedFuncStart.ll
Removed Tests:
None
Significant changes in test results:
CBE:
singlesource/Benchmarks/CoyoteBench/fftbench: -15.32% (3.59 => 4.14)
singlesource/Benchmarks/Misc/ffbench: -8.33% (3.36 => 3.64)
singlesource/Benchmarks/Shootout/methcall: -51.39% (6.46 => 9.78)
multisource/Applications/lambda-0.1.3/lambda: 5.21% (7.48 => 7.09)
multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 11.71% (11.87 => 10.48)
multisource/Benchmarks/Olden/treeadd/treeadd: 14.32% (8.38 => 7.18)
JIT:
singlesource/Benchmarks/CoyoteBench/fftbench: 10.66% (5.16 => 4.61)
singlesource/Benchmarks/Misc/ffbench: -8.24% (3.52 => 3.81)
multisource/Applications/lambda-0.1.3/lambda: 19.78% (13.70 => 10.99)
multisource/Benchmarks/MiBench/consumer-lame/consumer-lame: -5.70% (6.84 => 7.23)
multisource/Benchmarks/Olden/treeadd/treeadd: -7.23% (7.75 => 8.31)
multisource/Benchmarks/VersaBench/bmm/bmm: -5.71% (8.41 => 8.89)
LLC-BETA:
singlesource/Benchmarks/CoyoteBench/fftbench: 10.90% (4.77 => 4.25)
singlesource/Benchmarks/Misc/ffbench: -6.90% (3.48 => 3.72)
singlesource/Benchmarks/Shootout-C++/lists: -11.61% (8.44 => 9.42)
singlesource/Benchmarks/Shootout/methcall: 5.01% (9.38 => 8.91)
multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -5.36% (11.57 => 12.19)
multisource/Benchmarks/Olden/treeadd/treeadd: 13.78% (8.42 => 7.26)
multisource/Benchmarks/OptimizerEval/optimizer-eval: -5.64% (110.13 => 116.34)
multisource/Benchmarks/SciMark2-C/scimark2: 11.24% (34.97 => 31.04)
multisource/Benchmarks/VersaBench/bmm/bmm: -11.44% (7.69 => 8.57)
LLC:
singlesource/Benchmarks/Misc/ffbench: 10.88% (3.86 => 3.44)
singlesource/Benchmarks/Misc/flops-1: 5.50% (4.36 => 4.12)
singlesource/Benchmarks/Shootout-C++/lists: -8.43% (8.42 => 9.13)
singlesource/Benchmarks/Shootout/hash: -17.00% (4.94 => 5.78)
singlesource/Benchmarks/Shootout/methcall: -13.76% (8.14 => 9.26)
multisource/Applications/siod/siod: 7.73% (6.08 => 5.61)
multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -19.22% (9.99 => 11.91)
multisource/Benchmarks/SciMark2-C/scimark2: 11.22% (34.95 => 31.03)
GCCAS:
multisource/Applications/Burg/burg: 14.42% (7.0484 => 6.0323)
multisource/Applications/JM/ldecod/ldecod: 12.12% (31.7779 => 27.9257)
multisource/Applications/JM/lencod/lencod: 9.04% (59.6357 => 54.2473)
multisource/Applications/SIBsim4/SIBsim4: 7.70% (4.1042 => 3.7882)
multisource/Applications/SPASS/SPASS: 8.36% (57.7356 => 52.9113)
multisource/Applications/d/make_dparser: 11.46% (13.1648 => 11.6567)
multisource/Applications/lemon/lemon: 6.48% (3.2122 => 3.0041)
multisource/Applications/lua/lua: 7.39% (15.5849 => 14.4329)
multisource/Applications/oggenc/oggenc: 5.75% (12.8768 => 12.1367)
multisource/Applications/sqlite3/sqlite3: 6.58% (37.4983 => 35.0302)
multisource/Applications/treecc/treecc: 5.48% (6.8644 => 6.4884)
multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 17.08% (51.5952 => 42.7826)
multisource/Benchmarks/MallocBench/espresso/espresso: 5.17% (11.5967 => 10.9966)
multisource/Benchmarks/MallocBench/gs/gs: 5.52% (10.8606 => 10.2606)
multisource/Benchmarks/MiBench/automotive-susan/automotive-susan: 12.95% (3.9522 => 3.4402)
multisource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: 6.49% (10.9766 => 10.2646)
multisource/Benchmarks/MiBench/office-ispell/office-ispell: 11.00% (5.2723 => 4.6922)
multisource/Benchmarks/Prolangs-C/TimberWolfMC/timberwolfmc: 9.41% (19.5532 => 17.7131)
multisource/Benchmarks/Prolangs-C/agrep/agrep: 10.66% (3.4522 => 3.0841)
multisource/Benchmarks/Prolangs-C/bison/mybison: 10.71% (2.8761 => 2.5681)
multisource/Benchmarks/Trimaran/enc-3des/enc-3des: 14.41% (3.1361 => 2.6841)
multisource/Benchmarks/mediabench/gsm/toast/toast: 7.10% (2.9281 => 2.7201)
multisource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: 6.99% (11.2247 => 10.4406)
LLC-BETA compile:
multisource/Applications/treecc/treecc: 7.89% (2.7881 => 2.5681)
multisource/Benchmarks/MiBench/consumer-lame/consumer-lame: -11.07% (1.9881 => 2.2081)
LLC compile:
multisource/Benchmarks/MallocBench/espresso/espresso: 5.97% (3.6162 => 3.4002)
More information about the llvm-testresults
mailing list