[llvm-testresults] cfarm-x86-64 x86_64 nightly tester results

Apache apache at cs.uiuc.edu
Sat Mar 28 12:08:04 PDT 2009


http://llvm.org/nightlytest/test.php?machine=394&night=10217
Name: gcc11
Nickname: cfarm-x86-64
Buildstatus: OK

New Test Passes:
test/CodeGen/PowerPC/int-fp-conv-0.ll [DEJAGNU]
Applications/JM/lencod/lencod [LLC compile, LLC-BETA compile, , CBE, LLC, LLC-BETA] 
Applications/minisat/minisat [LLC compile, LLC-BETA compile, , CBE, LLC] 
Applications/obsequi/Obsequi [LLC compile, LLC-BETA compile, , , LLC] 
Benchmarks/FreeBench/analyzer/analyzer [LLC compile, LLC-BETA compile, , CBE, LLC, LLC-BETA] 
Benchmarks/FreeBench/distray/distray [LLC compile, LLC-BETA compile, , CBE, LLC, LLC-BETA] 
Benchmarks/MallocBench/espresso/espresso [LLC compile, LLC-BETA compile, , CBE, LLC, LLC-BETA] 
Benchmarks/McCat/18-imp/imp [LLC compile, LLC-BETA compile, , CBE, LLC, LLC-BETA] 
Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg [LLC compile, LLC-BETA compile, , CBE, LLC, LLC-BETA] 
Benchmarks/MiBench/consumer-jpeg/consumer-jpeg [LLC compile, LLC-BETA compile, , CBE, LLC, LLC-BETA] 
Benchmarks/Trimaran/enc-rc4/enc-rc4 [LLC compile, LLC-BETA compile, , CBE, LLC, LLC-BETA] 


New Test Failures:
None

Added Tests:
test/CodeGen/X86/imul-lea-2.ll
test/CodeGen/X86/tailcall-i1.ll
test/CodeGen/X86/tailcall-structret.ll
test/CodeGen/X86/tailcall-void.ll


Removed Tests:
None

Significant changes in test results:
GCCAS:
 singlesource/Benchmarks/Adobe-C++/loop_unroll: 8.30% (8.3365 => 7.6444)
 singlesource/Benchmarks/Adobe-C++/stepanov_abstraction: 11.23% (2.1721 => 1.9281)
 singlesource/Benchmarks/Adobe-C++/stepanov_vector: 8.55% (2.4321 => 2.2241)
 multisource/Applications/Burg/burg: 13.23% (4.9003 => 4.2522)
 multisource/Applications/ClamAV/clamscan: 10.07% (49.8071 => 44.7908)
 multisource/Applications/JM/ldecod/ldecod: 9.29% (25.3295 => 22.9774)
 multisource/Applications/JM/lencod/lencod: 9.91% (54.8394 => 49.4070)
 multisource/Applications/SIBsim4/SIBsim4: 11.65% (4.2562 => 3.7602)
 multisource/Applications/SPASS/SPASS: 9.89% (59.9837 => 54.0513)
 multisource/Applications/d/make_dparser: 12.78% (12.3647 => 10.7846)
 multisource/Applications/hbd/hbd: 10.92% (2.6001 => 2.3161)
 multisource/Applications/kimwitu++/kc: 8.44% (42.3386 => 38.7664)
 multisource/Applications/lemon/lemon: 9.25% (3.2882 => 2.9841)
 multisource/Applications/lua/lua: 9.57% (16.1690 => 14.6209)
 multisource/Applications/minisat/minisat: 11.30% (2.3001 => 2.0401)
 multisource/Applications/obsequi/Obsequi: 9.56% (3.7642 => 3.4042)
 multisource/Applications/oggenc/oggenc: 9.95% (14.4328 => 12.9968)
 multisource/Applications/siod/siod: 6.65% (7.1004 => 6.6284)
 multisource/Applications/sqlite3/sqlite3: 9.00% (38.1543 => 34.7221)
 multisource/Applications/treecc/treecc: 11.64% (7.5604 => 6.6804)
 multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 14.07% (35.0821 => 30.1458)
 multisource/Benchmarks/FreeBench/pifft/pifft: 11.70% (2.7001 => 2.3841)
 multisource/Benchmarks/MallocBench/espresso/espresso: 8.93% (12.4127 => 11.3047)
 multisource/Benchmarks/MallocBench/gs/gs: 9.60% (12.0007 => 10.8486)
 multisource/Benchmarks/MiBench/automotive-susan/automotive-susan: 14.60% (2.8761 => 2.4561)
 multisource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: 8.41% (12.1247 => 11.1046)
 multisource/Benchmarks/MiBench/consumer-lame/consumer-lame: 10.11% (11.9527 => 10.7446)
 multisource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: 9.96% (66.5961 => 59.9637)
 multisource/Benchmarks/MiBench/office-ispell/office-ispell: 8.27% (4.7882 => 4.3922)
 multisource/Benchmarks/MiBench/telecomm-gsm/telecomm-gsm: 12.40% (3.4202 => 2.9961)
 multisource/Benchmarks/PAQ8p/paq8p: 7.75% (8.1565 => 7.5244)
 multisource/Benchmarks/Prolangs-C/TimberWolfMC/timberwolfmc: 10.28% (18.7171 => 16.7930)
 multisource/Benchmarks/Prolangs-C/agrep/agrep: 8.16% (3.4322 => 3.1521)
 multisource/Benchmarks/Prolangs-C/bison/mybison: 8.75% (3.0641 => 2.7961)
 multisource/Benchmarks/Prolangs-C/football/football: 10.08% (3.6122 => 3.2482)
 multisource/Benchmarks/Prolangs-C/unix-tbl/unix-tbl: 8.54% (2.5761 => 2.3561)
 multisource/Benchmarks/Ptrdist/bc/bc: 10.63% (3.8002 => 3.3962)
 multisource/Benchmarks/Trimaran/enc-3des/enc-3des: 18.64% (3.4122 => 2.7761)
 multisource/Benchmarks/mafft/pairlocalalign: 7.57% (29.4778 => 27.2457)
 multisource/Benchmarks/mediabench/gsm/toast/toast: 12.37% (3.3962 => 2.9761)
 multisource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: 10.24% (12.4607 => 11.1847)
 multisource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: 7.38% (4.0122 => 3.7162)
 multisource/Benchmarks/tramp3d-v4/tramp3d-v4: 8.38% (73.3165 => 67.1721)
JIT:
 singlesource/Benchmarks/Adobe-C++/loop_unroll: -6.75% (16.16 => 17.25)
 singlesource/Benchmarks/BenchmarkGame/nsieve-bits: 18.25% (2.85 => 2.33)
 singlesource/Benchmarks/CoyoteBench/almabench: 10.61% (28.76 => 25.71)
 singlesource/Benchmarks/CoyoteBench/fftbench: 22.73% (5.28 => 4.08)
 singlesource/Benchmarks/Misc/ffbench: -8.43% (3.56 => 3.86)
 singlesource/Benchmarks/Shootout/lists: 5.06% (9.49 => 9.01)
 multisource/Applications/aha/aha: 72.98% (25.24 => 6.82)
 multisource/Applications/lambda-0.1.3/lambda: 11.67% (12.08 => 10.67)
 multisource/Applications/viterbi/viterbi: 11.66% (25.56 => 22.58)
CBE:
 singlesource/Benchmarks/BenchmarkGame/nsieve-bits: 10.79% (2.78 => 2.48)
 singlesource/Benchmarks/CoyoteBench/almabench: 10.90% (29.37 => 26.17)
 singlesource/Benchmarks/CoyoteBench/fftbench: -28.85% (3.64 => 4.69)
 singlesource/Benchmarks/Misc/ffbench: 16.39% (3.66 => 3.06)
 singlesource/Benchmarks/Shootout/hash: -31.94% (9.30 => 12.27)
 singlesource/Benchmarks/Shootout/methcall: 21.07% (7.50 => 5.92)
 multisource/Applications/lambda-0.1.3/lambda: 11.36% (7.92 => 7.02)
 multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 9.10% (11.98 => 10.89)
LLC:
 singlesource/Benchmarks/BenchmarkGame/nsieve-bits: -11.89% (2.44 => 2.73)
 singlesource/Benchmarks/BenchmarkGame/puzzle: -20.30% (1.33 => 1.60)
 singlesource/Benchmarks/CoyoteBench/almabench: 9.90% (27.97 => 25.20)
 singlesource/Benchmarks/CoyoteBench/fftbench: 23.14% (4.97 => 3.82)
 singlesource/Benchmarks/Misc/ffbench: 6.01% (3.66 => 3.44)
 singlesource/Benchmarks/Shootout/hash: 24.12% (12.15 => 9.22)
 multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 9.21% (11.94 => 10.84)
LLC-BETA:
 singlesource/Benchmarks/BenchmarkGame/nsieve-bits: -24.09% (2.20 => 2.73)
 singlesource/Benchmarks/CoyoteBench/almabench: 9.36% (27.88 => 25.27)
 singlesource/Benchmarks/Misc/ffbench: -7.82% (3.58 => 3.86)
 multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 9.81% (11.72 => 10.57)




More information about the llvm-testresults mailing list