[llvm-testresults] lauro-thumbv5te-softfloat i686 nightly tester results

Apache apache at cs.uiuc.edu
Thu Jun 21 07:57:43 PDT 2007


http://llvm.org/nightlytest/test.php?machine=143&night=3042
Name: laurov-desktop
Nickname: lauro-thumbv5te-softfloat
Buildstatus: OK

New Test Passes:
None

New Test Failures:
None

Added Tests:
None

Removed Tests:
None

Significant changes in test results:
CBE:
 singlesource/Benchmarks/Shootout-C++/lists1: 21.38% (2.76 => 2.17)
 singlesource/Benchmarks/Shootout/objinst: 9.36% (56.52 => 51.23)
 multisource/Applications/d/make_dparser: 5.90% (3.39 => 3.19)
 multisource/Applications/spiff/spiff: -10671.43% (0.07 => 7.54)
 multisource/Benchmarks/llubenchmark/llu: 5.92% (383.02 => 360.33)
GCCAS:
 multisource/Applications/Burg/burg: -2062.50% (0.0320 => 0.6920)
 multisource/Applications/JM/ldecod/ldecod: -2311.57% (0.1400 => 3.3762)
 multisource/Applications/JM/lencod/lencod: -9037.59% (0.3240 => 29.6058)
 multisource/Applications/SIBsim4/SIBsim4: -2350.00% (0.0240 => 0.5880)
 multisource/Applications/SPASS/SPASS: -2886.98% (0.3640 => 10.8726)
 multisource/Applications/d/make_dparser: -2341.32% (0.0680 => 1.6601)
 multisource/Applications/hbd/hbd: -2125.00% (0.0160 => 0.3560)
 multisource/Applications/kimwitu++/kc: -1965.60% (0.4520 => 9.3365)
 multisource/Applications/lambda-0.1.3/lambda: -1266.67% (0.0240 => 0.3280)
 multisource/Applications/obsequi/Obsequi: -1975.00% (0.0160 => 0.3320)
 multisource/Applications/oggenc/oggenc: -1580.08% (0.1200 => 2.0161)
 multisource/Applications/siod/siod: -1269.23% (0.1040 => 1.4240)
 multisource/Applications/spiff/spiff: -1575.00% (0.0160 => 0.2680)
 multisource/Applications/treecc/treecc: -1800.00% (0.0520 => 0.9880)
 multisource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -2728.10% (0.1000 => 2.8281)
 multisource/Benchmarks/FreeBench/pifft/pifft: -1850.00% (0.0160 => 0.3120)
 multisource/Benchmarks/MallocBench/cfrac/cfrac: -1180.00% (0.0200 => 0.2560)
 multisource/Benchmarks/MallocBench/espresso/espresso: -1462.50% (0.0960 => 1.5000)
 multisource/Benchmarks/MallocBench/gs/gs: -1820.94% (0.0960 => 1.8441)
 multisource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: -1900.00% (0.0720 => 1.4400)
 multisource/Benchmarks/MiBench/consumer-lame/consumer-lame: -2281.25% (0.0640 => 1.5240)
 multisource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: -6051.54% (0.3440 => 21.1613)
 multisource/Benchmarks/MiBench/office-ispell/office-ispell: -2625.00% (0.0320 => 0.8720)
 multisource/Benchmarks/MiBench/telecomm-gsm/telecomm-gsm: -2350.00% (0.0160 => 0.3920)
 multisource/Benchmarks/Prolangs-C/TimberWolfMC/timberwolfmc: -2617.95% (0.1120 => 3.0441)
 multisource/Benchmarks/Prolangs-C/agrep/agrep: -2220.00% (0.0200 => 0.4640)
 multisource/Benchmarks/Prolangs-C/archie-client/archie: -1733.33% (0.0120 => 0.2200)
 multisource/Benchmarks/Prolangs-C/assembler/assembler: -1300.00% (0.0160 => 0.2240)
 multisource/Benchmarks/Prolangs-C/bison/mybison: -1437.50% (0.0320 => 0.4920)
 multisource/Benchmarks/Prolangs-C/compiler/compiler: -2800.00% (0.0080 => 0.2320)
 multisource/Benchmarks/Prolangs-C/football/football: -1700.00% (0.0200 => 0.3600)
 multisource/Benchmarks/Prolangs-C/simulator/simulator: -1340.00% (0.0200 => 0.2880)
 multisource/Benchmarks/Prolangs-C/unix-tbl/unix-tbl: -2400.00% (0.0160 => 0.4000)
 multisource/Benchmarks/Ptrdist/bc/bc: -1585.71% (0.0280 => 0.4720)
 multisource/Benchmarks/mediabench/gsm/toast/toast: -1780.00% (0.0200 => 0.3760)
 multisource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: -2237.50% (0.0640 => 1.4960)
 multisource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: -3080.00% (0.0200 => 0.6360)
 multisource/Benchmarks/tramp3d-v4/tramp3d-v4: -4422.89% (0.3360 => 15.1969)




More information about the llvm-testresults mailing list