[llvm-testresults] lauro-armv5te-softfloat i686 nightly tester results

Apache apache at cs.uiuc.edu
Tue Feb 26 01:27:25 PST 2008


http://llvm.org/nightlytest/test.php?machine=246&night=5094
Name: laurov-desktop
Nickname: lauro-armv5te-softfloat
Buildstatus: OK

New Test Passes:
None

New Test Failures:
None

Added Tests:
None

Removed Tests:
None

Significant changes in test results:
GCCAS:
 multisource/Applications/ClamAV/clamscan: 30.52% (8.8485 => 6.1483)
 multisource/Applications/JM/ldecod/ldecod: -5.42% (3.9842 => 4.2002)
 multisource/Applications/JM/lencod/lencod: -12.67% (23.9655 => 27.0016)
 multisource/Applications/d/make_dparser: 19.40% (2.5361 => 2.0441)
 multisource/Applications/oggenc/oggenc: -11.02% (2.3241 => 2.5801)
LLC:
 multisource/Applications/JM/ldecod/ldecod: -14.99% (4.87 => 5.60)
 multisource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -9.02% (35.13 => 38.30)
 multisource/Benchmarks/Ptrdist/anagram/anagram: 21.71% (114.03 => 89.27)
 multisource/Benchmarks/SciMark2-C/scimark2: -13.95% (28.82 => 32.84)
 multisource/Benchmarks/llubenchmark/llu: -11.52% (303.78 => 338.77)
LLC compile:
 multisource/Applications/SIBsim4/SIBsim4: 25.10% (0.9880 => 0.7400)
 multisource/Applications/SPASS/SPASS: 16.13% (11.9567 => 10.0286)
 multisource/Applications/d/make_dparser: 12.50% (2.1441 => 1.8761)
 multisource/Applications/siod/siod: 23.22% (5.0123 => 3.8482)
LLC-BETA compile:
 multisource/Applications/SIBsim4/SIBsim4: 33.33% (1.0680 => 0.7120)
 multisource/Applications/SPASS/SPASS: -10.22% (10.7606 => 11.8607)
 multisource/Applications/siod/siod: -14.18% (4.2322 => 4.8323)
CBE:
 multisource/Benchmarks/FreeBench/mason/mason: -7.45% (13.96 => 15.00)
 multisource/Benchmarks/McCat/12-IOtest/iotest: -6.26% (11.03 => 11.72)
 multisource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -6.83% (35.12 => 37.52)
 multisource/Benchmarks/Prolangs-C++/life/life: -17.84% (8.97 => 10.57)
 multisource/Benchmarks/llubenchmark/llu: -7.06% (302.06 => 323.38)
LLC-BETA:
 multisource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -9.36% (34.52 => 37.75)
 multisource/Benchmarks/MiBench/automotive-bitcount/automotive-bitcount: -11.55% (5.28 => 5.89)
 multisource/Benchmarks/MiBench/security-rijndael/security-rijndael: -11.05% (3.80 => 4.22)
 multisource/Benchmarks/Trimaran/enc-3des/enc-3des: 7.18% (7.66 => 7.11)
 multisource/Benchmarks/llubenchmark/llu: -8.16% (299.82 => 324.29)
 multisource/Benchmarks/sim/sim: 9.17% (484.76 => 440.33)
 multisource/Benchmarks/tramp3d-v4/tramp3d-v4: -11.88% (72.07 => 80.63)
Bytecode:
 multisource/Benchmarks/VersaBench/bmm/bmm: 8.94% (3400 => 3096)




More information about the llvm-testresults mailing list