[llvm-testresults] smoosh-01 nightly tester results
daniel_dunbar at apple.com
daniel_dunbar at apple.com
Wed Mar 31 00:33:39 PDT 2010
http://smooshlab.apple.com/perf/db_nt_internal/nightlytest/6867/
Nickname: smoosh-01:1
Name: smoosh-01
Run: 6867, Start Time: 2010-03-30 23:00:39, End Time: 2010-03-31 00:30:48
Comparing To: 6865, Start Time: 2010-03-30 21:19:47, End Time: 2010-03-30 22:48:17
--- Changes Summary ---
New Test Passes: 0
New Test Failures: 0
Added Tests: 0
Removed Tests: 0
Significant Changes: 27
--- Tests Summary ---
Total Tests: 3582
Total Test Failures: 624
Total Test Failures By Type:
Bitcode: 66
CBE: 84
GCCAS: 66
JIT: 72
JIT codegen: 72
LLC: 66
LLC compile: 66
LLC-BETA: 66
LLC_BETA compile: 66
--- Changes Detail ---
New Test Passes:
New Test Failures:
Added Tests:
Removed Tests:
Significant Changes in Test Results:
JIT codegen:
MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: 8.37% (0.4768 => 0.5167)
MultiSource/Benchmarks/MiBench/consumer-lame/consumer-lame: 7.59% (1.9356 => 2.0825)
MultiSource/Benchmarks/mediabench/gsm/toast/toast: 5.18% (0.2451 => 0.2578)
CBE:
MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 8.61% (3.3700 => 3.6600)
LLC-BETA:
MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 9.17% (3.3800 => 3.6900)
MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: 7.50% (0.4000 => 0.4300)
LLC compile:
MultiSource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: 5.48% (1.0963 => 1.1564)
MultiSource/Benchmarks/MiBench/consumer-lame/consumer-lame: 6.05% (2.2751 => 2.4128)
MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: 8.16% (0.9492 => 1.0267)
GCCAS:
MultiSource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: 6.73% (1.5920 => 1.6991)
MultiSource/Benchmarks/mediabench/gsm/toast/toast: 5.45% (0.5409 => 0.5704)
MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: 5.53% (1.6156 => 1.7049)
MultiSource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: 9.98% (0.5009 => 0.5509)
JIT:
MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: 7.14% (0.5600 => 0.6000)
MultiSource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: 6.45% (0.6200 => 0.6600)
MultiSource/Benchmarks/MiBench/consumer-lame/consumer-lame: 7.73% (2.3300 => 2.5100)
MultiSource/Benchmarks/Prolangs-C/TimberWolfMC/timberwolfmc: 6.98% (0.4300 => 0.4600)
MultiSource/Benchmarks/Prolangs-C/cdecl/cdecl: 5.26% (0.3800 => 0.4000)
MultiSource/Benchmarks/mediabench/gsm/toast/toast: 6.90% (0.2900 => 0.3100)
MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: 5.80% (0.6900 => 0.7300)
LLC:
MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: 7.50% (0.4000 => 0.4300)
MultiSource/Benchmarks/MiBench/consumer-lame/consumer-lame: 6.90% (0.2900 => 0.3100)
LLC-BETA compile:
MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: 6.16% (3.6998 => 3.9278)
MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: 6.89% (0.5112 => 0.5464)
MultiSource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: 6.04% (1.0973 => 1.1636)
MultiSource/Benchmarks/MiBench/consumer-lame/consumer-lame: 6.79% (2.2751 => 2.4295)
MultiSource/Benchmarks/Prolangs-C/agrep/agrep: 5.75% (0.6209 => 0.6566)
More information about the llvm-testresults
mailing list