[llvm-testresults] smoosh-01 nightly tester results

daniel_dunbar at apple.com daniel_dunbar at apple.com
Wed Mar 31 02:12:02 PDT 2010


http://smooshlab.apple.com/perf/db_nt_internal/nightlytest/6871/
Nickname: smoosh-01:1
Name:  smoosh-01

Run: 6871, Start Time: 2010-03-31 00:43:23, End Time: 2010-03-31 02:11:42
Comparing To: 6867, Start Time: 2010-03-30 23:00:39, End Time: 2010-03-31 00:30:48

--- Changes Summary ---
New Test Passes: 0
New Test Failures: 0
Added Tests: 0
Removed Tests: 0
Significant Changes: 29

--- Tests Summary ---
Total Tests: 3582
Total Test Failures: 624

Total Test Failures By Type:
  Bitcode: 66
  CBE: 84
  GCCAS: 66
  JIT: 72
  JIT codegen: 72
  LLC: 66
  LLC compile: 66
  LLC-BETA: 66
  LLC_BETA compile: 66

--- Changes Detail ---
New Test Passes:

New Test Failures:

Added Tests:

Removed Tests:

Significant Changes in Test Results:
JIT codegen:
 MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: -7.47% (0.5167 => 0.4781)
 MultiSource/Benchmarks/MiBench/consumer-lame/consumer-lame: -7.11% (2.0825 => 1.9345)
 MultiSource/Benchmarks/mediabench/gsm/toast/toast: -5.08% (0.2578 => 0.2447)
 MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: -5.28% (0.6438 => 0.6098)
CBE:
 MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -7.65% (3.6600 => 3.3800)
 MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -6.98% (0.4300 => 0.4000)
LLC-BETA:
 MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -8.67% (3.6900 => 3.3700)
 MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -6.98% (0.4300 => 0.4000)
LLC compile:
 MultiSource/Benchmarks/MiBench/consumer-lame/consumer-lame: -5.73% (2.4128 => 2.2745)
 MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: -7.58% (1.0267 => 0.9489)
GCCAS:
 MultiSource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: -6.50% (1.6991 => 1.5886)
 MultiSource/Benchmarks/mediabench/gsm/toast/toast: -5.14% (0.5704 => 0.5411)
 MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: -5.78% (1.7049 => 1.6063)
 MultiSource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: -9.08% (0.5509 => 0.5009)
JIT:
 MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -6.38% (0.4700 => 0.4400)
 MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: -6.67% (0.6000 => 0.5600)
 MultiSource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: -6.06% (0.6600 => 0.6200)
 MultiSource/Benchmarks/MiBench/consumer-lame/consumer-lame: -7.17% (2.5100 => 2.3300)
 MultiSource/Benchmarks/Prolangs-C/TimberWolfMC/timberwolfmc: -6.52% (0.4600 => 0.4300)
 MultiSource/Benchmarks/Prolangs-C/cdecl/cdecl: -5.00% (0.4000 => 0.3800)
 MultiSource/Benchmarks/mediabench/gsm/toast/toast: -6.45% (0.3100 => 0.2900)
 MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: -5.48% (0.7300 => 0.6900)
LLC:
 MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -6.98% (0.4300 => 0.4000)
 MultiSource/Benchmarks/MiBench/consumer-lame/consumer-lame: -6.45% (0.3100 => 0.2900)
LLC-BETA compile:
 MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -5.61% (3.9278 => 3.7074)
 MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: -5.97% (0.5464 => 0.5138)
 MultiSource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: -5.98% (1.1636 => 1.0940)
 MultiSource/Benchmarks/MiBench/consumer-lame/consumer-lame: -6.29% (2.4295 => 2.2768)
 MultiSource/Benchmarks/Prolangs-C/agrep/agrep: -5.47% (0.6566 => 0.6207)



More information about the llvm-testresults mailing list