[llvm-testresults] smoosh-01.apple.com nightly tester results

daniel_dunbar at apple.com daniel_dunbar at apple.com
Sun Feb 28 02:06:23 PST 2010


http://smooshlab.apple.com/perf/db_nt_internal/nightlytest/5822/
Name:  smoosh-01
Nickname: smoosh-01.apple.com:3

Run: 5822, Start Time: 2010-02-28 00:41:58, End Time: 2010-02-28 02:06:12
Comparing To: 5819, Start Time: 2010-02-27 23:04:37, End Time: 2010-02-28 00:31:22

--- Changes Summary ---
New Test Passes: 0
New Test Failures: 0
Added Tests: 0
Removed Tests: 0
Significant Changes: 28

--- Tests Summary ---
Total Tests: 3582
Total Test Failures: 623

Total Test Failures By Type:
  Bitcode: 66
  CBE: 83
  GCCAS: 66
  JIT: 72
  JIT codegen: 72
  LLC: 66
  LLC compile: 66
  LLC-BETA: 66
  LLC_BETA compile: 66

--- Changes Detail ---
New Test Passes:

New Test Failures:

Added Tests:

Removed Tests:

Significant Changes in Test Results:
JIT codegen:
 MultiSource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: -5.56% (0.5087 => 0.4804)
 MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: -5.00% (0.5816 => 0.5525)
 MultiSource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: -5.73% (0.5251 => 0.4950)
CBE:
 MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -6.93% (3.6100 => 3.3600)
 MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -5.00% (0.4000 => 0.3800)
 MultiSource/Benchmarks/OptimizerEval/optimizer-eval: -21.90% (217.8100 => 170.1000)
LLC-BETA:
 MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -8.85% (3.7300 => 3.4000)
 MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -9.76% (0.4100 => 0.3700)
LLC compile:
 MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -7.75% (4.2086 => 3.8826)
 MultiSource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: -5.61% (1.1487 => 1.0843)
 MultiSource/Benchmarks/mediabench/gsm/toast/toast: -8.81% (0.3744 => 0.3414)
 MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: -5.71% (1.0059 => 0.9485)
GCCAS:
 MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -6.45% (3.8854 => 3.6349)
 MultiSource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: -6.61% (0.5444 => 0.5084)
JIT:
 MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -5.92% (6.7600 => 6.3600)
 MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: -5.17% (0.5800 => 0.5500)
 MultiSource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: -6.78% (0.5900 => 0.5500)
 MultiSource/Benchmarks/Prolangs-C/TimberWolfMC/timberwolfmc: -6.52% (0.4600 => 0.4300)
 MultiSource/Benchmarks/mediabench/gsm/toast/toast: -6.67% (0.3000 => 0.2800)
 MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: -6.06% (0.6600 => 0.6200)
 MultiSource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: -5.08% (0.5900 => 0.5600)
LLC:
 MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -6.39% (3.6000 => 3.3700)
 MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -7.50% (0.4000 => 0.3700)
LLC-BETA compile:
 MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -6.32% (4.1365 => 3.8751)
 MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: -7.35% (0.5959 => 0.5521)
 MultiSource/Benchmarks/Prolangs-C/TimberWolfMC/timberwolfmc: -5.90% (3.9523 => 3.7191)
 MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: -5.93% (1.0100 => 0.9501)
 MultiSource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: -7.83% (0.7258 => 0.6690)



More information about the llvm-testresults mailing list