[llvm-testresults] smoosh-01 nightly tester results

daniel_dunbar at apple.com daniel_dunbar at apple.com
Mon Apr 19 13:57:44 PDT 2010


http://smooshlab.apple.com/perf/db_nt_internal/nightlytest/7450/
Nickname: smoosh-01:1
Name:  smoosh-01

Run: 7450, Start Time: 2010-04-19 12:36:49, End Time: 2010-04-19 13:57:34
Comparing To: 7422, Start Time: 2010-04-17 19:06:40, End Time: 2010-04-17 20:29:01

--- Changes Summary ---
New Test Passes: 0
New Test Failures: 0
Added Tests: 0
Removed Tests: 0
Significant Changes: 67

--- Tests Summary ---
Total Tests: 3582
Total Test Failures: 638

Total Test Failures By Type:
  Bitcode: 66
  CBE: 98
  GCCAS: 66
  JIT: 72
  JIT codegen: 72
  LLC: 66
  LLC compile: 66
  LLC-BETA: 66
  LLC_BETA compile: 66

--- Changes Detail ---
New Test Passes:

New Test Failures:

Added Tests:

Removed Tests:

Significant Changes in Test Results:
LLC:
 MultiSource/Applications/lemon/lemon: -17.14% (1.7500 => 1.4500)
 MultiSource/Applications/spiff/spiff: -20.48% (3.7600 => 2.9900)
 MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -6.23% (3.3700 => 3.1600)
 MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -10.00% (0.4000 => 0.3600)
 MultiSource/Benchmarks/Olden/health/health: -8.89% (0.4500 => 0.4100)
 MultiSource/Benchmarks/Olden/perimeter/perimeter: -11.11% (0.2700 => 0.2400)
 MultiSource/Benchmarks/Olden/voronoi/voronoi: -16.67% (0.3600 => 0.3000)
 MultiSource/Benchmarks/Trimaran/netbench-crc/netbench-crc: -13.00% (1.0000 => 0.8700)
 SingleSource/Benchmarks/BenchmarkGame/fasta: -10.34% (1.1600 => 1.0400)
 SingleSource/Benchmarks/Shootout/strcat: -14.29% (0.2800 => 0.2400)
JIT:
 MultiSource/Applications/ClamAV/clamscan: -9.29% (4.5200 => 4.1000)
 MultiSource/Applications/JM/ldecod/ldecod: -8.22% (3.5300 => 3.2400)
 MultiSource/Applications/d/make_dparser: -9.84% (1.2200 => 1.1000)
 MultiSource/Applications/hbd/hbd: -6.82% (0.4400 => 0.4100)
 MultiSource/Applications/lemon/lemon: -9.39% (166.4000 => 150.7700)
 MultiSource/Applications/oggenc/oggenc: -6.86% (1.7500 => 1.6300)
 MultiSource/Applications/spiff/spiff: -19.91% (4.4200 => 3.5400)
 MultiSource/Applications/treecc/treecc: -8.51% (0.4700 => 0.4300)
 MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -5.73% (6.2800 => 5.9200)
 MultiSource/Benchmarks/FreeBench/pifft/pifft: -5.08% (0.5900 => 0.5600)
 MultiSource/Benchmarks/MallocBench/espresso/espresso: -5.88% (2.5500 => 2.4000)
 MultiSource/Benchmarks/McCat/09-vor/vor: -8.00% (0.2500 => 0.2300)
 MultiSource/Benchmarks/McCat/18-imp/imp: -8.00% (0.2500 => 0.2300)
 MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -9.09% (0.4400 => 0.4000)
 MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: -5.45% (0.5500 => 0.5200)
 MultiSource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: -10.00% (0.6000 => 0.5400)
 MultiSource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: -6.10% (7.8700 => 7.3900)
 MultiSource/Benchmarks/MiBench/office-ispell/office-ispell: -11.11% (0.2700 => 0.2400)
 MultiSource/Benchmarks/Olden/health/health: -10.00% (0.5000 => 0.4500)
 MultiSource/Benchmarks/Olden/perimeter/perimeter: -14.29% (0.3500 => 0.3000)
 MultiSource/Benchmarks/Olden/voronoi/voronoi: -17.78% (0.4500 => 0.3700)
 MultiSource/Benchmarks/OptimizerEval/optimizer-eval: -13.57% (1.4000 => 1.2100)
 MultiSource/Benchmarks/Prolangs-C/TimberWolfMC/timberwolfmc: -9.09% (0.4400 => 0.4000)
 MultiSource/Benchmarks/Prolangs-C/agrep/agrep: -11.54% (0.2600 => 0.2300)
 MultiSource/Benchmarks/Prolangs-C/assembler/assembler: -9.68% (0.3100 => 0.2800)
 MultiSource/Benchmarks/Prolangs-C/bison/mybison: -8.16% (0.9800 => 0.9000)
 MultiSource/Benchmarks/Prolangs-C/cdecl/cdecl: -5.26% (0.3800 => 0.3600)
 MultiSource/Benchmarks/Prolangs-C/gnugo/gnugo: -8.11% (0.3700 => 0.3400)
 MultiSource/Benchmarks/Prolangs-C/unix-smail/unix-smail: -11.36% (0.4400 => 0.3900)
 MultiSource/Benchmarks/Trimaran/netbench-crc/netbench-crc: -13.59% (1.0300 => 0.8900)
 MultiSource/Benchmarks/mafft/pairlocalalign: -6.93% (33.5000 => 31.1800)
 MultiSource/Benchmarks/mediabench/gsm/toast/toast: -7.14% (0.2800 => 0.2600)
 MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg: -10.61% (0.6600 => 0.5900)
 MultiSource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode: -7.69% (0.5200 => 0.4800)
 SingleSource/Benchmarks/BenchmarkGame/fasta: -11.02% (1.1800 => 1.0500)
 SingleSource/Benchmarks/Misc/richards_benchmark: -7.19% (1.3900 => 1.2900)
 SingleSource/Benchmarks/Shootout/strcat: -17.24% (0.2900 => 0.2400)
LLC-BETA:
 MultiSource/Applications/lemon/lemon: -16.67% (1.7400 => 1.4500)
 MultiSource/Applications/spiff/spiff: -20.74% (3.7600 => 2.9800)
 MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -5.67% (3.3500 => 3.1600)
 MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -10.00% (0.4000 => 0.3600)
 MultiSource/Benchmarks/Olden/health/health: -9.09% (0.4400 => 0.4000)
 MultiSource/Benchmarks/Olden/perimeter/perimeter: -11.11% (0.2700 => 0.2400)
 MultiSource/Benchmarks/Olden/voronoi/voronoi: -21.62% (0.3700 => 0.2900)
 MultiSource/Benchmarks/Trimaran/netbench-crc/netbench-crc: -12.12% (0.9900 => 0.8700)
 SingleSource/Benchmarks/BenchmarkGame/fasta: -10.34% (1.1600 => 1.0400)
 SingleSource/Benchmarks/Shootout/strcat: -17.24% (0.2900 => 0.2400)
CBE:
 MultiSource/Applications/lemon/lemon: -15.51% (1.8700 => 1.5800)
 MultiSource/Applications/spiff/spiff: -20.32% (3.7900 => 3.0200)
 MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -5.07% (3.3500 => 3.1800)
 MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -10.00% (0.4000 => 0.3600)
 MultiSource/Benchmarks/Olden/perimeter/perimeter: -14.81% (0.2700 => 0.2300)
 MultiSource/Benchmarks/Olden/treeadd/treeadd: -6.75% (9.7800 => 9.1200)
 MultiSource/Benchmarks/Olden/voronoi/voronoi: -17.95% (0.3900 => 0.3200)
 MultiSource/Benchmarks/OptimizerEval/optimizer-eval: -94.16% (130.3900 => 7.6100)
 MultiSource/Benchmarks/Trimaran/netbench-crc/netbench-crc: -13.00% (1.0000 => 0.8700)
 SingleSource/Benchmarks/BenchmarkGame/fasta: -8.70% (1.3800 => 1.2600)



More information about the llvm-testresults mailing list