[llvm-testresults] Grue-x86-64-O0-pic nightly tester results

daniel_dunbar at apple.com daniel_dunbar at apple.com
Mon Apr 19 05:38:40 PDT 2010


http://smooshlab.apple.com/perf/db_nt_internal/nightlytest/7445/
Nickname: Grue-x86-64-O0-pic:21
Name:  grue.apple.com

Run: 7445, Start Time: 2010-04-19 01:18:23, End Time: 2010-04-19 05:38:32
Comparing To: 7430, Start Time: 2010-04-18 01:18:22, End Time: 2010-04-18 05:38:19

--- Changes Summary ---
New Test Passes: 0
New Test Failures: 0
Added Tests: 0
Removed Tests: 0
Significant Changes: 24

--- Tests Summary ---
Total Tests: 3582
Total Test Failures: 1701

Total Test Failures By Type:
  CBE: 109
  JIT: 398
  JIT codegen: 398
  LLC-BETA: 398
  LLC_BETA compile: 398

--- Changes Detail ---
New Test Passes:

New Test Failures:

Added Tests:

Removed Tests:

Significant Changes in Test Results:
LLC:
 MultiSource/Applications/ClamAV/clamscan: -20.00% (0.3500 => 0.2800)
 MultiSource/Applications/JM/ldecod/ldecod: -18.92% (0.3700 => 0.3000)
 MultiSource/Applications/lemon/lemon: -12.38% (3.0700 => 2.6900)
 MultiSource/Applications/spiff/spiff: -21.15% (5.9100 => 4.6600)
 MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -14.58% (0.4800 => 0.4100)
 MultiSource/Benchmarks/MiBench/telecomm-CRC32/telecomm-CRC32: -7.69% (0.2600 => 0.2400)
 MultiSource/Benchmarks/Olden/health/health: -10.96% (0.7300 => 0.6500)
 MultiSource/Benchmarks/Olden/mst/mst: -6.45% (0.3100 => 0.2900)
 MultiSource/Benchmarks/Olden/perimeter/perimeter: -7.78% (0.9000 => 0.8300)
 MultiSource/Benchmarks/Olden/voronoi/voronoi: -16.25% (0.8000 => 0.6700)
 MultiSource/Benchmarks/Prolangs-C/unix-smail/unix-smail: -21.62% (0.3700 => 0.2900)
 MultiSource/Benchmarks/Trimaran/netbench-crc/netbench-crc: -8.85% (2.6000 => 2.3700)
 SingleSource/Benchmarks/BenchmarkGame/fasta: -6.56% (2.4400 => 2.2800)
 SingleSource/Benchmarks/Misc-C++/bigfib: -9.91% (3.2300 => 2.9100)
 SingleSource/Benchmarks/Shootout-C++/ary: -7.26% (1.2400 => 1.1500)
 SingleSource/Benchmarks/Shootout-C++/ary2: -6.61% (1.2100 => 1.1300)
 SingleSource/Benchmarks/Shootout-C++/lists1: -7.87% (1.2700 => 1.1700)
 SingleSource/Benchmarks/Shootout-C++/moments: -11.54% (1.0400 => 0.9200)
 SingleSource/Benchmarks/Shootout/strcat: -20.59% (0.3400 => 0.2700)
CBE:
 MultiSource/Benchmarks/MiBench/telecomm-CRC32/telecomm-CRC32: -6.82% (0.4400 => 0.4100)
 MultiSource/Benchmarks/Olden/health/health: -7.96% (1.1300 => 1.0400)
 SingleSource/Benchmarks/Shootout-C++/moments: -5.22% (2.4900 => 2.3600)
 SingleSource/Benchmarks/Shootout/strcat: -18.60% (0.4300 => 0.3500)
 SingleSource/Benchmarks/Stanford/Treesort: -7.69% (0.2600 => 0.2400)



More information about the llvm-testresults mailing list