[llvm-testresults] Grawp nightly tester results

daniel_dunbar at apple.com daniel_dunbar at apple.com
Mon Dec 21 03:28:38 PST 2009


http://smooshlab.apple.com/perf.cgi/db_nt_internal/nightlytest/3929/
Name:  grawp.apple.com
Nickname: Grawp:4

Run: 3929, Start Time: 2009-12-21 00:18:48, End Time: 2009-12-21 03:28:07
Comparing To: 3909, Start Time: 2009-12-20 00:19:02, End Time: 2009-12-20 03:28:10

--- Changes Summary ---
New Test Passes: 0
New Test Failures: 0
Added Tests: 0
Removed Tests: 0
Significant Changes: 17

--- Tests Summary ---
Total Tests: 3888
Total Test Failures: 144

Total Test Failures By Type:
  Bitcode: 1
  CBE: 53
  JIT: 27
  JIT codegen: 59
  LLC: 1
  LLC compile: 1
  LLC-BETA: 1
  LLC_BETA compile: 1

--- Changes Detail ---
New Test Passes:

New Test Failures:

Added Tests:

Removed Tests:

Significant Changes in Test Results:
LLC:
 Externals/SPEC/CINT2000/197_parser/197_parser: -8.97% (2.9000 => 2.6400)
 Externals/SPEC/CINT2006/445_gobmk/445_gobmk: -8.00% (0.2500 => 0.2300)
Bitcode:
 SingleSource/UnitTests/2005-05-11-Popcount-ffs-fls: -6.34% (2272.0000 => 2128.0000)
JIT:
 Externals/SPEC/CINT2000/197_parser/197_parser: -5.67% (4.0600 => 3.8300)
 Externals/SPEC/CINT2000/252_eon/252_eon: -6.99% (2.2900 => 2.1300)
 MultiSource/Applications/kimwitu++/kc: -17.68% (6.5600 => 5.4000)
 MultiSource/Benchmarks/Prolangs-C/gnugo/gnugo: -5.71% (0.3500 => 0.3300)
 MultiSource/Benchmarks/Trimaran/enc-pc1/enc-pc1: -9.64% (0.8300 => 0.7500)
 MultiSource/Benchmarks/Trimaran/enc-rc4/enc-rc4: 17.39% (1.1500 => 1.3500)
 MultiSource/Benchmarks/mafft/pairlocalalign: -12.20% (33.9400 => 29.8000)
 SingleSource/Benchmarks/McGill/queens: 5.98% (2.3400 => 2.4800)
 SingleSource/Benchmarks/Misc-C++/oopack_v1p8: 8.00% (0.2500 => 0.2700)
 SingleSource/Benchmarks/Misc/fbench: 6.44% (2.3300 => 2.4800)
 SingleSource/Benchmarks/Shootout-C++/methcall: -8.47% (7.5600 => 6.9200)
 SingleSource/Benchmarks/Shootout/lists: -10.07% (8.5400 => 7.6800)
JIT codegen:
 MultiSource/Applications/kimwitu++/kc: -20.55% (5.6155 => 4.4615)
LLC-BETA compile:
 MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: -5.44% (0.3585 => 0.3390)



More information about the llvm-testresults mailing list