[llvm-testresults] lordcrumb-clang-x86 nightly tester results

daniel_dunbar at apple.com daniel_dunbar at apple.com
Tue Sep 29 04:16:05 PDT 2009


http://smooshlab.apple.com/perf.cgi/db_nt_internal/nightlytest//2068
Name:  lordcrumb.apple.com
Nickname: lordcrumb-clang-x86:8
Buildstatus: OK

Run: 2068, Start Time: 2009-09-29 02:42:13
Comparing To: 2049, Start Time: 2009-09-28 02:51:00

--- Changes Summary ---
New Test Passes: 0
New Test Failures: 0
Added Tests: 0
Removed Tests: 0
Significant Changes: 44

--- Tests Summary ---
Total Tests: 3447
Total Test Failures: 84

Total Test Failures By Type:
  CBE: 34
  JIT: 21
  JIT codegen: 21
  LLC: 2
  LLC compile: 2
  LLC-BETA: 2
  LLC_BETA compile: 2

--- Changes Detail ---
New Test Passes:

New Test Failures:

Added Tests:

Removed Tests:

Significant Changes in Test Results:
JIT codegen:
 MultiSource/Applications/SPASS/SPASS: -5.57% (3.9693 => 3.7483)
 MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: 8.28% (0.2789 => 0.3020)
CBE:
 MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -9.30% (0.4300 => 0.3900)
 SingleSource/Benchmarks/BenchmarkGame/fasta: -5.33% (1.5000 => 1.4200)
 SingleSource/Benchmarks/Shootout/ary3: 5.04% (3.5700 => 3.7500)
LLC-BETA:
 MultiSource/Applications/lemon/lemon: -5.11% (1.7600 => 1.6700)
 MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -6.98% (0.4300 => 0.4000)
 MultiSource/Benchmarks/Olden/voronoi/voronoi: -5.13% (0.3900 => 0.3700)
 MultiSource/Benchmarks/Ptrdist/yacr2/yacr2: 18.29% (0.8200 => 0.9700)
 SingleSource/Benchmarks/BenchmarkGame/fasta: -5.79% (1.2100 => 1.1400)
LLC compile:
 MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: 6.07% (0.2865 => 0.3039)
 MultiSource/Benchmarks/mafft/pairlocalalign: -6.69% (1.9240 => 1.7953)
GCCAS:
 MultiSource/Applications/JM/ldecod/ldecod: -7.83% (3.1998 => 2.9494)
 MultiSource/Applications/JM/lencod/lencod: -6.08% (6.5684 => 6.1693)
 MultiSource/Applications/SIBsim4/SIBsim4: -17.68% (0.6074 => 0.5000)
 MultiSource/Applications/SPASS/SPASS: -9.45% (7.9045 => 7.1572)
 MultiSource/Applications/d/make_dparser: -10.75% (1.4109 => 1.2592)
 MultiSource/Applications/hbd/hbd: -7.91% (0.2984 => 0.2748)
 MultiSource/Applications/sqlite3/sqlite3: -11.95% (4.3582 => 3.8376)
 MultiSource/Applications/treecc/treecc: -6.28% (0.9221 => 0.8642)
 MultiSource/Benchmarks/ASCI_Purple/SMG2000/smg2000: -14.47% (3.9486 => 3.3774)
 MultiSource/Benchmarks/MiBench/consumer-jpeg/consumer-jpeg: -5.10% (1.4187 => 1.3463)
 MultiSource/Benchmarks/MiBench/consumer-lame/consumer-lame: -8.09% (1.4926 => 1.3719)
 MultiSource/Benchmarks/MiBench/consumer-typeset/consumer-typeset: -13.57% (7.4938 => 6.4767)
 MultiSource/Benchmarks/Prolangs-C/football/football: -7.06% (0.4218 => 0.3920)
 MultiSource/Benchmarks/Ptrdist/bc/bc: -12.61% (0.4869 => 0.4255)
 MultiSource/Benchmarks/mafft/pairlocalalign: -25.26% (6.2591 => 4.6782)
 SingleSource/Benchmarks/Adobe-C++/stepanov_abstraction: -20.27% (0.3749 => 0.2989)
 SingleSource/Benchmarks/Adobe-C++/stepanov_vector: -21.53% (0.4147 => 0.3254)
JIT:
 MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -6.52% (0.4600 => 0.4300)
 MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: 5.71% (0.3500 => 0.3700)
 MultiSource/Benchmarks/MiBench/telecomm-CRC32/telecomm-CRC32: -8.00% (0.2500 => 0.2300)
 MultiSource/Benchmarks/Prolangs-C/gnugo/gnugo: 8.00% (0.2500 => 0.2700)
 MultiSource/Benchmarks/Trimaran/enc-pc1/enc-pc1: -6.02% (0.8300 => 0.7800)
 SingleSource/Benchmarks/BenchmarkGame/fasta: -6.45% (1.2400 => 1.1600)
 SingleSource/Benchmarks/CoyoteBench/huffbench: -9.87% (17.0200 => 15.3400)
 SingleSource/Benchmarks/Shootout/ary3: -6.65% (3.7600 => 3.5100)
LLC:
 MultiSource/Benchmarks/MiBench/automotive-basicmath/automotive-basicmath: -7.14% (0.4200 => 0.3900)
 MultiSource/Benchmarks/Olden/voronoi/voronoi: -5.13% (0.3900 => 0.3700)
 MultiSource/Benchmarks/Ptrdist/yacr2/yacr2: 18.29% (0.8200 => 0.9700)
 MultiSource/Benchmarks/Trimaran/enc-pc1/enc-pc1: 10.61% (0.6600 => 0.7300)
 SingleSource/Benchmarks/BenchmarkGame/fasta: -5.79% (1.2100 => 1.1400)
LLC-BETA compile:
 MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: 5.17% (0.2900 => 0.3050)
 MultiSource/Benchmarks/mafft/pairlocalalign: -6.70% (1.9594 => 1.8281)



More information about the llvm-testresults mailing list