[llvm-testresults] Geryon-X86-64-PIC nightly tester results

daniel_dunbar at apple.com daniel_dunbar at apple.com
Mon Apr 12 04:57:23 PDT 2010


http://smooshlab.apple.com/perf/db_nt_internal/nightlytest/7241/
Nickname: Geryon-X86-64-PIC:5
Name:  clattner2.apple.com

Run: 7241, Start Time: 2010-04-12 01:59:52, End Time: 2010-04-12 04:54:16
Comparing To: 7182, Start Time: 2010-04-10 01:57:56, End Time: 2010-04-10 04:52:13

--- Changes Summary ---
New Test Passes: 0
New Test Failures: 19
Added Tests: 0
Removed Tests: 0
Significant Changes: 32

--- Tests Summary ---
Total Tests: 3879
Total Test Failures: 153

Total Test Failures By Type:
  CBE: 71
  JIT: 25
  JIT codegen: 57

--- Changes Detail ---
New Test Passes:

New Test Failures:
Externals/SPEC/CFP2000/179_art/179_art [CBE]
Externals/SPEC/CFP2000/188_ammp/188_ammp [CBE]
Externals/SPEC/CINT2000/254_gap/254_gap [CBE]
Externals/SPEC/CINT2000/256_bzip2/256_bzip2 [CBE]
Externals/SPEC/CINT2000/300_twolf/300_twolf [CBE]
MultiSource/Applications/sgefa/sgefa [CBE]
MultiSource/Benchmarks/MallocBench/espresso/espresso [CBE]
MultiSource/Benchmarks/MiBench/office-ispell/office-ispell [CBE]
MultiSource/Benchmarks/MiBench/telecomm-FFT/telecomm-fft [CBE]
MultiSource/Benchmarks/MiBench/telecomm-adpcm/telecomm-adpcm [CBE]
MultiSource/Benchmarks/Prolangs-C/agrep/agrep [CBE]
MultiSource/Benchmarks/Prolangs-C/unix-smail/unix-smail [CBE]
MultiSource/Benchmarks/Trimaran/enc-3des/enc-3des [CBE]
MultiSource/Benchmarks/Trimaran/enc-pc1/enc-pc1 [CBE]
MultiSource/Benchmarks/Trimaran/enc-rc4/enc-rc4 [CBE]
MultiSource/Benchmarks/mediabench/adpcm/rawcaudio/rawcaudio [CBE]
MultiSource/Benchmarks/mediabench/adpcm/rawdaudio/rawdaudio [CBE]
MultiSource/Benchmarks/mediabench/g721/g721encode/encode [CBE]
MultiSource/Benchmarks/mediabench/mpeg2/mpeg2dec/mpeg2decode [CBE]

Added Tests:

Removed Tests:

Significant Changes in Test Results:
Bitcode:
 MultiSource/Benchmarks/McCat/05-eks/eks: -7.96% (8640.0000 => 7952.0000)
 SingleSource/Benchmarks/CoyoteBench/lpbench: -10.45% (4592.0000 => 4112.0000)
JIT codegen:
 MultiSource/Benchmarks/MiBench/telecomm-gsm/telecomm-gsm: -7.65% (0.2548 => 0.2353)
CBE:
 MultiSource/Applications/hexxagon/hexxagon: -16.55% (8.6400 => 7.2100)
 SingleSource/Benchmarks/Shootout/sieve: 166.41% (6.4900 => 17.2900)
LLC-BETA:
 MultiSource/Applications/hexxagon/hexxagon: -7.25% (8.8300 => 8.1900)
 MultiSource/Benchmarks/ASC_Sequoia/CrystalMk/CrystalMk: -5.36% (8.5900 => 8.1300)
 MultiSource/Benchmarks/Olden/voronoi/voronoi: 5.26% (0.3800 => 0.4000)
 MultiSource/Benchmarks/Ptrdist/ft/ft: -5.77% (1.0400 => 0.9800)
 SingleSource/Benchmarks/Shootout/sieve: 172.30% (6.2100 => 16.9100)
GCCAS:
 Externals/SPEC/CFP2000/177_mesa/177_mesa: 5.02% (3.8120 => 4.0035)
 Externals/SPEC/CFP2006/433_milc/433_milc: 8.99% (1.0647 => 1.1604)
 Externals/SPEC/CINT2006/464_h264ref/464_h264ref: 6.00% (6.2398 => 6.6141)
 MultiSource/Applications/JM/lencod/lencod: 6.00% (6.2377 => 6.6118)
 MultiSource/Benchmarks/FreeBench/pifft/pifft: 6.86% (0.2959 => 0.3162)
 MultiSource/Benchmarks/MiBench/automotive-susan/automotive-susan: 5.87% (0.3648 => 0.3862)
 MultiSource/Benchmarks/MiBench/consumer-lame/consumer-lame: 5.74% (1.5374 => 1.6257)
 MultiSource/Benchmarks/Prolangs-C/football/football: 9.50% (0.3851 => 0.4217)
 MultiSource/Benchmarks/Ptrdist/yacr2/yacr2: 11.62% (0.2315 => 0.2584)
 MultiSource/Benchmarks/mafft/pairlocalalign: 8.27% (4.4672 => 4.8367)
JIT:
 MultiSource/Applications/hexxagon/hexxagon: -9.70% (10.2100 => 9.2200)
 MultiSource/Benchmarks/ASC_Sequoia/CrystalMk/CrystalMk: -5.98% (8.6900 => 8.1700)
 MultiSource/Benchmarks/MiBench/telecomm-gsm/telecomm-gsm: -6.12% (0.4900 => 0.4600)
 MultiSource/Benchmarks/mafft/pairlocalalign: 19.12% (27.2000 => 32.4000)
 SingleSource/Benchmarks/Misc/richards_benchmark: -8.94% (1.2300 => 1.1200)
 SingleSource/Benchmarks/Shootout/sieve: 154.20% (6.6600 => 16.9300)
LLC:
 Externals/SPEC/CINT2006/445_gobmk/445_gobmk: -10.00% (0.3000 => 0.2700)
 MultiSource/Applications/hexxagon/hexxagon: -7.25% (8.8300 => 8.1900)
 MultiSource/Benchmarks/ASC_Sequoia/CrystalMk/CrystalMk: -5.58% (8.6000 => 8.1200)
 MultiSource/Benchmarks/Olden/voronoi/voronoi: -5.00% (0.4000 => 0.3800)
 SingleSource/Benchmarks/Shootout/sieve: 172.46% (6.2100 => 16.9200)
LLC-BETA compile:
 Externals/SPEC/CINT2006/483_xalancbmk/483_xalancbmk: 8.49% (0.2803 => 0.3041)



More information about the llvm-testresults mailing list