<div dir="ltr">Hi Chris,<div><br></div><div>LGTM, thanks!</div><div><br></div><div> - Daniel</div></div><div class="gmail_extra"><br><br><div class="gmail_quote">On Thu, Aug 8, 2013 at 1:55 PM, Chris Matthews <span dir="ltr"><<a href="mailto:chris.matthews@apple.com" target="_blank">chris.matthews@apple.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div style="word-wrap:break-word">This is the first of several patches to refactor the LNT nt test, to make it more amenable to single test reruns. <div>
<br></div><div>This patch extracts common configuration options into a separate stateless object so that they can be accessed again for rerunning. Common functionality from the run is also extracted so that it can be run more than once.</div>
<div><br></div><div>This was tested on a regular nt run, as well as with —multisample and —only-test.</div><div><br></div><div><div>Index: lnt/tests/nt.py</div><div>===================================================================</div>
<div>--- lnt/tests/nt.py<span style="white-space:pre-wrap"> </span>(revision 187526)</div><div>+++ lnt/tests/nt.py<span style="white-space:pre-wrap"> </span>(working copy)</div><div>@@ -47,7 +47,346 @@</div><div> if self._log is None:</div>
<div> raise ValueError("log() unavailable outside test execution")</div><div> return self._log</div><div>+</div><div>+</div><div>+class TestConfiguration(object):</div><div>+ """Store and calculate important paths and options for this test based</div>
<div>+ on the command line arguments. This object is stateless and only</div><div>+ based on the command line arguments! Options which take a long</div><div>+ time to calculate are cached, since we are stateless this is okay.</div>
<div>+</div><div>+ """</div><div>+</div><div>+ def __init__(self, opts, start_time):</div><div>+ """Prepare the configuration:</div><div>+ opts -- the command line options object</div>
<div>+ start_time -- the time the program was invoked as a string</div><div>+ """</div><div>+ self.opts = opts</div><div>+ self.start_time = start_time</div><div> </div><div>
+ # Report directory cache.</div><div>+ self._report_dir = None</div><div>+ # Compiler interrogation is a lot of work, this will cache it.</div><div>+ self._cc_info = None</div><div>+ # Getting compiler version spawns subprocesses, cache it.</div>
<div>+ self._get_source_version = None</div><div>+</div><div>+ def __getattr__(self, attr):</div><div>+ """Provide direct access to the options when we don't provide a</div><div>+ configuration directly."""</div>
<div>+ return getattr(self.opts, attr)</div><div>+</div><div>+ @property</div><div>+ def report_dir(self):</div><div>+ """Get the (possibly cached) path to the directory where test suite</div>
<div>+ will be placed. Report dir is a directory within the sandbox which</div><div>+ is either "build" or a timestamped directory based on """</div><div>+ if self._report_dir is not None:</div>
<div>+ return self._report_dir</div><div>+ </div><div>+ if self.timestamp_build:</div><div>+ ts = self.start_time.replace(' ','_').replace(':','-')</div><div>+ build_dir_name = "test-%s" % ts</div>
<div>+ else:</div><div>+ build_dir_name = "build"</div><div>+ basedir = os.path.join(self.opts.sandbox_path, build_dir_name)</div><div>+ # Canonicalize paths, in case we are using e.g. an NFS remote mount.</div>
<div>+ #</div><div>+ # FIXME: This should be eliminated, along with the realpath call below.</div><div>+ basedir = os.path.realpath(basedir)</div><div>+ self._report_dir = basedir</div><div>+ return basedir</div>
<div>+</div><div>+ def report_path(self, iteration):</div><div>+ """Path to a single run's JSON results file."""</div><div>+ return os.path.join(self.build_dir(iteration), 'report.json')</div>
<div>+</div><div>+ def build_dir(self, iteration):</div><div>+ """Path of the build dir within the report dir. iteration -- the</div><div>+ iteration number if multisample otherwise None.</div>
<div>+ When multisample is off report_dir == build_dir.</div><div>+ """</div><div>+ # Do nothing in single-sample build, because report_dir and the</div><div>+ # build_dir is the same directory.</div>
<div>+ if iteration is None:</div><div>+ return self.report_dir</div><div>+</div><div>+ # Create the directory for individual iteration.</div><div>+ return os.path.join(self.report_dir, "sample-%d" % iteration)</div>
<div>+</div><div>+ @property</div><div>+ def target_flags(self):</div><div>+ """Computed target flags list."""</div><div>+ # Compute TARGET_FLAGS.</div><div>+ target_flags = []</div>
<div>+</div><div>+ # FIXME: Eliminate this blanket option.</div><div>+ target_flags.extend(self.opts.cflags)</div><div>+</div><div>+ # Pass flags to backend.</div><div>+ for f in self.opts.mllvm:</div>
<div>+ target_flags.extend(['-mllvm', f])</div><div>+</div><div>+ if self.opts.arch is not None:</div><div>+ target_flags.append('-arch')</div><div>+ target_flags.append(self.opts.arch)</div>
<div>+ if self.opts.isysroot is not None:</div><div>+ target_flags.append('-isysroot')</div><div>+ target_flags.append(self.opts.isysroot)</div><div>+ return target_flags</div><div>
+</div><div>+ @property</div><div>+ def cc_info(self):</div><div>+ """Discovered compiler information from the cc under test. Cached</div><div>+ because discovery is slow.</div><div>+</div>
<div>+ """</div><div>+ if self._cc_info is None:</div><div>+ self._cc_info = lnt.testing.util.compilers.get_cc_info( \</div><div>+ <a href="http://self.opts.cc" target="_blank">self.opts.cc</a>_under_test,</div>
<div>+ self.target_flags)</div><div>+ return self._cc_info</div><div>+</div><div>+ @property</div><div>+ def target(self):</div><div>+ """Discovered compiler's target information."""</div>
<div>+ # Get compiler info.</div><div>+ cc_target = <a href="http://self.cc" target="_blank">self.cc</a>_info.get('cc_target')</div><div>+ return cc_target</div><div>+</div><div>+ @property</div>
<div>+ def llvm_source_version(self):</div><div>+ """The version of llvm from llvm_src_root."""</div><div>+ if self.opts.llvm_src_root:</div><div>+ if self._get_source_version is None:</div>
<div>+ self._get_source_version = get_source_version(</div><div>+ self.opts.llvm_src_root)</div><div>+ return self._get_source_version</div><div>+ else:</div><div>+ return None</div>
<div>+</div><div>+ def build_report_path(self, iteration):</div><div>+ """The path of the results.csv file which each run of the test suite</div><div>+ will produce.</div><div>+ iteration -- the multisample iteration number otherwise None."""</div>
<div>+ report_path = os.path.join(self.build_dir(iteration))</div><div>+ if self.opts.only_test is not None:</div><div>+ report_path = os.path.join(report_path, self.opts.only_test)</div><div>+ report_path = os.path.join(report_path, 'report.%s.csv' % \</div>
<div>+ self.opts.test_style)</div><div>+ return report_path</div><div>+</div><div>+ def test_log_path(self, iteration):</div><div>+ """The path of the log file for the build.</div>
<div>+ iteration -- the multisample iteration number otherwise None."""</div><div>+ return os.path.join(self.build_dir(iteration), 'test.log')</div><div>+</div><div>+ def compute_run_make_variables(self):</div>
<div>+ """Compute make variables from command line arguments and compiler.</div><div>+ Returns a dict of make_variables as well as a public version</div><div>+ with the remote options removed.</div>
<div>+</div><div>+ """</div><div>+ cc_info = <a href="http://self.cc" target="_blank">self.cc</a>_info</div><div>+ # Set the make variables to use.</div><div>+ make_variables = {</div>
<div>+ 'TARGET_CC' : <a href="http://self.opts.cc" target="_blank">self.opts.cc</a>_reference,</div><div>+ 'TARGET_CXX' : self.opts.cxx_reference,</div><div>+ 'TARGET_LLVMGCC' : <a href="http://self.opts.cc" target="_blank">self.opts.cc</a>_under_test,</div>
<div>+ 'TARGET_LLVMGXX' : self.opts.cxx_under_test,</div><div>+ 'TARGET_FLAGS' : ' '.join(self.target_flags),</div><div>+ }</div><div>+</div><div>+ # Compute TARGET_LLCFLAGS, for TEST=nightly runs.</div>
<div>+ if self.opts.test_style == "nightly":</div><div>+ # Compute TARGET_LLCFLAGS.</div><div>+ target_llcflags = []</div><div>+ if self.opts.mcpu is not None:</div><div>+ target_llcflags.append('-mcpu')</div>
<div>+ target_llcflags.append(self.opts.mcpu)</div><div>+ if self.opts.relocation_model is not None:</div><div>+ target_llcflags.append('-relocation-model')</div><div>+ target_llcflags.append(self.opts.relocation_model)</div>
<div>+ if self.opts.disable_fp_elim:</div><div>+ target_llcflags.append('-disable-fp-elim')</div><div>+ make_variables['TARGET_LLCFLAGS'] = ' '.join(target_llcflags)</div>
<div>+</div><div>+ # Set up environment overrides if requested, to effectively</div><div>+ # run under the specified Darwin iOS simulator.</div><div>+ #</div><div>+ # See /D/P/../Developer/Tools/RunPlatformUnitTests.</div>
<div>+ if self.opts.ios_simulator_sdk is not None:</div><div>+ make_variables['EXECUTION_ENVIRONMENT_OVERRIDES'] = ' '.join(</div><div>+ ['DYLD_FRAMEWORK_PATH="%s"' % self.opts.ios_simulator_sdk,</div>
<div>+ 'DYLD_LIBRARY_PATH=""',</div><div>+ 'DYLD_ROOT_PATH="%s"' % self.opts.ios_simulator_sdk,</div><div>+ 'DYLD_NEW_LOCAL_SHARED_REGIONS=YES',</div>
<div>+ 'DYLD_NO_FIX_PREBINDING=YES',</div><div>+ 'IPHONE_SIMULATOR_ROOT="%s"' % self.opts.ios_simulator_sdk,</div><div>+ 'CFFIXED_USER_HOME="%s"' % os.path.expanduser(</div>
<div>+ "~/Library/Application Support/iPhone Simulator/User")])</div><div>+</div><div>+ # Pick apart the build mode.</div><div>+ build_mode = self.opts.build_mode</div><div>+ if build_mode.startswith("Debug"):</div>
<div>+ build_mode = build_mode[len("Debug"):]</div><div>+ make_variables['ENABLE_OPTIMIZED'] = '0'</div><div>+ elif build_mode.startswith("Unoptimized"):</div>
<div>+ build_mode = build_mode[len("Unoptimized"):]</div><div>+ make_variables['ENABLE_OPTIMIZED'] = '0'</div><div>+ elif build_mode.startswith("Release"):</div>
<div>+ build_mode = build_mode[len("Release"):]</div><div>+ make_variables['ENABLE_OPTIMIZED'] = '1'</div><div>+ else:</div><div>+ fatal('invalid build mode: %r' % self.opts.build_mode)</div>
<div>+</div><div>+ while build_mode:</div><div>+ for (name, key) in (('+Asserts', 'ENABLE_ASSERTIONS'),</div><div>+ ('+Checks', 'ENABLE_EXPENSIVE_CHECKS'),</div>
<div>+ ('+Coverage', 'ENABLE_COVERAGE'),</div><div>+ ('+Debug', 'DEBUG_SYMBOLS'),</div><div>+ ('+Profile', 'ENABLE_PROFILING')):</div>
<div>+ if build_mode.startswith(name):</div><div>+ build_mode = build_mode[len(name):]</div><div>+ make_variables[key] = '1'</div><div>+ break</div>
<div>+ else:</div><div>+ fatal('invalid build mode: %r' % self.opts.build_mode)</div><div>+</div><div>+ # Assertions are disabled by default.</div><div>+ if 'ENABLE_ASSERTIONS' in make_variables:</div>
<div>+ del make_variables['ENABLE_ASSERTIONS']</div><div>+ else:</div><div>+ make_variables['DISABLE_ASSERTIONS'] = '1'</div><div>+</div><div>+ # Set the optimization level options.</div>
<div>+ make_variables['OPTFLAGS'] = self.opts.optimize_option</div><div>+ if self.opts.optimize_option == '-Os':</div><div>+ make_variables['LLI_OPTFLAGS'] = '-O2'</div>
<div>+ make_variables['LLC_OPTFLAGS'] = '-O2'</div><div>+ else:</div><div>+ make_variables['LLI_OPTFLAGS'] = self.opts.optimize_option</div><div>+ make_variables['LLC_OPTFLAGS'] = self.opts.optimize_option</div>
<div>+</div><div>+ # Set test selection variables.</div><div>+ if not self.opts.test_cxx:</div><div>+ make_variables['DISABLE_CXX'] = '1'</div><div>+ if not self.opts.test_jit:</div>
<div>+ make_variables['DISABLE_JIT'] = '1'</div><div>+ if not self.opts.test_llc:</div><div>+ make_variables['DISABLE_LLC'] = '1'</div><div>+ if not self.opts.test_lto:</div>
<div>+ make_variables['DISABLE_LTO'] = '1'</div><div>+ if self.opts.test_llcbeta:</div><div>+ make_variables['ENABLE_LLCBETA'] = '1'</div><div>+ if self.opts.test_small:</div>
<div>+ make_variables['SMALL_PROBLEM_SIZE'] = '1'</div><div>+ if self.opts.test_large:</div><div>+ if self.opts.test_small:</div><div>+ fatal('the --small and --large options are mutually exclusive')</div>
<div>+ make_variables['LARGE_PROBLEM_SIZE'] = '1'</div><div>+ if self.opts.test_integrated_as:</div><div>+ make_variables['TEST_INTEGRATED_AS'] = '1'</div><div>
+ if self.opts.liblto_path:</div><div>+ make_variables['LD_ENV_OVERRIDES'] = (</div><div>+ 'env DYLD_LIBRARY_PATH=%s' % os.path.dirname(</div><div>+ self.opts.liblto_path))</div>
<div>+</div><div>+ if self.opts.threads > 1 or self.opts.build_threads > 1:</div><div>+ make_variables['ENABLE_PARALLEL_REPORT'] = '1'</div><div>+</div><div>+ # Select the test style to use.</div>
<div>+ if self.opts.test_style == "simple":</div><div>+ # We always use reference outputs with TEST=simple.</div><div>+ make_variables['ENABLE_HASHED_PROGRAM_OUTPUT'] = '1'</div>
<div>+ make_variables['USE_REFERENCE_OUTPUT'] = '1'</div><div>+ make_variables['TEST'] = self.opts.test_style</div><div>+</div><div>+ # Set CC_UNDER_TEST_IS_CLANG when appropriate.</div>
<div>+ if cc_info.get('cc_name') in ('apple_clang', 'clang'):</div><div>+ make_variables['CC_UNDER_TEST_IS_CLANG'] = '1'</div><div>+ elif cc_info.get('cc_name') in ('llvm-gcc',):</div>
<div>+ make_variables['CC_UNDER_TEST_IS_LLVM_GCC'] = '1'</div><div>+ elif cc_info.get('cc_name') in ('gcc',):</div><div>+ make_variables['CC_UNDER_TEST_IS_GCC'] = '1'</div>
<div>+</div><div>+ # Convert the target arch into a make variable, to allow more</div><div>+ # target based specialization (e.g.,</div><div>+ # CC_UNDER_TEST_TARGET_IS_ARMV7).</div><div>+ if '-' in cc_info.get('cc_target', ''):</div>
<div>+ arch_name = cc_info.get('cc_target').split('-', 1)[0]</div><div>+ make_variables['CC_UNDER_TEST_TARGET_IS_' + arch_name.upper()] = '1'</div><div>+</div><div>+ # Set LLVM_RELEASE_IS_PLUS_ASSERTS when appropriate, to allow</div>
<div>+ # testing older LLVM source trees.</div><div>+ llvm_source_version = self.llvm_source_version</div><div>+ if (llvm_source_version and llvm_source_version.isdigit() and</div><div>+ int(llvm_source_version) < 107758):</div>
<div>+ make_variables['LLVM_RELEASE_IS_PLUS_ASSERTS'] = 1</div><div>+</div><div>+ # Set ARCH appropriately, based on the inferred target.</div><div>+ #</div><div>+ # FIXME: We should probably be more strict about this.</div>
<div>+ cc_target = cc_info.get('cc_target')</div><div>+ llvm_arch = self.opts.llvm_arch</div><div>+ if cc_target and llvm_arch is None:</div><div>+ # cc_target is expected to be a (GCC style) target</div>
<div>+ # triple. Pick out the arch component, and then try to</div><div>+ # convert it to an LLVM nightly test style architecture</div><div>+ # name, which is of course totally different from all of</div>
<div>+ # GCC names, triple names, LLVM target names, and LLVM</div><div>+ # triple names. Stupid world.</div><div>+ #</div><div>+ # FIXME: Clean this up once everyone is on 'lnt runtest</div>
<div>+ # nt' style nightly testing.</div><div>+ arch = cc_target.split('-', 1)[0].lower()</div><div>+ if (len(arch) == 4 and arch[0] == 'i' and arch.endswith('86') and</div>
<div>+ arch[1] in '3456789'): # i[3-9]86</div><div>+ llvm_arch = 'x86'</div><div>+ elif arch in ('x86_64', 'amd64'):</div><div>+ llvm_arch = 'x86_64'</div>
<div>+ elif arch in ('powerpc', 'powerpc64', 'ppu'):</div><div>+ llvm_arch = 'PowerPC'</div><div>+ elif (arch == 'arm' or arch.startswith('armv') or</div>
<div>+ arch == 'thumb' or arch.startswith('thumbv') or</div><div>+ arch == 'xscale'):</div><div>+ llvm_arch = 'ARM'</div><div>+ elif arch.startswith('alpha'):</div>
<div>+ llvm_arch = 'Alpha'</div><div>+ elif arch.startswith('sparc'):</div><div>+ llvm_arch = 'Sparc'</div><div>+ elif arch in ('mips', 'mipsel'):</div>
<div>+ llvm_arch = 'Mips'</div><div>+</div><div>+ if llvm_arch is not None:</div><div>+ make_variables['ARCH'] = llvm_arch</div><div>+ else:</div><div>+ warning("unable to infer ARCH, some tests may not run correctly!")</div>
<div>+</div><div>+ # Add in any additional make flags passed in via --make-param.</div><div>+ for entry in self.opts.make_parameters:</div><div>+ if '=' not in entry:</div><div>+ name, value = entry,''</div>
<div>+ else:</div><div>+ name, value = entry.split('=', 1)</div><div>+ print "make", name, value</div><div>+ make_variables[name] = value</div><div>
+</div><div>+ </div><div>+ # Set remote execution variables, if used.</div><div>+ if self.opts.remote:</div><div>+ # make a copy of args for report, without remote options.</div><div>
+ public_vars = make_variables.copy()</div><div>+ make_variables['REMOTE_HOST'] = self.opts.remote_host</div><div>+ make_variables['REMOTE_USER'] = self.opts.remote_user</div>
<div>+ make_variables['REMOTE_PORT'] = str(self.opts.remote_port)</div><div>+ make_variables['REMOTE_CLIENT'] = self.opts.remote_client</div><div>+ else:</div><div>+ public_vars = make_variables</div>
<div>+</div><div>+ return make_variables, public_vars</div><div>+ </div><div> ###</div><div> </div><div> def resolve_command_path(name):</div><div>@@ -64,12 +403,12 @@</div><div> # If that failed just return the original name.</div>
<div> return name</div><div> </div><div>-def scan_for_test_modules(opts):</div><div>- base_modules_path = os.path.join(opts.test_suite_root, 'LNTBased')</div><div>- if opts.only_test is None:</div><div>+def scan_for_test_modules(config):</div>
<div>+ base_modules_path = os.path.join(config.test_suite_root, 'LNTBased')</div><div>+ if config.only_test is None:</div><div> test_modules_path = base_modules_path</div><div>- elif opts.only_test.startswith('LNTBased'):</div>
<div>- test_modules_path = os.path.join(opts.test_suite_root, opts.only_test)</div><div>+ elif config.only_test.startswith('LNTBased'):</div><div>+ test_modules_path = os.path.join(config.test_suite_root, config.only_test)</div>
<div> else:</div><div> return</div><div> </div><div>@@ -80,7 +419,7 @@</div><div> for dirpath,dirnames,filenames in os.walk(test_modules_path,</div><div> followlinks = True):</div>
<div> # Ignore the example tests, unless requested.</div><div>- if not opts.include_test_examples and 'Examples' in dirnames:</div><div>+ if not config.include_test_examples and 'Examples' in dirnames:</div>
<div> dirnames.remove('Examples')</div><div> </div><div> # Check if this directory defines a test module.</div><div>@@ -121,7 +460,7 @@</div><div> </div><div> # FIXME: Support duplicate logfiles to global directory.</div>
<div> def execute_test_modules(test_log, test_modules, test_module_variables,</div><div>- basedir, opts):</div><div>+ basedir, config):</div><div> # For now, we don't execute these in parallel, but we do forward the</div>
<div> # parallel build options to the test.</div><div> test_modules.sort()</div><div>@@ -131,7 +470,7 @@</div><div> for name in test_modules:</div><div> # First, load the test module file.</div><div> locals = globals = {}</div>
<div>- test_path = os.path.join(opts.test_suite_root, 'LNTBased', name)</div><div>+ test_path = os.path.join(config.test_suite_root, 'LNTBased', name)</div><div> test_obj_path = os.path.join(basedir, 'LNTBased', name)</div>
<div> module_path = os.path.join(test_path, 'TestModule')</div><div> module_file = open(module_path)</div><div>@@ -189,7 +528,7 @@</div><div> </div><div> return results</div><div> </div><div>-def compute_test_module_variables(make_variables, opts):</div>
<div>+def compute_test_module_variables(make_variables, config):</div><div> # Set the test module options, which we try and restrict to a tighter subset</div><div> # than what we pass to the LNT makefiles.</div><div>
test_module_variables = {</div><div>@@ -201,7 +540,7 @@</div><div> make_variables['OPTFLAGS']) }</div><div> </div><div> # Add the remote execution variables.</div><div>- if opts.remote:</div>
<div>+ if config.remote:</div><div> test_module_variables['REMOTE_HOST'] = make_variables['REMOTE_HOST']</div><div> test_module_variables['REMOTE_USER'] = make_variables['REMOTE_USER']</div>
<div> test_module_variables['REMOTE_PORT'] = make_variables['REMOTE_PORT']</div><div>@@ -225,19 +564,20 @@</div><div> </div><div> # We pass the test execution values as variables too, this might be better</div>
<div> # passed as actual arguments.</div><div>- test_module_variables['THREADS'] = opts.threads</div><div>- test_module_variables['BUILD_THREADS'] = opts.build_threads or opts.threads</div><div>-</div>
<div>+ test_module_variables['THREADS'] = config.threads</div><div>+ test_module_variables['BUILD_THREADS'] = config.build_threads or \</div><div>+ config.threads</div>
<div> return test_module_variables</div><div> </div><div>-def execute_nt_tests(test_log, make_variables, basedir, opts, report_dir):</div><div>+def execute_nt_tests(test_log, make_variables, basedir, config):</div><div>
+ report_dir = config.report_dir</div><div> common_args = ['make', '-k']</div><div> common_args.extend('%s=%s' % (k,v) for k,v in make_variables.items())</div><div>- if opts.only_test is not None:</div>
<div>- common_args.extend(['-C',opts.only_test])</div><div>+ if config.only_test is not None:</div><div>+ common_args.extend(['-C',config.only_test])</div><div> </div><div> # If we are using isolation, run under sandbox-exec.</div>
<div>- if opts.use_isolation:</div><div>+ if config.use_isolation:</div><div> # Write out the sandbox profile.</div><div> sandbox_profile_path = os.path.join(basedir, "<a href="http://isolation.sb" target="_blank">isolation.sb</a>")</div>
<div> print >>sys.stderr, "%s: creating sandbox profile %r" % (</div><div>@@ -267,20 +607,20 @@</div><div> common_args = ['sandbox-exec', '-f', sandbox_profile_path] + common_args</div>
<div> </div><div> # Run a separate 'make build' step if --build-threads was given.</div><div>- if opts.build_threads > 0:</div><div>- args = common_args + ['-j', str(opts.build_threads), 'build']</div>
<div>+ if config.build_threads > 0:</div><div>+ args = common_args + ['-j', str(config.build_threads), 'build']</div><div> print >>test_log, '%s: running: %s' % (timestamp(),</div>
<div> ' '.join('"%s"' % a</div><div> for a in args))</div><div> test_log.flush()</div><div>
</div><div> print >>sys.stderr, '%s: building "nightly tests" with -j%u...' % (</div><div>- timestamp(), opts.build_threads)</div><div>+ timestamp(), config.build_threads)</div>
<div> res = execute_command(test_log, basedir, args, report_dir)</div><div> </div><div> # Then 'make report'.</div><div>- args = common_args + ['-j', str(opts.threads),</div><div>- 'report', 'report.%s.csv' % opts.test_style]</div>
<div>+ args = common_args + ['-j', str(config.threads),</div><div>+ 'report', 'report.%s.csv' % config.test_style]</div><div> print >>test_log, '%s: running: %s' % (timestamp(),</div>
<div> ' '.join('"%s"' % a</div><div> for a in args))</div><div>@@ -290,7 +630,7 @@</div><div> # somehow MACOSX_DEPLOYMENT_TARGET gets injected into the environment on OS</div>
<div> # X (which changes the driver behavior and causes generally weirdness).</div><div> print >>sys.stderr, '%s: executing "nightly tests" with -j%u...' % (</div><div>- timestamp(), opts.threads)</div>
<div>+ timestamp(), config.threads)</div><div> </div><div> res = execute_command(test_log, basedir, args, report_dir)</div><div> </div><div>@@ -402,311 +742,133 @@</div><div> </div><div> return test_samples</div>
<div> </div><div>-def compute_run_make_variables(opts, llvm_source_version, target_flags,</div><div>- cc_info):</div><div>- # Set the make variables to use.</div><div>- make_variables = {</div>
<div>- 'TARGET_CC' : <a href="http://opts.cc" target="_blank">opts.cc</a>_reference,</div><div>- 'TARGET_CXX' : opts.cxx_reference,</div><div>- 'TARGET_LLVMGCC' : <a href="http://opts.cc" target="_blank">opts.cc</a>_under_test,</div>
<div>- 'TARGET_LLVMGXX' : opts.cxx_under_test,</div><div>- 'TARGET_FLAGS' : ' '.join(target_flags),</div><div>- }</div><div>-</div><div>- # Compute TARGET_LLCFLAGS, for TEST=nightly runs.</div>
<div>- if opts.test_style == "nightly":</div><div>- # Compute TARGET_LLCFLAGS.</div><div>- target_llcflags = []</div><div>- if opts.mcpu is not None:</div><div>- target_llcflags.append('-mcpu')</div>
<div>- target_llcflags.append(opts.mcpu)</div><div>- if opts.relocation_model is not None:</div><div>- target_llcflags.append('-relocation-model')</div><div>- target_llcflags.append(opts.relocation_model)</div>
<div>- if opts.disable_fp_elim:</div><div>- target_llcflags.append('-disable-fp-elim')</div><div>- make_variables['TARGET_LLCFLAGS'] = ' '.join(target_llcflags)</div><div>-</div>
<div>- # Set up environment overrides if requested, to effectively run under the</div><div>- # specified Darwin iOS simulator.</div><div>- #</div><div>- # See /D/P/../Developer/Tools/RunPlatformUnitTests.</div>
<div>- if opts.ios_simulator_sdk is not None:</div><div>- make_variables['EXECUTION_ENVIRONMENT_OVERRIDES'] = ' '.join(</div><div>- ['DYLD_FRAMEWORK_PATH="%s"' % opts.ios_simulator_sdk,</div>
<div>- 'DYLD_LIBRARY_PATH=""',</div><div>- 'DYLD_ROOT_PATH="%s"' % opts.ios_simulator_sdk,</div><div>- 'DYLD_NEW_LOCAL_SHARED_REGIONS=YES',</div>
<div>- 'DYLD_NO_FIX_PREBINDING=YES',</div><div>- 'IPHONE_SIMULATOR_ROOT="%s"' % opts.ios_simulator_sdk,</div><div>- 'CFFIXED_USER_HOME="%s"' % os.path.expanduser(</div>
<div>- "~/Library/Application Support/iPhone Simulator/User")])</div><div>-</div><div>- # Pick apart the build mode.</div><div>- build_mode = opts.build_mode</div><div>- if build_mode.startswith("Debug"):</div>
<div>- build_mode = build_mode[len("Debug"):]</div><div>- make_variables['ENABLE_OPTIMIZED'] = '0'</div><div>- elif build_mode.startswith("Unoptimized"):</div><div>- build_mode = build_mode[len("Unoptimized"):]</div>
<div>- make_variables['ENABLE_OPTIMIZED'] = '0'</div><div>- elif build_mode.startswith("Release"):</div><div>- build_mode = build_mode[len("Release"):]</div><div>- make_variables['ENABLE_OPTIMIZED'] = '1'</div>
<div>- else:</div><div>- fatal('invalid build mode: %r' % opts.build_mode)</div><div>-</div><div>- while build_mode:</div><div>- for (name,key) in (('+Asserts', 'ENABLE_ASSERTIONS'),</div>
<div>- ('+Checks', 'ENABLE_EXPENSIVE_CHECKS'),</div><div>- ('+Coverage', 'ENABLE_COVERAGE'),</div><div>- ('+Debug', 'DEBUG_SYMBOLS'),</div>
<div>- ('+Profile', 'ENABLE_PROFILING')):</div><div>- if build_mode.startswith(name):</div><div>- build_mode = build_mode[len(name):]</div><div>- make_variables[key] = '1'</div>
<div>- break</div><div>- else:</div><div>- fatal('invalid build mode: %r' % opts.build_mode)</div><div>-</div><div>- # Assertions are disabled by default.</div><div>- if 'ENABLE_ASSERTIONS' in make_variables:</div>
<div>- del make_variables['ENABLE_ASSERTIONS']</div><div>- else:</div><div>- make_variables['DISABLE_ASSERTIONS'] = '1'</div><div>-</div><div>- # Set the optimization level options.</div>
<div>- make_variables['OPTFLAGS'] = opts.optimize_option</div><div>- if opts.optimize_option == '-Os':</div><div>- make_variables['LLI_OPTFLAGS'] = '-O2'</div><div>- make_variables['LLC_OPTFLAGS'] = '-O2'</div>
<div>- else:</div><div>- make_variables['LLI_OPTFLAGS'] = opts.optimize_option</div><div>- make_variables['LLC_OPTFLAGS'] = opts.optimize_option</div><div>-</div><div>- # Set test selection variables.</div>
<div>- if not opts.test_cxx:</div><div>- make_variables['DISABLE_CXX'] = '1'</div><div>- if not opts.test_jit:</div><div>- make_variables['DISABLE_JIT'] = '1'</div><div>
- if not opts.test_llc:</div><div>- make_variables['DISABLE_LLC'] = '1'</div><div>- if not opts.test_lto:</div><div>- make_variables['DISABLE_LTO'] = '1'</div><div>- if opts.test_llcbeta:</div>
<div>- make_variables['ENABLE_LLCBETA'] = '1'</div><div>- if opts.test_small:</div><div>- make_variables['SMALL_PROBLEM_SIZE'] = '1'</div><div>- if opts.test_large:</div>
<div>- if opts.test_small:</div><div>- fatal('the --small and --large options are mutually exclusive')</div><div>- make_variables['LARGE_PROBLEM_SIZE'] = '1'</div><div>- if opts.test_integrated_as:</div>
<div>- make_variables['TEST_INTEGRATED_AS'] = '1'</div><div>- if opts.liblto_path:</div><div>- make_variables['LD_ENV_OVERRIDES'] = (</div><div>- 'env DYLD_LIBRARY_PATH=%s' % os.path.dirname(</div>
<div>- opts.liblto_path))</div><div>-</div><div>- if opts.threads > 1 or opts.build_threads > 1:</div><div>- make_variables['ENABLE_PARALLEL_REPORT'] = '1'</div><div>-</div><div>
- # Select the test style to use.</div><div>- if opts.test_style == "simple":</div><div>- # We always use reference outputs with TEST=simple.</div><div>- make_variables['ENABLE_HASHED_PROGRAM_OUTPUT'] = '1'</div>
<div>- make_variables['USE_REFERENCE_OUTPUT'] = '1'</div><div>- make_variables['TEST'] = opts.test_style</div><div>-</div><div>- # Set CC_UNDER_TEST_IS_CLANG when appropriate.</div><div>
- if cc_info.get('cc_name') in ('apple_clang', 'clang'):</div><div>- make_variables['CC_UNDER_TEST_IS_CLANG'] = '1'</div><div>- elif cc_info.get('cc_name') in ('llvm-gcc',):</div>
<div>- make_variables['CC_UNDER_TEST_IS_LLVM_GCC'] = '1'</div><div>- elif cc_info.get('cc_name') in ('gcc',):</div><div>- make_variables['CC_UNDER_TEST_IS_GCC'] = '1'</div>
<div>-</div><div>- # Convert the target arch into a make variable, to allow more target based</div><div>- # specialization (e.g., CC_UNDER_TEST_TARGET_IS_ARMV7).</div><div>- if '-' in cc_info.get('cc_target', ''):</div>
<div>- arch_name = cc_info.get('cc_target').split('-',1)[0]</div><div>- make_variables['CC_UNDER_TEST_TARGET_IS_' + arch_name.upper()] = '1'</div><div>-</div><div>- # Set LLVM_RELEASE_IS_PLUS_ASSERTS when appropriate, to allow testing older</div>
<div>- # LLVM source trees.</div><div>- if (llvm_source_version and llvm_source_version.isdigit() and</div><div>- int(llvm_source_version) < 107758):</div><div>- make_variables['LLVM_RELEASE_IS_PLUS_ASSERTS'] = 1</div>
<div>-</div><div>- # Set ARCH appropriately, based on the inferred target.</div><div>- #</div><div>- # FIXME: We should probably be more strict about this.</div><div>- cc_target = cc_info.get('cc_target')</div>
<div>- llvm_arch = opts.llvm_arch</div><div>- if cc_target and llvm_arch is None:</div><div>- # cc_target is expected to be a (GCC style) target triple. Pick out the</div><div>- # arch component, and then try to convert it to an LLVM nightly test</div>
<div>- # style architecture name, which is of course totally different from all</div><div>- # of GCC names, triple names, LLVM target names, and LLVM triple</div><div>- # names. Stupid world.</div><div>
- #</div><div>- # FIXME: Clean this up once everyone is on 'lnt runtest nt' style</div><div>- # nightly testing.</div><div>- arch = cc_target.split('-',1)[0].lower()</div><div>- if (len(arch) == 4 and arch[0] == 'i' and arch.endswith('86') and</div>
<div>- arch[1] in '3456789'): # i[3-9]86</div><div>- llvm_arch = 'x86'</div><div>- elif arch in ('x86_64', 'amd64'):</div><div>- llvm_arch = 'x86_64'</div>
<div>- elif arch in ('powerpc', 'powerpc64', 'ppu'):</div><div>- llvm_arch = 'PowerPC'</div><div>- elif (arch == 'arm' or arch.startswith('armv') or</div>
<div>- arch == 'thumb' or arch.startswith('thumbv') or</div><div>- arch == 'xscale'):</div><div>- llvm_arch = 'ARM'</div><div>- elif arch.startswith('alpha'):</div>
<div>- llvm_arch = 'Alpha'</div><div>- elif arch.startswith('sparc'):</div><div>- llvm_arch = 'Sparc'</div><div>- elif arch in ('mips', 'mipsel'):</div>
<div>- llvm_arch = 'Mips'</div><div>-</div><div>- if llvm_arch is not None:</div><div>- make_variables['ARCH'] = llvm_arch</div><div>- else:</div><div>- warning("unable to infer ARCH, some tests may not run correctly!")</div>
<div>-</div><div>- # Add in any additional make flags passed in via --make-param.</div><div>- for entry in opts.make_parameters:</div><div>- if '=' not in entry:</div><div>- name,value = entry,''</div>
<div>- else:</div><div>- name,value = entry.split('=', 1)</div><div>- print "make",name,value</div><div>- make_variables[name] = value</div><div>- </div><div>- return make_variables</div>
<div>-</div><div>-def prepare_report_dir(opts, start_time):</div><div>+def prepare_report_dir(config):</div><div> # Set up the sandbox.</div><div>- if not os.path.exists(opts.sandbox_path):</div><div>+ sandbox_path = config.sandbox_path</div>
<div>+ if not os.path.exists(sandbox_path):</div><div> print >>sys.stderr, "%s: creating sandbox: %r" % (</div><div>- timestamp(), opts.sandbox_path)</div><div>- os.mkdir(opts.sandbox_path)</div>
<div>+ timestamp(), sandbox_path)</div><div>+ os.mkdir(sandbox_path)</div><div> </div><div> # Create the per-test directory.</div><div>- if opts.timestamp_build:</div><div>- ts = start_time.replace(' ','_').replace(':','-')</div>
<div>- build_dir_name = "test-%s" % ts</div><div>- else:</div><div>- build_dir_name = "build"</div><div>- basedir = os.path.join(opts.sandbox_path, build_dir_name)</div><div>-</div>
<div>- # Canonicalize paths, in case we are using e.g. an NFS remote mount.</div><div>- #</div><div>- # FIXME: This should be eliminated, along with the realpath call below.</div><div>- basedir = os.path.realpath(basedir)</div>
<div>-</div><div>- if os.path.exists(basedir):</div><div>+ report_dir = config.report_dir</div><div>+ if os.path.exists(report_dir):</div><div> needs_clean = True</div><div> else:</div><div> needs_clean = False</div>
<div>- os.mkdir(basedir)</div><div>+ os.mkdir(report_dir)</div><div> </div><div>- # Unless not using timestamps, we require the basedir not to exist.</div><div>- if needs_clean and opts.timestamp_build:</div>
<div>- fatal('refusing to reuse pre-existing build dir %r' % basedir)</div><div>+ # Unless not using timestamps, we require the report dir not to exist.</div><div>+ if needs_clean and config.timestamp_build:</div>
<div>+ fatal('refusing to reuse pre-existing build dir %r' % report_dir)</div><div> </div><div>- return basedir</div><div>-</div><div>-def prepare_build_dir(opts, report_dir, iteration) :</div><div>- # Do nothing in single-sample build, because report_dir and the</div>
<div>- # build_dir is the same directory.</div><div>+def prepare_build_dir(config, iteration) :</div><div>+ # report_dir is supposed to be canonicalized, so we do not need to</div><div>+ # call os.path.realpath before mkdir.</div>
<div>+ build_dir = config.build_dir(iteration)</div><div> if iteration is None:</div><div>- return report_dir</div><div>+ return build_dir</div><div> </div><div>- # Create the directory for individual iteration.</div>
<div>- build_dir = report_dir</div><div>-</div><div>- build_dir = os.path.join(build_dir, "sample-%d" % iteration)</div><div>- # report_dir is supposed to be canonicalized, so we do not need to</div><div>
- # call os.path.realpath before mkdir.</div><div> if os.path.exists(build_dir):</div><div> needs_clean = True</div><div> else:</div><div> needs_clean = False</div><div> os.mkdir(build_dir)</div>
<div>-</div><div>+ </div><div> # Unless not using timestamps, we require the basedir not to exist.</div><div>- if needs_clean and opts.timestamp_build:</div><div>+ if needs_clean and config.timestamp_build:</div>
<div> fatal('refusing to reuse pre-existing build dir %r' % build_dir)</div><div>-</div><div> return build_dir</div><div> </div><div>-def run_test(nick_prefix, opts, iteration, report_dir):</div><div>- print >>sys.stderr, "%s: checking source versions" % (</div>
<div>- timestamp(),)</div><div>- if opts.llvm_src_root:</div><div>- llvm_source_version = get_source_version(opts.llvm_src_root)</div><div>- else:</div><div>- llvm_source_version = None</div><div>
- test_suite_source_version = get_source_version(opts.test_suite_root)</div><div>+def update_tools(make_variables, config, iteration):</div><div>+ """Update the test suite tools. """</div>
<div> </div><div>- # Compute TARGET_FLAGS.</div><div>- target_flags = []</div><div>+ print >>sys.stderr, '%s: building test-suite tools' % (timestamp(),)</div><div>+ args = ['make', 'tools']</div>
<div>+ args.extend('%s=%s' % (k,v) for k,v in make_variables.items())</div><div>+ build_tools_log_path = os.path.join(config.build_dir(iteration), </div><div>+ 'build-tools.log')</div>
<div>+ build_tools_log = open(build_tools_log_path, 'w')</div><div>+ print >>build_tools_log, '%s: running: %s' % (timestamp(),</div><div>+ ' '.join('"%s"' % a</div>
<div>+ for a in args))</div><div>+ build_tools_log.flush()</div><div>+ res = execute_command(build_tools_log, config.build_dir(iteration), </div><div>+ args, config.report_dir)</div>
<div>+ build_tools_log.close()</div><div>+ if res != 0:</div><div>+ fatal('unable to build tools, aborting!')</div><div> </div><div>- # FIXME: Eliminate this blanket option.</div><div>- target_flags.extend(opts.cflags)</div>
<div>+def configure_test_suite(config, iteration):</div><div>+ """Run configure on the test suite."""</div><div> </div><div>- # Pass flags to backend.</div><div>- for f in opts.mllvm:</div>
<div>- target_flags.extend(['-mllvm', f])</div><div>+ basedir = config.build_dir(iteration)</div><div>+ configure_log_path = os.path.join(basedir, 'configure.log')</div><div>+ configure_log = open(configure_log_path, 'w')</div>
<div> </div><div>- if opts.arch is not None:</div><div>- target_flags.append('-arch')</div><div>- target_flags.append(opts.arch)</div><div>- if opts.isysroot is not None:</div><div>- target_flags.append('-isysroot')</div>
<div>- target_flags.append(opts.isysroot)</div><div>+ args = [os.path.realpath(os.path.join(config.test_suite_root,</div><div>+ 'configure'))]</div><div>+ if config.without_llvm:</div>
<div>+ args.extend(['--without-llvmsrc', '--without-llvmobj'])</div><div>+ else:</div><div>+ args.extend(['--with-llvmsrc=%s' % config.llvm_src_root,</div><div>+ '--with-llvmobj=%s' % config.llvm_obj_root])</div>
<div>+ args.append('--with-externals=%s' % os.path.realpath(</div><div>+ config.test_suite_externals))</div><div>+ print >>configure_log, '%s: running: %s' % (timestamp(),</div><div>
+ ' '.join('"%s"' % a</div><div>+ for a in args))</div><div>+ configure_log.flush()</div>
<div> </div><div>- # Get compiler info.</div><div>- cc_info = lnt.testing.util.compilers.get_cc_info(<a href="http://opts.cc" target="_blank">opts.cc</a>_under_test,</div><div>- target_flags)</div>
<div>- cc_target = cc_info.get('cc_target')</div><div>+ print >>sys.stderr, '%s: configuring...' % timestamp()</div><div>+ res = execute_command(configure_log, basedir, args, config.report_dir)</div>
<div>+ configure_log.close()</div><div>+ if res != 0:</div><div>+ fatal('configure failed, log is here: %r' % configure_log_path)</div><div> </div><div>- # Compute the make variables.</div><div>- make_variables = compute_run_make_variables(opts, llvm_source_version,</div>
<div>- target_flags, cc_info)</div><div>+def copy_missing_makefiles(config, basedir):</div><div>+ """When running with only_test something, makefiles will be missing,</div>
<div>+ so copy them into place. """</div><div>+ suffix = ''</div><div>+ for component in config.only_test.split('/'):</div><div>+ suffix = os.path.join(suffix, component)</div>
<div>+ obj_path = os.path.join(basedir, suffix)</div><div>+ src_path = os.path.join(config.test_suite_root, suffix)</div><div>+ if not os.path.exists(obj_path):</div><div>+ print '%s: initializing test dir %s' % (timestamp(), suffix)</div>
<div>+ os.mkdir(obj_path)</div><div>+ shutil.copyfile(os.path.join(src_path, 'Makefile'),</div><div>+ os.path.join(obj_path, 'Makefile'))</div><div> </div><div>
- # Stash the variables we want to report.</div><div>- public_make_variables = make_variables.copy()</div><div>+def run_test(nick_prefix, iteration, config):</div><div>+ print >>sys.stderr, "%s: checking source versions" % (</div>
<div>+ timestamp(),)</div><div>+ </div><div>+ test_suite_source_version = get_source_version(config.test_suite_root)</div><div> </div><div>- # Set remote execution variables, if used.</div><div>- if opts.remote:</div>
<div>- make_variables['REMOTE_HOST'] = opts.remote_host</div><div>- make_variables['REMOTE_USER'] = opts.remote_user</div><div>- make_variables['REMOTE_PORT'] = str(opts.remote_port)</div>
<div>- make_variables['REMOTE_CLIENT'] = opts.remote_client</div><div>+ # Compute the make variables.</div><div>+ make_variables, public_make_variables = config.compute_run_make_variables()</div><div>
</div><div> # Compute the test module variables, which are a restricted subset of the</div><div> # make variables.</div><div>- test_module_variables = compute_test_module_variables(make_variables, opts)</div><div>
+ test_module_variables = compute_test_module_variables(make_variables, config)</div><div> </div><div> # Scan for LNT-based test modules.</div><div> print >>sys.stderr, "%s: scanning for LNT-based test modules" % (</div>
<div> timestamp(),)</div><div>- test_modules = list(scan_for_test_modules(opts))</div><div>+ test_modules = list(scan_for_test_modules(config))</div><div> print >>sys.stderr, "%s: found %d LNT-based test modules" % (</div>
<div> timestamp(), len(test_modules))</div><div> </div><div> nick = nick_prefix</div><div>- if opts.auto_name:</div><div>+ if config.auto_name:</div><div> # Construct the nickname from a few key parameters.</div>
<div>+ cc_info = <a href="http://config.cc" target="_blank">config.cc</a>_info</div><div> cc_nick = '%s_%s' % (cc_info.get('cc_name'), cc_info.get('cc_build'))</div><div> nick += "__%s__%s" % (cc_nick, cc_info.get('cc_target').split('-')[0])</div>
<div> print >>sys.stderr, "%s: using nickname: %r" % (timestamp(), nick)</div><div> </div><div>- basedir = prepare_build_dir(opts, report_dir, iteration)</div><div>+ basedir = prepare_build_dir(config, iteration)</div>
<div> </div><div> # FIXME: Auto-remove old test directories in the source directory (which</div><div> # cause make horrible fits).</div><div>@@ -714,102 +876,56 @@</div><div> start_time = timestamp()</div><div>
print >>sys.stderr, '%s: starting test in %r' % (start_time, basedir)</div><div> </div><div>+</div><div> # Configure the test suite.</div><div>- if opts.run_configure or not os.path.exists(os.path.join(</div>
<div>+ if config.run_configure or not os.path.exists(os.path.join(</div><div> basedir, 'Makefile.config')):</div><div>- configure_log_path = os.path.join(basedir, 'configure.log')</div>
<div>- configure_log = open(configure_log_path, 'w')</div><div>+ configure_test_suite(config, iteration)</div><div> </div><div>- args = [os.path.realpath(os.path.join(opts.test_suite_root,</div>
<div>- 'configure'))]</div><div>- if opts.without_llvm:</div><div>- args.extend(['--without-llvmsrc', '--without-llvmobj'])</div><div>
- else:</div><div>- args.extend(['--with-llvmsrc=%s' % opts.llvm_src_root,</div><div>- '--with-llvmobj=%s' % opts.llvm_obj_root])</div><div>- args.append('--with-externals=%s' % os.path.realpath(</div>
<div>- opts.test_suite_externals))</div><div>- print >>configure_log, '%s: running: %s' % (timestamp(),</div><div>- ' '.join('"%s"' % a</div>
<div>- for a in args))</div><div>- configure_log.flush()</div><div>-</div><div>- print >>sys.stderr, '%s: configuring...' % timestamp()</div>
<div>- res = execute_command(configure_log, basedir, args, report_dir)</div><div>- configure_log.close()</div><div>- if res != 0:</div><div>- fatal('configure failed, log is here: %r' % configure_log_path)</div>
<div>-</div><div> # If running with --only-test, creating any dirs which might be missing and</div><div> # copy Makefiles.</div><div>- if opts.only_test is not None and not opts.only_test.startswith("LNTBased"):</div>
<div>- suffix = ''</div><div>- for component in opts.only_test.split('/'):</div><div>- suffix = os.path.join(suffix, component)</div><div>- obj_path = os.path.join(basedir, suffix)</div>
<div>- src_path = os.path.join(opts.test_suite_root, suffix)</div><div>- if not os.path.exists(obj_path):</div><div>- print '%s: initializing test dir %s' % (timestamp(), suffix)</div>
<div>- os.mkdir(obj_path)</div><div>- shutil.copyfile(os.path.join(src_path, 'Makefile'),</div><div>- os.path.join(obj_path, 'Makefile'))</div><div>
+ if config.only_test is not None and not config.only_test.startswith("LNTBased"):</div><div>+ copy_missing_makefiles(config, basedir)</div><div> </div><div>- # If running without LLVM, make sure tools are up to date.</div>
<div>- if opts.without_llvm:</div><div>- print >>sys.stderr, '%s: building test-suite tools' % (timestamp(),)</div><div>- args = ['make', 'tools']</div><div>- args.extend('%s=%s' % (k,v) for k,v in make_variables.items())</div>
<div>- build_tools_log_path = os.path.join(basedir, 'build-tools.log')</div><div>- build_tools_log = open(build_tools_log_path, 'w')</div><div>- print >>build_tools_log, '%s: running: %s' % (timestamp(),</div>
<div>- ' '.join('"%s"' % a</div><div>- for a in args))</div><div>- build_tools_log.flush()</div>
<div>- res = execute_command(build_tools_log, basedir, args, report_dir)</div><div>- build_tools_log.close()</div><div>- if res != 0:</div><div>- fatal('unable to build tools, aborting!')</div>
<div>- </div><div>- # Always blow away any existing report.</div><div>- report_path = os.path.join(basedir)</div><div>- if opts.only_test is not None:</div><div>- report_path = os.path.join(report_path, opts.only_test)</div>
<div>- report_path = os.path.join(report_path, 'report.%s.csv' % opts.test_style)</div><div>- if os.path.exists(report_path):</div><div>- os.remove(report_path)</div><div>+ # If running without LLVM, make sure tools are up to date. </div>
<div>+ if config.without_llvm:</div><div>+ update_tools(make_variables, config, iteration)</div><div>+ </div><div>+ # Always blow away any existing report.</div><div>+ build_report_path = config.build_report_path(iteration)</div>
<div>+ if os.path.exists(build_report_path):</div><div>+ os.remove(build_report_path)</div><div> </div><div> # Execute the tests.</div><div>- test_log_path = os.path.join(basedir, 'test.log')</div>
<div>- test_log = open(test_log_path, 'w')</div><div>+ test_log = open(config.test_log_path(iteration), 'w')</div><div> </div><div> # Run the make driven tests if needed.</div><div>- run_nightly_test = (opts.only_test is None or</div>
<div>- not opts.only_test.startswith("LNTBased"))</div><div>+ run_nightly_test = (config.only_test is None or</div><div>+ not config.only_test.startswith("LNTBased"))</div>
<div> if run_nightly_test:</div><div>- execute_nt_tests(test_log, make_variables, basedir, opts,</div><div>- report_dir)</div><div>+ execute_nt_tests(test_log, make_variables, basedir, config)</div>
<div> </div><div> # Run the extension test modules, if needed.</div><div> test_module_results = execute_test_modules(test_log, test_modules,</div><div> test_module_variables, basedir,</div>
<div>- opts)</div><div>-</div><div>+ config)</div><div> test_log.close()</div><div> </div><div> end_time = timestamp()</div>
<div> </div><div> # Load the nightly test samples.</div><div>- if opts.test_style == "simple":</div><div>+ if config.test_style == "simple":</div><div> test_namespace = 'nts'</div>
<div> else:</div><div> test_namespace = 'nightlytest'</div><div> if run_nightly_test:</div><div> print >>sys.stderr, '%s: loading nightly test data...' % timestamp()</div><div>
# If nightly test went screwy, it won't have produced a report.</div><div>- if not os.path.exists(report_path):</div><div>+ print build_report_path</div><div>+ if not os.path.exists(build_report_path):</div>
<div> fatal('nightly test failed, no report generated')</div><div> </div><div>- test_samples = load_nt_report_file(report_path, opts)</div><div>+ test_samples = load_nt_report_file(build_report_path, config)</div>
<div> else:</div><div> test_samples = []</div><div> </div><div>@@ -831,9 +947,9 @@</div><div> machine_info['hardware'] = capture(["uname","-m"],</div><div> include_stderr=True).strip()</div>
<div> machine_info['os'] = capture(["uname","-sr"], include_stderr=True).strip()</div><div>- if <a href="http://opts.cc" target="_blank">opts.cc</a>_reference is not None:</div><div>+ if <a href="http://config.cc" target="_blank">config.cc</a>_reference is not None:</div>
<div> machine_info['gcc_version'] = capture(</div><div>- [<a href="http://opts.cc" target="_blank">opts.cc</a>_reference, '--version'],</div><div>+ [<a href="http://config.cc" target="_blank">config.cc</a>_reference, '--version'],</div>
<div> include_stderr=True).split('\n')[0]</div><div> </div><div> # FIXME: We aren't getting the LLCBETA options.</div><div>@@ -846,11 +962,11 @@</div><div> run_info['sw_vers'] = capture(['sw_vers'], include_stderr=True).strip()</div>
<div> </div><div> # Query remote properties if in use.</div><div>- if opts.remote:</div><div>- remote_args = [opts.remote_client,</div><div>- "-l", opts.remote_user,</div><div>
- "-p", str(opts.remote_port),</div><div>- opts.remote_host]</div><div>+ if config.remote:</div><div>+ remote_args = [config.remote_client,</div><div>+ "-l", config.remote_user,</div>
<div>+ "-p", str(config.remote_port),</div><div>+ config.remote_host]</div><div> run_info['remote_uname'] = capture(remote_args + ["uname", "-a"],</div>
<div> include_stderr=True).strip()</div><div> </div><div>@@ -860,32 +976,32 @@</div><div> include_stderr=True).strip()</div><div>
</div><div> # Add machine dependent info.</div><div>- if opts.use_machdep_info:</div><div>+ if config.use_machdep_info:</div><div> machdep_info = machine_info</div><div> else:</div><div> machdep_info = run_info</div>
<div>-</div><div>+ </div><div> machdep_info['uname'] = capture(["uname","-a"], include_stderr=True).strip()</div><div> machdep_info['name'] = capture(["uname","-n"], include_stderr=True).strip()</div>
<div> </div><div> # FIXME: Hack, use better method of getting versions. Ideally, from binaries</div><div> # so we are more likely to be accurate.</div><div>- if llvm_source_version is not None:</div><div>- run_info['llvm_revision'] = llvm_source_version</div>
<div>+ if config.llvm_source_version is not None:</div><div>+ run_info['llvm_revision'] = config.llvm_source_version</div><div> run_info['test_suite_revision'] = test_suite_source_version</div>
<div> run_info.update(public_make_variables)</div><div> </div><div> # Set the run order from the user, if given.</div><div>- if opts.run_order is not None:</div><div>- run_info['run_order'] = opts.run_order</div>
<div>+ if config.run_order is not None:</div><div>+ run_info['run_order'] = config.run_order</div><div> </div><div> else:</div><div> # Otherwise, use the inferred run order from the compiler.</div>
<div> run_info['run_order'] = cc_info['inferred_run_order']</div><div> </div><div> # Add any user specified parameters.</div><div>- for target,params in ((machine_info, opts.machine_parameters),</div>
<div>- (run_info, opts.run_parameters)):</div><div>+ for target,params in ((machine_info, config.machine_parameters),</div><div>+ (run_info, config.run_parameters)):</div>
<div> for entry in params:</div><div> if '=' not in entry:</div><div> name,value = entry,''</div><div>@@ -898,7 +1014,7 @@</div><div> target[name] = value</div>
<div> </div><div> # Generate the test report.</div><div>- lnt_report_path = os.path.join(basedir, 'report.json')</div><div>+ lnt_report_path = config.report_path(iteration)</div><div> print >>sys.stderr, '%s: generating report: %r' % (timestamp(),</div>
<div> lnt_report_path)</div><div> machine = lnt.testing.Machine(nick, machine_info)</div><div>@@ -1317,9 +1433,9 @@</div><div> </div><div> # FIXME: We need to validate that there is no configured output in the</div>
<div> # test-suite directory, that borks things. <<a>rdar://problem/7876418</a>></div><div>+ config = TestConfiguration(opts, timestamp())</div><div>+ prepare_report_dir(config)</div><div> </div>
<div>- report_dir = prepare_report_dir(opts, timestamp())</div><div>-</div><div> # Multisample, if requested.</div><div> if opts.multisample is not None:</div><div> # Collect the sample reports.</div>
<div>@@ -1328,7 +1444,7 @@</div><div> for i in range(opts.multisample):</div><div> print >>sys.stderr, "%s: (multisample) running iteration %d" % (</div><div> timestamp(), i)</div>
<div>- report = run_test(nick, opts, i, report_dir)</div><div>+ report = run_test(nick, i, config)</div><div> reports.append(report)</div><div> </div><div> # Create the merged report.</div>
<div>@@ -1343,7 +1459,7 @@</div><div> for r in reports], [])</div><div> </div><div> # Write out the merged report.</div><div>- lnt_report_path = os.path.join(report_dir, 'report.json')</div>
<div>+ lnt_report_path = os.path.join(config.report_dir, 'report.json')</div><div> report = lnt.testing.Report(machine, run, test_samples)</div><div> lnt_report_file = open(lnt_report_path, 'w')</div>
<div> print >>lnt_report_file,report.render()</div><div>@@ -1351,7 +1467,7 @@</div><div> </div><div> return report</div><div> </div><div>- report = run_test(nick, opts, None, report_dir)</div>
<div>+ report = run_test(nick, None, config)</div><div> return report</div><div> </div><div> def create_instance():</div><div><br></div><div><br></div><div><br></div><div></div></div></div><br><div style="word-wrap:break-word">
<div><div></div><div><br></div><div>
<div style="text-indent:0px;letter-spacing:normal;text-align:start;text-transform:none;white-space:normal;word-wrap:break-word;word-spacing:0px"><div style="text-indent:0px;letter-spacing:normal;text-align:start;text-transform:none;white-space:normal;word-wrap:break-word;word-spacing:0px">
<div style="text-indent:0px;letter-spacing:normal;text-align:start;text-transform:none;white-space:normal;word-wrap:break-word;word-spacing:0px"><br><span style="text-indent:0px;letter-spacing:normal;font-variant:normal;text-align:start;font-style:normal;display:inline!important;font-weight:normal;float:none;line-height:normal;text-transform:none;white-space:normal;font-family:Helvetica;word-spacing:0px">Chris Matthews</span><br style="line-height:normal;text-indent:0px;letter-spacing:normal;text-align:start;font-variant:normal;text-transform:none;font-style:normal;white-space:normal;font-family:Helvetica;font-weight:normal;word-spacing:0px">
<span style="text-indent:0px;letter-spacing:normal;font-variant:normal;text-align:start;font-style:normal;display:inline!important;font-weight:normal;float:none;line-height:normal;text-transform:none;white-space:normal;font-family:Helvetica;word-spacing:0px"><a href="mailto:chris.matthews@apple.com" target="_blank">chris.matthews@apple.com</a></span><br style="line-height:normal;text-indent:0px;letter-spacing:normal;text-align:start;font-variant:normal;text-transform:none;font-style:normal;white-space:normal;font-family:Helvetica;font-weight:normal;word-spacing:0px">
<br style="line-height:normal;text-indent:0px;letter-spacing:normal;text-align:start;font-variant:normal;text-transform:none;font-style:normal;white-space:normal;font-family:Helvetica;font-weight:normal;word-spacing:0px">
<br style="line-height:normal;text-indent:0px;letter-spacing:normal;text-align:start;font-variant:normal;text-transform:none;font-style:normal;white-space:normal;font-family:Helvetica;font-weight:normal;word-spacing:0px">
</div></div></div>
</div>
<br></div></div><br>_______________________________________________<br>
llvm-commits mailing list<br>
<a href="mailto:llvm-commits@cs.uiuc.edu">llvm-commits@cs.uiuc.edu</a><br>
<a href="http://lists.cs.uiuc.edu/mailman/listinfo/llvm-commits" target="_blank">http://lists.cs.uiuc.edu/mailman/listinfo/llvm-commits</a><br>
<br></blockquote></div><br></div>