[lldb-dev] Host::GetArchitecture and the default target triple

Zachary Turner zturner at google.com
Tue Aug 19 13:52:43 PDT 2014


Looking over the apple code path for Host::GetArchitecture, I'm a little
confused about why all this custom logic is needed.  What are the
situations in which llvm::sys::getDefaultTargetTriple() will return
something other than what we want?  Specifically, a concrete example might
help illustrate the problem.

I understand from the comment that it has something to do with being able
to run 32 and 64-bit executables in the same operating system.  Isn't this
the case everywhere?  I can run 32-bit executables on Windows x64 as well.
 llvm::Triple has a function called get32BitArchVariant() that I thought
returns a 32-bit triple for this case.  Does this not work for some Apple
configuration?

It seems like this logic should be able to be sunk into llvm::Triple
somehow.  Conceptually speaking, it seems like there should be two cases:

64-bit:
host_arch_64 = llvm::Triple::getDefaultTargetTriple()
host_arch_32 = llvm::Triple::getDefaultTargetTriple().get32BitArchVariant()

32-bit
host_arch_64 = <empty>
host_arch_32 = llvm::Triple::getDefaultTargetTriple()

Why doesn't this work?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/lldb-dev/attachments/20140819/74feaec7/attachment.html>


More information about the lldb-dev mailing list