[lldb-dev] Host::GetArchitecture and the default target triple

jingham at apple.com jingham at apple.com
Tue Aug 19 13:58:31 PDT 2014


Mac OS X is more complex because lldb also supports apps running in the iOS simulator on OS X.  Those apps are x86 or x86_64 processes, but their OS is "ios", not "darwin".  The platforms and a bunch of other fiddly bits rely on getting the OS right as well.

Jim

> On Aug 19, 2014, at 1:52 PM, Zachary Turner <zturner at google.com> wrote:
> 
> Looking over the apple code path for Host::GetArchitecture, I'm a little confused about why all this custom logic is needed.  What are the situations in which llvm::sys::getDefaultTargetTriple() will return something other than what we want?  Specifically, a concrete example might help illustrate the problem.
> 
> I understand from the comment that it has something to do with being able to run 32 and 64-bit executables in the same operating system.  Isn't this the case everywhere?  I can run 32-bit executables on Windows x64 as well.  llvm::Triple has a function called get32BitArchVariant() that I thought returns a 32-bit triple for this case.  Does this not work for some Apple configuration?
> 
> It seems like this logic should be able to be sunk into llvm::Triple somehow.  Conceptually speaking, it seems like there should be two cases:
> 
> 64-bit:
> host_arch_64 = llvm::Triple::getDefaultTargetTriple()
> host_arch_32 = llvm::Triple::getDefaultTargetTriple().get32BitArchVariant()
> 
> 32-bit
> host_arch_64 = <empty>
> host_arch_32 = llvm::Triple::getDefaultTargetTriple()
> 
> Why doesn't this work?
> _______________________________________________
> lldb-dev mailing list
> lldb-dev at cs.uiuc.edu
> http://lists.cs.uiuc.edu/mailman/listinfo/lldb-dev




More information about the lldb-dev mailing list