[lldb-dev] ELF parsing/disassembly almost, but not quite working?

Greg Clayton gclayton at apple.com
Wed Jun 23 10:44:08 PDT 2010


>>> 
>> 
>> This has to do with the default architecture that is currently being set. We will need to set the following macros correctly for linux:
>> 
>> LLDB_ARCH_DEFAULT
>> LLDB_ARCH_DEFAULT_32BIT
>> LLDB_ARCH_DEFAULT_64BIT
>> 
>> These don't make as much sense in linux as they do on Mac OS X. In Mac OS X we can run either 32 or 64 bit versions of apps on the same OS install if it is 64 bit capable (85% of our Intel machines are 64 bit capable).
>> 
>> So we probably want to set the LLDB_ARCH_DEFAULT defines correctly for the current linux host OS with #ifdefs. This will then mean that you won't have to set the architecture unless you are doing cross debugging.
> 
> This doesn't seem like the kind of thing to handle with #ifdefs,
> shouldn't lldb be able to infer the architecture from the binary it is
> debugging?
> 
> - Daniel

On linux yes. On Mac OS X, sometimes. If we have a universal binary we sometimes want to run i386 or x86_64, so we need to be told which one. Other times we have only one slice and the decision is easy.





More information about the lldb-dev mailing list