[lldb-dev] ELF parsing/disassembly almost, but not quite working?

Daniel Dunbar daniel at zuster.org
Wed Jun 23 10:48:34 PDT 2010


On Wed, Jun 23, 2010 at 10:44 AM, Greg Clayton <gclayton at apple.com> wrote:
>>>>
>>>
>>> This has to do with the default architecture that is currently being set. We will need to set the following macros correctly for linux:
>>>
>>> LLDB_ARCH_DEFAULT
>>> LLDB_ARCH_DEFAULT_32BIT
>>> LLDB_ARCH_DEFAULT_64BIT
>>>
>>> These don't make as much sense in linux as they do on Mac OS X. In Mac OS X we can run either 32 or 64 bit versions of apps on the same OS install if it is 64 bit capable (85% of our Intel machines are 64 bit capable).
>>>
>>> So we probably want to set the LLDB_ARCH_DEFAULT defines correctly for the current linux host OS with #ifdefs. This will then mean that you won't have to set the architecture unless you are doing cross debugging.
>>
>> This doesn't seem like the kind of thing to handle with #ifdefs,
>> shouldn't lldb be able to infer the architecture from the binary it is
>> debugging?
>>
>> - Daniel
>
> On linux yes. On Mac OS X, sometimes. If we have a universal binary we sometimes want to run i386 or x86_64, so we need to be told which one. Other times we have only one slice and the decision is easy.

That makes sense. On OS X with universal binaries, it might make sense
to pick the default based on the currently running architecture. This
is similar to how the compilers work, and avoids any need for
#ifdef'ing.

 - Daniel



More information about the lldb-dev mailing list