[lldb-dev] ELF parsing/disassembly almost, but not quite working?
jingham at apple.com
Wed Jun 23 11:37:18 PDT 2010
Yes, one of the parts of lldb that hasn't been fully fleshed out is the agent that will dynamically figure out this sort of thing. This sort of thing can change as the debugging session proceeds, too, for instance if you are given a 64/32 file on a 64-bit capable machine, then told to attach to a 32 bit process... So this isn't entirely trivial.
Until we get that system going, we are using some defines like this.
On Jun 23, 2010, at 10:48 AM, Daniel Dunbar wrote:
> On Wed, Jun 23, 2010 at 10:44 AM, Greg Clayton <gclayton at apple.com> wrote:
>>>> This has to do with the default architecture that is currently being set. We will need to set the following macros correctly for linux:
>>>> These don't make as much sense in linux as they do on Mac OS X. In Mac OS X we can run either 32 or 64 bit versions of apps on the same OS install if it is 64 bit capable (85% of our Intel machines are 64 bit capable).
>>>> So we probably want to set the LLDB_ARCH_DEFAULT defines correctly for the current linux host OS with #ifdefs. This will then mean that you won't have to set the architecture unless you are doing cross debugging.
>>> This doesn't seem like the kind of thing to handle with #ifdefs,
>>> shouldn't lldb be able to infer the architecture from the binary it is
>>> - Daniel
>> On linux yes. On Mac OS X, sometimes. If we have a universal binary we sometimes want to run i386 or x86_64, so we need to be told which one. Other times we have only one slice and the decision is easy.
> That makes sense. On OS X with universal binaries, it might make sense
> to pick the default based on the currently running architecture. This
> is similar to how the compilers work, and avoids any need for
> - Daniel
> lldb-dev mailing list
> lldb-dev at cs.uiuc.edu
More information about the lldb-dev