[Lldb-commits] [PATCH] Fix the remainder of warnings for the Windows build.

Todd Fiala tfiala at google.com
Fri May 30 13:00:19 PDT 2014


I think the challenge is that once we start veering away from the common datatypes that are guaranteed to maintain the right maximum size of data for a target that is possibly not the host, we can end up inadvertently starting to change data types in code that is used on multiple CPUs in between the target and the debugger client.  So I think the general philosophy has been to use the generic lldb:: and lldb_private:: types all the way to the final function call, and just do the static_cast<> to the appropriate type at the system/OS call layer.

In this particular case it might not be a big deal, but it then puts pressure on whatever calls this interface to decide maybe it can be a size_t.  And to resolve that, another piece of code that might sit in multiple contexts, some of which may convey data across a 64/32/64 boundary, could lose the upper 32 bits.  So it can have a domino effect of causing more types in more generic code to solve a warning/data-type size difference by using what appears right - the host OS native type - which could be wrong for the system as a whole.

So it's more a philosophy of safety that I'm appealing to in not changing that to a size_t.

http://reviews.llvm.org/D3944






More information about the lldb-commits mailing list