[cfe-commits] [Review] Rolling out ASTContext::getTypeSizeInChars()

Chris Lattner clattner at apple.com
Mon Jan 11 13:59:47 PST 2010


On Jan 11, 2010, at 1:49 PM, Ted Kremenek wrote:
> On Jan 11, 2010, at 1:24 PM, Ken Dyck wrote:
>>> I'm also concerned about the dimensionality here.  Why did we
>>> choose 'Chars' instead of 'Bytes'?
>>
>> The short answer is that it reflects how getTypeSizeInChars()  
>> calculates
>> its value. It divides the bit size of the type by the bit size of the
>> char type, so calling them CharUnits seemed more accurate than
>> ByteUnits. The aim is to eventually support character widths other  
>> than
>> 8.
>>
>> What specifically are you concerned about?
>
> Hi Ken,
>
> I'm concerned that the uses of getTypeSize() / 8 always want the  
> size in bytes, not chars (if the size of chars differs from the size  
> of bytes).  Code that expects getTypeSizeInChars() to return the  
> size in bytes (which is all the cases in libAnalysis) will get the  
> wrong results.

I'm pretty sure that Ken's approach is right. On target where a char  
is not one byte, sizeof(char) always returns 1, and size(foo) always  
returns size-in-bytes for example.

-Chris



More information about the cfe-commits mailing list