[llvm-commits] [llvm] r149980 - /llvm/trunk/include/llvm/Bitcode/BitstreamReader.h

NAKAMURA Takumi geek4civic at gmail.com
Tue Feb 7 05:54:46 PST 2012


Duncan, thank you to review mine.

2012/2/7 Duncan Sands <baldrick at free.fr>:
> Hi Takumi,
>
>> --- llvm/trunk/include/llvm/Bitcode/BitstreamReader.h (original)
>> +++ llvm/trunk/include/llvm/Bitcode/BitstreamReader.h Tue Feb  7 04:53:19 2012
>> @@ -17,6 +17,7 @@
>>
>>   #include "llvm/ADT/OwningPtr.h"
>>   #include "llvm/Bitcode/BitCodes.h"
>> +#include "llvm/Support/Endian.h"
>>   #include "llvm/Support/StreamableMemoryObject.h"
>>   #include<climits>
>>   #include<string>
>> @@ -242,12 +243,13 @@
>>     }
>>
>>     uint32_t getWord(size_t pos) {
>> -    uint32_t word = -1;
>> +    uint8_t buf[sizeof(uint32_t)];
>> +    memset(buf, 0xFF, sizeof(buf));
>>       BitStream->getBitcodeBytes().readBytes(pos,
>> -                                           sizeof(word),
>> -                                           reinterpret_cast<uint8_t *>(&word),
>> +                                           sizeof(buf),
>> +                                           buf,
>>                                              NULL);
>
> why did you need to introduce buf?  I don't see how it differs from
> using word.
>
>> -    return word;
>> +    return *reinterpret_cast<support::ulittle32_t *>(buf);
>
> Presumably this is where the real content of the change is.  Does it work
> if you use &word rather than buf here?

In my tweaks, I have not satisfied strict-aliasing (on g++) with
uint32_t nor ulittle32_t.
It is a reason why I introduced uint8_t[4].

I am dubious Support/Endian.h might not be ready for compatibility to
strict-aliasing in some cases.

In this case, I could implement without ulittle32_t but packing 4x
uint_8s manually.

Anyway, feel free to rework if anyone would know smarter resolution, thank you.

...Takumi




More information about the llvm-commits mailing list