[LLVMdev] [LLVMDev] trouble building gcc-frontend from source

Dale Johannesen dalej at apple.com
Mon Mar 16 09:44:57 PDT 2009


On Mar 16, 2009, at 4:23 AMPDT, Gautam Sewani wrote:

> On Mon, Mar 16, 2009 at 4:17 AM, Dale Johannesen <dalej at apple.com>  
> wrote:
>>
>> On Mar 15, 2009, at 12:40 PM, Duncan Sands wrote:
>>>
>>>> Glad that works for you, but it means that if the 32-bit cost
>>>> computation overflows, we won't be told about it.   I think the  
>>>> right
>>>> thing is to make sure the computation saturates at 30 bits  
>>>> instead of
>>>> overflowing.  Am I going to talk myself into overloading operator 
>>>> + ?
>>>
>>> is this problem really real?  Or has LLVM been miscompiled?
>>> The two people who have reported this were running very similar
>>> systems with broken compiler versions...
>>
>> Ah.  Probably not a real problem then; I was quite surprised it
>> overflowed 30 bits, although it's certainly possible in theory.  I
>> haven't seen any costs that come anywhere close to overflowing.  A
>> broken host compiler would explain it nicely, I missed that.
>>
> This is weird, because I tried it using the gcc-4.2 compiler too, and
> it gave the same result. I'll probably try with Intel now. BTW,
> changing the CC and CXX environment variables is the correct way to
> change the compiler used for building llvm-gcc frontend right? (The
> docs specified this method for changing the compiler while building
> the llvm suite).

It occurs to me that the problem may be negative costs, which are  
possible.  The signedness of "int" bitfields is implementation- 
defined, so this one really ought to be declared "signed int".  Could  
you try that and see if it helps?




More information about the llvm-dev mailing list