[LLVMbugs] [Bug 1762] llvm-gcc sees a 128 bit integer, but there is none
bugzilla-daemon at cs.uiuc.edu
bugzilla-daemon at cs.uiuc.edu
Sat Nov 3 16:49:19 PDT 2007
http://llvm.org/bugs/show_bug.cgi?id=1762
Duncan Sands <baldrick at free.fr> changed:
What |Removed |Added
----------------------------------------------------------------------------
Status|NEW |RESOLVED
Resolution| |INVALID
--- Comment #3 from Duncan Sands <baldrick at free.fr> 2007-11-03 18:49:18 ---
PS: On x86-32 I see that 64 bit arithmetic is being used. Consulting
the gcc tree dumps it seems that this is because it is doing arithmetic
using "bitsizetype", a gcc internal type, which has twice as many bits
as the system word size (so 128 bits on your machine). I checked that
mainline gcc-4.2 is also using bitsizetype for your testcase. So while
I understand your surprise at this testcase doing 128 bit arithmetic,
this is really a gcc issue: mainline generates 128 bit arithmetic for it
too, only you never noticed :) I'm closing this as invalid. However it
could also be marked a duplicate of the bug asking for 128 bit support
(PR1462) if you like.
--
Configure bugmail: http://llvm.org/bugs/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are on the CC list for the bug.
More information about the llvm-bugs
mailing list