[cfe-commits] r58685 - /cfe/trunk/lib/Headers/stddef.h

Sebastian Redl sebastian.redl at getdesigned.at
Tue Nov 4 06:03:34 PST 2008

Chris Lattner wrote:
> This is fine in the short term, but I don't think this will work in  
> general.
It's the way every C++ compiler out there does it.
>   Consider if you have:
> somevarargsfunction(1, 2, NULL);
> This will pass as an int, instead of as a pointer.  This matters on 64- 
> bit targets.
It matters everywhere for overloading. C++ programmers expect it. Oh, 
and we avoid varargs functions if we can.
> GCC has a strange __null extension that it uses for C++ mode, should  
> we add support for it?
It would be nice to have, but the extension doesn't do what you think. 
__null is actually an integer constant expression with value 0, which 
emits a warning if it is converted to int. typeid(__null) gives you the 
type ID of long.
__null isn't designed to make the varargs code safe, but to notify the 
programmer when he uses NULL in an actual integer context.

Perhaps we should define NULL as 0L, though. While GCC's stddef.h 
defines NULL to be 0 if __GNUG__ is not defined and the language is C++, 
I believe that to be an inconsistency they simply haven't noticed 
because __GNUG__ is always defined.
Note that VC++ 7.1 (Visual Studio.Net 2003) defines NULL to be 0. They 
may have changed this to 0LL in those versions that actually support 
64-bit targets, though. (VC++ has 32-bit longs under 64-bit 
architectures, so 0L wouldn't be sufficient.)

More interesting than __null would be to implement nullptr, the real 
null pointer constant from C++0x.


More information about the cfe-commits mailing list