[cfe-dev] [libc++] std::less and cxa_demangle.cpp

Matthew Dempsky matthew at dempsky.org
Tue Jun 18 11:02:52 PDT 2013


On Mon, Jun 17, 2013 at 2:45 PM, Matthew Dempsky <matthew at dempsky.org> wrote:
> So I think this actually reveals a bug in libc++'s implementation of
> std::less, etc.  They need partial specializations for pointer types
> to properly guarantee a total ordering, otherwise the above code
> triggers undefined behavior with libc++'s current definitions due to
> the pointer comparisons between &y and &x (and between &y and &x + 1).

Sorry, apparently this is my C background showing!  It looks like
whereas C11 says the relational operators applied to pointers of
different array elements is undefined, C++11 is more forgiving and
says it's only unspecified.

Still, it seems like the standard imposes no total ordering constraint
on pointer comparisons.  E.g., my understanding is this program to be
allowed to output 1:

    #include <iostream>
    int main() {
        int a, b;
        std::cout << ((&a >= &b) && (&b >= &a)) << std::endl;
    }

whereas this program must output 0:

    #include <iostream>
    #include <functional>
    int main() {
        std::greater_equal<int *> gteq;
        int a, b;
        std::cout << (gteq(&a, &b) && gteq(&b, &a)) << std::endl;
    }


I don't think this is purely theoretical either.  For example, I think
it would be reasonable for a compiler to decide to optimize "p >= &q"
to just true (given "T *p;" and "T q;"), because the only defined
results are "true" and "unspecified".  An optimization like this could
easily result in the first program above outputting "1" on standard
computers.

I think this would be very similar to the compiler optimizations
discussed in http://lwn.net/Articles/278137/.



More information about the cfe-dev mailing list