[PATCH] D24289: Add warning when assigning enums to bitfields without an explicit unsigned underlying type

Richard Smith via cfe-commits cfe-commits at lists.llvm.org
Thu Nov 17 09:52:23 PST 2016


On 17 Nov 2016 8:56 am, "Reid Kleckner" <rnk at google.com> wrote:

rnk added a comment.

In https://reviews.llvm.org/D24289#598169, @rsmith wrote:

> This is causing warnings to fire for headers shared between C and C++,
where the "give the enum an unsigned underlying type" advice doesn't work,
and where the code in question will never be built for the MS ABI. It seems
really hard to justify this being on by default.
>
> I'm going to turn it off by default for now, but we should probably
consider turning it back on by default when targeting the MS ABI (as a
"your code is wrong" warning rather than a "your code is not portable"
warning).


Seems reasonable. I asked to make it off by default, but got argued down to
putting it under -Wall.

> Yeah, suggesting adding an underlying type to the enum to solve this
problem seems like a bad idea, since that fundamentally changes the nature
of the enum -- typically allowing it to store a lot more values, and making
putting it in a bitfield a bad idea.

Any time you use a bitfield it stores fewer values than the original
integer type. I don't see how enums are special here. Even if I can store
values in the enum not representable as a bitwise combination of
enumerators, people usually don't, and adding an underlying type doesn't
change the conventions of normal C++ code.


The range of representable values for a bitfield with no fixed underlying
type is actually smaller than that of its underlying type. See
http://eel.is/c++draft/dcl.enum#8

So a bitfield of width equal to the number of bits needed to store any
enumerator does not have fewer values than the original type.

Do you have any better suggestions for people that want this code to do the
right thing when built with MSVC?

  enum E /* : unsigned */ { E0, E1, E2, E3 };
  struct A { E b : 2; };
  int main() {
    A a;
    a.b = E3;
    return a.b; // comes out as -1 without the underlying type
  }

Widening the bitfield wastes a bit. Making the bitfield a plain integer and
cast in and out of the enum type, but that obscures the true type of the
bitfield. So, I still support this suggestion.


How about we consider msvc's behaviour to just be a bug, and zero-extend
loads from enum bitfields with no negative enumerators? Since loading such
a negative value results in UB, this wouldn't change the meaning of any
program with defined behaviour, and still respects the MS ABI.

Repository:
  rL LLVM

https://reviews.llvm.org/D24289
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/cfe-commits/attachments/20161117/3858713a/attachment.html>


More information about the cfe-commits mailing list