1.) Implicit conversion to an integer.
Enums are not type-safe. They do prevent you from directly assigning one type of enum value to another, but there is nothing stopping you from casting an integer directly to an enum type.
enum PrimaryColor { Red = 0, Blue, Yellow }; enum FavoriteColor { Green = 0, Orange, Purple }; PrimaryColor pC = Red; FavoriteColor fC = Orange; pC = fC; // Error, you cannot assign one enum value directly to another. pC = Green; // Same error bool primaryColorIsGreater = (pC >= Green); // Bad! This is allowed, but probably isn't intended.
Notice on the final line we are able to make a comparison between a primary color and a favorite color. Green isn't even a primary color, so this is probably not intended.
If you want something type safe, the enum class comes to the rescue!
enum class PrimaryColor { Red = 0, Green, Blue }; enum class FavoriteColor { Green = 0, Orange, Purple }; PrimaryColor pC = Red; FavoriteColor fC = Orange; pC = fC; // Error, you cannot assign one enum value directly to another. pC = Green; // Same error bool primaryColorIsGreater = (pC >= FavoriteColor::Green); // Error, you cannot directly compare types from different enums.
2.) Scope
Enums are not strongly scoped. The enumerators of an enum have scope in the same scope as the enum itself. For example, what if in the example above we wanted FavoriteColor to have some of the colors as PrimaryColor?
enum PrimaryColor { Red = 0, Blue, Yellow }; enum FavoriteColor { Green = 0, Red, // Error, 'Red' is already defined in the PrimaryColor enum Blue // Error, 'Blue' is already defined in the PrimaryColor enum };
We're not able to do this with standard enums because they're not strongly scoped, however, the new enum class will let us do this.
enum class PrimaryColor { Red = 0, Green, Blue }; enum class FavoriteColor { Green = 0, Red, // Ok Blue // Ok };
3.) Inability to specify underlying type
The underlying type of an enum is not portable, because different compilers will use different underlying types for an enum. For example, if you're using an enum directly in a packet of information, the sender and receiver may have a different perception of what size that enum value takes.
enum Version { Version1 = 1, Version2 = 2, Version3 = 3 }; struct Packet { Version version; // Bad! This size can vary by implementation. // More data here }
You can workaround this, but it's not ideal (hence calling it a 'workaround'):
struct Packet { unsigned char version; // This works, but requires casting // More data here }
The workaround is ugly, we shouldn't have to store a version number as a char, and require the user on the other end to understand what type of data is stored in that char, and force them to cast it back to a integer (or unsigned integer). The enum class solves this issue for us, by allowing us to specify the underlying type of the enum, so we can guarantee what size it will be.
enum class Version : unsigned { Version1 = 1, Version2 = 2, Version3 = 3 }; struct Packet { Version version; // This is now safe to do, we know that 'version' is an unsigned int. // More data here }
Also, because the size of a standard enum differs by implementation, using values that assume signed or unsigned can be unsafe. Take, for example, the enum below:
enum MyEnum { Value1 = 1, Value2 = 2, ValueBig = 0xFFFFFFF0U };
Note that the last value has been explicitly set to an unsigned int. Because of the differing implementations of enums by different compilers, the resulting value of ValueBig also differs depending on what you're compiling with. This means that your code is not longer portable, and will only work as intended in some compilers. For compilers that treat enums as unsigned ValueBig will be 4294967280, for those that treat is as signed ValueBig will be -16. Even worse, there are compilers that treat ValueBig as 4294967280, but when comparing it against -1 will tell you that ValueBig is less than -1. The enum class solves this problem by allowing the programmer to specify the type. If we want 'ValueBig' to be 0xFFFFFFF0U then we just make sure that our enum class is specified to be an unsigned int:
enum class MyEnum : unsigned { Value1 = 1, Value2 = 2, ValueBig = 0xFFFFFFF0U // ValueBig is now guaranteed to be 4294967280 };
In Visual Studio 2011 the enum class is signed by default. I'm willing to bet that the C++11 standard defines this to be the required default, to prevent issues like what occurred with the basic enum.
enum class MyEnum // We do not specify the value { Value1 = 1, Value2 = 2, ValueBig = 0xFFFFFFF0U // ValueBig in VS2011 is -16 };
Some IDEs are helpful and will show you what your values are going to be so there are no surprises.
Sources:
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2347.pdf