I'm updating a physics simulation C code to use enums from long lists of #defines, but have run into an odd bug. A structure contains an enum as:
enum spec_mod_type_enum
{ SPEC_MOD_PL=1,
SPEC_MOD_EXP=2,
} spec_mod_type[NXBANDS];
(NXBANDS is just a #defined value)
Due to an oversight, no key is added for -1 whilst in another file, it's modified as:
xplasma->spec_mod_type[n] = -1;
However, when compiled in both clang and gcc this results in a silent failure; the value is set to undefined, rather than -1, with unpleasant consequences. This is odd as:
I was under the impression enums could be set to values outside their range.
We get no warnings about this with -Wall (or -Wextra), when it seems like the exact thing enums are supposed to warn over.
Could anyone enlighten me as to why this might be happening? And/or which compiler flags would warn us about this, or at least change the default behaviour for enums to allow this set?
The behaviour of your program could vary from platform to platform:
The C standard allows the compiler to choose any underlying integral type for the enumerator that's capable of representing all the explicit values given: 1 and 2 in your case.
So a compiler might pick an unsigned type for your enumeration. Assigning a negative value in that case would cause wraparound modulo 2^n where n is the number of bits used to represent the unsigned type.
On the other hand, it might pick a signed type, in which case -1 would be representable.
One remedy would be to introduce a negative dummy value into your enumerator.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With