Dear replying IoTlers,
some time ago I had a discussion with Martine on GitHub about the usage of
enums for flags [1]. Martine convinced me that seems to be wise to prefer
macros over enums here, to avoid alignment issues. However, it feels somehow
wrong not to use enums for this purpose (it's easier for the developer *and*
the compiler if a valid data type is chosen). Does anyone know a trick around
the issues that Martine mentioned:
Because flags have a width in memory that is in most cases smaller than
sizeof(enum) (most bit fields I know of are 16 bits max, on most of our
newer platforms, sizeof(enum) is however 32 bits). This results in every
assignment needed to be cast to either uint8_t or uint16_t. With macros you
don't need to cast since they are typeless.
Making the enum packed makes it's width unpredictable in terms of alignment
issues, when used in struct (which is not the case here, I know).
Cheers,
Oleg
[1] https://github.com/RIOT-OS/RIOT/pull/2614#discussion_r28941692
A silly question/suggestion:
Wouldn't it make sense to use sized integer types in the struct, but a separate enum to define the values? IIRC, enum's in C are simply named integer constants. C is also very permitting in storing values into smaller integer types. For C++, of course, the situation may not be that easy.
typedef struct {
uint16_t flags;
} foo_t;
typedef enum {
FOO_FLAG_ONE = 0x01,
FOO_FLAG_TWO = 0x02,
FOO_FLAG_THREE = 0x04,
} foo_flags_t;
--Pekka
Hi!
Another possibility would be to use packed enums but enforce/check their size
at build time with something like Rene proposed here:
https://github.com/RIOT-OS/RIOT/pull/1286
Cheers,
Oleg
gcc's -fshort-enums might do what you describe:
https://gcc.gnu.org/onlinedocs/gcc/Code-Gen-Options.html:
-fshort-enums Allocate to an enum type only as many bytes as it needs
for the declared range of possible values. Specifically, the enum type
is equivalent to the smallest integer type that has enough room.
Warning: the -fshort-enums switch causes GCC to generate code that is
not binary compatible with code generated without that switch. Use it
to conform to a non-default application binary interface.
/Joakim