C++ does not permit treating enum values as individual bits used with
the bitwise operators. For types that are a mask of flags, switch the
typedef to an unsigned int and use preprocessor macros for flag
constants.
Details
- Reviewers
asomers imp - Group Reviewers
cam - Commits
- rG480009522604: ctl_ioctl.h: Do not use enums for bit fields of flags
Diff Detail
- Repository
- rG FreeBSD src repository
- Lint
Lint Not Applicable - Unit
Tests Not Applicable
Event Timeline
This is a common pattern in CAM to make things more debuggable.
Why #define rather than done variation on const int foo = 0x1000; ?
Oh, I just did #define since that is the most common way of doing constants in the tree. I would not be opposed to using const int.
The only reason we do the funky ENUM bit dance is so gdb prints symbolic values better.
If const int gives us that, that's a reason to use it. Otherwise, I'm neutral: I'm not dogmatically
opposed to cpp, and const int doesn't solve a real problem in this case when cpp dogma is
factored out. Maybe 'C++ custom" would be a reason enough to do it if we had a larger C++
developer community that adhered to that custom.
tl;dr: I'm good either way unless const int gives better debugging which I'm skeptical of now that I'm typing this out.
Hmmm,
I don't think const int helps:
enum.c
/* What is more debuggable? */ #include <stdio.h> enum color { RED, BLUE, GREEN }; const int YELLOW = GREEN + 1; const int ORANGE = YELLOW + 1; int main(int ac, char **av) { enum color c = RED; printf("RED = %d\n", c); (void)getchar(); c = YELLOW; printf("YELLOW = %d\n", c); (void)getchar(); return (0); }
Transcript in gdb:
(gdb) start Temporary breakpoint 1 at 0x1887: file enum.c, line 14. Starting program: /usr/home/john/work/johnsvn/test/enum/enum Temporary breakpoint 1, main (ac=1, av=0x7fffffffe1b8) at enum.c:14 14 printf("RED = %d\n", c); (gdb) p c $1 = RED (gdb) n RED = 0 15 (void)getchar(); (gdb) n 17 printf("YELLOW = %d\n", c); (gdb) p c $2 = (BLUE | GREEN) (gdb) n YELLOW = 3 18 (void)getchar(); (gdb) p YELLOW $3 = 3
I suspect what you find useful is the p c case and there's no way for the debugger to know what possible constants that might be referring to, so it doesn't map a random constant of 3 to YELLOW.