Page MenuHomeFreeBSD

ctl_ioctl.h: Do not abuse enums for bit fields of flags
ClosedPublic

Authored by jhb on Feb 26 2025, 3:54 PM.
Tags
None
Referenced Files
Unknown Object (File)
Apr 7 2025, 5:51 PM
Unknown Object (File)
Apr 5 2025, 6:36 AM
Unknown Object (File)
Apr 2 2025, 6:50 PM
Unknown Object (File)
Mar 21 2025, 6:40 AM
Unknown Object (File)
Mar 19 2025, 12:47 PM
Unknown Object (File)
Mar 19 2025, 9:31 AM
Unknown Object (File)
Mar 13 2025, 2:25 PM
Unknown Object (File)
Mar 5 2025, 4:14 AM
Subscribers

Details

Summary

C++ does not permit treating enum values as individual bits used with
the bitwise operators. For types that are a mask of flags, switch the
typedef to an unsigned int and use preprocessor macros for flag
constants.

Diff Detail

Repository
rG FreeBSD src repository
Lint
Lint Not Applicable
Unit
Tests Not Applicable

Event Timeline

jhb requested review of this revision.Feb 26 2025, 3:54 PM
This revision is now accepted and ready to land.Mar 15 2025, 7:34 PM

This is a common pattern in CAM to make things more debuggable.

Why #define rather than done variation on const int foo = 0x1000; ?

In D49138#1126074, @imp wrote:

This is a common pattern in CAM to make things more debuggable.

Why #define rather than done variation on const int foo = 0x1000; ?

Oh, I just did #define since that is the most common way of doing constants in the tree. I would not be opposed to using const int.

In D49138#1126519, @jhb wrote:
In D49138#1126074, @imp wrote:

This is a common pattern in CAM to make things more debuggable.

Why #define rather than done variation on const int foo = 0x1000; ?

Oh, I just did #define since that is the most common way of doing constants in the tree. I would not be opposed to using const int.

The only reason we do the funky ENUM bit dance is so gdb prints symbolic values better.
If const int gives us that, that's a reason to use it. Otherwise, I'm neutral: I'm not dogmatically
opposed to cpp, and const int doesn't solve a real problem in this case when cpp dogma is
factored out. Maybe 'C++ custom" would be a reason enough to do it if we had a larger C++
developer community that adhered to that custom.

tl;dr: I'm good either way unless const int gives better debugging which I'm skeptical of now that I'm typing this out.

Hmmm,

In D49138#1126520, @imp wrote:
In D49138#1126519, @jhb wrote:
In D49138#1126074, @imp wrote:

This is a common pattern in CAM to make things more debuggable.

Why #define rather than done variation on const int foo = 0x1000; ?

Oh, I just did #define since that is the most common way of doing constants in the tree. I would not be opposed to using const int.

The only reason we do the funky ENUM bit dance is so gdb prints symbolic values better.
If const int gives us that, that's a reason to use it. Otherwise, I'm neutral: I'm not dogmatically
opposed to cpp, and const int doesn't solve a real problem in this case when cpp dogma is
factored out. Maybe 'C++ custom" would be a reason enough to do it if we had a larger C++
developer community that adhered to that custom.

tl;dr: I'm good either way unless const int gives better debugging which I'm skeptical of now that I'm typing this out.

I don't think const int helps:

enum.c

/* What is more debuggable? */

#include <stdio.h>

enum color { RED, BLUE, GREEN };

const int YELLOW = GREEN + 1;
const int ORANGE = YELLOW + 1;

int
main(int ac, char **av)
{
	enum color c = RED;
	printf("RED = %d\n", c);
	(void)getchar();
	c = YELLOW;
	printf("YELLOW = %d\n", c);
	(void)getchar();
	return (0);
}

Transcript in gdb:

(gdb) start
Temporary breakpoint 1 at 0x1887: file enum.c, line 14.
Starting program: /usr/home/john/work/johnsvn/test/enum/enum 

Temporary breakpoint 1, main (ac=1, av=0x7fffffffe1b8) at enum.c:14
14              printf("RED = %d\n", c);
(gdb) p c
$1 = RED
(gdb) n
RED = 0
15              (void)getchar();
(gdb) n

17              printf("YELLOW = %d\n", c);
(gdb) p c
$2 = (BLUE | GREEN)
(gdb) n
YELLOW = 3
18              (void)getchar();
(gdb) p YELLOW
$3 = 3

I suspect what you find useful is the p c case and there's no way for the debugger to know what possible constants that might be referring to, so it doesn't map a random constant of 3 to YELLOW.