BITSET uses long as its basic underlying type, which is dependent on
the compile type, meaning on 32-bit builds the basic type is 32 bits,
but on 64-bit builds it's 64 bits. On little endian architectures this
doesn't matter, because the LSB is always at the low bit, so the words
get effectively concatenated moving between 32-bit and 64-bit, but on
big-endian architectures it throws a wrench in, as setting bit 0 in
32-bit mode is equivalent to setting bit 32 in 64-bit mode. To
demonstrate:
32-bit mode:
BIT_SET(foo, 0): 0x00000001
64-bit sees: 0x0000000100000000
Solve this by making this effectively a bit string, where all bits are
set linearly. In this case, we see it as follows:
BIT_SET(foo, 0): 0x80000000
64-bit sees: 0x8000000000000000