Enum Flags are hurting my brain.

1    09 Jul 2015 12:43 by u/HelloJarvis

Hi Folks, I am reading some attributes of an object that has an int32 flag attribute similar to this:

    public enum attributes
    {
        A = 1,
        B = 2,
        C = 4,
        D = 8,
        E = 10,
        F = 20,
        G= 40,
        H = 80,
        I = 100,
        J = 200
    }

When I query the object, I get values like this back :

11

49

33

17

back. For love nor money can I figure out how to determine the UInt32 value I get back from the object to see which Flags are set on it.

I've had a look at a few stack overflow discussions and I am just as confused as I was an hour ago. Any help would be greatly appreciate.

Thank you in advance.

3 comments

2

I might be misreading something here, but shouldn't your integer values be 2^n?

Normally bitfields have a bunch of yes/no flags- you can tell what flags are set to true by checking to see if the bit assigned to your property is set to 1.

What I see here are some options that could cause collisions. If you see "10" coming out, how would you know if that was a combo of A+D, or if it was just E.

I'd try changing it to this-

public enum attributes
    {
        A = 1,      // 0b0000000001
        B = 2,      // 0b0000000010
        C = 4,      // 0b0000000100
        D = 8,      // 0b0000001000
        E = 16,    // 0b0000010000
        F = 32,    // 0b0000100000
        G= 64,    // 0b0001000000
        H = 128, // 0b0010000000
        I = 256,  // 0b0100000000
        J = 512  // 0b1000000000
    }

Now, you can use what /v/trayfly mentioned, and rig up a system that will take the value from that enum, do a bitwise and with JUST that value against your object with permissions- so if that bit is set in both, the flag is set.

edit added some clarity to the bit part in the code.