Enum Flags are hurting my brain.
1 09 Jul 2015 12:43 by u/HelloJarvis
Hi Folks, I am reading some attributes of an object that has an int32 flag attribute similar to this:
public enum attributes
{
A = 1,
B = 2,
C = 4,
D = 8,
E = 10,
F = 20,
G= 40,
H = 80,
I = 100,
J = 200
}
When I query the object, I get values like this back :
11
49
33
17
back. For love nor money can I figure out how to determine the UInt32 value I get back from the object to see which Flags are set on it.
I've had a look at a few stack overflow discussions and I am just as confused as I was an hour ago. Any help would be greatly appreciate.
Thank you in advance.
3 comments
2 u/Monqui 09 Jul 2015 15:07
I might be misreading something here, but shouldn't your integer values be 2^n?
Normally bitfields have a bunch of yes/no flags- you can tell what flags are set to true by checking to see if the bit assigned to your property is set to 1.
What I see here are some options that could cause collisions. If you see "10" coming out, how would you know if that was a combo of A+D, or if it was just E.
I'd try changing it to this-
Now, you can use what /v/trayfly mentioned, and rig up a system that will take the value from that enum, do a bitwise and with JUST that value against your object with permissions- so if that bit is set in both, the flag is set.
edit added some clarity to the bit part in the code.