35 comments

0

A real thing of beaty.

0

C++ : Where friends have access to your private members.

0

#define private public ftw

0

Versus my dumbass of a C++ professor who, without fail, would leave the "L" out of "public" every single time she wrote out a class structure on the board. She always got embarrassed every time she noticed her mistake, too.

0

Freudian slip

0

Valid only if you don't include any headers, otherwise it's non-conforming.

17.4.3.1.1 Macro names [lib.macro.names]

1 Each name defined as a macro in a header is reserved to the implementation for any use if the translation unit includes the header.

2 A translation unit that includes a header shall not contain any macros that define names declared or defined in that header. Nor shall such a translation unit define macros for names lexically identical to keywords.

0

I think it also depends on which compiler you use. For example, I'm pretty sure GCC/G++ allows this unconditionally, whereas I think Clang/LLVM doesn't (depending on which revision of C/C++ you're using).

0

Oh of course, it's only relevant if you're writing code that you want to reliably work on any compiler (portable code).

0

This is beyond my understanding, quick rundown?

0

In C, it can be useful to have a way to test whether a certain value is known at compile-time or not. That is what this crazy-ass macro does through arcane abuse of C evaluation rules.

Consider you have some common thing you need to do in your program which does a lot of arithmetic, like this:

int myfunc(int value)
{
    return ((value * 1032443897) / 35468 + 934) / 34 - 34673264;
}

That's a lot of math operations. I mean, a modern processor still eats it in a couple of nanoseconds, probably, but if you need to do this a lot in time-critical code it might add up. The thing is, maybe in some cases you want to call it on a constant value, like this:

int myvalue = myfunc(3948);

In this case, the 3948 is compile-time constant, so the compiler could do all the math at compile time and just write the fully calculated end result right into the program binary. There would be no more cost at runtime. However, if you just call the function like that it won't do this, because C compilers usually only translate one source file at a time (ignoring a fancy new thing called LTO for a second), and so if myfunc() is defined in a different source file than the code that calls it, the compiler doesn't have the knowledge of how to do the calculation in its head at the same time as it sees the constant. It has to generate a normal function call that calculates the result at runtime.

In order to fix this, you can write a macro like this:

#define mymacro(x) (((value * 1032443897) / 35468 + 934) / 34 - 34673264)

Macros go into header files (meaning they're essentially a part of every source file they're needed in) and are always fully evaluated by the compiler at the time they're invoked. So now when you write

int value = mymacro(3948);

you can be sure that the compiler will do all the math it can with that value and calculate the final result at compile-time.

But, alas, what happens if the input parameter is not a constant? Let's say you have a function like this:

void somefunction(int input)
{
    int value = mymacro(input);
    ...do something cool with the value...
}

Now the compiler cannot do all the math at compile time, and instead has to generate code into somefunction() to do it at runtime. If you have 20 different functions that all do this, that code will be duplicated 20 times in your program, making it bloated and slow (larger programs have worse instruction cache efficiency). For that case, you would rather just call myfunction() so that all these other functions only include a small function call instruction to myfunction(), and then myfunction() contains the code to actually do the math only once.

Of course, you can just pay attention when programming and make sure you always use mymacro() or myfunction() depending on whether the input parameter is constant or not. But that is annoying and error prone, and sometimes it can actually be kinda difficult to tell whether the compiler considers a value constant or not. It would be much nicer if the compiler could just make that decision for you. With Mr. Uecker's insane new creation, you can:

#define mycalc(x) if (ICE_P(x)) mymacro(x); else myfunction(x)

Now you just use mycalc() throughout your code whenever you want to do the calculation, and the compiler will either do all the math at compile-time if it can, or output a function call to the single copy of the calculation code that's in your program binary.

0

So he is a genious made it faster but no one can understand it

0

I bet his editor of choice is TECO.

0

Very good explanation. Can you explain the shit going on with all the void pointers and sizeof too?

0

Did you read it? Linus explains it pretty well.

It depends on highly obscure behavior. If one of the pointer operators of the tertiary operator evaluates to integer constant NULL (0), then the return value of the tertiary operator is coerced to have the type of the non-null operator (which is int *). And of course sizeof(*(int *)1) == sizeof(int). It relies on the fact that if it isn't an ICE, x * 0 will not be evaluated as null (remember sizeof cannot execute code) and the tertiary operator return will fall back to being of type void *. And sizeof(*(void *)x) != sizeof(int)

Like Linus points out, this is bad bad. There is no standardization for the sizeof(*(void*)) and it may very well equal sizeof(int) in a different compiler or even a future version of GCC

0

Linus' email already does it pretty well, honestly, I'm not sure how much more you can break that apart. Apparently there's some rule in the C standard that the type of a ternary operator expression (condition ? result_when_true : result_when_false) is equal to the type of result_when_true in the way it's used here except when that is the NULL constant ((void *)0), in which case it is equal to the type of result_when_false. The macro abuses this my making sure the result_when_true side of the operator will always be NULL (by multiplying with 0 and casting to (void *)) so that the only part that still influences it is whether the argument (x) is constant.

If it was constant, you're left with the result_when_false part which is sizeof(*(int *)). This is obviously equal to sizeof(int), since derefencing an int pointer makes an int. If it wasn't constant, the compiler instead looks at the type of the result_when_true part, which gives you sizeof(*(void *)). The size of the void type (from dereferencing a void pointer) is undefined in the C standard, but GCC has an extension to say that it is always 1 (because people often use (void *) as "generic" pointers to buffers where you don't really care what their contents are, and when you do pointer arithmetic on those it makes most sense to just count them as individual bytes). So the equation ends up as (sizeof(int) == 1), which is false (int can be anywhere between 2 and 8 bytes depending on the platform and compiler configuration, but it is never 1).

0

Thanks for that. A real beaty explanation.

0

a few things

  1. I would put money on this behavior being enabled in some in some obscure compiler flag in the near future.

  2. I never want to read code from the guy who first figured this out, I imagine its just a giant ball of spaghetti, or examples of hello world in brainfuck

0

I want to learn coding and programming etc... after reading this thread it feels like trying to figure out if the theoretical black hole theory is fact or not.

0

You don't have to do this sort of thing normally.

0

Can confirm @7e62ce85 - however it's possible to do some really wacked shit with macros. Could be cool to experiment with, but avoid using them too much in a real project.

0

It's like being able to override the comma operator in C++. Yeah you COULD do that. But you should probably be shot in the head if you do it in a production setting.

0

OS programming doesn't represent the majority of programming. You'll see a lot of awful shit in Kernel code that would otherwise be unacceptable.

0

Things of beaty such as this are what gets lost when taking runtime optimisation for granted - not that I've looked much into how that works, though I presume 'literal arguments + stateless function = literal result' is a common rule.

0

Linus is the one thing standing between pure code and SJWs infecting and destroying the kernel. His philosophy of show me your code has manged to scare away all subversive elements that can't wait to "help out" by adding a Code of Conduct to a project and then finding a new host while the project sinks due to feelings being prioritized above code.

I don't know what's worse: people coding because they see dollar signs, or SJWs contributing destructive behavior (and no code).

// someone who has been coding for 20+ years just for the fun of it.

0

Sadly you're probably right, and I fear who ever will succeed him might not have the brass and bold attitude to keep the SJWs away. They'll kill the kernel faster than you can say "Code of conduct".

0

I like the cut of your jib, sir.

0

SJWs are worse. Because they can't be easily dismissed without an incident.

0

Working for free is commendable, but there's nothing wrong with people wanting to be compensated for their work. If it wasn't for human greed, you'd be living in a mud hut right now and computers wouldn't even exist. I don't know anything about the Linux world and its politics, but it wouldn't surprise me if, once this Linus bloke is gone, it's the need for the Linux project to be competitive and profitable that keeps the SJWs at bay.

0
0

Gcc people?

0

GNU Compiler Collection. He's talking about the people who code/maintain the bit of software that takes C code and turns it into bytecode and/or executable machine code (.bin or .exe files if you want to think about it that way).

Point being it was an example of someone being a crybaby because he dared to say a group of people was "crazy" regardless of context.

0

Both STEM fields and Meritocracy go hand in hand, which is probably why SJW's have trouble cracking them.

Liberal Professor: Video Games Promote 'Toxic Meritocracy'; Wouldn't It Be Great If They Had More Pure Luck?

0

I like programming. It’s so counter intuitive to me it’s like putting my mind in a wrestling match. It’s gay and uncomfortable, but I’m getting stronger.

0

Why can't it be both?

0

His mouth runneth over. At sone point we'll have to accept he's an old man yelling at the clouds.