Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Note that clang just got a change to experiment with this: https://reviews.llvm.org/rL349442


Doesn't seem to me like a statement that variables should be implicitly initialized.

It seems like a workaround around insanity with respect to UB (where compilers fail to notify the programmer that they statically detected a logic bug, and instead go on with compiling, making crazy optimizations based on the wrong assumption that the logic bug was never there).


> Doesn't seem to me like a statement that variables should be implicitly initialized.

Not at all. I also believe that we shouldn't hide logic bugs, but the problem with this class of bugs is how they are hard to catch: forcing the initialization can at least make the behavior more consistent across execution of the same program (avoid the rare sequence of condition that will clobber the value the right way before you use it). For some specific application this may be an OK tradeoff for release builds.

But to clarify why I posted this, I was just trying add one piece of data to this part of the thread of discussion

>> " With modern static analysis, there's probably no performance benefit to not initializing stack variables that are read before they're written?" >"Implicit default initialization will never be 100% as efficient as simply not requiring initialization"

Actually we don't know the exact impact, it is likely codebase dependent, but this patch in clang will allow to experiment with various tradeoffs.

> where compilers fail to notify the programmer that they statically detected a logic bug

Compilers (at least clang) won't detect a logic bug without noticing the programmer. You have warnings for this.

The optimize just "assumes" that there is no logic bug but can't reason about the logic:

{ int a = 1; foo(&a); } int b; bar(&b);

Can I optimize toward:

int a = 1; foo(&a); bar(&a);

If we assume that there is no logic bug, then it seems like a valid transformation to me. But if bar reads it parameter before writing to it, then this optimization makes foo impacting bar.


Thanks, that's insightful!


I'll tell you I'd rather live in the current world where the compiler emits a warning 'foo may be uninitialized' than the world we are heading into where that enables a half assed optimization that speeds up a ubenchmark on an architecture no one uses anymore.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: