> One may wonder how the compiler discovers the variable type. The type in this case is *inferred* by the initialization.
That the author feels the need to emphasize this means either that they haven't paid attention to modern languages for a very long time, or this article is for people who haven't paid attention to modern languages for a very long time.
Type inference has left academy and proliferated into mainstream languages for so many years that I almost forgot that it's a worth mentioning feature.
> One is Zig’s robustness. In the case of the shift operation no wrong behavior is allowed and the situation is caught at execution time, as has been shown.
Panicking at runtime is better than just silently overflowing, but I don't know if it's the best example to show the 'robustness' of a language...
> Type inference has left academy and proliferated into mainstream languages for so many years that I almost forgot that it's a worth mentioning feature.
I'm not even sure I'd call this type inference (other people definitely do call it type inference) given that it's only working in one direction. Even Java (var) and C23 (auto), the two languages the author calls out, have that. It's much less convenient than something like Hindley-Milner.
And it's not caught in ReleaseFast builds ... which is not at all unique to Zig (although Zig does do many innovative things to catch errors in debug builds).
> Type inference has left academy and proliferated into mainstream languages for so many years that I almost forgot that it's a worth mentioning feature.
It’s not common in lower level languages without garbage collectors or languages focused on compilation speed.
I meant for focused on compilation speed to apply only to lower level languages. And when I say lower level I don’t really include D because it has a garbage collector (I know it’s optional but much of the standard library uses it I believe).
That a language has a garbage collector is completely orthogonal to whether it has type inference ... what the heck does it matter what "much of the standard library uses" to this issue? It's pure sophism. Even C now has type inference. The plain fact is that the claim is wrong.
The x axis is orthogonal to the y axis, so I can’t be interested in the area where x < 1 and y = 5?
> what the heck does it matter what "much of the standard library uses" to this issue?
It matters in that most people looking for a low level manually memory managed language won’t likely choose D, so for the purposes of “is this relatively novel among lower level, memory managed languages” D doesn’t fit my criteria.
> Even C now has type inference. The plain fact is that the claim is wrong.
The only popular language I can think of is C (prior to C23). If you want to include Fortran and Ada, that would be three, but these are all very old languages. All modern system languages have type deduction for variable declarations.
That the author feels the need to emphasize this means either that they haven't paid attention to modern languages for a very long time, or this article is for people who haven't paid attention to modern languages for a very long time.
Type inference has left academy and proliferated into mainstream languages for so many years that I almost forgot that it's a worth mentioning feature.
> One is Zig’s robustness. In the case of the shift operation no wrong behavior is allowed and the situation is caught at execution time, as has been shown.
Panicking at runtime is better than just silently overflowing, but I don't know if it's the best example to show the 'robustness' of a language...