Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It is only me, or type theory is overrated and oversold?


I can't say it's only you, but I certainly don't agree.


Donald Knuth might say so too.)

The problem, in my opinion, is same as with Category theory. While the math describing behavior of categories as abstractions is probably correct, the categories themselves are mere abstractions.

Consider clouds (in the sky). One could try to categorize them, but it is useless, because each cloud is unique. Same goes for companies, people, etc. Categories are useful for crude approximations but they are mere products of the mind.

Types are more "concrete" than clouds, of course, but they are not essential. Yes, you could define some "iterable" and "traversable" and "foldable" categories and do scribe a list with it, but the list itself is fundamental abstraction (a basic sequence structure) and it is good-enough. Moreover, after all the type information would be striped off, it will be a "natural" chain of pointers.

In other words, decent programming could be done (and has been done) without most of type theory. The notion of a ADT is enough.


"Decent" programming is a wiggle word. I'm of course obliged to agree with you when you define it. In a similar way, decent programming can be done with Turing Machines, punch cards, raw circuitry, brainfuck, what-have-you. Each would require a notion of "decent" in line with the style of the tool, but there is no hard place to push against when defining "decent".

My belief about type theory is that it serves, for purposes of PL, as a massively more difficult challenge than merely making a programming language which also happens to spawn programming languages on accident.

This is really important because as we've seen—PL design without some deeper driving purpose or mechanism for making tradeoffs can wind up wherever you want it to be. LOLCODE is a real thing and without Type Theory the only way we can talk about our distaste for it is to say that it's "indecent" for deeply personal reasons.

Type Theory wouldn't accept this because its goal is to be a good foundational system for all logic and mathematics.

It's also important to note that Type Theory doesn't serve this purpose by being aligned to some arbitrary goal. It works because the goal of "being a good foundational system for all logic and mathematics" is not so dissimilar from the goal of a PL which might be "represent all interesting thoughts of a programmer in a computable fashion". If Type Theory happened to stumble across non-computable mechanisms then it would be a poor fit, but because it turns out to be the case that the best foundational system must respect computability... things match up.

And ultimately that's the joy of learning Type Theory and programming with it: things match up. I say this flippantly because I lack the words for the enormity of the "matching up" that occurs.

Before I knew Type Theory I felt that PLs were cute and arbitrary. It was interesting to learn them anthropologically, but frustrating to use them because I was building on foundations of sand. After learning Type Theory I understand where the hard places are, where the foundation exists, and I know how to build things which I feel confident can last hundreds of years.


... You know that we do have a categorization of clouds, right? I think meteorologists would disagree with you about the utility of that categorization.

More to your point: whether you can do something without a tool has no bearing on whether or not it is a useful tool. You can write programs in machine code if you'd like to, but you're highly unlikely to be as productive as someone using a high-level programming language. Similarly, you can program without a type system if you'd like to, but you're highly likely to end up with runtime errors that you wouldn't have had you used a type system.


Except when the type system restricts you to homogeneous data-structures in inherently heterogeneous world. Herbert Spencer, the philosopher, could tell us a lot about it.

As for crude and misleading clasdification - there is no shortage for them, especially in realms of psychology (hypohondric, sangvinic - all that nonsense) or economics and finance. The results of application of flawed economic/financial models based on flawed categories could be seen everywhere. For meteorology, please, ask anyone who lives in monsoon area - there is worse than coin-flipping accuracy when it comes to hill regions.)



It is part of the marketing for several "new hotness" languages (however long ago their development started). Back when Java was new, the same marketing talk was directed at OOP, which isn't cool any more.


Especially when we consider how Java crowd have misunderstood OOP and emphasized stupid class hierarchies and static typing (and design patterns as a result) instead of message-passing and message-based polymorphism and protocols as the essence of the original paradigm developed by Smalltalk authors.


It is certainly oversold. There is a typical discussion that goes

A: I'm a simple Python programmer. Please tell me why I need types.

B: Types let you catch many bugs that would otherwise cause compile time errors or require compile time checks or testing.

A: But I've seen studies that suggest that static typing does not prevent bugs.

B: When I said static typing, I meant dependently typed languages, which can express almost any constraints at compile time, not just the basic constraints expressed by static typing.

A: Great, you've convinced me. What dependently typed languages can I use in my production system.

B: Errrr

As a mathematical theory, a way to think about programming, and a research project for future languages, type theory is very interesting. But it's not otherwise that useful for the practicing programmer.


I've spent a lot of time hanging out with pretty hardcore type theory people, and have never seen any of them recommend a dependently-typed language to a new programmer. In other words, you're pretty blatantly fighting a straw man.


It's not about how experienced the programmer is. The problem is that people do a bate and switch when it comes to type theory, promising the benefits of dependent types even though there are no production languages that deliver these benefits.


I've similarly never seen anyone promise the benefits of dependent types when talking about simpler type systems. Why would they need to? Much simpler type systems already have more than enough benefits to be worthwhile.

I'm not actually looking to debate you, your mind is made up and there's very little chance I'll change it, not that it would matter if I did. I mainly just want to point out for others reading this that your arguments are blatant straw men.

Also just fyi, it's "bait and switch".


>Much simpler type systems already have more than enough benefits to be worthwhile.

If that's your opinion, why did you ignore the fact that I addressed this in my original post? You accuse me of having a straw man argument, and yet you can't even engage with what I've written?

Anyway, you're right that neither of us is likely to convince the other, but I have really seen arguments that go like the one I described, and you're really doing me a disservice by ignoring the details of what I posted.


You "addressed" it by referring to "studies" that you didn't cite. There are studies that go both directions on the question, none of which have good enough methodology to be worth more than a tiny update in either direction. Most of the studies are about extremely weak type systems (C, Java, etc.), and therefore don't provide any evidence at all about type systems that are stronger than Java but weaker than dependent types (e.g. Swift, Rust, Haskell, etc).

Meanwhile we have a lot of people who have used both weak type systems and strong type systems who claim that strong type systems make their jobs dramatically easier, and we have a lot of people who have only used weak type systems who claim the opposite. Take a survey of people who have actually done non-trivial work using a language with a good type system, and you'll get very clear results about their view of the benefits. Clearly an RCT would be better, but given that training takes a long time, that just isn't feasible.


> which can express almost any constraints at compile time

... except heterogeneous lists, hash-tables and conditionals.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: