Yudkowsky is "not even wrong". He just makes shit up based on extrapolation and speculation. Those are not arguments to be taken seriously by intelligent people.
Maybe we should build a giant laser to protect ourselves from the aliens. Just in case. I mean an invasion is at least plausible.
If for whatever reason you want to think about what might happen if AI systems get smarter than humans, then extrapolation and speculation are all you've got.
If for whatever reason you suspect that there might be value in thinking about what might happen if AI systems get smarter than humans before it actually happens, then you don't have much choice about doing that.
What do you think he should have done differently? Methodologically, I mean. (No doubt you disagree with his conclusions too, but necessarily any "object-level" reasons you have for doing so are "extrapolation and speculation" just as much as his are.)
If astronomical observations strongly suggested a fleet of aliens heading our way, building a giant laser might not be such a bad idea, though it wouldn't be my choice of response.
OK, cool, you don't like Yudkowsky and want to be sure we all recognize that. But I hoped it was obvious that I wasn't just talking about Yudkowsky personally.
Suppose someone is interested in what the consequences of AI systems much smarter than humans might be. Your argument here seems to be: it's Bad to think about that question at all, because you have to speculate and extrapolate.
But that seems like an obviously unsatisfactory position to me. "Don't waste any time thinking about this until it happens" is not generally a good strategy for any any consequential thing that might happen.
So: do you really think that thinking about the possible consequences of smarter-than-human AI before we have it is an illegitimate activity? If not, then your real objection to Yudkowsky's thinking and writing about AI surely has to be something about how he went about it, not the mere fact that he engages in speculation and extrapolation. There's no alternative to that.
Maybe we should build a giant laser to protect ourselves from the aliens. Just in case. I mean an invasion is at least plausible.