Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As I said, Nick Bostrom's Superintelligence talks all about this.

The main thing is that the constraints of the physical world are nowhere near the limitations of human capabilities. Yes, there's a theoretical limit to how much computation you can do in a certain amount of space with a certain amount of energy. No biology or technology currently in existence even gets close to those limits.

Without any such fundamental limit there's just no reason for AGI not to present a threat.

But there is lots of other evidence too.



I've been planning to read Superintelligence for a while anyway, so I think I'll move it up the list a bit in response to all this discussion. I'm on vacation once I check out of here today, until the second week of Jan, so I'll probably read it over the holiday.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: