Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There’s a very real chance that strategy would significantly increase the likelihood of human extinction in the medium to distant future. We need minds to solve problems.

Of course, that’s not to say that it’s impossible to have too many people, or even that decreasing the likelihood of human extinction is necessarily one of the most important goals.



Sure - could it equally be argued that minds created the problems we need minds to solve? Seems circular.


No, not equally argued. We didn't put asteroids into motion that can destroy the earth's ability to sustain human life. I'm sure you could infinitely fill a list of things we have no control over that could make humans extinct.


Agreed. Those are non human events. I was literally referring to all the human activity that could end in extinction, though.


Forced stagnation might be able to effectively eliminate human-caused extinction, but you still have to worry about the external extinction events (like changes in solar output, or asteroids, or supernovae).

But without stagnation, yes, you have the human-caused problems to worry about, but you also have people working to solve those problems, as well as people working to solve the external problems.


Since this is HN, surely you have heard of 'The mythical manmonth'?


Yes, I have read it. The book is about how it is difficult to improve matters by adding more software engineers to an overdue software engineering project. The book does not claim that, for instance, 100 astronomers and 100 construction workers can not solve more problems than merely 100 construction workers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: