There’s a very real chance that strategy would significantly increase the likelihood of human extinction in the medium to distant future. We need minds to solve problems.
Of course, that’s not to say that it’s impossible to have too many people, or even that decreasing the likelihood of human extinction is necessarily one of the most important goals.
No, not equally argued. We didn't put asteroids into motion that can destroy the earth's ability to sustain human life. I'm sure you could infinitely fill a list of things we have no control over that could make humans extinct.
Forced stagnation might be able to effectively eliminate human-caused extinction, but you still have to worry about the external extinction events (like changes in solar output, or asteroids, or supernovae).
But without stagnation, yes, you have the human-caused problems to worry about, but you also have people working to solve those problems, as well as people working to solve the external problems.
Yes, I have read it. The book is about how it is difficult to improve matters by adding more software engineers to an overdue software engineering project. The book does not claim that, for instance, 100 astronomers and 100 construction workers can not solve more problems than merely 100 construction workers.
Of course, that’s not to say that it’s impossible to have too many people, or even that decreasing the likelihood of human extinction is necessarily one of the most important goals.