Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Methodologies. Languages. Practices. Tools.

I did say "specifically" :)



That's really just being lazy on your part though. I'm not going to sit here and list everything that's come onto the scene since 1990. Every new language. Every new methodology. Every new practice, pattern, and tool. And not just new things, but also improvements over way things are done.

Unless you are going to sit here and suggest that ARC or GC don't help decrease memory errors, or that languages like Python or Java haven't helped move things along. Heck, even C has been steadily improved over the years. Compilers are getting smarter. Tools like valgrind. Agile methodologies, and formalized code testing. Static analysis and even improve code reviews. Heck, even simple things like IDEs and editors.

So much has changed, so much as evolved. Does that mean everything is wrong? No. But relying on studies that can't be replicated and don't account for common programming practices and environments today is dangerous.


You claimed that the studies kazagistar listed rely on outdated assumptions. So, specifically which studies are invalidated by specifically which assumptions? It ought to be easy to name one, since there are quite a few studies and you say there are myriads of outdated assumptions to choose from.

Your answer is that "so much has changed", you're "not going to sit here and list everything", and my question is "really just being lazy"? That's a little hard to take seriously.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: