Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sadly, HN was not able to help you understand what premature optimization is.

What the SQLite team did was systematically measure the performance of their system, identify potential bottlenecks/inefficiencies, create and measure improvements to these bottlenecks and choose the better option.

This is what we call optimization. Premature optimization is when you have no idea how your system really performs, but spend time trying to make it "better" without having a way to determine whether your change made any difference.



I thought premature optimization was when you decide to optimize a bottleneck but later end up rearchitecting and tossing it all anyway.

I'd refer to what you're describing as messing around.


I'm not going to pretend to be some sort of authority on the subject, but personally I think both fit the bill. It is premature to optimize when you don't know how your system performs, and it can be premature to optimize if you might throw the solution away. However that last part isn't always the case. Maybe the reason you throw it away is that another solution is more performant - you had to optimize both to find that out.

To me, the core of the issue is knowing what you're doing. If you know what you're doing then it's generally not premature. You have a reason for what you're doing. You're sure that you're working on the right part of the code and you are properly measuring the difference in a meaningful way.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: