Because companies don't care about how 'clean' you code is, what design patterns you use or what cool new technology you are using. They just care that you manage to release quickly so the company can start making money.
So even if you don't know the best way to do things or the best pattern to use, as long as you can hack something together that works reasonably well for some time then you're good enough. Sure, it might be flimsy, break easily, unreadable etc. but if the company is making money then business is happy.
Think about it...code becomes legacy so quickly. 2-3 year old code is considered legacy, especially if it's not used/modified much. You could have spent 15% more time to make it more maintainable but it will still be considered bad code by the next guy who had to work on it because he has absolutely no idea how the code works and what it is meant to do. Chances are, it would be re-written anyway even if you spent that extra 15%. The in the meantime the company might have missed their initial release date and some potential revenue.
Not saying I agree with it but that is how it is unfortunately.
Depends on the business model. If you sit on the bleeding edge all the time maybe (even if you don't you should strive for the newest possible for the skills and budget if only for retention and hiring).
But if the whole point of the company is to build some generic software that's configurable then absolutely the inside and code matters because it's a hard requirement. You could build specifically to one type of configuration but it would not solve the other. And that means generic, abstractions, reuse as part of the business case not just some nice to have for clean code.
So it very much depends. It's also possible to take it too far. As for internal release dates I have always considered them bullshit ever since I heard executives talk about how they pad the time. The more layers of management the more padding and if everyone works separately in different teams the more the padding. Developers are better off ignoring internal release dates and building the best software they think possible to not compromise on quality. They can cut scope but at the end of the day the software better work. The only thing you shouldn't ignore is warning people if it will go over time. Since 50% of software goes over time that's expected.
Either it's good or it isn't either it works or it doesn't and the timescale devs are used to (sprints, days, stories etc) don't have any bearing on reality. If it takes a month more because it has to that's how long it takes. The concept of "MVP" or feature set is totally separate from a good piece of software; once you get there you can release and iterate and improve. And it's called "minimal" for a reason if you stop there your software is sunset. The features and improvements have to keep rolling.
So even if you don't know the best way to do things or the best pattern to use, as long as you can hack something together that works reasonably well for some time then you're good enough. Sure, it might be flimsy, break easily, unreadable etc. but if the company is making money then business is happy.
Think about it...code becomes legacy so quickly. 2-3 year old code is considered legacy, especially if it's not used/modified much. You could have spent 15% more time to make it more maintainable but it will still be considered bad code by the next guy who had to work on it because he has absolutely no idea how the code works and what it is meant to do. Chances are, it would be re-written anyway even if you spent that extra 15%. The in the meantime the company might have missed their initial release date and some potential revenue.
Not saying I agree with it but that is how it is unfortunately.