As an example, we keep our data normalized and add extra tables to do the operations we need. It’s expensive since we have to precompute, etc. But then on lookups it’s quicker.
Like everything, it depends on what you want to optimize for.
Yep, in my case we wanted to make user-facing queries fast (i.e. a reports page). The extra complexity (not much; we just have to remember when to update certain values in other tables when they change) is definitely worth it since load time dropped from 30 seconds sometimes to almost instant.
Denormalizing everything is definitely a pain; keeping data in sync in mongodb (which lacks traditional joins) on a previous project was not fun; now using postgresql.