Computer science. We used to talk about algorithms in terms of big-O notation. Now we just talk in terms of how fast it runs on the newest Nvidia cards. Also, most discussions here are about how you can glue existing stuff together and turn it into a profit. I hardly see any real CS anymore these days on online forums, let alone progress. Closest thing I remember is an article that discussed whether CSS stylesheets are Turing complete.
> We used to talk about algorithms in terms of big-O notation.
This is something CS majors often have to un-learn as they enter the world of practical programming. Big-O is just not incredibly useful in reasoning about algorithms as n is always bounded.
Yes, and realistically, you're just as likely to benefit from investigating lock and I/O contention. I wish my work was mostly optimizing single threaded CPU bottlenecks. Would make things a lot easier.
If you're dealing with disk access, sequential access is often so much faster than random I/O even in SSDs (especially in writes) that Big-O can be very misleading to look at.
As far as I know, computer science is still taught in CS degrees though. I think the signal to noise online is much lower because programming is much more accessible to those without a theoretical background, but I'd guess that in terms of raw numbers there are more people graduating with CS degrees who have learned the theory than ever before.