Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It might just be timing. 2004 was when everyone was switching from Java and C++ to Perl, PHP, Python, and Ruby. Hard to make a case for a compiler optimization that might save 10x on a couple percent of workloads when everybody a.) is focused on the workloads that it's not useful for and b.) is using languages that give up 30x performance from the beginning.

Things may be different now that Moore's Law no longer holds and many programs are cache-bound, but any current research also needs to take into account GPUs, distributed computing, and deep learning. There might be room for new research, but the research would likely be as much about auto-vectorization and minimizing trips between memory spaces as about local optimizations like AoS -> SoA conversion.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: