Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

IMHO programming language design is (or at least should be) guided by the underlying hardware. If the hardware dramatically changes, the way this new hardware is programmed will also need to change radically. But as long as the hardware doesn't radically change (which it didn't so far for the last 70 years or so), programming this hardware won't (and shouldn't) radically change either. It's really quite simple (or naive, your pick) :)


CPUs are progressing with more cores. Today a program should utilize 100+ cores to fully saturate modern server CPU (or 32 cores for consumer CPU).

GPU computations are a thing for many years and their hardware drastically differs from conventional CPUs.

There are neural accelerators in the latest computers. I have no idea what they do, but may be they warrant new programming approaches as well.


It is guided by hardware changes and changed over the last 70 years a lot.

GPUs spawned shader languages and network cards spawned html and javascript.


Yes indeed, and after had hit the reply button I was thinking about GPUs. But if you look at how a single GPU core is programmed, this is still served pretty well with the traditional programming model, just slightly enhanced for the different memory architecture.

With "radical changes" I mean totally moving away from the von Neumann architecture, e.g. "weird stuff" like quantum-, biochemical- or analog-computers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: