IMHO programming language design is (or at least should be) guided by the underlying hardware. If the hardware dramatically changes, the way this new hardware is programmed will also need to change radically. But as long as the hardware doesn't radically change (which it didn't so far for the last 70 years or so), programming this hardware won't (and shouldn't) radically change either. It's really quite simple (or naive, your pick) :)
Yes indeed, and after had hit the reply button I was thinking about GPUs. But if you look at how a single GPU core is programmed, this is still served pretty well with the traditional programming model, just slightly enhanced for the different memory architecture.
With "radical changes" I mean totally moving away from the von Neumann architecture, e.g. "weird stuff" like quantum-, biochemical- or analog-computers.