One theory is that cloud hosting providers use memory as a metric to estimate a customer's use of resources that aren't directly paid for (bandwidth, CPU, support, etc). So even though you may be way overpaying for increased memory based on current RAM prices, their data may show that you're much more likely to also use more resources you aren't directly paying for and set the price accordingly.
Well you probably know its not really just the RAM you are paying, they just use the RAM amount as a label. If you take a large chunk of RAM, that significantly reduces the number of other users that can go on that machine. Really its the whole service is expensive when you want larger amounts of RAM.
Basically I think we are paying for all of their support people and infrastructure and also a healthy profit.
RAM is a resource that they can't oversubscribe, so it costs more.
If I have an instance with 8x CPU cores they can timeshare CPU time between multiple customers on that physical server, since it's unlikely that all instances will use 100% CPU 100% of the time.
However, if I have a server running memcached and a 30GB cache, they can't also cell that RAM to another customer. This limits the amount they can oversell server resources, thus they charge more for it.