Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No idea but it should be significant. AFAIK cooling and energy are the biggest data center costs.


AMD servers are already below 3 watts per core. ARM doesn't actually confer any power advantage. Most ARM processors use less power because they're slower. Apple has a slight advantage because they use TSMC's latest process nodes, but it isn't very large and it isn't because of the ISA.


I was looking at EPYC chips from 2-3 years ago and those do consume more like 8-10W per core but you're right. The latest EPYC 9005 are actually quite efficient.


EPYC 7702(P) is only slightly more than 3W/core (it's 3.125) and that's from 2019.

But the newer ones use even less and they're faster.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: