Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Funny how some of his projects got cancelled like K12 at AMD or Royal Core at INTC and people always act like that was terrible decision, yet AMD is up like 100x on stock market and INTC... times gonna tell


Seems completely uncorrelated with what is discussed especially considering Intel didn’t enter the ARM market either.

Would make much more sense to compare with Qualcomm trajectory here as they dominate the high end ARM SoC market.

Basically AMD missed the opportunity to be first mover on a market which is now huge with a project Apple proved to be viable three years after the planned AMD release. Any way you look at it, it seems like a major miss.

The fact that other good decisions in other segments were made at the same time doesn’t change that.


> Basically AMD missed the opportunity to be first mover on a market which is now huge with a project Apple proved to be viable three years after the planned AMD release. Any way you look at it, it seems like a major miss.

I don't think this is a fair position. It could as well be that focusing in K12 would have delayed Zen, maybe delaying it enough that it could have become irrelevant by the time it got to market.

Remember that while Zen was a good CPU, the only reason it made as much impact as it did was because it also was released in a good time (when Intel was stumbling with 10nm and releasing Skylake refresh after Skylake refresh).


AMD was a pretty stripped down company at that point. They'd bet it all on Zen so when it got a foothold it made sense to double down on it until they could recover.

The thing about being broke is you may know about good opportunities but not have the resources to actually make use of them.


>> It could as well be that focusing in K12 would have delayed Zen, maybe delaying it enough that it could have become irrelevant by the time it got to market.

Agree. AMD stock was under $2 prior to Zen. Buying was a bet that Zen would be competitive with Intel in which case the stock would come back, otherwise they were doomed. The first Zen chips were in fact competitive but beat Intel in some benchmarks and lost in others. That would have brought back competition, but who knew Intel would flounder for many more years while Zen got a nice uplift with each generation! Delaying Zen would have been bad for AMD, but in hindsight that wouldn't have mattered so long as they could stay afloat til it launched.


>Basically AMD missed the opportunity to be first mover on a market which is now huge with a project Apple proved to be viable three years after the planned AMD release. Any way you look at it, it seems like a major miss.

No man, apple basically had the power to frog march it's app devs to a new cpu arch. That absolutely would not have happened in the windows ecosystem given the amount of legacy apps and (arguably more importantly) games. For proof of this you need look no further than Itanium and windows arm


Even moreso their hardware buyers.

If most Intel hardware makers had gone full ARM, they would simply have lost market share. Apple customers are going to buy Apple hardware—whatever it has inside.

But of course Apple controls not just the hardware but the OS. So ya, if only Apple hardware will run your application, you are going to port to that hardware.

Apple has a massive advantage in these transitions for sure.


> apple basically had the power to frog march it's app devs to a new cpu arch

Microsoft's ARM transition execution has been poor.

Apple's Rosetta worked on day one.

Microsoft's Prism still has some issues, but at release its compatibility with legacy x86 software was abysmal.

Apple's first party apps and developer IDE had ARM versions ready to go on day one.

Not so for Microsoft.

Apple released early Dev Kit hardware before the retail hardware was ready to go (at very low cost).

Microsoft did not.


Microsoft already had an example of how to do this in a reasonable fashion. Not only that, but the original developer was an ARM licensee. And then finally, during that era Windows was still being developed for multiple architectures.

https://en.wikipedia.org/wiki/FX!32


That's a good example of a non-Apple processor transition where DEC's Alpha CPU performance was industry leading, and the compatibility layer for legacy software was solid.


Apple has way stronger leverage than AMD when it comes to forcing "new standards" lets say.

AMD cannot go and tell its customers "hey we are changing ISA, go adjust.". Their customers would run to Intel.

Apple could do that and forced its laptops to use it. Developers couldnt afford losing those users, so they adjusted.


It’s a chicken and egg problem.

Nobody supports the new ISA because there is no SoC and nobody makes the new SoC because there is no support. But in this case, that’s not really true because Linux support was ready.

More than forcing volumes, Apple proved it was worth it because the efficiency gains were huge. If AMD had release a SoC with numbers close to the M1 before Apple targeting the server market, they had a very good shot at it being a success and leveraging that to success in the laptop markets where Microsoft would have loved to have a partner ready to fight Apple and had to wait for Qualcomm for ages.

Anyway, I stand that looking at how the stock moved tells us nothing about if the cancellation was a good or a bad decision.


>More than forcing volumes, Apple proved it was worth it because the efficiency gains were huge. If AMD had release a SoC with numbers close to the M1 before Apple targeting the server market, they had a very good shot at it being a success and leveraging that to success in the laptop markets where Microsoft would have loved to have a partner ready to fight Apple and had to wait for Qualcomm for ages.

Apple proved that creating your own high end consumer SoC was doable and viable idea due to TSMC and could result in better chips due to designing them around your needs.

And which ISA they could use? X86? Hard to say, probably no. So they had RISCV and ARM

Also about Windows...

If PantherLake on 18A actually performs as good as expected, then why would anyone move to ARM on Windows when viable energy eff. cpus like lnl and ptl are available


> If PantherLake on 18A actually performs as good as expected, then why would anyone move to ARM on Windows when viable energy eff. cpus like lnl and ptl are available

Well yes, exactly, that’s the issue with arriving 10 years later instead of being first mover. The rest of the world doesn’t remain unmoving.


RISC-V wasn’t a thing when Apple started designing their own chips. It was always going to be ARM. Look who founded ARM after all…


> Apple proved it was worth it because the efficiency gains were huge

Thing is, those efficiency gains are both in hardware and software.

Will a Linux laptop running the new AMD SoC use 5 W while browsing HN like this M3 pro does?


5W while browsing is already less efficiënt than my old laptop with a Zen 2 CPU (and most of the power is consumed by the display). Newer CPU's or SoC's should do quite a bit better than that.


During "light" browsing pretty much any laptop's power use is massively dominated by things that aren't the CPU, assuming there's been any attempt at enabling that use case (which doesn't always seem to be the case for many SKUs, certainly on the cheaper end).

A huge amount of Apple's competitive edge is in the "other 90%", but they don't seem to get the headlines.


It's good that x86 is coming close.

Does Windows have working sleep now? I hear it's dangerous to throw a wintelmd laptop in a backpack without shutting it down.


did your laptop display have the same brightness and pixel density


Steam Deck does about 8w.


Isn't the Deck x86 though?


> Their customers would run to Intel.

Data centers and hosting companies are probably the biggest customers buying AMD CPUs, no?

If those companies could lower their energy and cooling costs that could be a strong incentive to offer ARM servers.


What kind of difference we are talking about?

1% 3% 6% 10% 30%?


No idea but it should be significant. AFAIK cooling and energy are the biggest data center costs.


AMD servers are already below 3 watts per core. ARM doesn't actually confer any power advantage. Most ARM processors use less power because they're slower. Apple has a slight advantage because they use TSMC's latest process nodes, but it isn't very large and it isn't because of the ISA.


I was looking at EPYC chips from 2-3 years ago and those do consume more like 8-10W per core but you're right. The latest EPYC 9005 are actually quite efficient.


EPYC 7702(P) is only slightly more than 3W/core (it's 3.125) and that's from 2019.

But the newer ones use even less and they're faster.


Except they literally did exactly that with x86-64 so I’m confused by your comment.


Isn't x86 64 backward compatible, so that's fine?


Apple and Qualcomm _have_ to use Arm ISA because they don't have x86 license. Apple would have likely stayed on x86 if they could use it in their in-house designs. Intel wouldn't issue x86 license to Qualcomm or Apple, of course.


Very unlikely. Moving to Arm allowed Apple to have a single architecture across all their hardware and leverage iPhone designs as the starting point for Mac SoCs.


If Intel would have licensed x86 for use in Apple’s own finished computers, Intel would be in a way better position. Foolish not just to lose that customer but also to legitimize ARM as a desktop and high-end option.

I think Apple would have switched anyway though. They designed Apple Silicon for their mobile devices first (iPhone, iPad) which I doubt they would have made x86. The laptops and desktops are the same ISA as the iPhone (strategically).


There was a rumour on here that aarch64 was actually designed by Apple and given to ARM to standardize


It’s a false rumour. Arm has decades of ISA design experience and their chief architect has talked about designing Aarch64.

Sure they Apple and Arm worked together but it wasn’t developed by Apple and given to Arm.


It was more compelling when M1 came out, but these days, even accepting every possible x86-related patent as valid, everything pre-AVX should expire by 2026. So no license needed.


Intel specifically exited the general-purpose ARM market back about 20 years ago when it sold its XScale division to Marvell. I believe it kept making the ARM chips for use in network controllers and other specific purpose chips.


Intel failed to anticipate the smartphone revolution despite RIM being a customer of XScale. To be fair, they only entered because they got StrongARM from a law suit settlement with DEC in 1997 and they sold to refocus on more strategic segments which turned out to be actually a lot less interesting. I don’t think Intel can really be seen as a model of good strategic thinking.

But all of this is a decade before what we are discussing here. I didn’t even remember XScale existed at Intel while writing my first comment.


Allen Baum has some inside baseball on this:

  When the Microelectronics Group was transferred to Intel,
  that included the StrongARM Group. A month later, everybody
  in the StrongARM Group had pretty much quit.
https://youtu.be/wN02z1KbFmY?si=Gnt4DHalyKLevV2p

From 2:03:30 he points out that the only purpose of the DEC lawsuit was to facilitate the sale to Compaq without the microelectronics group.


Intel has made and killed ARM socs since then. Like Keem Bay.


>> Seems completely uncorrelated with what is discussed especially considering Intel didn’t enter the ARM market either.

I don't think AMD should be following Intel in markets outside x86. I want to see them go RISC-V with a wide vector unit. I'd like to see Intel try that too, but they're kind of busy fixing fabs right now.


If AMD released a desktop class ARM processor at that time, what software would it have run?

Apple had already switched cpus in Macs twice, it's not surprising that they could do it again, but would they have switched from Intel x86 to AMD ARM when they never used any AMD x86? Seems unlikely.

Focusing on a product that would sell on day one rather than one that would need years to build sales makes sense for a company that was struggling for relevance and continued operations.


Windows and Microsoft apps run on ARM on multiple Surface models.


In 2017, Windows on ARM was pretty bad, Windows RT not Windows 10 that had been released for x86 in 2015. I know RT started with only support for Store apps, but I don't know when it allowed native apps. x86 emulation wasn't present in RT either.

Today? Sure, they could probably sell some arm cpus; in 2017, doesn't seem likely.


Do the typical windows user run only Microsoft Apps?

I think you can get 95% of compatibility but the 5% of apps not running, even though they might be used once every full moon and there are alternatives, might be seen as a major blocker for a potential customer if he can still buy another computer with 100% compatibility.


>market which is now huge

SoC market is mcdonalds. its huge in the same way the soybean industry is huge. zero margin commodity.


Yeah, sure, remind me what were Qualcomm results last year. 10 billions?

But, don't get me wrong, I wouldn't spit on McDonalds 6 billions either and the soybean market is one of the fastest growing in the agrifood business, with huge volume traded, probably one of the most profitable commodity at the moment.


> Yeah, sure, remind me what were Qualcomm results last year. 10 billions?

How much of Qualcomm's profit comes from providing yet another ARM chip vs. all the value-added parts they provide in the ARM SoC's, like all the radio modem stuff necessary for mobile phones?

Now that's kind of a rhetorical question, not sure a clear answer exists, at least not outside Qualcomm internal finance figures. Food for thought, though.

(That's sort of the logic behind RISC-V as well. The basic ISA and the chip that implements it is a commodity, the value comes from all the application specific extra stuff tacked on to the SoC.)


> How much of Qualcomm's profit comes from providing yet another ARM chip vs. all the value-added parts they provide in the ARM SoC's, like all the radio modem stuff necessary for mobile phones?

The SoC is the SoC.

You can’t magically say "Qualcomm doesn’t make money from SoC which are commodities" and then argue "but actually they make money with the non commodity part because I want to somehow magically split in two something which isn’t splittable".

There is no real food for thought here. It is just a profitable market.


> Seems completely uncorrelated with what is discussed especially considering Intel didn't enter the ARM market either.

Maybe the folks at Intel just didn't want to StrongARM their competitors?


Stock valuation is a horrible measure of how well a company has planned for the future. Time has demonstrated this again and again and again.


But in this context that future is now and AMD is way better than it was around 2014-2017.


is stock up because of them or despite them?


In the case of AMD it's definitely because of them and the great leadership from Lisa Su.


It is hard to evaluate it reliably.


AMD is doing well because they moved on chiplets before Intel did. The decision of ARM vs x86 is pretty much unrelated to the move that saved them, and sticking with the architecture with which they had decades of experience was probably a good idea.

I mean Keller is talking about a decision to not pursue an ARM chip that he’d apparently been working on after(?) Zen 2 (or maybe in parallel). So AMD was already back on a good path at that point.


Cult of personality... or maybe people just want cool stuff for fun.


Keller himself credits the many people responsible for the contributing parts [0]. I think the general 'enthusiast' tech press and reporting likes hero figures and the simplicity it brings, even better when you can cast a good guy against a bad guy, and the background in this case would be AMD vs intel.

[0] https://web.archive.org/web/20210622032535/https://www.anand...


Humans inherently like having a narrative. When we discuss historical events, we typically want to have a clearly defined leader and/or visionary upon whom to pin events. Without this, our imaginations aren't as engaged, and therefore emotions aren't stirred. For example, the stories of early game companies are great because the teams were very small, a narrative can be written, and the product was fun. With modern games, budgets are massive, teams are massive, and things are often designed and approved by committee. The result can be beautiful and fun, but the story getting there isn't as entertaining.


This is the same reason we have trouble erecting defenses against "the banality of evil" etc. When the villain wears a monocle and smokes a cigar, everybody hates them. When the villain is a misaligned structural incentive, a million people can die and politicians and the media will start rounding up scapegoats instead of actually changing the incentive.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: