Apple using a 512-bit memory bus on its Max processors is indeed the future of APUs if you ask me.
AMD is coming out with Strix Halo soon with a 256bit memory bus, on the high end we're also seen the niche ampere platform running Arm CPUs with 576-bit buses. PS5 uses a 256-bit bus and Series X a 320-bit bus, but they're using GDDR instead of DDR which increases costs and latency to optimise for bandwidth but there's no reason you couldn't design a laptop or steam deck that did the same thing. AMD has their MI300X which is using 192gb HBM3 over a 8192-bit bus.
I don't think it's just apple going this way, and I do think that more and more of the market is going to be using this unified approach instead of the approach of having a processor and coprocessor separated over a tiny PCIe bus with separate DDR/GDDR memory pools. With portable devices especially, more and more every year I struggle to see how the architecture is justified when I see the battery life benefits of ridding yourself of all this extra hardware. LLM inferencing also creates a nice incentive to go APU because the dual processor architecture tends to result in excessive bandwidth and anemic capacity.
Nvidia may be the giant yet if you look at what most gamers actually run on, they run on Apple APUs, Snapdragon APUs, and AMD APUs. PCs with a discrete CPU and separate Nvidia GPUs have become a relatively smaller slice of the market despite that market segment having grown and Nvidia having a stranglehold upon it. The average game developed is not developed to run on a separate CPU and GPU monster beast - they're designed to run on a phone APU - and something like a Max processor is much more powerful than required to run the average game coming out.
> Apple using a 512-bit memory bus on its Max processors is indeed the future of APUs if you ask me.
It's very expensive to have a bus that wide, which is why it's so rarely done. Desktop GPUs have done it in the past ( https://www.techpowerup.com/gpu-specs/?buswidth=512%20bit&so... ), but they all keep pulling back from it because it's too expensive.
Apple can do it because they can just pay for it and know they can charge for it, they aren't really competing with anyone. But the M3 Max is also a stonking huge chip - at 92bn transistors it's significantly bigger than the RTX 4090 (76bn transistors). Was a 512-bit bus really a good use of those transistors? Probably not. Will others do it? Probably also no, they need to be more efficient on silicon usage. Especially as node shrinks provide less & less benefit yet cost increasingly more.
512-bit is probably a bit extreme, but I can see 192-bit and 256-bit becoming more popular. At the end of the day, if you have a high-end APU, having a 128-bit bus is probably THE bottleneck to performance. It's not clear to me that it makes sense or costs any less to have two 128-bit buses on two different chips which you see on a lot of gaming laptops instead of a single 256-bit bus on one chip for the midrange market.
M3 pro only used a mere 37 billion transistors with a 192-bit bus, so you can get wider than 128-bit while being economical about it. I'd love for there to be a 512-bit Strix Halo but it probably won't happen, it probably does not make business sense.
I don't know if the comparison to GPUs necessarily tracks here because the price floor of having 8 chips of GDDR is a lot higher than having 8 chips of DDR.
Sure, which is why you see other midrange APUs use >128-bit buses, eg PS4 & PS5 are both 256-bit buses w/ GDDR.
Similarly, ultra high end APUs like the Xeon Max are doing on-package HBM for truly monstrous memory bandwidth numbers.
The thing is that nobody else is really trying to make anything like an M3 Pro, and I don't know if they even will. It's a really weird product. Big dies are expensive, hence everyone pushing towards chiplets. A really great spot to split up dies is between compute units that are largely independent - which the CPU & GPU actually are. There's a few workloads where unified memory helps, but most don't. So splitting those apart makes a ton of sense still. Then also if you push them hard to squeeze out all the performance you can, they both get very hot - so you want them physically far apart for cooling reasons. At which point you might as well just give them their own memory which can also be further specialized for their respective needs as one is latency biased and the other bandwidth biased. And now you're just back to traditional desktop architecture - it still makes just way too much sense for the high end.
It makes sense for Apple since they focus almost exclusively on laptops and you get exactly the single mix of CPU & GPU they decide, just like it does for consoles where again they have a midrange power budget & a single CPU/GPU configuration over millions of units. But as soon as different product specializations show up and different workload demands are catered to, coupling the CPU & GPU like that kinda doesn't make sense?
>The thing is that nobody else is really trying to make anything like an M3 Pro
Well, AMD is indeed making their own M3 Pro, the Strix Halo. Which is going to achieve 4050/4060 performance in laptops that will again be cheaper and more power efficient for a lack of a Nvidia GPU. This is also a net addition to their lineup.
AMD is a minority player in the laptop market, so it doesn't necessarily make sense for them to compete hard against Intel at every price segment, and they have clear competitive advantage and counter-positioning here against Intel/Nvidia/Apple.
Is it incorrect though? Mobile alone is 85B of the 165B market [1], not to mention that Nintendo's Switch is basically an android tablet with a mobile chipset.
>Yes, we arent using mobile gaming as an indicator of GPU growth/performance.
Who's "we" because big tech has absolutely been touting GPU gains in their products for a long time now [1], driven by gaming. Top of the line iPhones can do raytracing now, and are getting AAA ports like Resident Evil.
In what world is being over half of a 185B industry a technicality?. A lot of these advancements on mobile end up trickling up to their laptop/desktop counterparts (See Apple's M-series), which matters to non-mobile gamers as well. Advancements that wouldn't have happened if the money wasn't there.
> Nvidia may be the giant yet if you look at what most gamers actually run on, they run on Apple APUs, Snapdragon APUs, and AMD APUs.
They really don't? You're trying quite hard to conflate casual gaming with console and PC markets, but they obviously have very little overlap. Games that release for Nvidia and AMD systems almost never turn around and port themselves to Apple or Snapdragon platforms. I'd imagine the people calling themselves gamers aren't referring to their Candy Crush streak on iPhone.
> something like a Max processor is much more powerful than required to run the average game coming out.
I mean I see tons of shooters (PubG, Fortnite, CoD, etc) and RPGs (Hoyaverse) seeing both mobile and PC releases and they're running on the same engine under the hood. I'm even seeing a few indie games being cross platform. Of course some games simply don't work across platforms due to input or screen size limitations, but Unity/Unreal are more than 3/4s of the market and can enable a release on every platform so why not do a cross-platform release if it's viable?
I just see the distinction you're drawing as being arbitrary and old-fashioned and misses the huge rise of midcore gaming which is seeing tons of mobile/console/pc releases. I understand that a TRUE gamer would not be caught dead playing such games, but as more and more people end up buying APU based laptops to play their hoyaverse games, that's going to warp the market and cause the oppressed minority of TRUE gamers to buy the same products due to economies of scale.
I don't even think it's a "true gamer" thing either. Besides Tomb Raider and Resident Evil 8, there are pretty much no examples of modern AAA titles getting ported to Mac and iPhone.
The console/PC release cycle is just different. Some stuff is cross-platform (particularly when Apple goes out of their way to negotiate with the publisher), but most stuff is not. It's not even a Steam Deck situation where Apple is working to support games regardless; they simply don't care. Additionally, the majority of these cross-platform releases aren't quality experiences but gatcha games, gambling apps and subscription services. You're not wrong to perceive mobile gaming as a high-value market, but it's on a completely different level from other platforms regardless. If you watch a console/PC gaming showcase nowadays, you'd be lucky to find even a single game that is supported on iOS and Android.
> so why not do a cross-platform release if it's viable?
Some companies do; Valve famously went through a lot of work porting their games to MacOS, before Apple depreciated the graphics API they used and cut off 32-bit library support. By the looks of it, Valve and many others just shrug and ignore Apple's update treadmill altogether. There's no shortage of iOS games I played on my first-gen iPod that are flat-out depreciated on today's hardware. Meanwhile the games I bought on Steam in 2011 still run just fine today.
I just don't get this obsession with the idea that only recently-released "AAA" games are real games (or that the only TRUE gamers are those who play them) and it seems like the market and general population doesn't quite grasp it either. These FAKE gamers buy laptops too, and they probably won't see the value in a discrete GPU.
Besides, it's ultimately irrelevant because when Strix Halo comes out, it's going to have the memory bandwidth and compute performance to be able to play any "AAA" game released for consoles until consoles refresh around ~2028, which is 4 solid years of performance before new releases will really make them struggle. These APUs won't be competing with the 4080, but instead the 4060, which is a more popular product anyways. Discrete GPUs are in an awkward spot where they're not going to be significantly more future proof than an APU you can buy, but will suck more power, and will likely have a higher BOM to manufacture.
If you asked TRUE gamers if gaming laptop with Nvidia GPUs were worth it a few years ago, when they were already the majority of the market, they would have laughed in your face and pointed out how they didn't play the latest AAA games good and thus TRUE gamers won't buy them and to instead buy a cheap laptop paired with a big desktop.
> I just don't get this obsession with the idea that only recently-released "AAA" games are real games
It's really the opposite; I think obsessing over casual markets is a mistake since casual gaming customers are incidental. These are people playing the lowest-common-denominator, ad-ridden, microtransaction-laden apps that fill out the App Store, not Halo 2 or Space Cadet Pinball. It really doesn't matter when the games came out, because the market is always separated by more than just third-party ambivalence. Apple loves this unconscious traffic, because they will buy any garbage they put in front of them. Let them be gorged on Honkai Star Rail, while Apple counts 30% of their expenses on digital vice.
Again, I think it's less of a distinction between "true" and "casual" gamers, but more what their OEM encourages them to play. When you owned feature phones, it was shitty Java applets. Now that you own an iPhone... it's basically the same thing with a shinier UI and larger buttons to enter your credit card details.
I'll just say it; Apple's runtime has to play catch-up with stuff like the Steam Deck and even modern game consoles. The current piecemeal porting attempts are pathetic compared to businesses a fraction their size. Even Nvidia got more people to port to the Shield TV, and that was a failure from the start.
Is the best-selling game on every platform Minecraft casual or hardcore? What about Heartstone which allows you to play as much as you like, even once a year (and win), competitive and addictive: choose three.
It is as if the casual true dichotomy is a false one.
>The average game developed is not developed to run on a separate CPU and GPU monster beast - they're designed to run on a phone APU
I love how performant mobile games are on desktop/laptop hardware assuming good ports, Star Rail and Princess Connect! Re:Dive for some examples.
This will probably go away once mobile hardware gets so powerful there's no requirement for devs to be efficient with their resource usage, as has happened with desktop/laptop software, but god damnit I'll enjoy it while it lasts.
AMD is coming out with Strix Halo soon with a 256bit memory bus, on the high end we're also seen the niche ampere platform running Arm CPUs with 576-bit buses. PS5 uses a 256-bit bus and Series X a 320-bit bus, but they're using GDDR instead of DDR which increases costs and latency to optimise for bandwidth but there's no reason you couldn't design a laptop or steam deck that did the same thing. AMD has their MI300X which is using 192gb HBM3 over a 8192-bit bus.
I don't think it's just apple going this way, and I do think that more and more of the market is going to be using this unified approach instead of the approach of having a processor and coprocessor separated over a tiny PCIe bus with separate DDR/GDDR memory pools. With portable devices especially, more and more every year I struggle to see how the architecture is justified when I see the battery life benefits of ridding yourself of all this extra hardware. LLM inferencing also creates a nice incentive to go APU because the dual processor architecture tends to result in excessive bandwidth and anemic capacity.
Nvidia may be the giant yet if you look at what most gamers actually run on, they run on Apple APUs, Snapdragon APUs, and AMD APUs. PCs with a discrete CPU and separate Nvidia GPUs have become a relatively smaller slice of the market despite that market segment having grown and Nvidia having a stranglehold upon it. The average game developed is not developed to run on a separate CPU and GPU monster beast - they're designed to run on a phone APU - and something like a Max processor is much more powerful than required to run the average game coming out.