It’s not on Apple to make sure their cheapest consumer-targeted computer is good enough for enterprise use.
To me it’s not really relevant what the old computer models used to do. You have to evaluate what is available today and choose accordingly. Like it or not Intel chips had different strengths and weaknesses. It’s a different design entirely.
I’m split on whether this is a dirty price segmentation trick or a legitimate design limitation where adding more display support is expensive in terms of die size.
Doesn’t matter though, because companies doing serious work are supposed to know to buy the business versions of laptops. They don’t buy Dell Vostro consumer grade PCs, they buy Dell Precision/Latitude/XPS business systems. Apple tells you right in the name of their system: Pro. If you’re a professional you buy the Pro model. If it’s too expensive then buy something else.
Only the M2 & M3 Max chips support more than two external monitors[1]. Those start at $3200, and are overkill for the vast majority of use-cases.
There's no excuse for a $2000+ machine to not support more than two external monitors. DisplayLink on MacOS is far from ideal, either: it works alright, but it has to use the screen recording functionality in the OS, which causes anything with protected content to freak out.
Sure, but most people don’t use more than two external monitors. Most people don’t use more than one.
The people who complain about specs per dollar were never Apple’s customers. “Why buy an Audi when a Dodge Neon SRT4 costs half as much and goes faster?” It has been this way for 40 years now. This just isn’t how they operate. When they design a product they don’t start from the specs, they start from how people use the product.
There are much cheaper ways to own a Max system if that specific spec is something you’re desperate for. For one thing, Apple themselves is selling the current model for $2700 refurbished. $500 off and it’s the exact same system with a brand new battery and full warranty.
Also, you should never buy a Mac without the student discount at the very least. Anyone can get it.
Finally, a used M1 Max system will cost you under $2000 and is barely 3 years old.
Keep in mind that if you were buying a MacBook Air in 2010 you were paying over $1800 in today’s money.
If we’re talking about support for external displays this all seems entirely tangential.
> When they design a product they don’t start from the specs, they start from how people use the product.
So they impose arbitrary limitations that have basically nothing to do with the specs just so that people who are supposed to use more expensive machines wouldn’t buy the cheaper models? Sounds about right.
Apple is trying to maximize their revenue because they can. There is nothing wrong about a for profit company doing that. Trying to find any other explanation is a bit silly though..
>Sure, but most people don’t use more than two external monitors. Most people don’t use more than one.
Most people don't buy Macs. So why even sell them then?
They literally took away a feature that their cheapest Intel Macs could do, and restricted it to their most expensive Apple Silicon Macs. They should be lambasted for this.
>Finally, a used M1 Max system will cost you under $2000 and is barely 3 years old.
A Raspberry Pi can do this for under $100. Come on.
Well, what PC people do is they hyper-focus on one specific spec like number of displays supported or price per GB of RAM but can’t see the forest for the trees beyond that.
If I just do the same thing with Macs I can win arguments just as easily. Find me a laptop with the kind of performance per watt specs as the M3 systems. Find another laptop of the same size/weight/power draw that can match the M3 Max’s performance at anything close to the same battery life. Find me a completely fanless Intel/AMD PC that performs as well as the MacBook Air and gets the same or better battery life. Find me a PC laptop where you can feed a RTX 40X0 mobile GPU with over 100GB of RAM. Find me another laptop that uses TSMC’s most advanced chip lithography.
PC spec monkeys will basically say it’s not a real laptop because it can’t support 800 external monitors and there’s no print screen key and it doesn’t have a parallel port etc etc. These are all specs that don’t matter to 99% of users.
Hell, if you’re the kind of person who has a triple or quad external monitor setup, that means you’ve spent around $1000 on just displays. That probably means you can afford $3,000 for a MacBook Pro with a Max chip or maybe pay $2,000 for a used one. And if you didn’t spend $1000+ on those displays, that means those four displays are probably so bad that you’re better off looking at one 4K display or two decent quality ultrawide displays.
> Well, what PC people do is they hyper-focus on one specific spec like number of displays supported or price per GB of RAM but can’t see the forest for the trees beyond that.
Not at all, there are many examples of various types of specs in this thread, where apple fanboys suddenly go mute :)
> If I just do the same thing I can win arguments just as easily. Find me a laptop with the kind of performance per watt specs as the M3 systems. Find another laptop of the same size/weight/power draw that can match the M3 Max’s performance at anything close to the same battery life.
So the only example you can come up with is performance per watt? (Your second question is basically the same as your first). M3 very good in that category, I don't disagree, it's apple's latest/best processor, and it does slightly outperform AMD Ryzens in that category[0]. Of course, when you take price into account, apple M processors are not even close to best :).
> Find me a PC laptop where you can feed an RTX 4080 mobile with over 100GB of RAM
Hilarious that you bring this up when macs don't even support CUDA and basically useless when it comes to the the most important aspects of having a GPU today... gaming and deep learning...
> Those laptops don’t exist, unless it’s a Mac.
Yeah, nothing but apple exists in an apple fanboy's mind.
So what you’re saying is you can’t find a better performance per watt, AMD “comes close.”
You are doing the spec monkey thing again. You changed the spec. I chose performance per watt and now you’ve changed it to performance per dollar.
Under a performance per dollar logic AMD makes the best PC graphics card on the market, which they obviously don’t in terms of total performance. Nvidia charges a huge price/performance premium on the RTX4090 because you can’t buy that performance elsewhere. Sound familiar?
> So the only example you can come up with is performance per watt
I’ve got another one: media encoding. Apple’s systems obliterate the rest.
If your argument is that CUDA is important I hate to say it but you’re actually reverting to that whole “product ecosystem and experience” angle that you were deriding in the same breath. Nvidia users have to buy Nvidia because it’s the only way to use Nvidia software. Kind of like how iOS developers and Final Cut Pro users must buy a Mac? “Yeah, nothing but apple exists in an apple fanboy's mind.” You could replace that statement with “Nvidia” under your own preferences.
Under the spec monkey argument someone buying a graphics card should ignore Nvidia’s CUDA ecosystem and buy an AMD graphics card that offers better performance per dollar. But you’re saying that the lack of CUDA on a Mac is a major downside. Which is it? Performance per dollar or user experience and ecosystem?
This is why doing the spec monkey thing turns us around and around in circles. I’m not being an Apple fanboy I’m just pointing out how it’s completely reasonable for an expensive computer to not prioritize supporting a zillion monitors.
I never claimed that any particular single spec makes macbooks bad, that was entirely your own strawman :). There are maaaaany reasons why I think they're bad.
> I’m not being an Apple fanboy I’m just pointing out how it’s completely reasonable for an expensive computer to not prioritize supporting a zillion monitors.
My 9 year old asus laptop has better external monitor support than my m2 macbook pro... these problems were basically solved 10 ago... how hard can it be? How much do you have to 'prioritize' this? How hard is it to solve the many years-old annoying, well-known macos bugs? I don't see innovation or engineering quality coming out of apple (the only exception being (the very recent) M line of CPUs)... everything else is meh - buggy, fragile, locked-in, overpriced, non-standard, lack of support for important stuff like CUDA, etc.
Also note that 'support multiple external monitors' here actually means 'kinda support some monitors sometimes'. Just google and read the hundreds of threads about external monitor issues on M2 pros.
The only issue I have with external monitors on my M2 Pro – and it’s admittedly annoying – is that unless I turn everything on in a specific sequence, the primary monitor’s energy saving kicks in and turns off the screen before the Mac has synced video. It essentially bootloops.
This only happens on my Acer Predator, and only if I’m using DP —> USB-C. The secondary LG doesn’t care, nor does the Acer if it’s over HDMI.
The fix I’ve found is to wake up the Mac first with the external keyboard, then turn the Acer on and wait for sync, login, then turn the LG on.
While I’d obviously rather not have to deal with this, I feel like it’s at least partially on the incredibly aggressive power saving of the Acer, which I can’t find any way to disable or extend the timeout of.
The excuse is that this is Apple, and the solution to problems with them is to buy more things. In this case, get a $1,500 ultra wide curved monitor which is better than dual head.
For $1500 it's better to get one of the 43" 4K displays. I've used one for over half a decade now, and the ability to comfortably tile a browser plus four terminals side-by-side is unmatched. Or if you will, display 10 A4 pages of a document simultaneously.
there are AMD chips being sold right now that don't even support HDMI Org VRR let alone AV1 decode.
and those skylake laptops are stuck on HDMI 1.4b, so they top out at effectively 1080p60, but sure, you get three of them. And the DP/thunderbolt tops out at 4K60 non-HDR with crappy decode support, and you get at most like 2 ports per laptop.
the grass isn't always greener, there's lots of pain points with x86 hardware too. heck, those celerons you're so fond of are down to literally a single memory channel by this point. is a single stick going to be enough raw bandwidth for a developer that wants to be compiling code etc?
HDMI 1.4b does 1440p75 or 4k30 and HDMI 2.0 was brand new at the time.
> the grass isn't always greener, there's lots of pain points with x86 hardware too. heck, those celerons you're so fond of are down to literally a single memory channel by this point. is a single stick going to be enough raw bandwidth for a developer that wants to be compiling code etc?
What a weird argument; no shit a bargain bin CPU from 10 years ago is worse than a brand new mid-range chip. That's the exact point I'm making. That Celeron was bad 10 years ago. 10 years of progress, billions of dollars of investment and you get the same maximum RAM capacity, less external monitors at a much higher price.
And you’re ignoring all the things that the apple will do that your chip won’t, or the things it’s massively better at.
It has 2x the bandwidth of a Radeon 780M and runs at 35w, it has as much bandwidth as a PS5. there are pluses and minuses to doing it both ways, but, detractors only want to look at the handful of areas where traditional chips have an edge.
"Cheapest" but not cheap. A $400 Steam Deck can do 3 external monitors with an inexpensive MST hub and has a single USB C port.
The MBA is an extremely close competitor to the Dell XPS line too. And "Pro" doesn't even guarantee you more monitors. The $1600 M3 MBP is just as limited as the "consumer" Air.
Only if you ignore the shitty finger trackpad tracking on dell, windows (shit UX) or Linux (shit battery life and shit sleep/wake), and in general the real life battery duration in real life use cases.
We really don’t know. Personally I’m not surprised that a chip that came from a smartphone has difficulty with multiple monitors. I’m guessing that the Pro and Max chips need a much larger die area dedicated to that functionality.
To me it’s not really relevant what the old computer models used to do. You have to evaluate what is available today and choose accordingly. Like it or not Intel chips had different strengths and weaknesses. It’s a different design entirely.
I’m split on whether this is a dirty price segmentation trick or a legitimate design limitation where adding more display support is expensive in terms of die size.
Doesn’t matter though, because companies doing serious work are supposed to know to buy the business versions of laptops. They don’t buy Dell Vostro consumer grade PCs, they buy Dell Precision/Latitude/XPS business systems. Apple tells you right in the name of their system: Pro. If you’re a professional you buy the Pro model. If it’s too expensive then buy something else.