Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Modular PC Design: Sustainable Approach Enhanced Repairability Reduced E-Waste (intel.com)
23 points by pabs3 on Jan 24, 2025 | hide | past | favorite | 53 comments


PCs are already extremely modular, and they have been modular for multiple decades now.

All Intel needs to do is not refresh the socket with every CPU generation, slap a green sticker on the box, and amp up their marketing to include "green" and "less e-waste" and all that.

They'll be able to virtue signal to the environmentally conscious with minimal capital outlay and whatever this engineering masturbation article is.


> PCs are already extremely modular, and they have been modular for multiple decades now.

Laptops were fairly modular as well. you could upgrade memory and disks, in theory gpu as well (mxm connectors).

On some older laptops the cpus were socketed, so you could update those as well.


It could be argued that it was the Intel Ultrabook initiative that killed laptop modularity. Although that was a reaction to the MacBook Air and PC makers would have likely gone down that route, Ultrabook or no.


I feel a lot of the repairability issues and poor longevity we have for electronic devices can be laid at the feet of Apple. So many "innovations" from them directly resulted in poorer device servicability, and other manufacturers were forced to follow suit, because that's what the consumer "wanted".

And the consumer only "wanted" these things, because marketing and paid reviews told them they wanted it.


Not sure if it was ultrabook, because sone thinkpads (t440/450/460) were ultrabook yet you could still update disks and ram.


How about supporting a socket for more than one generation?

My desktop PC from 2018 is still going strong, on account of using the AM4 socket (AMD). That allowed me to upgrade to the 5800X3D a while back, a whole two generations or four years newer than the original CPU.


This is the reason I chose to go with AMD's 7000 series for my 2022 build.

I wasn't aware of Intel's limitation when I built my first computer in 2016 so when I wanted to upgrade a few years later I wasn't expecting to need a new motherboard since it still had everything I needed - it felt so wasteful!

Instead I just waited for AM5 based on the longevity for AM4.I'm really hoping AMD support AM5 for a few more generations so I can do the same as you in 2026/7


>How about supporting a socket for more than one generation?

Intel typically seems to support sockets on the desktop for 2-3 generations of CPU:

https://en.wikipedia.org/wiki/CPU_socket#80x86


This is unfortunately not true and a result of Intel's obfuscatory tactics. The socket has been the same in subsequent generations, but you needed to buy a new motherboard because the chipset supposedly did not support the newer CPUs [0]. An example of this is the socket 1151 generation. This socket was named 1151-1 and 1151-2 in later years due to this tactic. When Intel was on their high horse (2010-2018) they generally supported 1-2 generations per socket.

[0] https://www.pcgamer.com/modders-get-intels-coffee-lake-cpus-....


To add to this: some motherboards could support newer cpus via a patch to the bios or embedded controller firmware... meaning that the incompatibilities were/are completely artificial.


I did the same with Intel 12th gen vs 14th gen.


Market forces are driving companies to make more and more highly integrated, glued together macbook clones because consumers demand them. Or at least manufacturers think they do.

Maybe it's time for a class of repairable laptops/desktops, similar to what Intel did with their "Ultrabook" branding a decade ago. Call it an ecobook, and require upgradable storage, RAM, and access to replacement parts for 10 years. Something that lets companies say "this is thicker than a macbook, but not because it's worse".


Let me introduce you to https://frame.work/


And then manufacturers will have to not only explain why its thicker, they also have to explain why its heavier, hotter, louder, slower and with worse battery life…


Meanwhile Microsoft is telling everyone to ewaste their perfectly good PCs and buy new ones?

Preventing ewaste like that is more important than designing products like this, in my opinion.


It doesn't have to be an either or decision. We can do both.


talking about E-Waste , why modern tv doesn't have external device for the soc ? why I have to bypass the slow tv using a firestick , but the tv anyway become obsolete,

but the tv anyway becomes obsolete, and lacking security updates also potentially a danger for my home network and for the internet in general? in addition to losing some essential functions that only the soc integrated in the tv system can provide? would it have been enough to move the computational part containing the cpu, the memory and the sw on an external hw so as to be able to swap it (proprietary or not, but at least upgradable!)


What's the difference between a TV with built in shit smarts and a plugged in firestick, and a TV with no built in smarts and a plugged in firestick?

In the former you get subsidized hardware so the TV is cheaper, in the latter you don't so the hardware is much more expensive.

My TV got a Chromecast on day one out of the box. I have never used it's smart settings, it's never been connected to any network, and it never will be connected to any network. It's remote is in a box or drawer somewhere, never used. I use the Chromecast's little cute remote as it has all the functions I need.

As long as a TV has an HDMI connection, and I can connect my external media box, I will never ever ever use the onboard smarts, but I will definitely continue using the subsidized hardware.


> and it never will be connected to any network

That you know of. It might still connect through other TVs in the vicinity using hidden access points, and reach the internet through those for automatic updates and telemetry and, of course, malware.


Do you have a source for that?


The best I could find was Samsung TVs connecting to a nearby unsecured hotspot (some routers used to have that by default).

OTOH, if during setup you agree to them sending telemetry, they might just use a cheap cellular modem with arrangements with local telcos for low-traffic data connections with a lower QoS than telephony, the same way some cars do OTA updates even though you don't give their on-board computers access to your phone or home.


I’m confused by this, just turn the network off on your TV and use the external device.

There aren’t any essentially functions of a TV that require a network connection.

My smart TV is only connected to the network to upgrade the firmware, and the moment firmware upgrades stop coming out I’m not going to connect it anymore.


even disconnected from the network, however I have to buy an external device to bypass some functions ( which is a non optimal workaround ),

however the bugs are not solved, however the TV remains very slow and I do not even have the courage to do a reset of the TV sw because I do not know if I could reinstall the sw to a recent version (or it would remain with the first factory beta because the updates are deprecated anyway), it is a 7 year old 65" Sony Bravia for example (led, 4k, i'm good with video/audio I don't want to change it but i'm almost forced to.. ) , the menus are very slow, some recording and playback functions of content from the line crash (since the latest updates) and I often have to restart it.... all for a non-replaceable integrated soc of $50


maybe If there was a function to disable android running in the background from the tv ( which is the default), the menus and other primary functions would run better, maybe it's possible.. but do really I have to hack the tv ? how many will do that ?

and as I've said, I'm not even confident in performing a firmware reset at this point given that I would probably get only the version released with the tv without any successive update ( anyway not a solution for everyone )


Ah that’s your problem, you bought an old Android TV.

Unfortunately you bought a bad product in the first place.

If you get an LG with WebOS or whatever Samsung puts on their TV it’s just fast from the factory and it doesn’t get slower over time.

I have a very old LG OLED, so old that it’s curved (2015 probably), and it has no slowdown or bugs like this.

But every Android TV including new ones like my friend’s cheap HiSense purchased last yesh seem to have slow volume buttons, slow bootup, slow slow slow.

I think Android wasn’t designed for the type of SoC that is in a reasonable budget for a TV manufacturer, and that’s why most TV producers like LG and Samsung don’t use it. Plus, Android TV was very much a half-baked product until recently.

In other words, your problem isn’t the non-upgradability of the SoC, your problem is that you have a bad product in the first place.

My advice is don’t worry about the waste. Every moment you worry about the waste, Taylor swift takes a 20 minute driving commute via private jet. You have finished the useful lifespan of your device, it is probably not long before it gets picture uniformity problems anyway. Get yourself a new LG OLED or Samsung QLED or something like that. You won’t regret it. It’ll be a huge upgrade over a 7 year old Bravia LED in picture quality and capabilities.

And still don’t use the smart TV built-in streaming. Use a high-end streaming box with a good processor like an Apple TV or Nvidia Shield. Don’t use a cheap Roku, that’s how you get a slow experience.


[I used to previously work at Google on Android TV, all opinions my own]

There's a constant push for more and more features because consumers and cable operators shop based on that. There's also a constant push for ridiculous BOM reduction to the tune of individual cents being saved. Android TV is non-trivial to monetize so it's hard to dedicate a whole team for performance improvements. There's an imaginary bar for "good enough" (read not too shitty) and you get this result. I hate the result but I can't see what market dynamics might change it.

Unsubsidized "HDMI output" type TVs would cost so much that they would sell poorly, resulting in even higher prices, ad nauseam. I think we're stuck with this.


It's not even that, it's that Android TV/Google TV devices even when not using smart features have slow and laggy basic features like volume control.

It just seems like it's software that was shoehorned into a place where it doesn't fit well IMO.


Roku never suffered from that.


Even my TCL TVs with Roku were better than TVs with Android built in.


Can't you just tell it to automatically switch to some HDMI input on power on?

I have one of those "smart" TCL TVs, with Google TV. No idea what the difference is with Android. I've tried using the smarts, and it was a shitshow, even brand new. Laggy as hell.

Fortunately, it has an option to turn on to the last used input. It's also possible to turn on and off by receiving some signal from an external device over HDMI. I basically never touch its remote anymore and the only reminder that it runs "Google TV" is the logo when it turns on. It's unplugged from any network, and the only consequence is a notification which disappears on its own that it's not connected to any network.


I had six TCL Roku TVs before we downsized. I plugged an AppleTV 4th Gen into the two I used the most often and set them automatically switch to the port the AppleTV was connected to.

The Apple Remote could control the volume and turn the TV on and off.

WiFi was disabled on the two that were attached to AppleTVs.


You want to replace the shit smarts in a TV with a slow underpowered Firestick?


I do miss user-upgradable RAM and storage, which seems to be absent on many laptops these days. However, I don’t share the same sentiments about the other minor features that have been integrated onto the motherboard. Anyone else remember squeezing in a modem, network card, SoundBlaster, and a video card into their case? Or the frustration of discovering that the power supply couldn’t reliably power all those components. Or wanting to upgrade your CPU only for them to change the socket again.

What I really want is something like the Samsung Dex, where you can dock a portable device into something and make it more powerful.


In terms of the sound card, nic, etc, all of these components have essentially reached parity or the point where nobody is really upgrading them. Back in the day your sound card could be regularly upgraded, along with nic and whatnot. Nobody is doing that nowadays outside of niche circumstances.

Ram has held on because it's been increasing in capacity like crazy, and software has taken advantage of that.

Samsung Dex is great, tbh. It easily outperforms most Chromebooks and is really simple to use.


I’ve recently wanted to upgrade my network card to support the latest WiFi standards but the upgrade cycle is definitelu longer.


Some of the newest laptops now have LPCAMM2 memory which is user upgradable.

Also, I would so much rather change the motherboard/PCU on my laptop than get the latest unit my manufacturer has released.

I spent a good amount of money on my current laptop in 2016 and often times I’m reluctant to upgrade because I would have to pay a similar amount or downgrade.


What is going on in that header image? Data destruction? You would want to do any data recovery inside a hood with very good filtering...


I would like to see some OCP designs leaking out of hyperscaler datacenters and pop up as desktops or deskside designs.

Right now there isn’t any decent looking piece of kit to turn any 5u rack server into a deskside. HP had it with their Itanium workstations. We need more creative designers.


You want a 5U with H200 on your desk, not minding the rest of the necessities that go along with it? How about an 88x SAS drive array crammed in that same space?


Unless the 88 drives are SSD's there is a noise issue to deal with (and, with 88 of anything, thermals and also more noise), but I wouldn't mind a machine designed with ease of maintenance in mind.


The issue is not with the creativity of the designers. It is with corporate incentives and consumer demand, mostly.


It's cheaper to get a refurbished rack server than a tower version of the exact same machine (because there are a lot more decommissioned rack machines).

I guess I'll have to try my hand at 3D printing bigger stuff.


> I guess I'll have to try my hand at 3D printing bigger stuff.

An attractive 5U deskside enclosure sounds like a woodworking project to me. I might be blinkered though!


We hit the opposite issue with our samsung washing machine (we plan to never buy another samsung product because of multiple issues with this thing and their support).

It has 5 different “modular” control boards, and when one fails, it tells you which one. That seems great until you realize the boards are > $300, and the error codes point you to the wrong boards (edit: and they don’t accept returns of spare parts)

This created 100’s of pounds of ewaste. The washing machine is 4 years old.

There is something to be said for miniaturization and consolidation of components. If the damn thing had one board that cost $400, then we would have to throw out the whole machine. It’d also have simpler wiring harnesses, etc.


That doesn't sound like a problem with modular parts, that sounds like poor quality and what I have to assume is either artificially high prices or really poor design (basically, I don't believe that a washing machine can contain 5 control boards that really cost >$300).


Intel, responsible for widespread use of non-ECC memory, as well as enforcing a new socket every one or two CPU generations.

Infamous for recent two consecutive generations of self-destroying CPUs (13th and 14th gen).

What a load of bull.


ECC memory is 100% irrelevant.


Kind of, but not 100%. At least not in the case where you get a machine with non-ECC RAM and later need something more workstation-class with ECC RAM for whatever reason (scientific, research and whatnot).

But in general it speaks to the anti-consumer behavior of Intel.


Bold claim.

I will just highlight that Linus Torvalds disagrees[0], and that this sort of thing[1] happens whenever there's a geomagnetic storm.

0. https://arstechnica.com/gadgets/2021/01/linus-torvalds-blame...

1. https://bugzilla.mozilla.org/show_bug.cgi?id=1762568


Not really a bold claim. The performance impact and additional expense of ECC RAM is not really worth it for most consumer applications, like gaming.

It's especially not worth it if you're storing all your critical data on cloud services that already run on server-grade hardware.

Why do I care about ECC memory if the source of truth of my data is in cloud services like S3 that have impossibly high durability standards?

We can even back it up a bit and say, how do you know that data loss isn't acceptable to me?

Is ECC memory really more protective compared to a 3-2-1 backup system? ECC memory can't stop my house from burning down or having water intrusion destroy my system. I think a lot of those "natural" causes of data loss are more statistically probable.

Finally, it's my understanding that filesystems with checksums can essentially mitigate this problem or warn you very early of issues.


>Not really a bold claim.

100% irrelevant, sure.


Whether or not RAM is modular is completely unrelated to whether it’s ECC or not.


Bit late, seems like they’re doing this as a stunt to try and stay above ARM but I do appreciate it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: