Was looking into building a new computer today, after 5 years with my current one... Saw that even the ram sticks comes with leds. I don't want a single light anywhere but that's not going to happen ¯\_(ツ)_/¯
By far the worst PC component I have come across containing LEDs is Kingston's HyperX Fury RGB SSD [1]. It's a 2.5" SSD with 75 LEDs! These LEDs generate so much heat (70°C+) that the drive thermal throttles and either prevents the operating system from booting or causes significant performance issues [2].
So my comment is 100% nitpick but I do want to make a point here.
X year failure rate divided by X is not the same as AFR. AFR will always be higher, because it has to compensate for each year's pool of potential failures shrinking.
The correct equation is already there, 1-e^-(n/114), and the correct number is .87%. And while that's only a percent off, it's lucky to only be a percent off. If MTBF was 113 years it would be .86 vs. .88, which is the difference between two digits of accuracy and one and a half digits of accuracy. Losing a quarter of your accuracy is not great!
Those rates are surprisingly low considering that's spinning iron. I'd love to see a similar data set for SSDs.
Of course, it also depends on what you count as a failure. Is a drive that can no longer write but can still read a "failure"? Probably - but that's a considerably softer way to fail than a hard drive with a bad bearing or head.
A lot of SSDs that fail early, fail because of their firmware. That MTBF probably only covers the reliability of the flash chips but not the controller.
May well not be unreasonable for a solid state component.
Also, keep in mind that MTBF is not expected lifetime or anything at all like that.
What a 1 million hour MTBF actually means is that if you had 1 million of these drives running at the same time, you'd expect one failure per hour. (Or similarly, if you had 1000 drives, you'd expect a failure every 1000 hours).
The mean rate of failures over time may not be constant, but it may have a mode for an expected life time window. This modal mean rate of failure is what's being described, I reckon.
It would be nice to know the time window, but I'd expect it to be somewhere in the 5 to 10 year range.
LED's are the modern blinkenlights. They can be useful if the OS supports them properly (Linux is slowly but surely getting there), so that failure states or heavy load can be conveniently signaled in a way that doesn't impact actual, on-screen work.
I wouldn't mind components with a couple of tiny status LEDs, toggleable with a small dip switch or whatever. The issue is that manufacturers add these to simply get "rad-points."
"game cache" branding is stupid but it is L3 not just "more memory" and the amount that Ryzen 3rd gen has is actually significant. So much so that it hugely impacts some workloads like GCC compilation times, which are way, way faster on Ryzen 3rd gen than anything else thanks entirely to that huge amount of game cache, I mean L3 cache.
I mean, it's not wasting that much of any of those. It's probably less wasteful to just put LEDs on all those motherboards than create separate product lines with and without the flair - if some decent majority of the kinds of people that build their own PCs (gamers) want them, then it's easier to save on all the tooling and design and so on to create a new "pro" line of motherboards just for the people that don't want LEDs when they can just use an opaque case.
It wastes a lot! Unintuitively, for the things that do all the heavy lifting, chips are incredibly power-efficient. The stuff on the side, anything that glows or moves (LEDs and fans) takes up a disproportionate ton of power.
> It wastes a lot! Unintuitively, for the things that do all the heavy lifting, chips are incredibly power-efficient. The stuff on the side, anything that glows or moves (LEDs and fans) takes up a disproportionate ton of power.
Not at all true. It's around 0.1-0.2 watt per LED. A RAM stick is going to have maybe a dozen at most, so that's an extra 1-2 watts. Similar for a motherboard. Maybe a fully binged out build is burning 10-20 watts on RGB, give or take.
That's comparable to the power usage of just the VRMs or the X570 chipset (which is 10-15w), to say nothing of the CPU or GPU itself (both of which commonly blow past 100W under loads)
LEDs are very, very efficient. They do not take a "disproportionate ton of power."
2W is a power budget for an entire single borad computer I can run a desktop on. So it's quite a lot in absolute terms for a led inside the case that I'll never see.
10-20W is 10-40% of idle power consumption, perhaps, for a workstation. So it wastes a lot even in relative terms.
You missed the "fully blinged out" part. If you just get RGB on the motherboard and RAM, the harder ones to avoid, it's going to well under 5W. More like 2W. This is a power cost that basically doesn't exist in the context of the entire system.
I'm talking about the overhead involved in making the separate lines. But that's an interesting point - but counterintuitive as I don't need to build in coolers for the heat generated by LEDs. Do you have numbers on power usage?
There’s no reason to not have a simple on/off mechanism (hw or sw). My 6-7 year old MB has some LED traces on the PCB for some reason. Luckily they are dim, static, and ASUS included an option to turn it off in the BIOS. Some other components on the market now don’t these “luxuries”. They are unreasonably bright, blinking, and can only be disabled with a soldering iron.
If you were the kind of person who gets annoyed by low levels of light (e.g., when trying to sleep) then you would've noticed by now that putting a light inside something with as many holes as a computer case does not block all the light.
(Also computer hardware vendors seem particularly fond of blue LEDs, which is the most annoying color to some people.)
Because it limits what I can get. Some light might be leaking out of whatever case I end up with. Why is it on everything? It's a waste. I didn't want it when I was 16, I don't want it when I'm 30. I can't be alone.
You don't have to turn them on? I get the feeling people don't realize they have control over not just the color and action, but whether they're on at all.
And yes, I mean RAM, GPU, motherboard, fans, etc advertised as "RGB". If it's the same price and includes LED, then for all purposes it doesn't have LED.
> I get the feeling people don't realize they have control over not just the color and action, but whether they're on at all.
Not a single vendor supports Linux. Luckily some things have been reverse engineered, but it's very far from "just turn it off". Even on windows it sucks if you're mixing brands. MSI Mobo, Asus GPU, Corsair RAM, gonna need 3 separate programs running at startup to control those LEDs.
The control softwares set the state and don't need to be installed beyond that. I downloaded the trial Win10 image for a VFIO VM, disabled all the lighting, and never thought about it again. I may choose not to use Windows, but I'm not going to overlook a tool because I'm "Linux-only".
I get that some object to non-open standards on principle, but RGB is not a serious impediment to availability or use. Looking, there are non-RGB versions of most current-gen hardware available from popular resellers, and there are plenty of OEM options that are RGB-free. What I don't have racked somewhere else is in solid 'silent' cases with as few moving parts as required, but I bought parts with minimal/no lighting anyway.
I certainly hope that everyone can find something amenable to their situation, but I also feel that objecting doesn't help when an hour's work or research beforehand can allieviate the cause.
People have been mentioning very practical reasons but I just wanted to add that for me, even if none of these practical reasons were to apply (and I also don’t think the practical reasons are very strong), it’s simply about aesthetics.
It’s extremely hard to build a PC that would fit my personal aesthetics and even if I can turn those off the mere thought of having to configure that and the mere thought of those LEDs existing at all offends me on a deep level.
You have limitless choices when it comes to building a PC but the aesthetics of the things you can get are awful and tiring and just … ugly.
That was pretty much my attitude then I wanted a high air flow case and switched to a Coolermaster H500 (2x200MM, 3x120mm noctua (two in the top, one on the back) with a glass side..and a tinted glass side.
The RGB stuff sorta grew on me and it's kind of interesting on Linux to be able to see the CPU fan stop spinning completely (never seems to happen on W10 though...).