The whole article reads like the excuses of someone with a vested interest in discrediting evidence of their favorite brand's poor performance. I don't think the take away from the data provided by Backblaze is "I can expect to get a failure rate of exactly 1.231971 if I buy brand X's hard drives." The end-user-useful conclusions are things like "HGST's drives are the best," and "6TB drives are less reliable than 4TB drives right now."
Sure, all of the factors listed in the criticism may play a role in the failure rates (except the external enclosure bit, since A. The majority of the "shucked" drives were 3TB, and B. They've outgrown that practice.) But they only have the weakest of justifications for believing that those factors vary systematically across the manufacturers. And indeed, even those factors did vary systematically we'd still get the right answer if we had made the more general conclusions. For example, if the vibrations in the seagate-only enclosures are greater than the vibrations in the HGST-only enclosures, that can only be because the HGST drives are better and vibrate less. Or alternatively, maybe the pods all vibrate the same, but HGST is better because it is more resistant to vibrations.
True, and what those criticisms actually show is that Backblaze's data is highly relevant for the average consumer.
I regularly buy external HDDs, rip them out and put them into desktops and laptops, put them back in different enclosures, and so on. As a result, my HDDs experience a lot of movement and extreme temperatures (e.g. being left in the trunk of a car on a hot summer day). It's good to know which models are the most likely to survive such abuse in the long term.
Thanks for this. Although it makes some important points, it reads as if the author is annoyed that backblaze's data from tens of thousands of drives is getting so much press, compared to the rather useless single drive reviews published by sites like tweak town.
It's also a bit disingenuous to criticize back blaze's methodology when you know that a 'comprehensive study' under more controlled conditions will NEVER actually happen with the necessary sample size to draw conclusions.
Stress testing is a valid methodology for determining reliability - eg car makers crash their cars into walls at high speed to make sure they are safe, or use a robot to push the brake pedal a million times to see when it fails - so they hardly deserve criticism for pushing the drives hard. More information for the consumer is a good thing.
I read the same thing a year ago and I came away actually upset at the tweaktown article. Off the top of my head, I remember some of the complaints being that the drives were subject to abnormal amounts of heat and that the drives were consumer-level drives.
I remember a study Google did on harddrive reliability and it seemed to show that heat had little to no effect on it. I also don't regard consumer-level as being a bad thing. As a consumer, I kind of want to know which drives are built for abuse better. All drives fail; which drives fail more and at what cost?
The tweaktown article did talk about temperature. I think you were right to feel they were being silly with that. Temperature MAY correlate with failure but Backblaze found it did not do so within the ranges they actually see in their environment. Something about which it appears they would have more than enough data to be able to compute.
Google's study some time ago found that temperature either didn't correlate with failures or, in the ranges they ran their machines, had an inverse correlation with failure. It would appear to be one problem disk manufacturers have largely surmounted.
The Tweaktown article is a straight hit piece. I'm not really sure what the motivation would be. The writing is so sloppy and negative that it's hardly compelling.
I didn't think their arguments that Backblaze's early drive failures(First week or what have you) can be explained by their purchasing methods. My understanding is that they still see this well after they have stopped buying from Costco ect.
I've noticed this to an extreme degree on HN lately to the point that I upvote posts I disagree with because they present a valid point. The only reason I can see for it is that they have a different opinion from most HN readers.
Like this, for example: http://www.tweaktown.com/articles/6028/dispelling-backblaze-...