Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My memory of floppy disks was having to write multiple copies of the same files across different disks because odds are one of them would fail. It was particularly painful if you had to zip any files up as odds are you’d end up with a failed CRC.

I was a relatively early adopter of CD writers because I was sick of floppy disks failing.

I’ve also got a stack of 3” floppies (like what was used for the Amstrad CPC and Nintendo Famicom) that don’t work too.



I remember how my blood would run cold when you could hear the drive making that distinctive noise when it's re-reading the same sector over and over again because it's not reading right.


A few years back I was doing some cleanup of old media. Checking to see what was on stuff and what I could toss. Of the hundreds of floppies, I had a file failure rate of about 5%. However, my CD/DVD failure rate was around 22% including factory stamped discs.

Now, true, a lot of the dud floppies had been filtered out back in the day. But I was shocked at how well they had faired over the long term vs optical media.


Archivers of that time had an interesting dilemma. If you are compressing a bunch of files, it's more effective to concatenate them all first and then process the resulting bundle - like .tar.gz etc - since compression algorithms can then deduplicate across file boundaries. But if you do that, damage to any part of the archive corrupts the whole thing, which is especially likely when splitting it across multiple floppies. So DOS archivers generally defaulted to compressing files first and then bundling the result. RAR was an interesting case in that it let the user decide on a case by case basis between regular and "solid" (pre-bundled) archives.


I remember buying boxes of floppies hoping that I'd get one that would be error free.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: