Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

His Nov.2009 blog entry on ZFS data deduplication is a pretty good primer:

http://blogs.sun.com/bonwick/en_US/entry/zfs_dedup

P.S. also, the 2008 pics with Linus. I suppose there is no further story to go with those ...



IMHO, with my little work in compression, all that dedup stuff was a bit hyped. And BTW it is older than ZFS, at least Plan 9's network storage had the same feature. But alignment kills looking for matching chunks of data. And "generally best left to the application" sounds like a cop-out.


It's the ease of usage, speed, and very large compression window that makes dedup better than traditional compression.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: