Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is there any advantage at all to BCD over the int number of cents? Either approach gives exact precision but requires a marker to say how many digits to the right of the decimal. Fixed-point decimal is vastly faster for calculations. BCD requires weird carry calculations, is less memory efficient per digit stored, and requires a choice between truly awful memory density (one digit per byte) or using bitwise operations to do nibble-level addressing.

Actually, the more I think about it the more awful it gets. I'll go ahead and assert that the only reason ever to consider BCD is for compatibility with legacy systems that use it, and even then you'd only want to use it on the edges of the codebase where the system interfaces live.



Not really, other than compatibility. When the previous poster said financial applications, what was probably meant was "mainframe applications". BCD is still popular on z/OS because it's a first-class format there, and even editors knows how to handle files that contain BCD values.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: