Difficult to fathom why the fashion for these passed, because such systems provide so many advantages. A good example is the Spanish system of division into 34, which permits straightforward further division into not only 2 but also 17 pieces - a case very difficult to handle with the modern restrictive, awkward and inconvenient decimal systems.
It saddens me in some ways that the UK currency decimalized before mass computerization of record-keeping took off. I think I would have liked to have lived in a world where database currency columns needed to support pounds, shillings and pence (keeping in mind that pence could be divided into quarters, of course).
Dealing with Olde English currency units (and other non-decimal currency units) may have been "a fun challenge" for programmers... but it does NOT sadden me that programmers have seldom had to deal with them! It would have led to a whole world of hellish pain. Have we not suffered enough pain as it is, what with character encoding, timezone handling, y2k-incompatible dates, spatial coordinate / projection systems, etc?
I'm not even sure there'd be much to do: just store everything in sixteenths a of a farthing and format on output, much as people do with tenths of a penny today. If anything the numbers are actually slightly more convenient for machines with no divide instruction.
It's only recently (2001?) that US stock prices moved to decimal from previous eighths of a dollar. The origin of this was spanish "pieces of eight" but not sure why it persisted so long. Maybe because it made the spread always at least $0.125 ?
Difficult to fathom why the fashion for these passed, because such systems provide so many advantages. A good example is the Spanish system of division into 34, which permits straightforward further division into not only 2 but also 17 pieces - a case very difficult to handle with the modern restrictive, awkward and inconvenient decimal systems.