For pretty much all 8-bit home computers cycle-correct emulation is essential, without it most modern scene demos simply don't work, but also a lot of old games (although those are usually more forgiving).
In later computer architectures, hardware components have become more and more decoupled from each other, running on separate clocks and busses, with caches and buffers inbetween and what not, all of which makes timing less predictable but also less important in emulation, giving the emulation much more slack when it comes to "synchronicity" (which ironically makes modern computer systems "easier" to emulate than older systems - at least when it comes to correct timing).
But 8-bit home computers (and also the early 16-bit systems like the Amiga and Atari) were essentially a single 'mega-chip' all running off the same clock and all chip timings being deterministic, which was and is exploitet by software.
Yes but they're so slow that it's trivial to emulate them
EDIT: That's why I point out PS2. The PS2 was the first system that was fast enough while being dependent on timing enough that there were huge issues with its emulation
Writing an Apple2e emulator, I wouldn't say it's trivial :-/ Proper video decoding is not exactly easy and neither is speaker emulation and neither is floppy disk emulation.
It's easy if you're looking at 99% accuracy. But if you aim at 99.9% it's a different story.
Plain and simple, there's no fully accurate Apple2e emulator so far and it's not like nobody has given a try in the last 20 years...
In later computer architectures, hardware components have become more and more decoupled from each other, running on separate clocks and busses, with caches and buffers inbetween and what not, all of which makes timing less predictable but also less important in emulation, giving the emulation much more slack when it comes to "synchronicity" (which ironically makes modern computer systems "easier" to emulate than older systems - at least when it comes to correct timing).
But 8-bit home computers (and also the early 16-bit systems like the Amiga and Atari) were essentially a single 'mega-chip' all running off the same clock and all chip timings being deterministic, which was and is exploitet by software.