A common complaint for modern Atari retro systems is that they are laggy and I think it's because ultimately they are all emulating the original systems.
It seems crazy to me that modern hardware emulating 50 year old hardware can't match the performance characteristics. What could the problem be?
In modern systems everything is buffered, often way too much individually, let alone in aggregate.
If it were not for the games industry it would be much worse, they are the only force incentivizing keeping latency within somewhat acceptable bounds. Things like FreeSync can help a lot.
Not directly related to Atari, but there's always going to be additional lag on modern "mini consoles" since most of them force you to go through HDMI.
If you grew up playing Punch-Out on an old NES with the RF module to TV, you'll know how brutal even 20-30 ms of added latency can be if you play on the NES Classic Edition.
It's not a 100% digital thing. Timing has to be correct in order for the display to work properly, at the very least. The full hardware emulation includes emulating analog stuff.
It seems crazy to me that modern hardware emulating 50 year old hardware can't match the performance characteristics. What could the problem be?