Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm actually surprised by this; I was under the impression that CPU only took a ~5-10% hit and you can pass the GPU directly to a VM. So I've never actually done it, but I expected VMs to be fine for gaming these days. Where does it fall apart?


VMs are fine for basic games like Minecraft (that isn't resource intensive). It falls apart when you play anything that is resource intensive. Even if you have, say, 32GB of RAM and a good CPU+Graphics, the fact you have to emulate anything at all means it gets noticeably laggier. You can laud the fact that the GPU gets passed to the VM, but since we are emulating, you will notice that.


VMs aren't emulating anything. Services like Stadia and GeForce Now explicitly rely on virtualizing gaming machines to be able to scale them. Nvidia even have a technology for slicing and dicing up large graphic cards to multiple client VMs.


Xbox One/Series X/S actually runs games within a VM as well:

https://wccftech.com/xbox-one-architecture-explained-runs-wi...


A service that is basically a one way video conference call with a remote system isn't exactly a benchmark for high performance/low latency gaming.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: