> I am willing to bet that NVidia and AMD's GPU unit have floodgates of vulnerabilities along the lines of Spectre,
100%, I'm not anybody important but I've been saying this for years, as soon as I saw Spectre/Meltdown announced it's like "lol yeah I bet GPUs are even worse, everyone is just racing for performance and not timing-correctness so they probably have zero hardening against that". I hadn't thought of the connection between Meltdown and that person's comment but yeah it's entirely possible they saw the potential for that or other security shenanigans.
Multi-tenant or multi-privilege-level GPU is likely a shitshow, so in a way it's actually a blessing that heavy GPGPU compute never really took off. Because I bet you can totally do things like leak the desktop or your browser windows to malicious webGL or another tenant on a shared vGPU. We live in a world where clients are mostly running one GPGPU application at a time, plus the desktop, and that means there's nothing there to leak.
(Although of course, it's always a little useless to speculate about massively counterfactual scenarios and assume everything would have gone the same... if multi-tenant/multi-app GPGPU compute had taken off, more attention probably would have been paid to multi-user security/hardening.)
Of course the real game-over is if you can get the GPU to leak CPU memory (or get the GPU to get the driver stack to leak kernel memory/etc via the CPU). That's bad even without multi-tenant.
Intel in particular may also be more vulnerable to that sort of thing since they have uniquely tight ties between the iGPU and the CPU. They come from a world where dGPUs didn't exist and the iGPU was only ever a ringbus away from memory or the CPU... it's apparently been a huge problem with the Xe/Arc drivers (there have been a couple patches where they produced 100x speedups by fixing operations that allocate memory in the wrong place, etc) since the GPU is suddenly no longer super close. It would not surprise me at all if AMD would be more secure because they're not working with an iGPU that's super tightly tied to the CPU like that.
That's super funny you bring that up, thanks for tickling that particular neuron. Great comment and again, just a rando who tech-watches for fun, but, I agree 100%.
100%, I'm not anybody important but I've been saying this for years, as soon as I saw Spectre/Meltdown announced it's like "lol yeah I bet GPUs are even worse, everyone is just racing for performance and not timing-correctness so they probably have zero hardening against that". I hadn't thought of the connection between Meltdown and that person's comment but yeah it's entirely possible they saw the potential for that or other security shenanigans.
Multi-tenant or multi-privilege-level GPU is likely a shitshow, so in a way it's actually a blessing that heavy GPGPU compute never really took off. Because I bet you can totally do things like leak the desktop or your browser windows to malicious webGL or another tenant on a shared vGPU. We live in a world where clients are mostly running one GPGPU application at a time, plus the desktop, and that means there's nothing there to leak.
(Although of course, it's always a little useless to speculate about massively counterfactual scenarios and assume everything would have gone the same... if multi-tenant/multi-app GPGPU compute had taken off, more attention probably would have been paid to multi-user security/hardening.)
Of course the real game-over is if you can get the GPU to leak CPU memory (or get the GPU to get the driver stack to leak kernel memory/etc via the CPU). That's bad even without multi-tenant.
Intel in particular may also be more vulnerable to that sort of thing since they have uniquely tight ties between the iGPU and the CPU. They come from a world where dGPUs didn't exist and the iGPU was only ever a ringbus away from memory or the CPU... it's apparently been a huge problem with the Xe/Arc drivers (there have been a couple patches where they produced 100x speedups by fixing operations that allocate memory in the wrong place, etc) since the GPU is suddenly no longer super close. It would not surprise me at all if AMD would be more secure because they're not working with an iGPU that's super tightly tied to the CPU like that.
https://www.pcworld.com/article/819397/intels-graphics-drive...
That's super funny you bring that up, thanks for tickling that particular neuron. Great comment and again, just a rando who tech-watches for fun, but, I agree 100%.