"Another neat trick: the Lumia 1020 actually creates two images every time you press the camera button in Nokia Pro Camera—a super high-res version for editing and archiving, and an oversampled 5-megapixel copy for easy sharing via email or social networks such as Twitter and Facebook."
> Doesn't WebGL essentially use root access to your gpu?
Only in badly-designed, non-conformant user agents. :)
The WebGL security issues have, for at least a year, been FUD. The spec is essentially fixed, drivers have become a whole lot better, and browsers have become a whole lot stricter in their validation code. WebGL is not OpenGL nor OpenGL ES.
Examples: while the OpenGL ES spec is silent on buffer overflows, WebGL mandates user agents to signal errors; while OpenGL ES doesn't specify the state of freshly allocated textures, WebGL requires them to be blanked out; WebGL textures can never be from non-origin-clean canvases.
Except on Linux. As an anecdote, as recently as January, visiting any WebGL sites that used any shaders would reliably panic my kernel with NVidia hardware and the NVidia binary blob. I then switched to Intel hardware which is much more stable but also shows how much variance there is here.
For me to be comfortable with WebGL, we'll have to really train graphics card manufacturers to take security issues as seriously as web developers do. That isn't going to happen any time soon.
Huh, if you don't mind me asking, which video card? I've used linux+nvidia pretty exclusively for work and never seen anything near as bad on linux (I do run higher-end cards than the average PC, though).