Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

May be SVG is not the best vector format after all. May be we need something simpler, where every item directly maps to GPU graphics calls, also binary would help.


GPUs have graphic calls anymore. At best they have fragment and pixel shaders. Converting a vector graphic into a good shader is not trivial, compared to using the built in functions of texturing.

It gets even worse once you realize how limited older mobile GPUs are or what are the incompatible subsets of features supported at decent performance.


If you want something simpler that works with GPUs it needs to be exclusively built out of triangles and/or code that can operate on a single pixel independently of its neighbors.

Resolution independent formats are inherently anti-triangle and by the time you've hit triangles you already have a target resolution in mind.

Or put another way, GPUs really don't like vector graphics in general. That's not what they're built for or good at.


GPUs like vector graphics just fine; rasterizing vector graphics is literally what GPUs are designed to do. It's infinitely scalable vector graphics that they tend to dislike.

Once you've subdivided your curves and such into triangles/vertices/etc., the GPU ends up being a lot happier about its existence.


GPUs are not designed for vector graphics, they are designed for triangles and triangles only. Triangles are a subset of vector graphics, but are almost never the subset that people mean when they say "vector graphics."

Specifically vector graphics typically includes curves, which GPUs just don't do at all.

Once you've tessellated a curve into triangles you've already baked in a desired resolution. You can't have resolution independent vector graphics in a GPU friendly way.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: