Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> This is a lot more concerning.

I'm not so sure that's problematic. Probably browser just aren't a great platform for doing a lot of XML processing at this point.

Preserving the half implemented frozen state of the early 2000s really doesn't really serve anyone except those maintaining legacy applications from that era. I can see why they are pulling out complex C++ code related to all this.

It's the natural conclusion of XHTML being sidelined in favor of HTML 5 about 15-20 years ago. The whole web service bubble, bloated namespace processing, and all the other complexity that came with that just has a lot of gnarly libraries associated with it. The world kind of has moved on since then.

From a security point of view it's probably a good idea to reduce the attack surface a bit by moving to a Rust based implementation. What use cases remain for XML parsing in a browser if XSLT support is removed? I guess some parsing from javascript. In which case you could argue that the usual solution in the JS world of using polyfills and e.g. wasm libraries might provide a valid/good enough alternative or migration path.



They don't reduce complexity. They translate C++ (static complexity) to JS (dynamic complexity).

Also it is not complexity if XSLT lives in a third-party library with a well defined interface.

Thei problem is control. They gain control in 2 ways. They will get more involved in xml code base and the bad actors run in the JS sandbox.

That is why we have standards though. To relinquish control through interoperability.



SVG is XML based.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: