Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
WebGPU hits 40% availability 2 weeks after Chrome releases support (web3dsurvey.com)
128 points by bhouston on May 17, 2023 | hide | past | favorite | 70 comments


WebGPU hits 40% availability in the subset of websites that intentionally put an WebGPU testing iframe that runs bleeding edge JS in their sites. Even the JS used in collector.js uses vanilla JS features that aren't even in all browsers and so will simply miss out on collecting any information about them, or browsers that don't run it's JS, at all.


Do you have better stats on its adoption? I am basing this on collecting capabilities from roughly 30K browser per day, which is at least decent. You can argue it is biased, but that is what the breakdowns are for.

Also, what features in collector.js do not have wide browser support? If you let me know, I can address it. Is the amount of browsers who can not run this script statistically significantly, like more than 1% of devices?

I do have error reporting on the script so I know that if the script runs I catch all the errors. I guess I am missing if the script fails to compile in the first place.

BTW I just pushed a new version that uses ES6 as the TypeScript compile target rather than ES2020 for collector.js.


You need error reporting as a separate script injected early on into the html page before your script is ever loaded. This error reporting script should be as bare bones and backwards compatible as possible.

You can grab mine from our site, if you like.


Sure. Which site is that? I can do this easily on the iframe embed.


Here you go, this is taken from production, copied from the browser's "view source" feature so there are a few dynamic values hard-coded (they're obvious, though): https://gist.github.com/mqudsi/2f570cf58d7d293ba27217a308659...

It's from a checkout page (the most important page where fancy JS isn't worth a single lost sale) so it goes to great pains to catch all errors and not be itself a source of JS errors. It intercepts errors and POSTs them to the server with a good-faith attempt at getting the browser, faulting script, line number, column number, error message, and stack trace.

It was actually in a <script> tag and not in an external js, but that shouldn't matter too much. You can see the comments and code to support actually ancient browsers (predating IE6 and Firefox 24).


>Do you have better stats on its adoption?

Chrome (to include the original Chromium and forks like Edge and Brave) holds 80~90% of the browser market share, and the vast majority of those installations will either autoupdate or be manually updated on a regular basis.

The logical conclusion thus is that whatever feature(s) introduced by Chrome will be widely available for common use in a short period of time.

This is the positive side to a monopolized browser market: Whatever Chrome supports is what the commons can support.


> Chrome (to include the original Chromium and forks like Edge and Brave) holds 80~90% of the browser market share

https://gs.statcounter.com/browser-market-share: worldwide, Chromium-family browsers are at around 76%.

Naturally it depends on where you are: by Statcounter’s figures, Australia and USA are both under 62%, and India is over 95%.

(Note that Statcounter is far from reliable: its data comes from trackers that are blocked by most ad/content blockers, which mean it is likely to significantly undercount Firefox especially.)

And two other factors to consider here:

• On a feature like this, browser support is only one of the gating factors: you also need graphics card/driver support. For a long time, WebGL support was way lower than the browser support charts suggested, for this reason. (Subpoint: WebGPU is not a cohesive whole; devices may support some features and not others, which will always make life more difficult, and that’s about the graphics card hardware and driver, not the browser.)

• At this time, Chromium hasn’t shipped WebGPU on all platforms—only desktop platforms. Mobile platforms will lag, and I imagine that low-end devices will just not support WebGPU for many years to come.


It is not a positive, but a negative side to a monopolized browser market. Unrestrained power to define communication standards and protocols will eventually be used for malicious purposes and there will be no players in the market who would oppose it.

Use Firefox, the last bastion of the Free Web.


Firefox is working on implementing WebGPU support too.


It's not about WebGPU. It's about power to unilaterally define web standards.


Why bring that up with respects to a technology that was collaboratively designed between all major browser vendors?



[flagged]


I too would be more than happy for Mozilla to be replaced but thinking that Google is any less cancerous is simply retarded. In fact, one of the biggest problems with Mozilla (but not the only one) is that they are effectively a Google vassal that can't afford to challenge its master. As for all the other non-Google-reliance problems, you can bet that as another SV tech corp, Google has them all too and then some.


Yeah right. IE6 'just worked' too and you didn't have to worry about compatibility if you used it.


As someone else pointed out, you're overestimating Chrome/ium's market share.

Regardless, after the web.dev/baseline announcement, I looked at Browslerlist and one of our site's analytics and it is shocking how many people are not using the last two versions of evergreen browsers. There is a long tail of browser versions in those stats.

https://browsersl.ist


That idea usually works, but doesn't apply to WebGPU, as it has some specific hardware and software requirements to be supported.


Hence why anyone doing Web development should update their CV for ChromeOS developer.


If you only use JS to collect statistics you are only collecting statistics about JS users. And that is not everyone. And I'm not just talking about weird nerds like myself that intentionally disable it but also the underserved billions of the world without the latest year's smartphone.

It's a fatal flaw in pretty much all modern web stats collection that makes it seem like JS and bleeding edge JS support are far more of the pie than they are. And any time it's brought up those same people say it's not worth actually collecting information about all hits, say, via the webserver logs, because it's small. But they never connect the dots...


What do you mean by "the underserved billions of the world without the latest year's smartphone"?

Sure, there is a small set of people who choose not to run JS, or use something like lynx that doesn't support it, but almost every browser version released in the last 20 years, certainly all the graphical ones, have by default good enough JS support to gather basic statistics and send them back to the server.


Look at collector.js, in the first dozen characters it's already using ECMAScript 6 standards like promises. At the very least you need to change that "last 20 years" to the last 5 years.


> Look at collector.js, in the first dozen characters it's already using ECMAScript 6 standards like promises.

According to caniuse, promises are supported by Chrome since v33, by Edge since v12, by Firefox since v29, and by Safari since 7.1.

https://caniuse.com/promises

Both Chrome 33 and Firefox 29 were released in 2014.


You are gently omitting IE which doesn't have this at all. Doesn't matter when others were released, at that time IE had a market share that was non-neligible.


I get what you're trying to say, but a lot of these ES6 things have been supported for longer than you'd think a while now, and are near-universal.

Near-universal: not universal. But people still running very old devices or browsers are not going to have a fun time on many sites regardless, and WebGPU is not likely to be used casually on random sites: it will be used for the same stuff that WebGL and canvas is today: mostly for special-purpose things like games, some visualisations, etc. – the sort of stuff that very old and slow devices will have trouble with in the first place.


> it will be used for the same stuff that WebGL and canvas is today

So for fingerprinting, got it.


Promises have been supported since 2014 in both Chrome for Android and Safari on iOS. Even pretty old phones generally run web browsers more recent than that — and that's not "the last five years," that's nearly twice as long.


If you are going by the dates that Google and Apple browsers supported it for the first time. But if you tried to use ECMAScript 6 features back in 2014? Hah! The support would not be there. 5 years is a realistic number.


You argued that using these features today blocked "billions" of people using old smartphones. Promises have been supported by smartphone browsers for over nine years. I do not think that using promises today prevents anywhere near billions of people from running your code, nor do I think that "5 years is a realistic number" for a feature that has been supported for 9. We're discussing whether using promises today impacts a significant number of web users.


> you are only collecting statistics about JS users. And that is not everyone.

Okay. I can assume that anyone who has JS disabled, and I know how to detect that in the iframe, as also not having WebGPU available. Because WebGPU is exclusively JS and if JS is disabled, it effectively disables WebGPU. I'll add that.


Users with an older smartphone on a slow connection could fail to run the script even if it's not disabled, if it's a separate network request.


> Users with an older smartphone on a slow connection could fail to run the script even if it's not disabled, if it's a separate network request.

Hmm... technically correct, but I wonder if this is statistically relevant? Remember that I am currently collecting 30K browser samples per day. I'd have to miss 300 samples in a day this way for it to cause an error of 1%.


From what kind of sites? The kind that know and care about WebGPU support enough to include your iframe? That's some heavy selection bias if my assumption is true. Not a lot of low end smart phones visiting the kind of person that runs that kind of site.


I list the "supporters" of https://web3dsurvey.com prominently on the homepage.


That's what I figured. So that supports the idea that it's only measuring a very skewed set of visitors from Webgl, JS, and gfx designer type sites. I'd put the iframe on my very non-JS, non-webgl, non-gfx sites to skew the other way but I don't put JS scripts on sites.


But then it wouldn't report anything, and assuming those cases are roughly representative for cases where it does work, it doesn't really skew the results.


It would be great if browser makers published their own statistics on this kind of thing. They presumably have the real data. But they don't share it, so the next best thing is for a community of interested developers to try to collect it themselves. Of course that's not as good as the real data that the browser makers have, but what better option is there if you need data like this?


WebGPU availability factoid is actually just statistical error. Noscript Georg, who lives in a cave and browses over 10 sites a day using Emacs, is not an outlier and should have been counted.


40% seems disengenuous when it's essentially 0 on mobile...


I always feel there's probably a lot of selection bias involved in these sorts of figures.

If you build a mobile first website, most traffic you're going to get is mobile since mobile first websites suck on desktop.

If you build a website that doesn't perform well on mobile, mobile users are going to bounce and your traffic is going to make it look like desktop is the primary paradigm.

FWIW, something like 60-70% of the traffic to my sites is from desktop users. They work on both desktop and mobile, but they just cater to the sort of people who use desktop.

It's really hard to get objective measures.


> I always feel there's probably a lot of selection bias involved in these sorts of figures.

I am aware of this and if the collector gets more widely adopted I can get better numbers -- I only created this site in March 2023.

Luckily, I do show the platform breakdowns as well.


> Chrome is a monopoly

Oh wow, not news.


I want the ANE+Android equivalent along with haptics to also come to all browser runtime...just feels like kneecaping websites


Haptics were part of the web spec but scammers abused them (making the phone vibrate to simulate a virus alert or whatever) so the feature got disabled in most browsers I believe.

It's officially supported on all platforms except for Safari (big surprise there) but for browsers supposedly supporting the spec on MDN (https://developer.mozilla.org/en-US/docs/Web/API/VIBRATION_A...) I'm not getting any vibrations our of demo pages I can find online.

Edit: actually, it only seems broken in Firefox and Bromite now, Chrome still has vibration support. Firefox seems to have lost vibration somewhere during development of the rewrite that also took away most addons, so I'm sure it'll be a few years before this bug gets fixed.

Still, with Chrome supported, that's the majority of mobile devices out there fully supporting vibration patterns!


It works just fine on Android browsers.

https://whatwebcando.today/


You're right, it'll work for most users.

It's still bugged on Firefox: https://bugzilla.mozilla.org/show_bug.cgi?id=1653318 and also on Bromite on my phone, which is why I thought Chrome also disabled the feature.


Didn't work on Chrome113 or Brave on Android 13


It surely does on my phone, using Android 13 with Chrome on a Samsung device.


Adoption will be dead slow. People just copy each other or something they see and like. The css 3d options and other neat things in the ecosystem are also dead in the water. Maybe because its too complex for the average ppl or no proper examples were provided.


What will people actually use WEbGPU for?


I'm interested in the potential uses of WebGPU for gaming. WebGL is a pretty limited API compared to what you have when using opengl, DirectX, etc. WebGPU should help fill that gap.


Surveillance. Even basic HTML is used for surveillance with things like "tracking pixels".


What are you basing this on? I can't think of any reason it would be used for surveillance, but curious if I'm just out of the loop on this one.

Also, I'm not sure you understand what a tracking pixel is – tracking pixels don't track pixels on your screen.


A tracking pixel is an external image loaded from a server for analytics purposes. All web technologies facilitate this. Pictures, css, fonts, video, audio, javascript, and all the new Web* tech. Requests link you together on different servers. Scripts actively spy on you by logging keystrokes and more, then funneling that to a surveillance company who launder it for the government. Rendering tech like canvas and webgpu help uniquely identify the machine by carefully timing and positioning things.


Can't the server already kind of uniquely identify a machine using the IP, User Agent, timing of requests, times of visit, navigating patterns, etc., without even relying on any client-side data?


If I connect to example.com how does google spy on me when I do that when not using chrome? Answer is example.com instructs your browser to load google's spyware.

(Note: the real example.com might not do that, it is merely an example)


Yes this is called fingerprinting and it only takes ~7 pieces of entropy (iirc) to uniquely identify you from every other device on the internet. Javascript alone hands over dozens.

JS is truly a cancer that really only benefits advertisers and intel agencies.


Unfortunately, some major ad tracking companies are using WebGL capabilities to aid in their fingerprinting of browsers.

As WebGPU is rolled out, they'll probably add WebGPU capabilities to their collection list as well.

These companies do not single out WebGL or WebGPU for fingerprinting but they query any optional browser feature they can in order to develop as complete and robust of a fingerprint as possible.


Clustered training of AI models.


"WebGPU availability"

... have you asked your users if you can use their GPU? Have you asked your users if you can query their capabilities? Availability isn't consent.


I can’t think of a single app — web, native or otherwise — that has ever asked before using my GPU.


Native apps ask for consent - not specifically for your GPU, but for compute in genereall. It's called having to explicitly install them before they can run.


I mean, you explicitly visit the website too, right? It's not like it just spontaneously takes over your browser.


What about their CPU? And if they don't agree, how would their machines respond?

(This question is half joke half rhetorical. The point of it is to argue that there is no material difference between different parts of a modern computer/smartphone. The CPU and GPU are often the same chip and both are and were always part of showing you a web page. The fact that WebGPU uses a different API doesn't change anything)


Requiring consent for computation that is not strictly needed would actually be a good thing.


That's probably a hard line to draw. Just blending some colors or doing a complex layout isn't strictly needed. How complex can something be be before it's unnecessary?


> How complex can something be be before it's unnecessary?

When it's the remote server that's instantly delivering non-static data. Yes, javascript falls under that and yes I have javascript disabled.


You are asking too much. Chrome must execute remote code. WebGPU is CIAs wet dream.


Javascript already brought all 3 letter agencies to climax long time ago.

Javascript based fingerprinting is capable of uniquely identifying every single one of us and the reason it's not everywhere because it's too slow for commercial web. I haven't looked into webGPU yet but it seems like this could greatly speed up the fingerprinting process.


The alternative to webapps in JavaScript are desktop apps.

Do you really think the situation would have been better with desktop apps? sha256(/etc/passwd) or whatever the equivalent of that is on Windows and you're done. Even in a sandbox where you don't have access to these files there are still plenty of opportunities, such as just reading the environment, solib versions, testing support for hardware features, etc.


Yes, when they decided to install ChromeOS on their computer.


It's hardly consent when Chrome or another browser that implements all this shit ends up being required to access your bank, do your taxes, interact with your friends etc.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: