I have a straightforward answer. Build incredible PWAs that users will miss out on on iOS, because it has partial support at best. You'd still serve 80% of the market.
I'll admit its a theoretical answer, as most commercial organization would not risk losing out on iOS users. They're not the biggest group of users, yet some of the most valuable groups of users, commercially speaking.
Just doubling down on my statement. I've seen hundreds of discussions and flame wars on browser performance, RAM usage, the ability to keep open 200 tabs.
I mean it when I say that I cannot reproduce any of these differences. Perhaps I could if I would actively measure things, but in terms of just experiencing the speed of all major browsers, I really do not see a meaningful difference.
It could be device related, but I'm on a 6 year old PC (a pretty sizable one though), not having any issues on my mediocre work laptop either.
Doesn't mean it doesn't matter in absolute terms, I'm just saying I don't see it. The more important point though is non-technologist do not tend to pick browsers in such rational ways.
hi, I'm the author. Agree with your point when you read it like that, so I'll clarify what I meant.
"Compatible with the web" in the context of this article means compatible with the web that is primarily built to work on Chromium/Webkit. Because at this point of time, the web is Chromium.
You're right that Edge is compatible with web standards, albeit that they are lagging a bit behind in implementing newer standards. By that logic, IE11 is also compatible with web standards.
Thanks for commenting in the open from Mozilla's perspective, even if its just one voice, it's much appreciated.
I agree that Firefox will remain an attraction to the group of people caring about openness, standards, and their privacy. My deepest concern is on how small that group is going to be. Will it settle at the current ~10%, or will it be as low as 5%, which I consider very dangerous grounds to be in.
As for Google's abuse of market power, I strongly believe it has already happened, in plain view. There's no doubt in my mind that dominance in one market has been used to capture another market. We don't have to wait for that "mistake", it has already happened.
These abuses are already under investigation, but I'm afraid it won't change the outcome. They'll pay some fine in a few years, but none of the browsers will be uninstalled.
It won't matter much, but added an update to the top of the article suggesting people to install Firefox.
We have to differentiate between prefetching from search and prefetching from the actual page being opened.
Google CAN preload arbitrary websites. They fully load your website, JS included, when they index it. As for the security problem when on google.com, they could still preload the HTML, CSS, webfonts, do all the DNS/HTTPS overhead, all of which would be safe to do, save those first few seconds, and create a level playing field.
hi, I'm the author of the article. I have no idea how to stop it. Bad press doesn't seem to matter, as this has been around for years.
One longshot I could think of is reporting it to the EU as anti-competitive behavior. In the case of exclusive AMP preloading on a dominant market (search), it's not that far fetched.
As much as I'm really glad the EU is taking a stand on this, they're also taking too long. The shopping case took a number of years to conclude, and the Android case, still ongoing, has been an infringement Google has operated for nearly a decade now. Suffice to say, if we are waiting for the EU to stop AMP, the Internet will be almost entirely AMP-based before the EU forces them to stop.
Similarly, in the US, by the time Microsoft lost it's antitrust case, Microsoft's dominance in a lot of ways was already cemented, and in those ways, still is. (While obviously the mobile shift has cut them out, almost every desktop PC in every business on the planet still runs Windows.)
You're missing the entire point. It is fine to have a framework like AMP that puts up technical constraints leading to a performance-friendly web page.
The problem is the cache and specifically the preloading of it. This gives AMP an unfair advantage of multiple seconds over anything else.
That's why Redfin (https://redfin.engineering/how-to-fix-googles-amp-without-sl...) pointed out that the Web Packaging spec could fix this. But before you have a general purpose spec that fixes something, you need a specific embodiment that does. asmjs came before WASM. SPDY came before HTTP/2. Flash came before HTML5. I didn't like Flash, but would you have suggested Adobe worked with browser vendors for years to bring the Web up the capabilities needed and never having shipped Flash?
Still, even without the AMP-cache, mobile sites were loading way too much JS, even after Google penalized them. The effect of AMP showing how sites could be loaded as fast as native Apple News/Facebook Instant, has finally gotten publishers to strip down their sites. You might not like the way it played out, but the end result is that not only do end users get AMP-cached fast loading, but they also end up download far less data, because the sites themselves have been pared down.
It won't fix this. The only thing it will do, it will let browsers show the original link, not the AMP link, and fix the UI. The problems described in the article will not go away.
> But before you have a general purpose spec that fixes something, you need a specific embodiment that does
AMP isn't that spec though. It does nothing special. And the only reason it's fast is because Google aggressively preloads it.
> but the end result is that not only do end users get AMP-cached fast loading, but they also end up download far less data,
Are they though? When for every search google preloads tens of AMP sites to make them "fast"?
> It won't fix this. The only thing it will do, it will let browsers show the original link, not the AMP link, and fix the UI. The problems described in the article will not go away.
"If other browsers accepted the Web Packaging standard, the web might look rather different in the future, since basically any site that links to a lot of external sites (Reddit? Twitter? Facebook?) could start linking to prerendered Web Packages, rather the original site. Those sites would appear to just load faster. Web-Packaged pages could one day eliminate the Reddit “hug of death,” where Reddit’s overenthusiastic visitors overwhelm sites hosting original content.
Despite cries that Google is trying to subvert the open web, the result could be a more open web, a web open to copying, sharing, and archiving web sites."
>Are they though? When for every search google preloads tens of AMP sites to make them "fast"?
TheVerge.com non-AMP loads 3MB of data, 289 HTTP requests, executes 1.5Mb of JS. Going to Google.com and searching for Verge stories produces 10 carosel Verge stories, and according to Chrome DevTools, only 377kb was loaded, though this seems oddly wrong, I doubt prefetching AMP stories will exceed the shitty bloat of non-AMP pages.
WashingtonPost non-AMP homepage is 6MB+
NYT non-AMP is 4MB+
WSJ non-AMP is 5.7MB
And by non-AMP, I mean "mobile web version" The desktop versions are even larger.
It's prerendered (via a static site generator). In total, it loads 692 KB (I din't do anything to optimize it, the images are quite large etc.). It loads from a small server, and images are loaded from Twitter, meme.com etc.
It loads a whopping 2.9 MB [1], and keeps loading as you scroll down. If you open it from Google's search, it opens instantly. Because parts of it were already preloaded on the search page. And the page itself (including almost all images) is served by a ridiculously powerful geographically distributed CDN.
So, questions/hints
1. How is that fair to people who actually build their pages and host them on their servers?
2. What is open about this web?
3. How will Web Packaging solve this issue if I can't afford to build a geographically-distributed CDN on par with Google's for my own cache?
---
[1] It actually changes on every reload. The lowest number I've seen is 1.6 MB, but then, in a second or two, it starts loading additional stuff, going up to at least 2.2 MB
So much for "small APM pages". Actually, as I'm clicking around, rarely is a page below 1 MB. Even for pages that are not that different from mine: only images and text.
For some reason you think that the solution to that is "let's do a standards-incompatible aggressively preloaded slimmed down page that will live on our ultra-fast CDN/cache servers".
Can you see the problem?
Also, can you see why web packages don't solve the problem (hint to start you thinking: not everyone can run their pre-rendered pages off of Google's CDN. Even Google's own AMP isn't fast if it's not preloaded from Google's cache)?
> "let's do a standards-incompatible aggressively preloaded slimmed down page that will live on our ultra-fast CDN/cache servers".
How can it be standards incompatible if it works in existing standards compatible browsers?
> Also, can you see why web packages don't solve the problem (hint to start you thinking: not everyone can run their pre-rendered pages off of Google's CDN. Even Google's own AMP isn't fast if it's not preloaded from Google's cache)?
Did you read the Redfin article? The point isn't for you to run the CDN or do the prefetching, the point is, how do people find your site and articles? Either they find it through Google/Bing/Baidu/etc, social network sites (Twitter/Facebook), or aggregation sites (Reddit, HackerNews, etc). The point is, for large aggregation sites with a lot of traffic to roll out preloading on CDNs. So for example, Cloudflare already supports AMP-Cache, and Reddit could roll out prefetching if desired.
And you completely missed the point that, getting publishers to adopt AMP gets them to slim down their sites even if you don't use the AMP cache or preloading. Something everyone has been trying to get them to do for years, including Google, who has been trying to penalize slow sites for years (https://www.linkedin.com/pulse/20140827025406-126344576-goog...)
So hurray for you making a slimmed down page, but you're not the target audience, the huge number of other sites that have for years, bloated the Web and haven't responded to previous attempts to force them to go on a diet are the target.
> How can it be standards incompatible if it works in existing standards compatible browsers?
You really have no idea how the web works, do you? Browsers do a best effort to display any page. Even if the HTML is totally absolutely invalid, the browser will go out of its way to display at least something.
The mere fact that something is displayed by a browser doesn't make it standards-compliant.
- whatever extensions to HTML 5 they bring are not a part of any HTML standard, past or present. And it doesn't look like Google is interested in making them a part of any future standard.
> So hurray for you making a slimmed down page, but you're not the target audience, the huge number of other sites that have for years
That's not the point, is it? Google will still penalise my page even if it's way slimmer than a standard AMP page. And since I cannot afford to run a Google-scale CDN, it will perform worse than an AMP page.
So here's what we have in the end:
- Google (and Google alone) decides what AMP will look like. There are no discussions with the web community at large or the standards committees.
- Google (and Google alone) decides that only AMP pages end up in its own proprietary AMP cache. (Other "big aggregators" may/will also decide that only AMP pages can be in their proprietary caches)
- Even if a web developer follows all of Google's performance tips (https://developers.google.com/speed/docs/insights/rules) the page will still be penalised because it's not an AMP page (i.e.: not a page developed using whatever a big corp has decided, and running from a big corp's CDN/cache)
- Even Google's own page speed tools tell you that AMP is not fast, and yet everyone (even 100% optimised slimmed down pages) is penalised if you're not running the page from an overpowered private cache
A lot of mental gymnastics and total ignorance of how the web works goes into calling this an open, extensible web that will benefit everyone.
"This blog post was tough to read! Seems to have capped out at a 6th grade level."
I'm not a native speaker, and this is the best I can do. I was expecting less than 6th grade, so thanks!
"Re: the topic, the author shows clear bias toward Firefox"
Yes, the article literally says I am biased towards Firefox success. You didn't discover a secret plot or anything.
"It might be better to just acknowledge this: the best browser wins on each platform."
No, the one shipped to user's home screens wins. Even crappy ones.
"20/20 means PERFECT vision. Above that is ‘abnormally good’."
If it would mean perfect, there would not be a level above it. I do agree the statement isn't as clever as I planned it to be.