Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Using page speed in mobile search ranking (googleblog.com)
80 points by igrigorik on Jan 17, 2018 | hide | past | favorite | 37 comments


I expect this should hit bad WordPress developers hardest, since the throw-another-plugin-on-the-fire philosophy has led to pages that load 15 javascript libraries for features not even present on that page.


Couple things I found while optimizing a particular WordPress site

1) https://www.webpagetest.org for analyzing loading times

2) a3 lazy load for lazy-loading images: https://wordpress.org/plugins/a3-lazy-load/

3) autoptimize for combining multiple CSS files: https://wordpress.org/plugins/autoptimize/

More description of what we did at my write-up here: https://hackernoon.com/dont-brake-for-fonts-3-web-performanc...


Don't forget caching. A plugin such as w3 total cache or wp fastest cache (my current favourite) will greatly improve your load times.

In my experience the biggest issue with with slow load times are people uploading large unoptimized images. Things like multiple css/js files, unnessary plugins and framework are nothing compared to people using uncompressed large images. Use a plugin, ImageOptim or cloudinary.


Wordpress is well optimized from my experience.

It loads the page and the text first so you can start reading. The images and scripts are delayed and cached. Comments are last. Pictures and thumbnails have various resolutions to improve load time.


Wordpress itself is good at getting the important content to you ASAP. Many Wordpress plugins are not as good.


But what about themes?

Most i've seen load 5+ CSS files and least 10+ JS.


Out-of-the-box stock WP is OK, but many/most plugins are not at all optimized, especially when you install a bunch of them. Hence the "bad" WP developers who pile on a dozen and ruin perf.


Many (most?) of them are not even good on Desktop. Agree to the other comments that the basic Wordpress is very good optimized but who uses only basic Wordpress?


Is there any fast alternative?


In general, it’s not an alternative so much as actually measuring performance and making sure you don’t exceed your targets for each type of visitor. For almost everything you can find fast and slow examples based on how people configured it.


yes sure, I get that. But: I am looking for an alternative, that you can't "mess around" with, the one, that can't be configured to death. I guess "medium" is an obvious choice if you are looking for a log only, but I have no idea how fast their site really is


If it can’t be configured, people won’t use it. The kind of stuff most sites suffer from is as simple as being able to not have it look like everyone else: once you do that the odds approach certainty that someone’s design will load 8mb of ads, fonts, etc.


Well that depends on the use case, right? People do use Medium for blogging, you don’t need WP to setup a blog these days


https://www.hardypress.com/ makes a static version of the site which will be faster


The issue GP is taking about is with lots of client side plugins and CSS files, and not with WP taking too long to render the HTML.

So how does this product helps in this case?


Static files means that your server might serve the page faster (especially under load), but page bloat is a different matter.


Looks like another victory for http://ampletter.org/


You vastly overestimate our agility, but we will take the compliment :)


Shouldn't Google have done this in the first place instead of AMP?


They did, back in 2010 [0]. And suprisingly nobody really changed anything, which is why they eventually came up with the idea for and created AMP.

This change is more focused on mobile.

[0] https://webmasters.googleblog.com/2010/04/using-site-speed-i...


It would have made a change if it was more properly communicated and had a stricter penalty. Really, if those sites where warned for having a few MB's of tracking js or unnecessary assets by google and facebook they would have changed. The problem is that the warning was just sent out as blogposts, not as a drop in SERPs.


AMP was for mobile. That prior change was for desktop. This new change is for mobile. Ergo why didn't they start with an SERP boost for fast mobile pages period rather than the AMP system.


AMP makes it so Google can make speed happen versus this change puts responsibility on third parties to improve speed.


> Although speed has been used in ranking for some time, that signal was focused on desktop searches. Today we’re announcing that starting in July 2018, page speed will be a ranking factor for mobile searches.

> The “Speed Update,” as we’re calling it, will only affect pages that deliver the slowest experience to users and will only affect a small percentage of queries.

The desktop ranking signal had almost no effect; slow pages reliably turned up in search results. Maybe some pages that took 15s to paint were slightly diminished in results, if that.

If the mobile ranking signal is as weak as the desktop ranking signal, (and they seem to be indicating it's going to be a very small effect,) I would expect this change to have no measurable effect at all on the median performance of pages loaded from Google search results.


Without NN won't ranking pages by "page speed" simply enhance the benefits of purchasing "fast lanes" near end points (on previously dumb pipes ;-) to those willing to do so?


It sounds like you're describing a Content Delivery Network (CDN). Many sites use CDNs in order to serve content from as close to the user as possible, often from within the ISP's network. This is unrelated to Net Neutrality, and has been common practice for years.

Using a CDN is one way to speed up your site, but there are lots of other important steps like reducing dependencies and compressing your resources.

(Source: I used to work on mod_pagespeed at Google, now I work on other things there.)


Or even doing the basics like using Photoshop to optimise your images first!


Probably not, since Google is presumably measuring speed via their own connection (possibly with simulated mobile network speeds), not via random user endpoints.


What's NN?


Net Neutrality. (Or Netscape Navigator, but I think in this case it's the former.)


Yeah, I thought Navigator, too.

Now get off my lawn while I load winsock.


While this is great news but is there a way to opt-out of it? I don't really mind heavy sites (sometimes) since I'm always on wifi most of the time and care more about search relevance than speed.

Kind of like how amazon shows you the search results by "Relevance", "Price", "Popularity", etc. Maybe google can do it too and let us sort by "Relevance", "Speed", "By Geography", etc.


You've never been able to opt out of any of the google search algorithm features. I don't see why you we be able to opt out of this one.


The fact that I get different results based on what device I'm using really bothers me. Why does Google have to try to be so omniscient? Why not just ask me, "Prioritize fast-loading pages? (y/n)".

Concerns over Google-the-company aside, Google-the-search-platfrom really sucks IMO because it assumes it knows what I want, rather than just asking me. That mentality was Steve Jobs' worst legacy and I hate how it's wound up everywhere.


I feel like it's easy to frame the question that way, but really look at how many questions they would need to ask...

* Prioritize fast-loading pages?

* Prioritize mobile pages?

* Prioritize by pagesize?

* Prioritize your language? Prioritize local results first? Prioritize news sites? Prioritize blogging sites? Prioritize comment platforms? Prioritize "safe-for-work" websites? etc...

There's an almost unlimited number of questions that you could be asked, and the reality is that nobody wants to answer any of them.

At some point, they need to make some decisions for you, otherwise you are tasked with coming up with your own search algorithm yourself! The question becomes which decisions should they make for you, and which should they let you make.

Currently the answer looks like they are making most of them for you, while allowing you to make decisions on whether you want "web, videos, images, books, news, etc.." and what date range you are searching for.

I think that's a good choice for them, because at the end of the day if you want more control or choice in your searches, you can always use another search engine.

This isn't also unique to Google or their products. I've struggled with this in my own programs and products, even open source ones. An endless list of configuration options means that nobody will ever set them all correctly, and it increases the amount of testing you need to do almost exponentially. Choosing good defaults, or in many cases making the decision outright for your users is the best possible case, because there just isn't any other way which scales out to the entire piece of software in most cases.


Agreed. Knobs to turn is a good thing. But only if they are super relevant knobs. Too much configurability leads to bloat. Good defaults are good, but even better is (often) not having a knob to turn at all.


Have to disagree. Use Google search a lot and find it uncanny in getting me what I need in most situations in the first couple of links.

Now I am old and been using search engines for a long time and remember Veronica and Jughead coming online and do think I have learned through the years how to word a search to optimize getting the results.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: