Don't use video to present a list of points. Although it's only a 3:30 video, I would rather have been able to scan a list of the quotes (with attribution to authors & their titles)... then I could have know it was only a list of opinions (not a reasoned analysis).
I don't see any actual content here. Google thinks the web should be faster. That's obvious. They just say this over and over.
We have more pressing problems, like browser compatibility to solve first. Maybe if Google could host a service that would take a page known to work on FF3 and apply magic to it on the fly and make it work on IE6/7/8 etc that would be something useful...
"Many websites can become faster with little effort, and collective attention to performance can speed up the entire web..."
This is true - I have been working with many over the last 3 years to help them improve web site performance. Most sites will see a huge performance improvement by just concatenating (bundling) their JavaScript and CSS. Tools like; YUICompressor and www.rockstarapps.com Eclipse tools simplify the process.
One issue I have found is that developers always leave performance tuning of all projects to the end.
One issue I have found is that developers always leave performance tuning of all projects to the end.
Yep! Here's how that works:
1) You hear "premature optimization is the root of all evil" parroted everywhere.
2) You decide to leave the optimization for the end.
3) You develop code that performs suboptimally, sometimes even abysmally.
4) You develop more code on top of it.
5) You repeat the previous step until the codebase is big enough that optimizing away root causes of performance problems is too painful.
6) By this time you have to deal with releases and bugfixes and feature requests and maintenance and whatnots, so you settle for the advice from Coding Horror, where Jeff told you that throwing hardware at performance problems is cheaper than investing programmers' time in solving them.
7) Profit! Er, no, wait, that's the other list of steps...
In web programming the things that will really give you perceptible speed boosts aren't even optimizations so much as "do it the right way the first time".
How hard is it to configure Apache to gzip your textual content? That takes, what, four lines? And it will deliver huge, perceptible boosts to probably six nines of all websites without perceptible downsides? You should require a note signed by your doctor to NOT have that option turned on.
Putting all your JS/CSS in one file takes almost no thought in Rails. (:cache => true) If you require multiple sets of included Javascript, you have to make the terribly difficult optimization decision of giving them different names (:cache => "purchasing_scripts") If you're not using Rails, then you might have to actually write code in your build/deploy script, once. (Then you copy/paste it to every project you ever do, because there is never going to be a time when including 6 CSS files individually is a good idea.) Looking at our repository it looks like an ANT script can do it in a dozen lines, so it can't POSSIBLY be that hard in your language of choice.
Long long ago, gzip would cause some browsers to barf. Some browsers would lie about what they could handle, and you had to check the UA as well as the accept-encoding header.
JS/CSS concatenation, at least from the last time I had to roll my own always drove me nuts because I wanted to "recompile" on the fly and thus putting it in my build script meant I had to rebuild with every little change.
(I recognize these are excuses. But every action comes with an opportunity cost, blah blah blah)
I often copy the uncompressed source directly over the compiled file, make my edits on the live file, and then copy it back when I'm ready to check-in. Granted, the system I work with compiles and doesn't concatenate, which is a slightly simpler problem. But you could easily have your build script leave comment markers in the file (or just look for top-level '(function(){...})()' closures which nearly every sane system uses) and "unbuild" the concatenated file from that.
I think you might be overstating this a bit: browsers definitely had gzip support back in the stone-age (I actually implemented this by hand in 2000 or so) but for slightly shorter values of "long, long ago" it's also been the case that the necessary UA checks were well-known even if your default Apache config didn't already include them.
JS/CSS, however, I agree completely on - it's a good deployment step but minification is quite the nuisance for debugging and it's less critical if you've configured your server to set proper Expires/Cache-Control headers.
The right place to insert performance tuning is probably between #3 and #4. It makes perfect sense to leave optimization until you know what you actually want to build. It also makes sense that a piece of code will be no faster than its slowest component. So write the slow version first, but don't build anything on top of it until it's no longer slow. Don't pile crap on top of crap.
While HTML is not a network protocol, it is still a data format that is standardized and to which many implementations need to conform -- a "protocol" in layman's terms.
Is XMPP a protocol? Yeah, it specifies a machine-machine communication system but it also specifies the structure and formatting of the messages it sends.
Where do you draw the line? If it were a technical discussion, you'd be right that HTML isn't a "protocol". From TFA, I got the sense that they didn't mean a network protocol but something more like a "standard" (without the nasty connotations of that term).
Not just in layman's term. That is what a protocol is: a specification for communication. With reference to the OSI model, HTTP is an application layer protocol. TCP is a network layer protocol.
Edit: oh, screw me, the article actually says 'HTML'. I didn't register that and saw 'HTTP'. I'm guessing the author meant that: didn't you ever write HTML when you meant HTTP and didn't catch it until the third reading?
Edit2: interestingly enough, the above said 'ISO' instead of 'OSI'...
Being more strict than the standards already are would be horrible. The Web succeeded thanks to the loose coupling between browsers and documents, allowing us to make browsers which behave the way we want instead of merely porting the original browser from TimBL's NeXT.
Writing documents which degrade gracefully isn't hard unless you do it backwards (scripted frills before meaningful content).
The web won't become faster. It'll just do more than before and take as much time as before or even slightly longer, just like software and operating systems have evolved so far.
The key to fast operation is lowering the work-per-horsepower ratio.
This hasn't happened for obvious reasons: imagine the mid-90's world wide web running on modern hardware. Pages would be a few kilobytes. Nearly everything would be instantaneous. Text would be weighted over images. People would be generally unhappy with the simple design. Then they would begin to make sites that are more amazing and flashy and less useful... oops, welcome to the 2010's! :)
It's obvious what the hidden message is here: "We can't compete with desktop applications and operating systems, so we're going to turn the tables on everyone by putting everything on the web where we can kick ass. Err... 'Let's make the web faster! Isn't that a neat idea?'"
Not at all, I'm just pointing it out. It's amusing to me how the PR departments can spin self-interest into purely mutual-interest. I.e. it would be like Mother Teresa helping people because each time she does God cuts her a $10,000 check, and then prattling on about how she's just looking out for you.
This is starting to be kind of obnoxious behavior from google. They only want the web to be faster so people can click back faster and search again and they can get higher throughput on their searches and thus generate more revenue from their advertisers.
Google added yet another browser to the list of compatibility issues web developers have to endure so they could shave a few milliseconds off javascript. Now they are nagging us day after day to rewrite our websites to scrape a couple more milliseconds off our load times so they can generate even more revenue.
How about Google subsidize all web developers with a $10,000 grant so they can dedicate themselves to implementing the acceleration features they are asking for?
Don't use video to present a list of points. Although it's only a 3:30 video, I would rather have been able to scan a list of the quotes (with attribution to authors & their titles)... then I could have know it was only a list of opinions (not a reasoned analysis).