I feel like this is becoming standard in lots of places, or at least vogue. I worked in a chem lab at the University of Washington that had indoor windows expressly for notes and presentations, and the science library at my school has rooms with glass walls and markers for group studying.
My only exposure to the Old Spice campaign so far has been through HN, despite generally considering myself a relatively internet-aware individual. Interesting.
Almost anything paragraph size can be squeezed down to 140 characters.
While I agree with the article at large, I disagree with this. There is definitely value in writing things between 140 characters and essay-length, if only to jot down complete ideas without grooming them into a lengthy piece of essay quality.
True, but in my experience it's a worthy investment of valuable time. I spent three years using Linux distros, and though I use OSX now, I'm much more comfortable with general system administration than I would have been otherwise.
Having recently had to do some crazy commandline-fu (powershell and cmd.exe) to fix my Windows install, I have to say that there really is no comparison - linux wins hands down. Sure, part of it was familiarity, but it felt so clunky and awkward that I don't think thats all it was. The flexibility of bash, awk, sed, grep etc etc cannot be overstated, IMHO.
On another front, I've found Linux to be time saving, compared to Windows. For example, on a new install, on Windows I now have to hunt for the software I need from lots of different locations, download each one manually and install each one manually (and its impossible or awkward to automate, so you have to sit through a load of install GUIs - or switch CDs if its something proprietary that only comes on CD, though luckily I don't even remember the last time I had to do that, even my games are digital distribution games nowadays). On Linux, all the base stuff is "probably" (depending on your distro) already installed. The rest can be batch installed by a single command (pacman, apt-get, yum, etc) and you can then leave to do something else while you wait for it to install.
Similarly, Microsofts advertisements claim that Linux is difficult and time consuming to keep up to date - what with Windows Update and all. But we all know that on linux this is just a single command (pacman -Su, apt-get upgrade, etc) - this can, of course, easily be automated to run at certain times. Windows Update only updates Microsoft software - the Linux command will update all software installed through the package management system.
Of course, this doesn't invalidate the quote - you still spend time to make Linux do what you need, but you do on other operating systems too, so relatively to the non-free operating systems (which take the same or more of your time), Linux is, indeed, free.
The irritating thing about that argument is it comes with the built-in implication that you already know something else (Windows/OSX/whatever). It is basically complaining about learning-curve, right?
The other irritating thing is it usually 'means' general computer usage, but as hackers we need to choose OS's for things like server deployments, development environments, etc. In my experience when you start mixing in enough complex/worthwhile applications of the OS it becomes a wash in time consumption between *nix/Windows (since you are using the bulk of your time learning application-level things)
So if your doing something 'worthwhile' might as well go with the option that only costs 1 of the assets rather than three (time, money, and freedom)
You make an excellent point about the application-level learning, I actually noticed myself that I was spending a vast amount of time just configuring my OS and apps -- time which I could have spent actually producing things. This was my primary justification for buying my Mac, though I still think I learned a lot using Linux that I wouldn't have otherwise ...patience and perseverance, among other things.
I want to design and launch a improbably large space-mirror. If it were properly focused on earth (curvature adapting to distance from the planet), and given light-speed delay, we could direct a telescope upon it and see the state of earth 6.52x years ago, where the mirror is x parsecs away from earth.
Roadblock: probably not a scientifically viable proposition, given space debris and lack of ability to build something that large/durable.
Solution: write a sci-fi short story with this as a plot element, and call it good.
He makes an interesting point though about intuitive, human expectations for data retention. In interactions with humans, we can expect passing comments to eventually pass, but a computer could retain even the most trivial of information indeterminately.
To me, part of the appeal of Woot! is its nonchalance and small, independent aesthetic. I hope it doesn't lose this in the process of becoming an Amazon company.
This is an interesting point, though what we have with Android is something slightly different in that there's a certain degree of individual vendor responsibility to provide updates for your device as needed. Thus vendors will compete on a few different fronts:
* Hardware. I.e. design, reliability, technical specifications
* Commitment. E.g. providing firmware updates
* Marketing.
* Icing. By this I mean the vendor's "above and beyond" software offerings, such as a more polished UI, add on software packages, etc.
I have to admit that I find Android exciting as an economical experiment. I am not, however, an Android device user. For me Apple's relentless attention to detail has kept me captive (for now).