Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sort of, but also not really. Two decades ago, say around 1995, a computer was still pretty damn expensive. A decent system (not high-end systems like we use for gaming today, but very basic office stuff) would easily set you back $1k to $2k, which is a bit more in today's dollars. That wasn't accessible to everyone, especially because these things weren't good enough to replace your office, your phone, your TV, your gaming console, your newspaper, your camera etc. So it was an additional cost to all of that.

And while it was already large two decades ago, it's not ubiquitous in the sense of finding $70 star trek like tablets in remote villages in central Africa, like happens today. Ubiquitous in the sense that outside of the 10-20% upper-middle class of the US/Europe, who've indeed had personal computing for decades, we're seeing computing arrive en masse to an additional 1-2 billion people, and probably billions more not long after, now that you get full desktop software, with complete hardware including input and a screen and battery, at $70 retail, and sub $50 second hand. That's insane.

You guys remember the $20 smartphone media talk last year? Should have arrived by now. In any case, this decade is something special. Chips are using so little energy nowadays, low res screens, too. Things are sturdy, cheap. The electricity costs per year are a tiny fraction of the device's costs (which is on its way to 5 to 10c a day per device) Computing is actually going to become accessible to 4-5 billion people by the end of the decade, that's something really new. And we're seeing a lot of initiatives in terms of free, global-coverage connectivity in the form of internet, too, for low-data applications like messaging, banking, wikipedia etc. e.g. FB & Google's initiatives.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: