I'm not sure I buy into the Larry Page-as-Lex-Luthor analogy, even if it's meant to be more evocative than literally descriptive. (Page has achieved, circumstantially, what Luthor single-mindedly strove to do.) People seem to want to attribute sinister motives to Page, and while I don't know the man, all available evidence suggests he's more ambitious business tycoon than Big Brother-in-training.
I do, however, think there's something very interesting in this piece. Namely, what does it mean for someone to hold such potential power over the lives of so many? In the grand sweep of history, the amount of theoretical control over the world afforded to someone in Larry Page's position has accumulated in the blink of an eye, and it's staggering.
Technology is changing more quickly than our ability to understand its ethical, socioeconomic, and even existential implications. That doesn't mean we have to adopt the Luddite program, to be frightened of technology, and to slow its progress. We should welcome the rapid progress, and encourage technology. At the same time, perhaps we need to get a lot more serious about developing a coherent philosophy of technology, and a forward-looking technology policy. For instance: it's still a bit crackpotish to suggest we should be working on a Three Laws of Robotics in 2014, but that notion will become less outlandish in the next few years. And maybe, as Asimov subtly suggested, we're the robots who need the guidelines.
We live in very interesting times. More accurately, we're living the prologue to some very interesting times.
I do, however, think there's something very interesting in this piece. Namely, what does it mean for someone to hold such potential power over the lives of so many? In the grand sweep of history, the amount of theoretical control over the world afforded to someone in Larry Page's position has accumulated in the blink of an eye, and it's staggering.
Technology is changing more quickly than our ability to understand its ethical, socioeconomic, and even existential implications. That doesn't mean we have to adopt the Luddite program, to be frightened of technology, and to slow its progress. We should welcome the rapid progress, and encourage technology. At the same time, perhaps we need to get a lot more serious about developing a coherent philosophy of technology, and a forward-looking technology policy. For instance: it's still a bit crackpotish to suggest we should be working on a Three Laws of Robotics in 2014, but that notion will become less outlandish in the next few years. And maybe, as Asimov subtly suggested, we're the robots who need the guidelines.
We live in very interesting times. More accurately, we're living the prologue to some very interesting times.