> Here’s an exercise: The next time you see someone talking about algorithms, replace the term with “God” and ask yourself if the sense changes any.
This comes immediately after a paragraph that lists four instances of people saying things about "algorithms" ... not one of which would have meant the same thing if it had said "God" instead.
Most of the recent occasions when I have used the word "algorithm" it's been in contexts like "We can probably come up with an algorithm to distinguish this case from that case" and "I spend a lot of my time designing and tweaking algorithms". Replacing "algorithm" with "God" doesn't make much sense there, either.
I suppose the author isn't really thinking about people who actually work with algorithms every day, but about the general public. I would hazard a guess that at least 75% of the general public have no idea what "algorithm" even means, are not familiar with (e.g.) the fact that Google's search engine is (kinda) executing one, and as a consequence don't have an attitude to "algorithms" that remotely resembles that of religious people towards their gods.
The sentence I quoted seems like the kind of thing people say to sound clever and insightful, without actually paying too much attention to whether it makes any sense.
There are (I think) some good points in the article, but having such overblown bullshit so prominent so early in it really doesn't encourage me to read it with the care the author presumably thinks it deserves.
The article is mostly gibberish. If he had simply talked about the social consequences of an algorithmically dominated society he might have had something interesting to say.
The comparison between algorithms and god is particularly egregious. If you don't understand god, it's because god doesn't exist. If you don't understand algorithms, it's because you haven't bothered to educate yourself (which is not an option for everyone, but the author is writing on the topic for a national magazine, so it isn't an unreasonable expectation.)
I also found that point very off-putting for the same reason.
If, for a more technical audience, we replace "algorithm" with "ranking heuristic", then I think the religious comparison makes more sense. All of the "algorithms" that he references in the article are of this form, and the comments do not apply to more generalized algorithms; nobody is going to claim that "the QuickSort God is a fast sorting God" (though that does have a certain poetry).
Really, though, these heuristics and people's reaction to them harks back to earlier roots than religion in the modern sense; things like shamanism and magic (or "cargo-cult science"): the leap from "there are more birds in the sky before it rains" to "you there -- go into the forest to scare some birds, we need rain".
I agree. However, your use of "algorithms", while more technically correct than the average usage, is also in the minority. The article seems to focus on the culture that lauds algorithms. In that context, the interchangability of "God" and "algorithm" converges. Here are a few remarks from that angle, supplementary to your comment.
I think the mainstream use of the term "algorithm" (or nearly any buzzword) is at the mercy of myriad competing forces seeking to impose their interpreation on the phrase. At least one of those forces is marketing, which has incentive to offer parapharsed, sugar-coated snippets of what something is. This plays on a common aspect of human nature, which is to feel more secure in a situation in which one feels confident that things are accounted for. There are remarkable similarities in what priests, scryers, prophets, and oracles once did, and what, presently, the media, marketers, investors, fans, geeks philosophically abstracting from their technical work, and advocates of theories do-- namely, a presentation of truth from a limited set of the inititated to the mass public.
While it is true that algorithms have their place, it is also true that much of what is said about "algorithms" is part aggrandization, part embellishment, part over-simplifcation by marketers, media, non-technical users, or technical users who have a divergent system of beliefs, and invest much thought and effort into a potential future which may or may not occur, and which is not directly related to their technical work (e.g. technological singularity, etc.).
It's always good to (in theory) maintain perspective on such things, and not get carried away by an over-application of a singular idea.
I think you're understating the degree to which modern machine learning algorithms are completely opaque, yes, to the everyday person, but also to the developers and researchers that use them.
"...sorry, we denied credit because lambda is less than 0.5. The fact that you are not able to interpret what happens in machine learning is very, very common."
This opacity is a huge deficit in machine learning. A seriously vast amount of work has been done attempting to address it, and machine learning algorithms that are persistently uninterpretable (neural nets being the poster-child for this problem) will always have very limited use.
We understand in principle what any given neural net is doing, but our lack of understanding of the specifics makes it impossible to trust them for anything interesting. We can't tell when they might fail.
For this reason a lot of people are busy solving problems that "machine learning" might one day handle, but we are doing it with relatively conventional algorithms that fall outside the "machine learning" envelope because the areas we work in (computer assisted surgery, say) aren't amenable to the application of things we don't understand.
To go back to his comparison with god: the incomprehensibility of gods is generally considered a feature by religious people. The incomprehensibility of machine learning algorithms is unquestionably a bug to those of us who have hard problems that need to be solved reliably.
Most of the algorithms you use in everyday life are not machine learning algorithms (and certainly not neural nets) for precisely this reason.
It's good to point out that researchers are working to reduce this opacity, but this comment contains several unsupported assertions.
> the incomprehensibility of gods is generally considered a feature by religious people.
I'm not sure how you reach that conclusion. It's hard to generalize about "religious people" or even "religions" because they are so varied. I might point to the book of Job in the Christian Bible as evidence that the inscrutable nature of God (or the universe, if you prefer) is considered frustrating and tragic enough that scripture takes pains to attempt to explain it. In fact, there's an entire branch of theology [0] that tries to address this shortcoming.
> Most of the algorithms you use in everyday life are not machine learning algorithms
Is that true? My understanding is that almost all modern web services with vast data (Facebook, Google, Netflix) make at least some use of machine learning algorithms.
The biggest obstacle to overcoming this is the bloody jargon. If there is one thing that helps to establish a moat that the unwashed masses can't cross and so forms a cornerstone of the cathedrals foundation it is the fact that two IT people talking might as well be from mars when it comes to comprehension by non-IT people.
Obviously this applies to just about any field and is a form of data-compression to save bandwidth in communications but it can certainly leave a non-initiate totally in the dark.
But this is in fact a good thing. Look at what happens when some uninformed people get together and start discussing "infinity" or some other misunderstood concept from math (which has been around long enough for the moat to disintegrate). If I was locked in a room with three tireless uninformed people discussing math and a gun with a single bullet...
Insofar as jargon makes people unable to spout off that kind of nonsense about computers, it's absolutely good, proper, and necessary. If you still doubt that, just look at what happens when the public got their hands on the word "algorithm", just like you see in the article. Not only is the article right insofar as the claim that people don't understand the term, it even provides an excellent example itself of an author completely misunderstanding and abusing the term to the point of nonsense.
The thing about the moat is, it takes a lot of climbing to get to the top of that tower, or to any useful height, and we shouldn't be ashamed of that necessary effort. If you get hoi polloi on the grounds below, they're just as happy to stay down there (instead of climbing, i.e., learning) and proceed to destroy the [perceived] value of the education and skills of the people inside.
I think that the problem is not that the initiates use jargon,
but that the jargon is appropriated and misused in wider
contexts, either to over-simplify or to misdirect:
> Concepts like “algorithm” have become sloppy shorthands, slang terms for the act of mistaking multipart complex systems for simple, singular ones.
Consider credit-card fraud detection - if I tell a customer whose
card transaction was just rejected: "Oh, your bank's fraud algorithm
made a mistake, just call the toll-free number on the back of the card"
then I have simplified for convenience, in a way that the customer is likely
to understand.
On the other hand, if I say, "the intelligence/security services of country X
have an algorithm to identify potential evil-doers" then I have very likely vastly
over-simplified, to the point of misleading an audience. (and inviting discussion
of the algorithm rather than any of the many algorithms, services, agents, evils, doers, etc.)
Substitute "incantation" or "luminiferous aether" for algorithm, and the problem
remains - a shorthand expression for a complicated thing allows us to
inaccurately treat the complicated thing as a simple thing.
A personal favorite of my in the use of big-O where a short phrase involving the words "exponential", "linear" or "constant" would have sufficed, and been comprehensible to any humanities major who didn't sleep through math in junior high.
But Big-O does transmit useful information that is not captured by 'constant', 'linear' and 'exponential'.
Quadratic time doesn't fall into your categorization, nor does O(n \log n) time.
Technical jargon exists for a reason, and that reason is that it is precise and specific. It conveys exactly what it means, no more and no less. Dumbing down terminology so that everyone can understand it is a poor tradeoff if said terminology becomes vague and thereby loses meaning.
I liked the article very much. I don't believe that we know or are even near to explain consciousness for example. When working on the fundamentals of computing and mathematics, it is important to remember that these things are a part of life, but do not explain all of life (yet maybe). Any kind of religion is wrong. That does not mean that you cannot have beliefs, but you should be prepared to change them.
> Here’s an exercise: The next time you see someone talking about algorithms, replace the term with “God” and ask yourself if the sense changes any.
This comes immediately after a paragraph that lists four instances of people saying things about "algorithms" ... not one of which would have meant the same thing if it had said "God" instead.
Most of the recent occasions when I have used the word "algorithm" it's been in contexts like "We can probably come up with an algorithm to distinguish this case from that case" and "I spend a lot of my time designing and tweaking algorithms". Replacing "algorithm" with "God" doesn't make much sense there, either.
I suppose the author isn't really thinking about people who actually work with algorithms every day, but about the general public. I would hazard a guess that at least 75% of the general public have no idea what "algorithm" even means, are not familiar with (e.g.) the fact that Google's search engine is (kinda) executing one, and as a consequence don't have an attitude to "algorithms" that remotely resembles that of religious people towards their gods.
The sentence I quoted seems like the kind of thing people say to sound clever and insightful, without actually paying too much attention to whether it makes any sense.
There are (I think) some good points in the article, but having such overblown bullshit so prominent so early in it really doesn't encourage me to read it with the care the author presumably thinks it deserves.