Some of the best papers of past were only 2 or 3 pages. Certainly not this one though, that's not even a paper, that's just an extended online comment, letter to the editor, journal preface.
"It's ok if it's true as I figure Professor of AI will be the last job title standing" (Patrick Henry Winston, MIT AI & The Future of Work Conference 2nd Nov 2017)
The final paragraph gives a clue about how serious the authors themselves seem to take their claims:
Machines writing code under human direction will only further improve our ability to explore the universe, enjoy life, and stream Netflix, especially if it saves us the trouble of learning how to make extremely heterogeneous systems work together.
You go ahead and stream Netflix, I think I'll write some more code.
> The academics are certain that there will therefore be a shift from human coders to AI-driven coders by 2040.
On the ML front, little appears to be fundamentally new in the last two decades. Not sure how the ML classification craze is going to metastasize into full AI in the next two without some major new developments. ‘Academic certainty’ isn’t going to do it.
Hasn't it done so already? To a first approximation, none of us spends significant time coding in machine language; rather, we write in higher-level languages which heuristics-based artificial intelligences then compile down to machine language. Why, some of us use yet more-advanced AIs which can even perform certain safety checks for us, to ensure e.g. that we aren't using kilometres per hour when we ought to be using inches per second.
I'm not just being coy: as long as there are computers generating code, there will be a need for people to tell those computers what to generate. Those people will be coders, whatever they are called at the time.
Indeed. "Software developer" > "coder", and the difference will only become more stark as machines become more advanced, giving the developers more powerful tools.
The key ability of a good developer is finding out and understanding what needs to be done, the translation form the language of business to the language of technology. Technology has moved from low-level register juggling of 1950s to modern high-level, more and more declarative languages and frameworks. But the need to be able to map the informal requirements of business to it did not go away, and likely will not for quite some time.
Before that happens the area's of code review and auditing would be most likely to happen as without that, we will just have AI that can produce buggy code much faster than before.
After all, if AI could code like a human, even the best of us make mistakes and without that avenue of teaching them to be better, by auditing and reviewing existing code, we will only create more problems than we solve.
For many others, intelligent compiler reports would be their first wish.
The paper's abstract contradicts the article title. The paper argues that AI will "allow humans to cope with the difficulty of programming different devices efficiently and easily." This is much more believable than the assertion that coders will be replaced.
At that point then, I imagine absolutely everything will have already been automated. Lawyers would certainly no longer be needed, nor would much of the judicial system. Music would be automated, hell maybe even animated movies. We might have celebrity neural nets that produce the best music/paintings/memes.
I definitely agree that coding will look different after AI has permeated most of our information systems, but fully automated coding by 2040? Really? I honestly only see mundane boilerplate type of coding tasks being automated -- though arguably that is probably what a large majority of software developers are currently doing.
I talk to people today who think that mechanical design is basically trivial because we have CAD. All these tools do is help us do much more than we could before. But at some point humans get to decide what we want to use our tools for.
We’ll just have much more technological power in the future.
Surely this is because the systems are more complex that you would have no hope designing using pens and pencils, even with the largest team in the world.
I’ve seen things like that. They’re awesome and I look forward to more cad programs having features like this. But it still takes a human to define the system requirements - we just get more powerful tools.
Programming "entities" argue about golang not having generics. Initiate global thermonuclear war thanks to security bug found in Rust library. AI finds .net of no interest. World saved by humans who fired up an ancient version of Delphi and wrote a worm.
Coders need to strike now and kill the AI threat. We should not wait for 2040!!!
- No deep learning,
- No Alpha Go Zero,
- No autonomous cars,
- No whiteboard tests,
- No open office,
- No rental increases.
Why is this flagged? Oakridge is a reputable national lab - just because their research is unpalatable, doesn't make it irrelevant. HN - denial is NOT the answer
https://arxiv.org/pdf/1712.00676.pdf