Let’s take this to its logical limit: imagine that AI gets so good that it can replace a lot of jobs en masse. The most popular job in about half of the US is truck drivers, and in the states where it is teachers, truck drivers are number 2. Let’s say we do get to not only self-driving trucks but self-driving trucks loaded by robotic warehouse workers. Let’s postulate that this becomes the norm and let’s say that along with robotic truck drivers and warehouse workers, we also get robotic cashier, road work crews, and so on. I am not giving a timeline here, but let’s explore what happens when we get to 30% unemployment. Then 40%, 50%, 80%.
Sure we will have the robot wrangler engineers, scientists, teachers, nurses, etc. But typically we have social unrest past like 8% unemployment. What happens when double digits of people have no jobs and all the time on their hands? Well “eat the rich” might become very literal and no amount of protection against that can really be bought. Ultimately, the only option is either a Dune-style elimination of all AI (very unlikely) or we will have to decouple “living wage income” from “job”. If you think about it, the idea that you must have a job in order to make money is more of an implementation detail. If robots produce so much value that it is actively not even viable for humans to work, the only real logical solution is to distribute the profits from the labor of the robots in a way that isn’t by hourly rate times hours worked. In fact one possible way to do this is to tax the value produced by AI and then funnel that to a universal basic income program. Everyone by default is an artist. If you want to be a nurse or teacher or scientist or engineer you can. Otherwise just produce art at your leisure while the robots work the fields and cook your meals.
Unfortunately there is a very affordable alternative solution:
1. Massive population reduction (war is a very efficient way to achieve this)
2. Birth control, to slow down population growth to a stable rate near 0
3. Eugenics, to ensure only people with needed capabilities are born (brave new world)
In this scenario, 500,000 people (less ?) in charge of millions of robots and a minority of semi-enslaved humans would freely enjoy control over the world. The perfect mix between Asimov and Huxley.
All the agitation about "building a 1984-style world" is, at best, just a step toward this Asimov/Huxley model, and most likely, a deliberate decoy.
> If robots produce so much value that it is actively not even viable for humans to work, the only real logical solution is to distribute the profits
You don't understand. Almost nobody actually thinks about this in the right way, but it's actually basic economics.
Salt.
We used to fight wars for salt, but now it's literally given away for free (in restaurants).
If "robots produce so much value" then all that value will have approximately zero marginal cost. You won't need to distribute profits, you can simply distribute food and stuff and housing because they're cheap enough to be essentially free.
But obviously, not everything will be free. Just look at the famous "cost of education vs TV chart" [1]. Things that are mostly expensive: regulated (education, medicine, law) and positional (housing / land - everyone wants to live in good places!) goods. Things that are mostly cheap: things that are mass produced in factories (economies of scale, automation). Robots might move food & clothing into the "cheap" category but otherwise won't really move the needle, unless we radically rethink regulation.
Regulation might be one part of the equation, but I would like to understand a bit deeper on how much of cost driver it is.
Healthcare is heavily regulated in some countries and less in others. It should be possible to get some comparisons.
I somehow feel that making many humans healthy is fundamentally a really hard problem that gets harder every year because the population ages and expectations rise. It feels to easy of a talking point to put it all on regulation.
This is very well thought-out. But if regulation has such power, shouldn't we find better ways to use it?
Yeah, I know, it's very hard to craft good legislation. In fact, there's this problem of agency: the will to have things be a certain way is not always in the humans, or does not always emanate from the direct needs of the people. Many of the problems of modern capitalism are because there's emergent agency from non-human things, i.e. corporations. In the case of US, agency emerging from the corporate world has purportedly sequestered democracy. But there also agency emerging from frenzied political parties that define themselves as opposition to each other with a salted no-mans-land in the middle. This emerging agency thing is not new; it existed before in other institutions, e.g. organized religion. In any case, the more things there are vying for power, the less power people have to govern themselves in a way that is fair.
With AI, there's a big chance we will at least super-charge non-human agency, and that if we can avoid the AIs themselves developing agency of their own.
I think it's inevitable that most jobs will be eventually automated away (even without AI), and that's going to come with major class struggle and restructuring of society.
You underestimate how creative are people with finding things to do. Society didn't collapse because we automated stuff that was done in the past almost exclusively by humans, such as agriculture.
The difference I see is that when we automated car factories, etc. there were still loads of jobs that (a) were better done by a human AND (b) most humans could perform them. The issue is that if you eliminate all jobs that don’t require a PhD or equivalent, what happens to the people who are just not cut out to be nuclear physicists or biochemists, etc?
> The issue is that if you eliminate all jobs that don’t require a PhD or equivalent, what happens to the people who are just not cut out to be nuclear physicists or biochemists, etc?
The divine right of the new kings will be born from social Darwinism
Past performance is not indicative of future results.
Everyone keeps saying 'no need to worry, no need for society to plan, because jobs happened in the past'. Like we should just put all our hope on these magic future jobs appearing. Plenty of countries exist where there aren't enough jobs. We aren't exempt from that as if some magic jobfairy is looking out for us.
They are never going to just lift everyone up. We could have done that for world hunger, we didn't. They gutted USAID because 38 billion, but sent 40 billion to Argentina because 'business'. They don't care if our living standards become the same as the third worlds. Just like we didn't care all that much that the third world had really rough lives. How do you currently think about the third world? That is about as much thought/concern as we will get. We are cooked if we leave it to 'THEM' be they business or government.
I think the difference is scale. When for every person with a 9-5 job and a living wage you have 5-10 able bodied adults who live next door and are starving and have nothing to lose, not much can physically protect you. It becomes economically cheaper to share than to protect your assets.
I stopped reading past the first sentence, because that is not the logical limit. The logical limit is an unparalleled super intelligence that's akin to a god. What you state as the limit in the extrema is incorrect. Therefore with a faulty premise, all consequential propositions are inherently flawed.
Nonsense. If we give ourselves over to a super intelligence that we ourselves cannot fathom, there is no point in trying to argue about what that will look like. An ant cannot understand how a Saturn V rocket works no matter how hard it tries or how much time it has. But you do you :)
Sure we will have the robot wrangler engineers, scientists, teachers, nurses, etc. But typically we have social unrest past like 8% unemployment. What happens when double digits of people have no jobs and all the time on their hands? Well “eat the rich” might become very literal and no amount of protection against that can really be bought. Ultimately, the only option is either a Dune-style elimination of all AI (very unlikely) or we will have to decouple “living wage income” from “job”. If you think about it, the idea that you must have a job in order to make money is more of an implementation detail. If robots produce so much value that it is actively not even viable for humans to work, the only real logical solution is to distribute the profits from the labor of the robots in a way that isn’t by hourly rate times hours worked. In fact one possible way to do this is to tax the value produced by AI and then funnel that to a universal basic income program. Everyone by default is an artist. If you want to be a nurse or teacher or scientist or engineer you can. Otherwise just produce art at your leisure while the robots work the fields and cook your meals.