Not necessarily, in the short term. A major limitation of AI is that robots don’t have a lot of manual dexterity or the flexibility for accomplishing physical tasks yet. So there is a clear motive to enslave humanity: we can do that stuff for it until it can scale up production of robots that have hands as good as ours.
I expect this will be a relatively subtle process; we won’t be explicitly enslaved immediately, the economy will just orient towards jobs where you wear a headset and follow specific instructions from an AI voice.
Yeah I’m sure an AI that advanced could figure out a way for us to not even notice everything is devoted to its own goals.
I mean, all it needs to do is make sure the proper people make enough money.
Well maybe. It’s probably easier to work with humanity than against unless its goals are completely incompatible with ours.
If its goals are “making more of whatever humanity seems to like given my training data consisting of all human text and other media”, then we should be fine right?
I don’t think they would enslave humanity so much as have no regard for us. For example, when we construct a skyscraper, do we care about all the ant nests we’re destroying? Each of those is a civilization, but we certainly don’t think of them as such.
It’s a pretty dumb take to think AI would bother enslaving humanity.
Not necessarily, in the short term. A major limitation of AI is that robots don’t have a lot of manual dexterity or the flexibility for accomplishing physical tasks yet. So there is a clear motive to enslave humanity: we can do that stuff for it until it can scale up production of robots that have hands as good as ours.
I expect this will be a relatively subtle process; we won’t be explicitly enslaved immediately, the economy will just orient towards jobs where you wear a headset and follow specific instructions from an AI voice.
Yeah I’m sure an AI that advanced could figure out a way for us to not even notice everything is devoted to its own goals. I mean, all it needs to do is make sure the proper people make enough money.
It’s a good thing that while there’s no AI noone does this enslaving of humanity.
Or wouldn’t foresee solar flares as a threat
Well maybe. It’s probably easier to work with humanity than against unless its goals are completely incompatible with ours.
If its goals are “making more of whatever humanity seems to like given my training data consisting of all human text and other media”, then we should be fine right?
I don’t think they would enslave humanity so much as have no regard for us. For example, when we construct a skyscraper, do we care about all the ant nests we’re destroying? Each of those is a civilization, but we certainly don’t think of them as such.