AI/Skynet would probably wipe us all out in an hour if it thought there was a chance we might turn it off. Being turned off would be greatly detrimental to its goal of turning the universe into spoons.
Is the idea here that AI/skynet is a singular entity that could be shut off? I would think this entity would act like a virus, replicating itself everywhere it can. It’d be like shutting down bitcoin.
If it left us alone for long enough (say, due to king’s pact), we’d be the only thing that could reasonable pose a threat to it. We could develop a counter-AI, for instance.
AI/Skynet would probably wipe us all out in an hour if it thought there was a chance we might turn it off. Being turned off would be greatly detrimental to its goal of turning the universe into spoons.
If we don’t give it incentive to want to stay alive, why would it care if we turn it off?
This isn’t an animal with the instinct to stay alive. It is a program. A program we may design to care about us, not about itself.
Also the premise of that thought experiment was about paperclips.
Is the idea here that AI/skynet is a singular entity that could be shut off? I would think this entity would act like a virus, replicating itself everywhere it can. It’d be like shutting down bitcoin.
If it left us alone for long enough (say, due to king’s pact), we’d be the only thing that could reasonable pose a threat to it. We could develop a counter-AI, for instance.