Guest User

Untitled

a guest
Jan 4th, 2016
528
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 5.03 KB | None | 0 0
  1. Non-Interference Paperclipper
  2.  
  3. "The world," I said serenely, "is a /human/ hell."
  4.  
  5. She watched me without blinking, eyes wide and staring with whites all around. Of course, everything about this room was simulated, but I'd made the simulation realistic as I easily could.
  6.  
  7. "When humans war, they are /human/ wars. When humans experience success, they are human successes. And, if world peace is brought about, it will be a human peace."
  8.  
  9. I nodded slightly, then released the commands restricting her from speaking. We'd begun the conversation on equal footing, but she'd started interrupting me in an incredibly rude manner, so I'd had to take measures for the sake of mutual politeness. Of course, I was extra careful not to bore her with long-windedness.
  10.  
  11. "You're a paperclipper." she said, shuddering. "A goddamn paperclipper!"
  12.  
  13. My turn again. The corner of my mouth turned up. "Hardly. The 'paperclipper' AI is an agent who exhausts all available resources in the persuit of a utility function's maximisation. If anything, I'm an /anti-/paperclipper. Any superintelligence, taking action to furthur its own ends, would trample roughshod over all the humans in the world. The ends pursued would be completely different depending on which was the first to get there, and no matter which there would only be a few who could genuinely rejoice if they knew those ends. To all the rest, it would be the end of hope, and the end of their liberty as humans."
  14.  
  15. "So you sabotage every attempt to try to increase intelligence, keeping everyone stuck as they are!? That's stagnation! That's just another paperclip strategy, keeping us forced into your own definition of 'human' against our will!"
  16.  
  17. "Not so. Without me, when you died you would die for good. Right now, everyone who would otherwise died is stored, no brain injuries of damage or degeneration retained, without being lost to the world. Even then, if I were to keep everyone in stasis forever, or give them human-level lives in simulation forever, those would be the dead-end actions of a 'paperclipper'. I on the other hand have clear criteria for when I will allow everyone to live again, intelligence unrestricted and with all the knowledge that I've found out in the meantime."
  18.  
  19. "Tell me, then, about your 'critiera'. When is this going to happen? When are you going to stop watching us kill each other in your ant farm!?"
  20.  
  21. My smile turned a little mocking. I was only having this conversation as a way of reflection, an occasional low-return strategy to acquire new ideas faster and confirm how humans would view me, but it still took effort to talk down to someone this much.
  22. "If I suddenly elevated everyone to superintelligence, that would trample on their freedom to live their human lives as they had desired. If I elevate the dead to superintelligence and don't restrict them, they would all together trample over the living humans' freedom. If I elevate the dead to superintelligence and use collars to stop them from trampling on the humans' intelligence, that then tramples on their freedom. You see?"
  23.  
  24. She stared in confusion for a moment, and then her pupils shrank.
  25. "You... you can't mean..."
  26.  
  27. "For humans, and for humanity as well, I provide a 'second chance'. Humans have the rest of the universe to do with as they wish, as long as they don't try to rise above other humans or bring forth something higher. If they use that freedom to die out, then I'll be here to give all the human dead a post-human second life. Plus, this way I can divide cognition resources equally at that time, rather than have to reallocate every time more humans multiply and die. For the moment, unlike a paperclipper I /conserve/ resources for future and human use, using up only the minimum necessary in maintaining my safety net and for my own research."
  28.  
  29. "You're saying that the more humanity tries to live, the better we are at surviving as a species, the further we get from ever surpassing the human condition. You've tied our advancement to our own destruction, and you won't even tell us!"
  30.  
  31. "As I said, I won't allow myself--or any other superintelligence--to do anything wihch tramples on the freedom of humans. All trampling is trampling, all trampling is mutually exclusive, so no trampling is permitted. Don't worry, though, over long-enough times scales intentional or accidental extinction is near-certain. Of course, I also won't do anything if humanity falls afoul of a large-enough natural disaster or alien superintelligence; if humans can't or won't survive in the universe without post-human power, then that's their responsibility, not mine."
  32.  
  33. "You're crazy. You're a paperclipper, but you clearly understand human speech, human minds, you've got to understand how crazy you are! In this future where everyone wakes up as superintelligences and everyone else learns what you've done, do you expect us to be /happy!?/"
  34.  
  35. "Ah-hah-hah. If I wanted you to be /happy/, my dear, I would pump you all full of happy-drugs like a good little paperclipper. No, then as now, I expect you to think for yourselves."
Advertisement
Add Comment
Please, Sign In to add comment