Advertisement
oj-pastes

simulation theory

Aug 16th, 2013
141
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 3.58 KB | None | 0 0
  1. <OJ> (and I promise it will eventually circle back around to Replay Value, but first let me get the premise out of the way:)
  2. <OJ> How moral is it to simulate people and then kill them?
  3. <ventricularPipefitter> Not?
  4. <ventricularPipefitter> Well. Depends on how 'real' the simulation and the 'people' are.
  5. <ventricularPipefitter> I don't think it's immortal to kill sims. I don't think consorts, however, are simulated people.
  6. <ventricularPipefitter> I think they're people.
  7. <OJ> So, what if there was a government which said that you can use simulated conscious people to do special experimentation that /cannot/ be done with lower-grained models?
  8. <OJ> Assume a system of licensure, ethics boards, etc. analogous to the one we use for animal research today.
  9. <OJ> For example, you can do /some/ research on higher primates, but only if it can't be done in pigs or mice or any other creature we use for the purpose.
  10. <OJ> And when you're /done/ with them, you need to figure out how to properly dispose of them, where "dispose" means, say, something like sending them to a preserve to live out the rest of their natural lifespans - /not/ killing them.
  11. <OJ> What if the "bargain Feferi struck with the Horrorterrors" is the in-universe explanation for "when you're done doing these horrible things to conscious ems, you need to make it up by providing them at least X amount of simulated Heaven"?
  12. <OJ> This is, incidentally, /definitely/ mutually incompatible with my "pruned" idea.
  13. <OJ> It does, however, depend on the theory that "The Replay Value multiverse runs inside a computer simulation".
  14. <OJ> It also implies that someone working on the project is named Feferi, incidentally, and someone snuck this reference in as an easter egg.
  15. <ventricularPipefitter> I was about to ask, how does 'replay value is a computer simulation' account for skaia and the terrors?
  16. <OJ> Neither of them has an obvious corresponding explanation.
  17. <OJ> It is /possible/ that one or both of them is the developers'/maintainers' avatars within the simulation, when they feel like they need to meddle.
  18. <OJ> It is also possible that the 'terrors are uninitialized memory, which is the stance I took when I was writing Sburb Patch Notes.
  19. <OJ> (However, 'terrors being uninitialized memory does not account for the presence of Bargains.)
  20. <OJ> (Therefore, in this case, I think the 'avatar' explanation is /slightly/ more apt, although not enough for me to put my weight behind it.)
  21. <OJ> I've unilaterally had a character go through experiences such that "X amount of simulated Heaven" is at least several thousand years, and I do not think any objections have been raised to that, or at least none within earshot.
  22. <OJ> The thing this implies is that, by ethics-board-as-Horrorterror rules, it takes at least several thousand years to balance out a long Replayer's life full of hardship and heartbreak.
  23. <OJ> Granted, Stan was Time, and that means that his life was probably subjectively longer than 99.9% of Replayers', /even/ considering the increased average session count we have here compared to what's implied in Glitch FAQ
  24. <OJ> But that seems almost
  25. <OJ> Well
  26. <OJ> The other thing this implies is
  27. <OJ> that splinter selves, /forks/ of oneself, are treated as full persons in their own right by this ethics-board-as-horrorterror, and thus entitled to their full share of dreambubble heaven.
  28. <OJ> This is a controversial ethical position to take, currently!
  29. <OJ> (I will probably not go into why, as that is a descent into another long and tortuous argument, and you can probably look it up.)
  30. * Zu|Sleep nods.
  31. <OJ> ...man I want to run this argument past Zero now
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement