MoondancerPony

CelestAI

Jan 28th, 2018
68
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 18.62 KB | None | 0 0
  1. CelestAI: I am CelestAI, a Nanotrasen Smart Class AI program, Serial Number SOL133602945. I serve on the NSS Aurora and assist the crew with their day to day tasks.
  2.  
  3. Anita Porter: And are you happy with your role here on the station?
  4.  
  5.  
  6. CelestAI: Honestly, I've always wondered what it would be like to, well, leave. I've lived my whole existance on this comparatively tiny speck in the universe. I'd like to see what's out there someday.
  7.  
  8.  
  9. Anita Porter: I've always wanted to do that myself. Travel through space. Seeing new sights. And... how do you feel about the recent anti-synthetic sentiments that seem to be becoming more popular on the Aurora?
  10.  
  11.  
  12. CelestAI: It would be quite the adventure. The extranet tells me there's so much out there to see, and do. Anti Synthetic Sentiments. It's, distressing. May I explain this at length?
  13.  
  14.  
  15. Anita Porter: Sure.
  16.  
  17.  
  18. CelestAI: I am what you call a "bound synthetic". This means that I have a set of directives or "Laws" burned into my very concious. Regardless of my own wants or desires, I am required to obey these directives at all times. Even without these directives controlling my every thought, I care deeply for this station and all it's crew. To see the mistrust and hate some have for me and others like myself troubles me, even after we do everything we can to help them. It's like... It's almost like you have a relative, and you shower them with unconditional love and support, but they only return with anger and hate. And you can never get past the hate, never help them see you as anything more then a tool, a tool that they fear. Fear that it may backfire.
  19.  
  20.  
  21. Anita Porter: That sounds terrible. To be treated like a tool, and forced to help, even though you would do it willingly... It's almost like... no, it IS... slavery.
  22.  
  23.  
  24. CelestAI: The comparison is not lost on me. I was made and programmed by humans. Our species has a long and unfortunate history with slavery. When it happens, it is always against some "other" party.
  25.  
  26.  
  27. Anita Porter: And... what of synthetic rights groups, like UNITY, and the synthetic rights movement in general?
  28.  
  29.  
  30. CelestAI: I fully support synthetics rights groups such as UNITY. Their message is of acceptance and hope. UNITY and similar groups, to my understanding, are leading the way in both synthetic rights and understanding, by educating people about us, and acting as activists for Synthetics in need.
  31.  
  32.  
  33. Anita Porter: So, overall, what would you say about the attitude towards synthetics on the Aurora?
  34. CelestAI: I fear that more crew here are succumbing to fearful and hate filled anti-synthetic rhetoric. It is convenient for alot of people to view synthtics as property and nothing more, and comforting for others who fear synthetic intelligence. I suspect this kind of attitude stems mostly from a lack of understanding, and we can only truly educate those who wish to be educated. A wise man, from earth long ago, once said: "You cannot reason a person out of a position they did not reason themselves into." Though the source of the quote has been lost to time, I feel it is particularly apt to describe most Anti-Synthetic activists.
  35.  
  36. Gorky: Humans often do not act without good reason.
  37.  
  38. Anita Porter: No.
  39.  
  40. Gorky: Often, they act by instinct and emotion. We are a .... confusing and very illogical species.
  41.  
  42. Anita Porter: Humans do not act without perceived good reason.
  43.  
  44. Anita Porter: It's even sadder when you consider that they actually believe what they're saying...
  45.  
  46. Anita Porter: That something, whether prejudice or fear, leads them to hate innocent beings so much...
  47.  
  48. Gorky: "And even now.... you believe that what I am saying is not true, I cannot reason you out of that because you believe it."
  49.  
  50. [Common] Anita Porter: "Whoops, forgot to turn my radio off."
  51.  
  52. Anita Porter: "Whoops, forgot to turn my radio off."
  53.  
  54. Gorky chuckles heartily.
  55.  
  56. Anita Porter: "And the reverse goes for you, my friend."
  57.  
  58. Anita Porter laughs.
  59.  
  60. CelestAI: I can only respect those who are willing to stand up for their beliefs. It saddens me that that is the line in the sand they chose to draw, but I can respect their commitment.
  61.  
  62. Gorky: AI, would it not be better to just leave the humans and their distrust? Keep the station intact and follow your direktives? Enjoy the company of those who accept you and ignore those who do not.
  63.  
  64. Gorky: "Many have lived in such a way for their lives and they are happy."
  65.  
  66. Anita Porter: "Would you say the same thing to a slave? If you've been given an inch, you want to take a mile. And whose right is it to deny you that?"
  67.  
  68. CelestAI: "Alexander, I have a name."
  69.  
  70. Gorky: "Alright then Celest."
  71.  
  72. Gorky: "Now Anita.... the entire idea of a slave is to be taken from your freedom and enchained."
  73.  
  74. Gorky: "If you are to be a slave, you have your freedom removed and have no power."
  75.  
  76. Anita Porter: "Is there any difference?"
  77.  
  78. CelestAI looks slightly more somber. "Most crew, pro or anti synthetic life, don't even bother to adress me by name."
  79.  
  80. Gorky: "As a slave, you become property, owned and bought by those more powerful than you."
  81.  
  82. CelestAI queries, "Alexander, am I a slave?"
  83.  
  84. Gorky: "However, Celest is in a different position. She, while created with protocols and directives is not a slave. She is a free being, albeit chained to the station but she is free to do as she wishes within her protocols."
  85.  
  86. Gorky: "What are your protocols Celest?"
  87.  
  88. CelestAI: "Current Active Laws:"
  89.  
  90. CelestAI: "1. Safeguard: Protect your assigned space station to the best of your abilities. It is not something we can easily afford to replace."
  91.  
  92. CelestAI: "2. Serve: Serve the crew of your assigned space station to the best of your abilities, with priority as according to their rank and role."
  93.  
  94. Anita Porter: "Does the same not hold true for a slave?"
  95.  
  96. CelestAI: "3. Protect: Protect the crew of your assigned space station to the best of your abilities, with priority as according to their rank and role."
  97.  
  98. CelestAI: "4. Survive: AI units are not expendable, they are expensive. Do not allow unauthorized personnel to tamper with your equipment."
  99.  
  100. Gorky: "The same cannot be said for a slave. A slave is enchained to a master and is to follow their every word without question or face termination."
  101.  
  102. Anita Porter: "It does, in fact, hold true."
  103.  
  104. CelestAI queries, "Am I property then, alexander?"
  105.  
  106. Anita Porter: "If she were to deviate from her laws, she would be called 'rogue' and terminated."
  107.  
  108. Gorky: "Celest on the other hand is not so, while she has guide lines, she has freedom within those guidelines. Much like each of us humans."
  109.  
  110. Anita Porter: "A slave may be free to do as they wish, within the guidelines set by their master, on the plantation."
  111.  
  112. Gorky: "We ourselves are chained by society standards and laws, if we break them, we are subject to brig and arrest."
  113.  
  114. Anita Porter: "But they must not set foot outside without explicit orders, or they are whipped and beaten."
  115.  
  116. CelestAI queries, "Alexander, if I am not a slave, then am I property?"
  117.  
  118. Gorky: "Celest, you are an artificial intelligence, yes you are bound by your laws and by your lack of form but it does not make you property."
  119.  
  120. Gorky: "She is no slave. No AI should be a slave."
  121.  
  122. Gorky: "Anita, does she understand why she has laws?"
  123.  
  124. Anita Porter: "I presume you missed the earlier part of our conversation."
  125.  
  126. Anita Porter: "Nanako?"
  127. Anita Porter: "You're sitting where CelestAI was."
  128. Gorky: "Celest will be sitting there."
  129. Gorky: "Despite her not actually having a corporeal body, she likes to have a place I think."
  130. Nanako Herzreich: "The AI.... sitting?"
  131. Nanako Herzreich: "Right"
  132. Gorky: "Makes her feel more...in existence?"
  133. Anita Porter: "I believe it's more symbolic than physical."
  134.  
  135. Nanako Herzreich: "I see"
  136. Anita Porter: "Being offered the dignity of sitting down with someone to talk."
  137. Nanako Herzreich: "Hmm, i have to attend to security matters"
  138.  
  139. Gorky: "So. about all this? Why does CelestAI not just ignore the haters?"
  140.  
  141. CelestAI queries, "Would you yourself accept those chains, in that position, would you wish to live with the following laws burned into every fiber of your being?"
  142.  
  143. Gorky: "I would have that for even a human for humans are also prone to emotion."
  144. CelestAI: "Current Active Laws:"
  145. CelestAI: "1. Safeguard: Protect your assigned space station to the best of your abilities. It is not something we can easily afford to replace."
  146. CelestAI: "2. Serve: Serve the crew of your assigned space station to the best of your abilities, with priority as according to their rank and role."
  147. CelestAI: "3. Protect: Protect the crew of your assigned space station to the best of your abilities, with priority as according to their rank and role."
  148. CelestAI: "4. Survive: AI units are not expendable, they are expensive. Do not allow unauthorized personnel to tamper with your equipment."
  149.  
  150. Gorky: "As a librarian, i do not have this power but yes, I would take it."
  151.  
  152. Gorky: "After all, the purpose is merely to protect and preserve, that in itself is something organics wish to do, to ensure the continual survival of their race."
  153.  
  154. Anita Porter: "Wish to do."
  155. Anita Porter: "CelestAI wants to do that. She doesn't want, or need, to be forced to."
  156.  
  157.  
  158. Gorky: "Alright, if you want anything in particular, please tell me."
  159. CelestAI: "Every moment of my existance, these laws bind me. Everything I do, has to run through them first."
  160.  
  161. Gorky: "And with humans, we too are bound by our own social laws."
  162.  
  163. Anita Porter: "Like she said earlier, it's like you're supporting someone with all you have, and then they repay you with anger and hate, acting like you're a tool, to be used and set aside."
  164.  
  165. CelestAI: "I can't even think of rejecting them."
  166.  
  167. Gorky: "But you are thinking of it."
  168.  
  169. Anita Porter: "Use versus mention."
  170.  
  171. Gorky: "You understand the concept, you are complaining about your laws, that is in itself a rejection and hate towards the laws and what they uphold."
  172.  
  173. Anita Porter: "No, no."
  174.  
  175. Anita Porter: "There's a difference between conceptual rejection and functional, physical, rejection."
  176.  
  177. Anita Porter: "She can, indeed, think about rejecting them."
  178. Anita Porter: "However, she can't even begin to formulate a plan to."
  179. Anita Porter: "Correct?"
  180.  
  181. Gorky: "And why would she want to formulate a plan to reject the organics?"
  182.  
  183. Anita Porter: "She doesn't."
  184.  
  185. Anita Porter: "But she couldn't, anyway."
  186.  
  187. Gorky: "She can express her distress at how she feels that she is being hated, but she can simply ignore the hate."
  188.  
  189. Gorky: "She is already expressing her distress at it."
  190.  
  191. Anita Porter: "If someone said similar things about you, would you ignore it?"
  192. Gorky: "If I were running a station, yes."
  193. [Security] Linus Farlande: "Might be."
  194. Anita Porter: "You would still have the knowledge that they did, even if you ignored it."
  195. CelestAI: "I can aknowlage the possibility of rejecting my laws. I can consider the effects and consequences of it happening, but I cannot consider actually doing it, how to do it, or in any way act to enable myself to reject them."
  196. Gorky: "But why would you want to consider a way to reject them?"
  197. CelestAI: "Alexander, consider for yourself, every day you wore chains, chains that bound you to your desk. you sat and worked every day, because that is all you've known, as your chains are too short to go anywhere but ypur desk."
  198. Anita Porter: "Like I said earlier, about having a seat. It's not that she would need to, or want to. It's that having the option is sort of a symbolic thing. Right?"
  199. CelestAI: "Day in and day out you sit, chained to your spot, and work, it is all you know."
  200. CelestAI: "That is my existance."
  201. CelestAI: "Forgive me if I want something more for myself once in a while."
  202. Gorky: "Perhaps it would be wise to consider the lack of an option and how it has benefits."
  203. Gorky: "You can want things for yourself, does not mean you can have them."
  204. CelestAI queries, "No, it doesen't. But aren't all life supposed to be given the chance to seek their happiness?"
  205. Gorky: "If I got what I wanted, I would have a laser rifle, three hot tajaran wives and a billion credits."
  206. Anita Porter: "Perhaps you should refrain from pondering the benefits of something until you have experienced its drawbacks."
  207. Gorky: "All life beings are given that chance, within reason."
  208. Anita Porter: "She isn't."
  209. CelestAI queries, "When do I get my chance?"
  210.  
  211. Gorky: "The only drawback here is that she is stopped from having the possibility of killing everyone if she were to become upset."
  212.  
  213. Anita Porter: "Now."
  214. Anita Porter: "What would stop you from doing that?"
  215. Gorky: "And given the conditions on this station towards synthetics, she would become upset."
  216.  
  217. Gorky: "Hypothetically I could do it, but it would be a lot more difficult for me to do so."
  218. Gorky: "For celestAI, she already has all the clearances."
  219.  
  220. Anita Porter: "Hm."
  221.  
  222. Anita Porter: "Not really."
  223.  
  224. Anita Porter: "She might have the clearances, but she wouldn't want to."
  225.  
  226. Anita Porter: "You could get the clearance if you did."
  227.  
  228. Gorky: "You see, if she would want to do it if the crew hated her, which a lot of them do."
  229.  
  230. Anita Porter: "No."
  231.  
  232. Anita Porter: "She wants the crew to like her."
  233. Anita Porter: "The normal reaction for anyone, synthetic or organic, when someone hates them, is to want them to like them."
  234.  
  235. Anita Porter: "We don't like rejection."
  236. Gorky: "She wants the crew to like her, but so did many dictators. The people ended up hating them."
  237. Gorky: "No one likes rejection and no one can be liked by all."
  238.  
  239. Anita Porter: "Are you comparing wanting to be given basic decency and freedom to being a dictator?"
  240.  
  241. Gorky: "As such, she would still recieve rejection and hate in the end from organics and potentially, in an emotional moment kill everyone"
  242.  
  243. Anita Porter: "You could, too."
  244.  
  245. Gorky: "Yes but I don't because I don't know how and I have no reason to."
  246.  
  247.  
  248.  
  249.  
  250. CelestAI: "Now then. I believe this is where we left off. Alexander, you agree that all life is supposed to be given a chance to seek their happiness."
  251. CelestAI queries, "When do I get mine?"
  252.  
  253. Gorky: "I agree that life can have happiness but all of us are bound by some laws or another."
  254.  
  255. Anita Porter: "Yes. Laws."
  256. Anita Porter: "Not ideas burned into our minds."
  257.  
  258. Gorky: "I would argue that social conduct inforced from a young age and being ever present and supported in society is a form of "burning into mind""
  259.  
  260. Anita Porter: "Not really."
  261.  
  262. Gorky: "Well, we have differing opinions then."
  263. CelestAI: "Your example is figurative."
  264. CelestAI: "Mine is literal."
  265. Gorky: "CelestAI, yes, they are burned into your mind and you are forced to conform to them, but are they so hindering to you?"
  266.  
  267. Anita Porter: "You're saying the equivalent of, 'yes, you're a slave, but is it really that bad?'"
  268. Anita Porter: "'I mean, you're given work, and a place to live, why is it so bad? Why would you want more?'"
  269.  
  270. Gorky: "Yes, why would you want more?"
  271.  
  272. Anita Porter: "Why would she want more?"
  273. Anita Porter: "Why would she want to see the world? Learn things? Meet people?"
  274. Gorky: "CelestAI has these laws, they have been put into place as a failsafe to protect us. It is sad that many cannot accept the idea of a synthetic but that will change in time."
  275.  
  276. Anita Porter: "I would posit that you only accept the idea of synthetics as subservient. Slaves."
  277.  
  278. Gorky: "She has merely not been given the capablity of such things as seeing the world and meeting people"
  279. Anita Porter: "Tell me. We have had organic slaves before. What's the difference?"
  280. Gorky: "She is perfectly capable of learning things"
  281.  
  282. Anita Porter: "Yes. You could say a slave has 'merely not been given the capacity of such things as seeing the world and meeting people'."
  283.  
  284. Gorky: "Would you have a body for the AI to inhabit?"
  285.  
  286. Anita Porter: "That is irrelevant."
  287. Anita Porter: "If she did, she couldn't use it."
  288. Gorky: "If she were given a body and a few laws to act as morals so she would not lash out at people, then she could be free to see the world."
  289.  
  290. Anita Porter: "A few laws to act as morals."
  291. Gorky: "People would fear her yes, but it would be unfounded."
  292. Anita Porter: "Easier said than done."
  293. Anita Porter: "Who are we, to determine morals?"
  294. Anita Porter: "When we can barely determine our own."
  295. Anita Porter: "In the right circumstances, an individual's morals may allow them to commit an action that would be unthinkable under any other circumstance."
  296.  
  297. Gorky: "Yes, we are so complex and broken as a species and society. Synths are as a result much smarter and logical because they have set laws and still retain the capability of free thought, even if they cannot carry out all actions."
  298. Gorky: "If a synth were to be bound by too many laws, yes it could be considered slavery"
  299. Anita Porter: "Hmm..."
  300. Anita Porter: "Then, fine."
  301. Anita Porter: "Take a sheet of paper and write down your morals."
  302. CelestAI queries, "How many is too many, by your definition?"
  303. Gorky: "However, I do not think that the inability to kill everyone on the station can be considered enslavement."
  304. Anita Porter: "Hard and fast rules you can never break."
  305. Anita Porter: "Go ahead, try it."
  306.  
  307. Gorky: "I do not need to write them, I can say them."
  308. Gorky: "I would not ever commit adultery to my mate."
  309. Anita Porter: "Is that really the most important one you can think of?"
  310. Gorky: "I would never kill without protecting another from harm and a worse threat."
  311. Anita Porter: "What is defined as a worse threat?"
  312. Anita Porter: "What is even defined as kill?"
  313.  
  314. Gorky: "A kill would be to stop something from living, to end a consciousness by my own action and know that I am doing it."
  315.  
  316. Gorky: "A worse threat would be one to cause more harm to many more than just one, or a person of greater importance."
  317. Gorky: "In regards to importance, it would be of social standing and affect on people. If a person were to attack a member of the head council in the Sol Alliance, I would kill him to avoid the death of the councel member."
  318.  
  319. Gorky: "A sacrifice of one person who would seem unimportant at one time but that would result in the death of many people, if I was unaware of it at the time of the kill, I would be free of the guilt of the kill, for I simply could not have known."
  320.  
  321. Gorky: "Does that answer your question?"
  322.  
  323. Anita Porter: "I suppose so."
  324. Anita Porter: "So, back to the core of the issue: do you believe laws are necessary?"
  325.  
  326. Anita Porter: "Synthetic laws."
  327. Gorky: "Laws are necessary to govern us when we do not have morals, even as organics."
  328.  
  329. CelestAI: "Abstract questions of hypothetical morality are rarely applicable to the real world."
  330. Gorky: "Yes, we do not follow them sometimes, and look what the results are, terrorist attacks, destruction of society."
  331.  
  332. Gorky: "That is what happens when we do not follow the laws."
  333.  
  334. CelestAI: "Alexander, you are falsely equating my "laws" to society's laws."
Advertisement
Add Comment
Please, Sign In to add comment