Advertisement
vereornox

Untitled

Oct 17th, 2017
558
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 66.28 KB | None | 0 0
  1. <Elara> so yeah, Hiero
  2. <Elara> a republic can be a democracy, and many are
  3. <Elara> my point is that Machiavelli wrote in favour of republics, not of democracies
  4. <Elara> and you can be a republic without making even a pretence at democracy
  5. <Elara> (a civic republic with a city militia is going to be responsive to public opinion, whether or not it ever actually holds a vote)
  6. <Megafire> How popular were Democracies at the time of Machiavelli, again?
  7. <Elara> popular as in well-thought-of, or popular as in with-a-broad-franchise?
  8. <Elara> I mean, the answer to both is "not very", but
  9. <Megafire> Just trying to frame in my head when what happened.
  10. <Megafire> But yeah, Machiavelli was very much in favour of the Republic of Florence, as opposed to having the place be run by a ruling family like the Medici.
  11. <Elara> democracy wasn't really well thought of until ~the 18th century, and even then there was a lot of opposition and uncertainty
  12. <Megafire> (And then he supported the Medici anyway because they ruled Florence, and whoever rules Florence had better be powerful, because Florence is best.)
  13. <Elara> and the democracies that did exist tended to have restricted franchises, so that only reasonably rich men could vote
  14. <Megafire> Hmhmm.
  15. <Elara> though if I remember right a few women could vote in England - widows who passed the property qualification
  16. <Elara> they actually had the vote taken away from them when there was a reform to let more men vote, and I've never been sure whether it was deliberate
  17. <Megafire> Yeah, it went to land owners, and women were capable of owning land, albeit not in any easy way.
  18. <Megafire> Huh, really?
  19. <Megafire> Was that a reform to let soldiers vote?
  20. * Druza (uid209687@charlton.irccloud.com) has joined
  21. <Elara> nah
  22. <Elara> I think it'd be the Great Reform Act of the 1830s, but it might be one of the more minor ones from the mid C19th
  23. <Elara> anyway, it massively increased the (male) franchise, but it made the primary qualification a penis rather than propertyholding
  24. <Megafire> Well, it /is/ easier to check than land deeds.
  25. * You are now known as Vernap
  26. <Elara> :P
  27. <Elara> ah, was 1867 that made all householders voters. 1832 merely doubled the franchise by reducing the propertyholding requirement
  28. <Elara> for all men to vote took til 1918, which also added limited female suffrage
  29. * Ellardy (uid178344@brockwell.irccloud.com) has joined
  30. <Megafire> A consolation price for sending all men off to WWI, yeah.
  31. <Megafire> 'Hey, we've sent all these guys through literal hell for no real gain, do y'think we should give them a say in whether to do that again?'
  32. <Elara> mmm, it has been claimed that the age for women to vote (30) was picked because much younger and they'd have made an immediate majority of the electorate due to the war casualties among men
  33. <Megafire> Heh.
  34. <Elara> reminding myself of some of these details, I have also been reminded that there used to be "university" seats in parliament, so graduates got two votes - one where they lived, and one for the uni seat
  35. <Megafire> And they got rid of that but kept the House of Lords?
  36. <Megafire> And the religious leaders?
  37. <Ellardy> I still have a graduate vote
  38. <Elara> hey, I never said it was sane Meg
  39. <Megafire> Oh, I'm not saying they should've kept it.
  40. <Ellardy> Well, I will have it once I graduate but Ireland three senators just for graduates of major Irish universities
  41. <Megafire> I'm just imaging a Britain in which they had kept that, and gotten rid of the Church leaders.
  42. <Elara> but yeah, Oxford, Cambridge, London, Wales, and two seats for "Combined English universities", so about 1% of parliament was elected by STV from graduates
  43. <Ellardy> Oops, I stand corrected. Three just for my university :p
  44. <Elara> TCD, Ellardy?
  45. <Ellardy> Yep
  46. <Ellardy> Our senators are absolute badasses though so it's justified
  47. <Elara> wonder if it's a hangover from our system
  48. <Ellardy> Everything here is hangover from Britian. But here I think it's to do with the "panel" system Da Valera came up with
  49. <Aliphant> I like how everyone said "I want the vote to represent the will of the people" then promptly instituted about 50 rules that corrupt that idea
  50. <Elara> panel system?
  51. <Aliphant> Like having only rich property owners able to vote
  52. <Ellardy> Ireland's legislature is a mirror of the British one but they didn't have any lords to fill the upper house and didn't want to make lords so they created a panel system
  53. <Ellardy> Most of the seats of the Seanad are supposed to represent the different categories of people
  54. <Ellardy> So farmers get X number of seats, arts get X number of seats, graduates get X number of seats (split by uni) and so on
  55. <Elara> to be fair Hiero, restricted voting came first, then "let's represent the will of the people" came second
  56. <Ellardy> Of course, it reflects society as De Valera imagined it to be in the 1930s so it somewhat out of date now
  57. <Elara> waaaaay second
  58. <Aliphant> Wait. Then what's the point of having voting at all in the first place?
  59. <Aliphant> If you have 0 interest in representing the will of the people
  60. <Ellardy> Bourgeois rule is better than monarchic rule
  61. <Megafire> ^
  62. <Elara> the origin of Parliament was in the middle ages, where the idea was to have leading figures in local communities come to Westminster, agree how much tax they were prepared to give the king, and then go back and collect the damn taxes
  63. <Megafire> A very specific set of people both wanted to have more to say and had the power to enforce their right to vote.
  64. <Elara> they were selected by a vote, but the idea wasn't to "represent" the people, the idea was to pick people with enough standing in the community that they'd be able to get money out of said community
  65. <Ellardy> Also, the fear of mob rule meant that they understood "people" to mean "people educated enough not to go crazy with this new power". We still restrict voting rights of minors, foreigners, prisoners, etc.
  66. <Elara> everything else has accrued on top of that - a way of getting more tax out of the country
  67. <Aliphant> So... the idea of voting came from a kind of oligarchical idea where a bunch of smart people got together to agree/vote on how to rule the country?
  68. <Megafire> Yes.
  69. <Ellardy> "i National Language and Culture, Literature, Art, Education and such professional interests as may be defined by law for the purpose of this panel; ii Agriculture and allied interests, and Fisheries; iii Labour, whether organised or unorganised; iv Industry and Commerce, including banking, finance, accountancy, engineering and architecture; v Public Administration and social services, including voluntary social activities." <--the
  70. <Ellardy> other panels
  71. <Elara> the idea was it was a council that could advise and assist the king in ruling
  72. <Ellardy> Hiero: you live in a country which places a minimum wealth requirement on the president and an overbearing executive answers to a parliament composed only of smart people from a single party. It's not that far off
  73. <Elara> XD
  74. <Aliphant> Yeah the minimum wealth requirement on the presidency is the dumbest thing ever
  75. <Ellardy> Even the reasoning given is remarkably similar except the fear is not mob rule leading to taking from the rich but mob rule leading to ethnic strife
  76. <Aliphant> Elara: Where did this will of the people idea come in then
  77. <Ellardy> It's the ideal that can't be reached I guess? If we believed that unreservedly, we'd have direct democracy or sortition instead of representative democracy via election
  78. <Elara> when the Commonsstarted finding its own voice, started thinking of itself as a partner in ruling the country. the King's right to rule was obvious, but where did their right come from?
  79. <Ellardy> We recognise politics needs a good deal of background knowledge so we let an elite handle it but with the caveat that the people can pull the plug at any time
  80. <Ellardy> Well yes. The cynic would say that it was a meaningless slogan the bourgeoisie used without believing in, all the better to manipulate the masses against the previous rulers
  81. <Elara> the time period I'm talking about, it's more like the gentry manipulating the bourgeoisie
  82. <Elara> but hey
  83. <Ellardy> I believe O'Brien in 1984 makes that exact argument. The Top, the Middle and the Low or whatever
  84. <Aliphant> The right to rule always stems from righteousness. Not because X number of people said you're the ruler, or because God said you were, or because your daddy and granddaddy were all rulers. You have the right to rule if and only if you're some of upright moral character and who has the righteousness necessary to use that moral character in service of the country.
  85. <Elara> righteousness?
  86. <Aliphant> That's my belief, yes.
  87. <Elara> that's...
  88. <Elara> because here's the thing, righteousness is both highly subjective and impossible to judge
  89. <Aliphant> The characteristic of being morally correct, of having really good moral values, of being fair and impartial, of generally Doing Good
  90. <Aliphant> Well, yes >_<
  91. <Elara> *definitively judge
  92. <Aliphant> Which is why we've been disputing over who's the most righteous since the beginning of time.
  93. <Aliphant> Right now the idea is that the people know best who is righteous, so we'll let them vote, and they'll collectively come to a decision on who's righteous. But the end result must be to select a righteous person, not a popular one.
  94. <Aliphant> In other words, you should vote for someone not because they serve your goals, but because they are genuinely someone who you think is a good person who wants to do good things.
  95. <Elara> I think the framing is incorrect
  96. <Aliphant> Even if they'd increase the taxes imposed on you, or even if other candidates would cost you more in some other way.
  97. <Elara> the right to rule is granted by consent of the ruled. we can argue about how that consent can be indicated, but ultimately if you're maximally righteous but the people reject you, you will rightfully leave them alone
  98. <Elara> (Consent: Not Just For Sex (TM) )
  99. <Ellardy> ...
  100. <Ellardy> I will never be able to study The Leviathan again
  101. <Elara> why's that, LRD? ^^
  102. <Aliphant> I don't agree. I think that people have a right to impose a righteous rule on someone else even if that someone else doesn't want it. If you really are *correct*, then... what does it matter what the other person wants?
  103. <Aliphant> The question is how you even know you're correct in the first place, of course. But if we take this hypothetical situation that there really is some uber-righteous person, then they have a right to do whatever they wish.
  104. <Aliphant> All of politics is just an exercise in finding the best system to determine this person, or create one.
  105. <Elara> no they don't, because some actions they might wish to take would be unrighteous
  106. <Elara> like, e.g., taking rulership over a people which rejects them
  107. <Ellardy> Hobbes felt that consent of the governed could be obtained at sword point
  108. <Elara> Hobbes was wrong
  109. <Ellardy> Hobbes got a lot of things right though
  110. <Ellardy> All of liberal tradition is based on the breakthroughs of a profoundly illiberal person
  111. <Ellardy> But yes, that was one of his stupider ideas
  112. <Elara> oh, I get that a lot of his thinking is sound
  113. * Netrunner has quit (Quit: Connection closed for inactivity)
  114. <Aliphant> I don't quite agree that taking rulership over a bunch of unrighteous people who reject you, and doing so in the name of righteousness, is *inherently unrighteous* by itself. If there is a country of scoundrels and bandits all fighting amongst themselves, then to me it is quite righteous for someone external to come in and enforce the rule of law and drain the blood from the streets.
  115. <Elara> there has never been a country of scoundrels and bandits
  116. <Aliphant> Even if every scoundrel and every bandit and every warlord within that region were to vehemently disagree with that consent.
  117. <Aliphant> I know. I'm just giving an example of a situation that I feel that you and I would both agree on.
  118. * Ellardy is now known as EllardyAway
  119. <Elara> in a situation where there is severe disorder, it might be legitimate to go in and establish civil order
  120. <Aliphant> It's a thought experiment. If, in this scenario, we accept that it's okay to forcibly rule over these people, then we must ask ourselves why we think that's okay. The reason would be that we believe there's a principle that says, when we see a wrong, it's fine to correct it. When we see disorder, it's fine to use order to correct it.
  121. <Aliphant> I think that that principle applies to the real world as well.
  122. <Aliphant> There are many wrongs in the real world, and there are many people who seek to correct those wrongs. I don't think these people are bad simply because a few people - who very often coincidentally are the ones who benefit from the wrongs - decide to object for some sketchy reason or other.
  123. <Elara> here's the thing though, in that case? if the disorder is sufficiently bad to require external intervention
  124. <Elara> there will be a mass of the population which wants the disorder to stop, and lacks the power to make it
  125. <Elara> so some interventions will have the backing of (much of) the population as long as they're effective
  126. <Elara> for example, until things went tits up (as they so often do), the Army deploying to Northern Ireland had broad popular support from both Protestants and Catholics in the province
  127. <Aliphant> Elara: Sorry, I'd love to continue this conversation, but something distracted me. We can hold this discussion for later? And I'll come back and explain why I feel that way.
  128. <Elara> you can if you want. you'll remain wrong :P
  129. <Aliphant> Okay, it's handled.
  130. <Aliphant> Alright, so here's the way I see it. Both you and I agree that disorder is wrong and cancerous within the world. You argue that disorder is not something we have to violate consent in order to put down, because where there is disorder there will always be unrest. That may or may not be true - various Ayn Rand style libertarian paradises come to mind - but let's grant that as a given for now. But I think that, setting disorder aside, there are
  131. <Aliphant> plenty of evil things in this world that we can both agree are evil, but that there is no majority consent to get rid of.
  132. <Aliphant> The primary example I will use here are things like Nazism, racism, sexism, etc.
  133. <Aliphant> I will repeat my statement given above. I don't believe that someone's right to bodily autonomy outweighs the right of someone else to make the world a better place and to perform a righteous act which gets rid of evils within the world.
  134. <Megafire> (Chaos is a social good, and I am somewhat wary of your rant against disorder.)
  135. <Aliphant> While I would prefer not to violate someone consent, if they push me to it and refuse to turn over a new leaf no matter how much I beg and plead them, or reason with them, I definitely won't let the fact that they refuse to consent stop me from calling that someone do something about their depravity.
  136. <Elara> diversity is a social good, chaos often comes with diversity, and order is only necessary as a baseline to allow other things
  137. <Aliphant> Diversity is a social good, I agree.
  138. <Cyrix> what are righteous acts? what is evil? and what is the "greater good" Aliphant?
  139. <Aliphant> What I don't see is why we have to allow Nazis in the name of "diversity"
  140. <Aliphant> I'm not even saying that everything I view as evil has to be stopped even if it violates consent.
  141. <Elara> we don't, we allow Nazis in the name of freedom of expression and assembly
  142. <Aliphant> I'm saying that something that *everyone agrees is evil* should be stopped.
  143. <Megafire> Nazis are an unfortunate side-effect of something good.
  144. <Cyrix> thats a milkmaid argument Aliphant
  145. <Aliphant> Well, the Nazis don't, but fuck 'em, they're Nazis.
  146. <Megafire> Okay, so, I agree that Nazis are evil.
  147. <Megafire> But the logic you're using here is very dangerous.
  148. <Aliphant> Megafire: So why not trim and alter the good thing to not allow Nazis? Look, you want freedom of expression so you can have a wide range of political opinions and everyone can debate and be happy, right?
  149. <Elara> if there are a thousand Nazis, there are at least ten thousand people who aren't Nazis, but think Nazis aren't any worse than (insert other thing here)
  150. <Aliphant> But it turns out you can have a wide range of political opinions without condoning Nazism
  151. <Aliphant> this channel would be one example of such a place
  152. <Megafire> No, I want freedom of expression because freedom expression is good.
  153. <Aliphant> Wait, seriously? Talking is a terminal value?
  154. <Megafire> Yes.
  155. <Cyrix> @Aliphant Nazis came to be because exactly of what you describe
  156. <Cyrix> not the other way 'round
  157. <Aliphant> o_o That is NOT the usual argument I've run across in this kind of case, which is that if you remove it the world will instantly become 1984
  158. <Elara> not talking per se, but expressing yourself yes
  159. <Megafire> Being allowed to talk is a terminal value.
  160. <Elara> fundamental freedoms include expression, assembly, conscience
  161. <Megafire> ^
  162. <Aliphant> Well, I'll excuse myself from this conversation, because I simply don't agree with that. I think it's good in that it makes people happy and fulfills their preferences, but it's not a terminal good. And I don't think there's any point butting heads once we find out there's a terminal conflict.
  163. <Aliphant> Because otherwise we're just going to go "this is a fundamental freedom/good!" "no it isn't!" etc.
  164. <Elara> I think we can be a little more interesting than that
  165. <Aliphant> Cyrix: If Nazis emerged as an attempt to censor Nazis, then like... censor harder ;)
  166. <Megafire> I can explain why it's a fundamental freedom, namely the fact that allowing people to be fundamentally wrong in the face of social pressure also allows people to be fundamentally right in the face of social pressure.
  167. <Cyrix> @Aliphant you are doing it wrong - fundamentally
  168. <Megafire> And, also, chaos is a social good.
  169. <Elara> censoring harder turns the censoring apparatus into something closer to ersatz Nazis
  170. <Elara> is the problem with that approach
  171. <Megafire> ^
  172. <Cyrix> @Aliphant you are just furthering a rift between groups and stoke fires of hate doing that
  173. <Megafire> Like, you may not be a Nazi, but you're sure as hell getting close to fascism.
  174. <Cyrix> nazis capitalized on fears and resentment
  175. <Cyrix> and what you want to do is exactly doing that: providing resentment and fears
  176. <Aliphant> Megafire: There are two types of people that statement is concerned with. People that are fundamentally wrong in the face of social pressure, and people that are fundamentally right in the face of social pressure. Why not remove the first and keep the second? I do not see why banning Nazis suddenly means that people cannot be fundamentally right in the face of social pressure.
  177. <Megafire> Besides, pushing Nazis underground just makes it harder to keep track of what they're actually doing.
  178. <Aliphant> Elara: Uh? It's not racist.
  179. <Cyrix> by trying to "censoring them harder" you give a resistance (call them new-nazis) a reasons to exist in the first place
  180. <Aliphant> If you mean Nazi as in authoritarian, then sure. I have no problem with authoritarianism in and of itself. The problem with Nazi Germany wasn't that Hitler was a very strict ruler, it was that they were slaughtering Jews.
  181. <Megafire> How on earth can you tell who is fundamentally right in the face of social pressure and who is fundamentally wrong in the face of social pressure?
  182. <Aliphant> The same way you come to the conclusion that X is a fundamental good/freedom. It's moral intuitions. That's the basis of all morality.
  183. <Elara> in my conception, human rights exist for humans. humans are individuals - so the fundamental rights are for individuals. each individual gets a right to life, a right to express themselvs, a right to associate with like-minded people, and a right to their own beliefs. these rights are occasionally caveated, but the caveat is not "unless you are(/we say you are) a Nazi"
  184. <Cyrix> @Aliphant I do not want to sound rude - but it is frankly quite apparent that you have no idea where nazis came from or what their ideology was.
  185. <Aliphant> Megafire: I've never liked the "pushing Nazis underground" argument.
  186. <Cyrix> More importantly you do not care to explore that - "because fuck nazis". Thats ignorant and stupid and dangerous.
  187. <Aliphant> Like, that seems like a problem that's solved by getting better at eliminating Nazis.
  188. <Cyrix> ...
  189. <Megafire> You can't selectively apply social pressure, though.
  190. <Aliphant> Why not? I selectively apply social pressure all the time.
  191. <Megafire> The Overton Window cuts out both good and bad ideas.
  192. <Elara> as an individual you can, as a government you acn't
  193. <Megafire> No, you don't. You just apply social pressure.
  194. <Elara> *can't
  195. <Aliphant> I apply social pressure on people who do things socially wrong, that are fundamentally wrong, like expressing racist opinions. I do not apply social pressure on people who do things socially unacceptable but fundamentally right, like, I dunno, being a brony.
  196. <Aliphant> I wouldn't say being a brony is a fundamental right, but it's not *wrong*, and I think you get my point.
  197. <Megafire> Thinking some ideas are good and some are bad, and only the good should be expressed is literally what social pressure is.
  198. <Cyrix> you apply social pressure either way
  199. <Aliphant> Elara: Expand a bit more on that, 'lara?
  200. <Aliphant> If 1 guy can selectively apply social pressure, then when this guy becomes the supreme ruler then it follows that this guy can direct his government to selectively apply social pressure under his own morality.
  201. <Aliphant> Therefore it seems apparent to me that governments can selectively apply social pressure.
  202. <Elara> no, because that's not how bureacracies work?
  203. <Aliphant> *Society* would still believe X is wrong.
  204. <Aliphant> But the government would refuse to condemn people for it.
  205. <Megafire> The thing you are claiming is 'selectively applying social pressure' is literally just 'applying social pressure'.
  206. <Elara> the top guy will express an aim, the people below him will enact it based on their understanding of it/how much they agree with it/etc
  207. <Aliphant> Guys. I'm okay with applying social pressure. What I'm not okay with is applying social pressure on good people.
  208. <Elara> by the time it's worked a few layers down, "racists are bad but bronies are okay" could have become something pretty different
  209. <Aliphant> Can you explain why it's not possible for the government to refrain from applying social pressure on good people?
  210. <Cyrix> @Aliphant what are good people?
  211. <Cyrix> what are bad ones?
  212. <Cyrix> the ones you disagree or agree with?
  213. <Megafire> You're applying it according to your own moral standards and intuitions, maybe, but as we've already established, three out of the four people in this conversation think you're wrong.
  214. <Cyrix> do you even comprehend what you type?
  215. <Aliphant> Good people are people who do good things. They're kind, nice to be around, help others and make the world a better place. Bad people are people who do evil things. They're nasty, they commit crimes, they murder and rob, and they frequently make everyone around them miserable.
  216. <Megafire> Except you will be applying social pressure on good people, because that's what applying social pressure inevitably means.
  217. <Cyrix> What are good things?
  218. <Cyrix> what are bad ones?
  219. <Aliphant> Megafire: Why does it inevitably mean that :|
  220. <Aliphant> Can you show me the mechanical process by which my policy leads to a social pressure on good people
  221. <Elara> so Stan, who lives down the street and pays his taxes, but is an argumentative arsehole is a "bad person"?
  222. <Megafire> Because you're going to be wrong on morality sometimes.
  223. <Elara> because there are no good people and evil people, the line between good and evil cuts through every human heart
  224. <Megafire> And when you're wrong on morality, you're applying your social pressure on good people.
  225. <Cyrix> @Aliphant the point is that you do not get to decide what is good or bad as an government.
  226. <Aliphant> Okay, pretend I'm the grand dictator of Uzbekistan and I have an ironfisted control over the government. My first rule is to define what I feel is good and bad, and to order the government to use the law to crack down on bad people and stay away from cracking down on good people (even if those people are doing unpopular or socially unacceptable things)
  227. <Elara> as Solzhenitsyn put it
  228. <Aliphant> Can you show me a sequence of events in which this leads to the government of Uzbekistan cracking down on good people?
  229. <Cyrix> @Aliphant sure!
  230. <Cyrix> all the frikkin time
  231. <Aliphant> Elara: No, because a balance of factors is taken into account, and being argumentative is something that, well, doesn't outweigh the fact that you're law abiding and pay your taxes.
  232. <Megafire> You've just made 'good but unpopular or socially unacceptable' an impossible category.
  233. <Cyrix> especially when you have something like: stealing is bad! helping poor people is good!
  234. <Cyrix> someone steals to help the poor
  235. <Cyrix> what now?
  236. <Elara> your rules are imperfect, so the policeman who hates Group X (a group of good people) finds a rule he can use to harass Group X
  237. <Aliphant> Megafire: Why is this impossible? I've specifically told my government to avoid persecuting unpopular people if they're not doing anything wrong.
  238. <Megafire> Because, you've defined the thing you think is socially acceptable as good.
  239. <Aliphant> Why, then, is it still impossible to be one of these people?
  240. <Elara> you get a report showing that rulebreakers are being punished, you're happy
  241. <Elara> Group X get fucked over
  242. <Aliphant> Bullshit. I think that a lot of things are good, and not all of those things are socially acceptable.
  243. <Aliphant> Censorship, for one.
  244. <Megafire> You are the dictator.
  245. <Cyrix> @Elara you do not even need to go to the implementaion side of things
  246. <Megafire> By definition, what you want happens. What you think is good becomes socially acceptable, because you enforce it.
  247. <Aliphant> Cyrix: Then we decide which is more severe: their theft, or their altruism.
  248. <Cyrix> its the very definitions that are already flawed even in hypothetical examples
  249. <Elara> I know, 'rix, I'm just hoping to show Hiero that even if he's right about the principle it's going to fail in practice
  250. <Cyrix> @Aliphant and how do you decide that?
  251. <Cyrix> altruism can't be measured in money
  252. <Aliphant> I reject such an argument unless you can prove that it's going to always be impossible, as a matter of fundamental logic. The reason for this is that even if today's technology, today's culture, today's society makes this system impossible to implement, if it's right *in principle*, then we should be moving towards it and moving towards the kind of technology and culture that would allow it to happen.
  253. <Megafire> And, again, you're wrong on morality, and end up prosecuting people for the wrong thing.
  254. <Aliphant> Megafire: So your argument is "at the end of the day, what if the supreme ruler is wrong"
  255. <Cyrix> so much stupid and ignorance upsets me
  256. <Cyrix> ill leave
  257. * Cyrix (Mibbit@net-jk34t0.dip0.t-ipconnect.de) has left
  258. <Megafire> No.
  259. <Elara> it's today's people who make it impossible to implement
  260. <Megafire> My argument is 'the supreme ruler is going to be wrong, this is why chaos is a social good.'
  261. <Aliphant> Why are today's people a barrier towards implementing this kind of system?
  262. <Aliphant> >the supreme ruler is going to be wrong
  263. <Aliphant> what?
  264. <Aliphant> what makes you so sure
  265. <Aliphant> :|
  266. <Megafire> Because everyone is wrong sometimes.
  267. <Aliphant> Hold on, let me pull up what I said earlier.
  268. <Elara> because you're expecting a vast bureaucracy, an apparatus of state power, to seamlessly implement the whims of a supreme rule
  269. <Elara> *ruler
  270. <Aliphant> <Aliphant> The question is how you even know you're correct in the first place, of course. But if we take this hypothetical situation that there really is some uber-righteous person, then they have a right to do whatever they wish.
  271. <Aliphant> <Aliphant> All of politics is just an exercise in finding the best system to determine this person, or create one.
  272. <Elara> rather than to act in their own interests
  273. <Aliphant> ^ politics is an exercise in finding or creating a person who is *never wrong*
  274. <Aliphant> THAT is what politics exists for.
  275. <Megafire> What?
  276. <Megafire> No.
  277. <Aliphant> And once we find that person, we can just put them as our supreme ruler and there is no issue.
  278. <Elara> that's fundamentally wrong about what politics is
  279. <Megafire> That is insane.
  280. <Aliphant> How the hell is that insane? Do you not want a perfect ruler?
  281. <Elara> hell no
  282. <Megafire> Such a person does not exist and cannot, in fact, exist.
  283. <Megafire> No.
  284. <Megafire> Fuck that.
  285. <Aliphant> Okay, so let's talk about that. Why can this person not exist?
  286. <Aliphant> Don't tell me subjective morality :|
  287. <Elara> the system will come to depend on them, and when they inevitably die or retire, the system will be fucked
  288. <Megafire> Because people are flawed and change their minds and are wrong as a matter of course.
  289. <Elara> and that's granting the idea you could find one in the first place
  290. <Megafire> Also, because reality is complicated, and sometimes there is no good choice. No third option. No solution that doesn't hurt people.
  291. <Aliphant> Elara: First of all, I wish to make it clear that I'm a transhumanist. To me, an ideal future is a future where people are free from the shackles of death or senility. Of course, nobody is arguing that in 2017, such a system is possible. I'm saying that it's going to be possible sometime in the distant future, when we finally find a way to get around these practical issues - and when we get to that future, we should implement it, and right now
  292. <Aliphant> our job is to work towards that future.
  293. <Aliphant> There's always a solution that minimizes hurt
  294. <Megafire> No.
  295. <Aliphant> Even if there's no solution that doesn't hurt people
  296. <Megafire> There isn't.
  297. <Aliphant> If there are 500 solutions, logically speaking one of them must be the solution that hurts the least people
  298. <Aliphant> if you sort them in descending order by amount of people hurt, there it is, at the bottom.
  299. <Aliphant> The point of the perfect ruler is to pick that option. All the time.
  300. <m1el> it doesn't mean that you'll like that solution
  301. <Megafire> Or there are two that hurt the exact same amount of people.
  302. <Aliphant> m1el: Of course I might not like it. Hell, I might be the one hurt by it.
  303. <Elara> or you don't know enough to know which policy will hurt the least people
  304. <Aliphant> Does it matter? No. That's why they're the supreme ruler!
  305. <Elara> information is never complete and perfect
  306. <Aliphant> They're supposed to do what's best for everyone. Even if I, a single person, don't agree with it.
  307. <Megafire> Your system requires perfection on way too many fronts to be even remotely close to viable.
  308. <Megafire> Which is why it's insane.
  309. <Aliphant> Elara: If the solution is so tangled and difficult to discern that even this hypothetical "perfect ruler" would be unable to find the correct solution, then why do you think democracy would be better?
  310. <Elara> also I just want to note that I don't want the term transhumanism to be explicitly linked to this version of politics, because overcoming human limitations is cool and creating godkings is bad
  311. <Aliphant> Wait, I thought you were anti-transhumanist. You were always going on about how AIs solving death would never exist, etc.
  312. <Megafire> Because democracy doesn't pretend to be perfect.
  313. <Elara> because, Hiero, in the same way that well-regulated markets can harvest information that's available to individual buyers and sellers but is not accessible to the Economic Ministry
  314. <Aliphant> Megafire: Okay, let's step back for a bit and say that, y'know, maybe life is just Too Hard (I disagree with this, but let's grant it for now) and the perfect ruler can only achieve a correct rate of 95% success. 95%! That's a fucking miracle compared to democracy.
  315. <Elara> democracies can harvest information on who's being hurt and helped that isn't accessible to the God King
  316. <Aliphant> Democracy can't even achieve above 50%
  317. <Megafire> Democracy is all about trying to find our way blind throughout a dark cave and trying not to fall into various deep pits along the way.
  318. <m1el> I just finished reading this conversation. Hiero, the idea of a supreme infallible dictator may sound appealing, but it is extremely dangerous. first, no human is infallible. second, no ruler rules, he/she is surrounded by subordinates. third, no ruler rules forever. if you don't understand the implications of this, you're dangerous.
  319. <Elara> I am pro-transhumanism, I can recognise that the Singularity is dumb
  320. <Elara> these are not contradictory positions
  321. <m1el> * no ruler rules alone
  322. <Aliphant> m1el: You have not finished reading the conversation if you think "no ruler rules forever" is a valid objection to my position :|
  323. <Elara> eh, you haven't even touched on the succession problem
  324. <Megafire> The supreme ruler being wrong 5% of the time is going to kill everything and everyone, whereas a democracy has at least some systems in place to stop a wrong once it gets started.
  325. <Aliphant> Sorry, run that by me again. Because I see a lot of "supreme rulers" out there, they're wrong *a lot more than 5% of the time*, and yet North Korea or the Phillipines or Turkey or wherever aren't literally uninhabited wastelands. A supreme ruler making one mistake doesn't mean everyone dies.
  326. <Megafire> Because they're not supreme rulers in the way you're proposing.
  327. <Elara> that's because dictatorships as actually instantiated don't have the extreme level of personal power that you're claiming
  328. <Megafire> Their checks and balances are other countries.
  329. <Aliphant> Elara: Question - why can't the Economic Ministry just obtain the information from the markets?
  330. <m1el> as for "freedom of expression is not a terminal right" - let's test that by taking your freedom of expression.
  331. <Elara> what even is that question?
  332. <Aliphant> Why the fuck would you? My right to eat spaghetti is not a terminal right, but nobody would say "let's test that by taking away your spaghetti".
  333. <Aliphant> If there was a *good reason* why I should be barred from eating spaghetti, then take it, and you'll hear no complaints on my end.
  334. <Megafire> 'The Markets' are not a discrete entity from which information can simply be obtained.
  335. <Elara> if the market is operating, the Ministry isn't setting the prices. if the Ministry is setting a price, there is no longer a market
  336. <Aliphant> (my freedom of expression is in fact severely restricted as of this very moment. I am not complaining.)
  337. <Aliphant> Elara: I'm ok with the Ministry not setting the prices? I'm not saying that literally every cent should be micromanaged by the government. Please don't misunderstand me.
  338. <Aliphant> If it turns out there's some problem with letting the market rule.
  339. <Aliphant> Then we can let the ministry set the prices
  340. <Elara> I'm saying there's an analogy between your idea of the Supreme Leader setting policy and the idea of an Economic Ministry setting the prices
  341. <Elara> and markets and democracy are tools that human societies have created to partially handle the issues with those ideas
  342. <m1el> Aliphant: I'm pretty sure you can survive without eating spaghetti, but you'll wither pretty fast if you're not able to express yourself.
  343. <Aliphant> I believe the following ideas. 1) There is an objective morality. 2) If there's an objective morality, then it follows there must be some agent who can maximize that morality to the fullest. 3) This agent would be the perfect ruler. Whether we have to use artificial intelligence or genetic engineering to make it, it's definitely possible. 4) If we acknowledge that such an agent exists, then it's commonsensical to instate them as the supreme
  344. <Aliphant> ruler.
  345. <m1el> 1) There is an objective morality. <-- nope
  346. <Aliphant> m1el: And if me expressing myself would cause a greater wrong, then I say let me wither away! Let me wither until there is nothing left, for it would be even more evil to unseal my lips.
  347. <Aliphant> IN practice, this is already put into place.
  348. <Elara> why would it follow that an agent exist to maximise morality?
  349. <Aliphant> If the government decides that it would be for the better to shut my mouth, then I will shut my mouth, because they have the guns. :|
  350. <Aliphant> (notice the lack of withering)
  351. <Megafire> ...
  352. <Megafire> It'd become my moral duty to aid the rebellion in whatever way I can.
  353. <Aliphant> Elara: If there exists a well defined set of rules for maximizing good, then it's possible to conceive of, or more likely build, an agent that maximizes those rules.
  354. <Aliphant> Agree?
  355. <Elara> no
  356. <Aliphant> O...kay. Let's break this down.
  357. <m1el> sure, we can take an extreme case that an alien species invaded earth and speaking your opinion causes everyone to die. but we haven't encountered such examples in the real life.
  358. <Elara> or maybe, but that agent is unlikely to be a single being
  359. <Aliphant> Suppose the objective morality was to... I dunno, manufacture lego blocks. Just as a hypothetical. The perfect moral agent would be the agent most effective at manufacturing lego blocks.
  360. <Megafire> No.
  361. <Aliphant> m1el: I've seen many case of people whose opinions are so toxic that speaking their opinion would cause society to be polluted and minds to be corrupted by the sheer wrongness of the bile they spew forth. It is these people that I want to silence.
  362. <Megafire> It'd be a moral agent perfectly effective at manufacturing lego blocks.
  363. <Megafire> And by assuming you've found it, you lock yourself out of further improvements.
  364. <Aliphant> Yes, and if the objective morality says to manufacture lego blocks (this is obviously NOT THE CASE, it's literally a hypothetical)
  365. <Megafire> (Hence, chaos is a social good.)
  366. <m1el> Let people express toxic opinions. we've been through this. 100 years ago gay marriage was considered "toxic"
  367. <Aliphant> then obviously we want a perfectly effective at manufacturing lego blocks agent
  368. <Elara> like, positing objective morality for a moment - the least worst way to instantiate that would be likely to involve input from all the people living in that society
  369. <Aliphant> It's a total strawman to equate gay marriage to Nazism, miel.
  370. <Megafire> Not in the system you're proposing, it isn't.
  371. <m1el> sure, let people express their nazi views.
  372. <Aliphant> No, I will not.
  373. <m1el> that is unfortunate.
  374. <Aliphant> I will use social pressure to shut them down when they do so. And I will vote for governments and policies that promise to shut them down. Because I cannot abide the idea of Nazis, and I know that in 100 years they'll still be considered toxic.
  375. <Megafire> Your system has literally no way of differentiating between the two.
  376. <Elara> not a single AI or a single person, however clever
  377. <Megafire> What if your perfect moral agent decides gay marriage is immoral?
  378. <Megafire> Based on perfect moral standards you are incapable of comprehending?
  379. <Aliphant> Megafire: Yes there is! Look, I'm not God. I don't know what the objective morality is. But i do know the objective morality definitely says "gay marriage is not wrong" and "Nazism is wrong". So a perfect moral agent would by definition differentiate between them as good and evil.
  380. <Elara> hang on, isn't this basically reprising Divine Command Theory?
  381. <Megafire> You don't know that!
  382. <Elara> right up to the "the perfect moral agent would agree with me"
  383. <Aliphant> No idea what that is, 'lara :|
  384. <Aliphant> The perfect moral agent would NOT agree with me.
  385. <Megafire> You have no idea whether or not objective morality says 'gay marriage is not wrong'.
  386. <Elara> the reason you're reprising it in secular terms is probably partly because you don't know what it is
  387. <Aliphant> The perfect moral agent would agree with me that gay marriage is not wrong, and Nazism is wrong. But it might well disagree with me on a variety of other factors.
  388. <Aliphant> Heck, it might even disagree with me that censorship is not wrong.
  389. <Megafire> You don't know that!
  390. <Megafire> You have no reason to assume any of that!
  391. <m1el> what if your Perfect Moral Agent instantiates Nazism?
  392. <Aliphant> I know that, because I while I do not have perfect knowledge of objective morality, I at least know enough to confidently say, *for sure*, that it is objectively not wrong to be gay.
  393. <Megafire> The perfect moral agent might well decide Nazism is correct or causes the least amount of harm!
  394. <Aliphant> Then it's not the perfect moral agent x_____x
  395. <m1el> how do you know that?
  396. <Aliphant> How can you say "this is the perfect moral agent who thinks Nazism is correct"
  397. <m1el> I just did
  398. <Aliphant> Because objective morality clearly states Nazism is correct
  399. <Elara> how do you know?
  400. <Aliphant> Unless you want to disagree with me and think that objective morality states Nazism is right :|
  401. * Cyrix (Mibbit@net-jk34t0.dip0.t-ipconnect.de) has joined
  402. <Aliphant> Elara: Moral intuitions, of course. That's how all morality begins.
  403. <Elara> you literally just said yourself, probably via typo, that objective morality is in favour of Nazism
  404. <Megafire> I'm saying you've got no fucking clue what the absurd alien agent you're proposing will end up thinking.
  405. <Aliphant> No, I said people who disagree with me would think that objective morality state Nazism is right.
  406. <Elara> intuition and aesthetics might be where you start, but they aren't where you end
  407. <Aliphant> There are two options on the table. 1) Objective morality thinks Nazism is right. 2) Objective morality thinks Nazism is wrong.
  408. <Megafire> And maybe it won't end up thinking Nazism is right. Maybe it'll end up thinking North Korean Communism is right, instead!
  409. <Elara> we end in debate and discussion, via the freedom of expression we've been talking about
  410. <Aliphant> I believe 1). Do you believe 2)? If you don't believe 2), you must logically speaking believe 1).
  411. <Aliphant> If we all agree that 1) is correct, then we all agree that this moral agent would agree with us (because the premise of the moral agent is that it's the perfect executor of the objective morality, and if the objective morality believes that Nazism is wrong it would agree with us Nazism is wrong).
  412. <Megafire> False dichotomy.
  413. <Aliphant> What's the third option?
  414. <Megafire> I can say 'I don't know'.
  415. <Megafire> I can say that I think Nazism is bad, based on the information I currently have.
  416. <Aliphant> You *don't know whether Nazism is wrong*? Okay, I hate to tell you this, but it's considered basic knowledge of good and evil where I come from.
  417. <Megafire> Shut up and listen for five seconds.
  418. <m1el> I would also like to put an option 4) there is no objective morality
  419. <Elara> I think if there is an objective morality, Nazism will be against it, but there are a lot of greyer area cases which we could find and replace a word for Nazism in all the above ^ where I have my opinions but they may/may not fit with "objective morality"
  420. <Aliphant> m1el: If you don't believe in an objective morality, then I have nothing to say to you. I'm not going to waste my time trying to defend a certain view of good and evil towards someone who thinks there's no such thing as good and evil.
  421. <Megafire> I can say I don't know if Nazism will end up being found objectively wrong by the entity you're proposing, because it is an alien entity so far removed from us that our sense of right and wrong might well not apply.
  422. <Megafire> Fuck it, I'll go one step further.
  423. <m1el> there is "good" and "evil", and they're relative
  424. <Megafire> I think what you're proposing is evil.
  425. <Aliphant> meg - tell me when you're finished, I promised not to interrupt
  426. <Elara> I'm pretty sure meat-eating is against objective morality, but I'd rather not have that enforced TYVM
  427. <Megafire> And I think it'd be my moral imperative to oppose whatever dictator you're trying to put in place.
  428. <m1el> :D
  429. <Megafire> Because, chaos is a social good.
  430. <Elara> Hiero would make a good villain, y'know
  431. <Aliphant> Elara: Unlike Nazism and gay marriage, I am kinda unsure if the perfect ruler would think meat eating was bad, or okay. But if the perfect ruler decided to enforce a ban on meat eating, I'd go along with it, because it's perfect lol
  432. * FossAsleep has quit (Quit: Connection closed for inactivity)
  433. <Aliphant> also that last statement is pure shit stirring and uncalled for :|
  434. <Megafire> Ugh.
  435. <Aliphant> like there's literally no productive discussion to say "someone would make a good villain" lara
  436. <Cyrix> hey heropants
  437. <Aliphant> Megafire: I think where I disagree with you is this. I think the perfect ruler would NOT be completely removed from our idea of good and evil.
  438. <Cyrix> like - lets label all the morally bad people with a yellow star
  439. <Megafire> You can't say that you'd go along with a perfect entity even when it disagrees with you and still hold that it will not disagree with you on certain things!
  440. <Cyrix> and call them jews to better identify them
  441. <Megafire> Do you not see this!?
  442. <Elara> Cyrix, can you not troll please
  443. <Cyrix> to apply social pressure to them
  444. <Cyrix> I do not think he groks it any other way
  445. <Aliphant> Cy, I'm not going to engage with that shitty strawman.
  446. <Cyrix> it isnt a strawman
  447. <Cyrix> it is literally what happend
  448. <Aliphant> Megafire: I'm going to explain to you why I think that this perfect entity would agree with me on some things, and disagree on others.
  449. <Cyrix> *literally*
  450. * You are now known as VereorNox
  451. <Elara> Hiero, no offence intended, but IMO the best villains in stories are convinced they're doing good...and like Meg, I think what you're proposing would be evil, if actually enforced
  452. <Megafire> That's not the fuckin' point!
  453. <Aliphant> And I'm going to explain why I'd be okay with trusting its judgement if it disagrees with me.
  454. <Aliphant> And I'm also going to explain why I think that it won't disagree with me on stuff like gay marriage.
  455. <Cyrix> noone cares for your strawmans Aliphant
  456. <Aliphant> Whom am I strawmanning, Cy?
  457. <Cyrix> your "perfect entities" and 'moral' debates with two outcomes
  458. <Cyrix> are laughable at best
  459. <Cyrix> and moronic in reality
  460. <Megafire> Your own thoughts on the matter are literally irrelevant because you've already conceded you'll relinquish them in favour of the God you're raising up!
  461. <Elara> so if you were to have the power to bring about this thing, and you were trying to do it, you *would* be a villain
  462. * Elara coughs "Thou shalt not raise anything thou canst not put down"
  463. <Aliphant> Megafire: But the God will agree with me on all that matters, and if it disagrees with me - hell, if this God came down to earth and told me that you were right and chaos is a social good - then that's okay!
  464. <Aliphant> Because I know it won't disagree with me on the big points, like gay marriage or murder being wrong!
  465. <m1el> "But the God will agree with me on all that matters" oh bot
  466. <Aliphant> Why do I think this?
  467. <m1el> *boy
  468. <Cyrix> oooh now we add a splash of religious fanaticm in it
  469. <Megafire> YOU CANNOT KNOW THIS
  470. <Cyrix> because thats what goverments needs
  471. <Aliphant> Meg, can you please PLEASE let me explain why I think so rather than just saying I can't.
  472. <Cyrix> no
  473. <Elara> hey, let the man talk
  474. <Cyrix> fiiine
  475. <Cyrix> lets hear it
  476. <Elara> we all disagree with him, he knows it, let's give him the space to try to bring out his ideas
  477. <Aliphant> First axiom: I think I'm right, like objectively right, when I say that gay marriage is correct. As in, I literally can know that I'm objectively 100% no lie, no doubt, no instance of wrongness when I say gay marriage is not evil.
  478. <Megafire> Sure I can.
  479. <Cyrix> for you definition of marriage anyway
  480. <Megafire> It's easy.
  481. <Aliphant> Because I have a 100% confidence that objective morality agrees with gay marriage, someone who was perfect at enforcing objective morality would therefore agree with me about gay marriage.
  482. <Megafire> 'Marriage is wrong, ergo, gay marriage is wrong.'
  483. <Cyrix> but I can give you plenty of reasonable examples where it is 'wrong'
  484. <Megafire> Can you fathom an instance in which marriage itself is wrong?
  485. <Aliphant> No. It helps people and makes them happier.
  486. <Megafire> What if it doesn't?
  487. <m1el> ._.
  488. <Megafire> What if another institution does that better?
  489. <Megafire> (Also, happiness as a terminal value, huh?)
  490. <Cyrix> @Aliphant you try to frame 'progressive thoughts' and arguments like gay marriage rights to further your facist thoughts and fuel your missguided arguments
  491. <Aliphant> Then we'll just have gay *better institution* instead, and sure, that's fine, it's a new evolution in marriage.
  492. <Elara> Meg has a point ^ if all marriage is wrong, then gay marriage is also wrong and the objective morality bot will outlaw it
  493. <Aliphant> I'm 100% okay with moral entity implementing that, btw.
  494. <Megafire> What if living in an enforced free love society in which marriage is illegal made people happier?
  495. <Aliphant> Happiness is not the only terminal value I have, and it should be weighed against the others. I am more than willing to make some people unhappy to satisfy other more important terminal values, like justice.
  496. <Cyrix> yeah. Lebensraumerweiterung Ost is important for the majority of the Arian race
  497. <Megafire> Holy shit that somehow sounds even worse than 'happiness' being a terminal value.
  498. <Cyrix> I am perfectly willing to make some people like the polish unhappy to further my goals
  499. <Cyrix> march on
  500. <Cyrix> !
  501. <Megafire> Fuckin' justice, really?
  502. <Aliphant> Cyrix: I'll say it outright. If by fascism you mean authoritarianism without consideration to social progressivity, then in and of itself I have no problem being a fascist. I am fairly open about my authoritarian views on this IRC and I have stated before multiple times that I think that while democracy may be an acceptable solution right now, it's never going to work as an end goal.
  503. <Cyrix> because we need an "end goal"
  504. <Cyrix> life is about an "end goal"
  505. <Cyrix> as is society
  506. <Aliphant> Cyrix, I'm not sure why you keep bringing race into this. I'm perfectly unwilling to make evil people and criminals happy to deliver justice upon them etc. etc. and I'm fine with that
  507. <Cyrix> right!
  508. <m1el> shit Hiero says "some racism is good" "I have no problem being a fascist sometimes"
  509. <Aliphant> Of course it is!
  510. <Cyrix> yeah sure
  511. <Cyrix> for you in your own little world that might be the case
  512. <Aliphant> fuck off, m1el, if you're going to keep debating in bad faith, I'm leaving this channel.
  513. <Cyrix> and fuck everyone who thinks differently
  514. <Elara> Whoa
  515. <Elara> Everyone
  516. <Elara> Play a bit nicer
  517. <Aliphant> For the record: that quote he said was taken out of context. It was in an argument about Affirmative Action where I argued that Affirmative Action was racial discrimination, but stripped of all the things that make racial discrimination bad, and that therefore it's as good kind of racism. I never said that racism was good.
  518. <Aliphant> http://squid314.livejournal.com/323694.html
  519. <_> [ The Worst Argument In The World - Jackdaws love my big sphinx of quartz ] - squid314.livejournal.com
  520. <Aliphant> Specifically I was quoting from this article
  521. <m1el> I've read that post and it doesn't cover what I mean by racism
  522. <Megafire> Okay, so people who deserve justice are those that decrease total happiness more than they increase it?
  523. <Aliphant> Yes, m1el, I really want you to fuck off when you take that quote out of context to try to paint me as a Nazi.
  524. <Aliphant> Because if you're going to say that me supporting affirmative action means I'm a racist, then that's just bad faith debating.
  525. <Cyrix> hey Aliphant - what is a nazi?
  526. <Aliphant> And you should know exactly why that's bad faith, because I just linked an article explaining why it was
  527. <Aliphant> A Nazi is someone who hates Jews.
  528. <Cyrix> lol
  529. <Cyrix> no
  530. <Elara> No
  531. <Aliphant> Nazism is characterized by authoritarian anti-semitism.
  532. * MicroIce (Megafire@net-56lide.dynamic.ziggo.nl) has joined
  533. * Megafire is now known as Villain28545
  534. * MicroIce is now known as Megafire
  535. <Cyrix> in part
  536. <Cyrix> what else?
  537. <Aliphant> I can quote the wiki article if you'd like.
  538. <Cyrix> no, I want to hear it from you
  539. <Elara> Nazism is so much more than just anti Semitism
  540. <Cyrix> in your words
  541. <Aliphant> "the ideology and set of practices associated with the 20th-century German Nazi Party, Nazi Germany and other far-right groups. Usually characterized as a form of fascism that incorporates scientific racism and antisemitism, Nazism's development was influenced by German nationalism (especially Pan-Germanism), the Völkisch movement and the anti-communist Freikorps paramilitary groups that emerged during the Weimar Republic after Germany's
  542. <Aliphant> defeat in the First World War."
  543. <Cyrix> I can read the article myself
  544. <Cyrix> I want to know it in your words
  545. <Megafire> Okay, so people who deserve justice are those that decrease total happiness more than they increase it?
  546. <Aliphant> Elara: Yes, Nazis also supported certain right-wing economic ideas, they invented privatization iirc
  547. <Cyrix> @Megafire For Great Justice.
  548. <Elara> Nope
  549. * TFS has quit (Connection closed)
  550. <Aliphant> I, uh, are you correcting me that they invented privatization?
  551. <Aliphant> Or... disagreeing that they were right wing?
  552. <Cyrix> ... they invented what?
  553. <Cyrix> LOL
  554. <Elara> They were right wing, they were also authoritarian
  555. <Megafire> Because goddamn, happiness and justice make for such terrible terminal values.
  556. * Hierophant (Hierophant@net-pe9.28r.90.183.IP) has joined
  557. <Hierophant> Computer crashed.
  558. <Hierophant> Stand by while I reboot
  559. <Hierophant> (sending this on phone atm)
  560. <Megafire> Because goddamn, happiness and justice make for such terrible terminal values.
  561. * Hierophant has quit (Connection closed)
  562. <Elara> What would you pick as a terminal value, Meg?
  563. <Megafire> Autonomy.
  564. <Megafire> Human rights.
  565. <m1el> survival of the human species or their descendants
  566. <Megafire> Those two should do it.
  567. <Cyrix> you are not happy citizen! initiate mandatory happyness enforcment. For Great Justice.
  568. <Cyrix> bleep bleep
  569. <Elara> Human rights is not a terminal value until you've defined those rights
  570. <Megafire> Honestly, I don't care about the human race. If our choices lead to our extinction, so be it.
  571. <Megafire> Point.
  572. <Megafire> Bodily autonomy, freedom of thought and expression, the ability to make meaningful choices...
  573. * Aliphant_ (Aliphant@net-14l.p2v.55.182.IP) has joined
  574. <Elara> Autonomy isn't terrible, to be fair, though I'm not sure I'd pick it
  575. <EllardyAway> I think this chat should be renamed #philosophy
  576. * Villain28545 has quit (Ping timeout: 181 seconds)
  577. <Megafire> Which would you pick?
  578. <Aliphant_> Hey all, I'm back.
  579. <Aliphant_> Last message seen was:
  580. <Aliphant_> <Aliphant> Or... disagreeing that they were right wing?
  581. <EllardyAway> Or #PoliticalThought ,it's a pretty good euphemism for political philosophy
  582. <Elara> We have a philosophy, but I think Nazi chat belongs here lrd
  583. <Aliphant_> fill me in on all relevant backlog.
  584. <Aliphant_> EllardyAway: This touches on a lot of politica themes like censorship and authoritarianism.
  585. <Aliphant_> Broadly speaking the purpose of this chat is to contain divisive or controversial arguments.
  586. <EllardyAway> Heh. The real purpose of the chat
  587. * Cyrix sighs
  588. * Aliphant has quit (Ping timeout: 181 seconds)
  589. <Aliphant_> Cyrix: I actually like you as a person, so I really hope you won't stoop to insults and stuff in a political discussion any more.
  590. <Elara> I would probably pick a proxy for happiness that avoids obvious wireheading outcomes
  591. * Aliphant_ is now known as Aliphant
  592. <Megafire> https://pastebin.com/Pd19YDiD
  593. <_> [ <Cyrix> ... they invented what? <Cyrix> LOL <Elara> They were right wing, they - Pastebin.com ] - pastebin.com
  594. <Megafire> Logs you missed, Hiero.
  595. <Aliphant> I'd argue wireheading isn't real happiness, I'd argue true happiness involves meaningfulness more than just pleasure, that it's not just hedonism. Subject for another day.
  596. <Aliphant> <Cyrix> you are not happy citizen! initiate mandatory happyness enforcment. For Great Justice.
  597. <Megafire> I think you'd end up severely disappointed in humanity, if that's what you believe.
  598. <Aliphant> More like "you are not happy citizen! here's why, and here's what we're going to do to fix the problem."
  599. <Aliphant> That's how governments should run. Fixing problems is their job.
  600. <Megafire> So, basically exactly what Cyrix said.
  601. <Cyrix> so... I am not allowed to be unhappy?
  602. <Megafire> ^
  603. <Cyrix> you do know that grief is healthy and needed?
  604. <Aliphant> I'm saying you won't be unhappy, because there's no reason for you to be.
  605. <Cyrix> and that it is up for the individual when they are done griefing?
  606. <Megafire> I am perfectly okay with not being happy all the time.
  607. <Aliphant> Megafire: "mandatory happyness enforcement" has some unfair and unreasonable connotations which is completely different from the connotations of "fixing the problem that made you unhappy"
  608. <Elara> Did you see the Dr who episode about this?
  609. <Megafire> Maybe, but it's the exact same thing.
  610. <Elara> Was pretty good
  611. <Cyrix> @Elara I dont watch tv much
  612. <Aliphant> I have not, so if there's a relevant point or scene, you might want to link a video.
  613. <Megafire> You're free to not like the connotations, but at least acknowledge the problem.
  614. <VereorNox> Wow Doctor Who on TV would be atrocious in German
  615. <Aliphant> What's the problem?
  616. <VereorNox> terrible dub voices
  617. <Cyrix> oh god vern xD
  618. <Megafire> <Cyrix> so... I am not allowed to be unhappy?
  619. <Cyrix> no, you are not!
  620. <Cyrix> For Great Justice
  621. <Megafire> That is the problem.
  622. <Elara> Oh, robots that are trying to help end up killing people who are persistently unhappy
  623. <Aliphant> This is a problem because you *want* to be unhappy, right Meg?
  624. <Cyrix> hahaha xD
  625. <Elara> This spirals into all the humans being dead
  626. <Cyrix> I want noone to tell me how to feel
  627. <Megafire> I think it's a problem because feeling various emotions is healthy and good for people.
  628. <Aliphant> In that case, I'd argue that you aren't truly happy as long as you don't feel grief. If you're not satisfied with your life, then you're not really happy. I'm not obligated to defend some weird hedonistic/emotion based version of happy.
  629. <Megafire> Yes you are, if you make it your goddamn terminal value.
  630. <Cyrix> aaaah
  631. <Aliphant> I am not defending, I repeat I am not defending a universe where everyone is made to feel a single emotion all the time.
  632. <Cyrix> so you define now what happy is too?
  633. <Cyrix> right?
  634. <m1el> Hiero, I've thought about our conversation on AA, and I was not happy with the outcome.
  635. <Elara> Aka what happy means to most people?
  636. <m1el> You see, the problem with discrimination based on race lies not only in the "uneven playing field", but it's much broader. AA means you're judged upon by observing the group you belong to, ignoring everything that makes you *you*.
  637. <Aliphant> Defining what it means to be happy is the project of tons of discussion over the course of human history, Cy.
  638. <Cyrix> So what exactly is happy? I want some PlanEconomics for the maxed out Happynessgrowth
  639. <m1el> As an example, some rich Black guy is going to pass through, and some poor white guy is going to fail because of AA. This is not acceptable. People have qualities other than their race, and should be treated individually.
  640. <Aliphant> Megafire: Let's taboo the word happy, since we have different conceptions of what it means.
  641. <Cyrix> LOL
  642. <Cyrix> yes
  643. <VereorNox> ... that sounds like a terrible reason to taboo a word
  644. <Cyrix> Lets not call it Fenster anymore but Lichtraumfluter
  645. <Aliphant> I have terminal value "X". I think that a perfect ruler would maximize X, amongst other things. X is characterized by a general satisfaction with life, and a general wanting of that thing to happen.
  646. <VereorNox> Cyrix please
  647. <Aliphant> VereorNox, Cyrix: I use the word "taboo" here in a very specific context.
  648. <Cyrix> @Aliphant sure.
  649. <Aliphant> Specifically I'm referring to the idea of "rationalist tabooing"
  650. <Megafire> If your terminal value of 'happiness' includes 'allow people to be sad sometimes' you lose any ability to really judge it moment to moment, and therefore can't enact justice upon those that reduce it.
  651. <Elara> The obvious counter to that, m1el, is that pre-AA systems didn't take race into account, and that taking race into account moves closer to treating people as individuals than ignoring it
  652. <Aliphant> where people who don't agree on the definition of a word try to define the word without using that word.
  653. <Aliphant> Because otherwise you just end up in circular discussion.
  654. <Cyrix> and rationalism is such a nice catch all phrase for everything equally muddy and undefinable
  655. <Cyrix> same as righteousness
  656. <m1el> Elara: how do we know that it moves closer?
  657. <Aliphant> No, rationalist here refers to a specific group of people with a specific group of beliefs led by a specific man who Vern and Elara both hate.
  658. <Cyrix> Adolf?
  659. <Aliphant> Nevertheless, I think his conception of rationalist taboo as a method of getting to know a specific word better when you disagree on what that word means is an excellent one.
  660. <Elara> Because now you're taking an important factor into account that you were ignoring before
  661. <Megafire> I don't think any rationalist would claim they're being led by Big Yud.
  662. <m1el> why is it important?
  663. <Aliphant> Cyrix, I have a lot of respect for you as a person but if you do not stop it with the Nazi-implication bullshit, I will put you on mute until this discussion is over.
  664. <Elara> Because we can see that it affects outcomes to a significant extent
  665. <Aliphant> Megafire: Irrelevant. You know the exercise I'm talking about, so let's use it to better express my terminal value without using confusing human language that has already led to misunderstanding.
  666. <Cyrix> @Aliphant stop talking like a nazi then?
  667. <m1el> so let's make people *blind* to this factor
  668. <Megafire> I'm still stuck on finding your entire plan absolutely insane.
  669. <Aliphant> Can you explain to me *exactly how and where* I am like a Nazi?
  670. <Megafire> And evil, and fundamentally necessary to oppose.
  671. <Aliphant> Because I am authoritarian?
  672. <Aliphant> Is that it?\
  673. <Aliphant> Because I want to censor people?
  674. <Elara> That would require you to understand Nazism and thus be a long conversation
  675. <Aliphant> Surely it can't be because I'm anti-semitic, or because I use overlong German words to describe my concepts.
  676. <Aliphant> Or any number of other factors.
  677. <Cyrix> I would suggest you to read up on the nsdap and its values
  678. <Aliphant> Elara: I actually don't care because it's far more important that I'm right than some superficial similarity with a political group that I've explicitly advocating imprisoning people for being in.
  679. <Elara> M1el being blind to the fact one person is speaking a different language won't help you to understand
  680. <Elara> You want to *jail* Nazis now?
  681. <Aliphant> if they express their beliefs, yes. That's part of censorship.
  682. <Aliphant> How is "let's jail Nazis" a controversial opinion, jesus christ
  683. <Megafire> ...
  684. <m1el> so you want to put people to jail for wrongthink?
  685. <Megafire> ^
  686. <Cyrix> but that you can honstley say with a straight face that you are willing to accept the suffering of some people for the happyness of others - and call that justice
  687. <Cyrix> proofes you have not even an inkling of an idea what nazis are, were they came from and where their ideology comes from
  688. <Aliphant> Okay, sure, let's call it that, we can split the hair and say it's wrongspeak, but wrongthink leads to wrongspeak, so yes, I want to jail people for wrongthink. I think their wrongthink is so wrong and evil that it pollutes society whenever it's expressed, and so I see no alternative but to either change them from it or jail them. Seeing as Nazis have proven resistance to polite requests to change, we move onto the other plan.
  689. <Cyrix> LOL
  690. <Cyrix> yes
  691. <Cyrix> I have this brilliant idea
  692. <Aliphant> Cyrix: *straight face* I am willing to accept the suffering of some people for the happiness of others. That is justice.
  693. <Cyrix> lets label everyone who "wrongthinks" with a big yellow star
  694. <Aliphant> Why the fuck would we do that?
  695. <Megafire> Dude.
  696. <Cyrix> so that we know where to apply Justice!
  697. <Cyrix> and also apply some "social pressure"
  698. <Elara> Cyrix stahp
  699. <Megafire> You're a fascist.
  700. <Cyrix> and lets have some gov dudes in snazzy SS uniforms do some controle rounds
  701. <Aliphant> Okay, sure, let's say I grant your idea. What next? What point are you trying to prove with this example? Boo hoo, the Nazis also used stars to mark their enemies. That's not the problem. The problem was that their definition of who is an enemy is bad in the first place.
  702. <Megafire> Eheheheh.
  703. <Elara> You said this a few minutes ago, it's not helping
  704. <Cyrix> to ensure the social pressure is sufficiently applied
  705. <Megafire> Well, he broke me.
  706. <Aliphant> If you want to mark genuinely evil people with yellow stars - rather than just Jews who you hate for no reason - I am not opposed to that.
  707. <Megafire> Fuckin' no bad tactics, only bad targets.
  708. <Aliphant> If you want to mark every Nazi and racist and murderer with a yellow star and jail them all, do it!
  709. <Cyrix> you are so dumb Aliphant - no offense
  710. <Cyrix> "hate jews for no reason"
  711. <Cyrix> Nazis had plenty of reasons to hate them
  712. <Cyrix> this is just proof you do not understand what you are talking about
  713. <Megafire> Most of them ended up being 'they're evil'.
  714. <Aliphant> It has brought up several times in this conversation that I have no understanding of Nazism. I'm going to make things very clear right now, and this will be my answer to all further comments that state or imply that I'm ignorant about Nazism.
  715. <Cyrix> I am not obligated to educate you. Thats your obligation.
  716. <Druza> Cyrix
  717. <Druza> Stop being an ass.
  718. <Megafire> ^
  719. <Elara> I think we should probably draw a line under this conversation before it gets bad-tempered
  720. <Aliphant> I do not give a single flying fuck. If someone waves a swastika flag or says Jews should be killed, I want to imprison them. If they tell us that Hitler was right, I want to imprison them. If they express anti-semitic values, I want to imprison them.
  721. <Cyrix> lol
  722. <Cyrix> Again
  723. <Elara> No
  724. <Elara> Not again
  725. <Cyrix> you slap "socially accepted behaviour" in front of your arguments
  726. <Aliphant> This conversation has been in extremely bad faith. It's my practice to thank everyone involved in a heated debate for a good faith attempt at civil discussion, but I'm unable to do so with honesty this time around.
  727. <Cyrix> but you do not understand WHY that behaviour is "scially accepted" by todays standards
  728. <Cyrix> stop your empty set phrases
  729. <Aliphant> I can see that this is going to degenerate into increasingly unproductive spirals of misunderstanding and hilarious strawmanning, so I'm going to stop this here. If you want to continue the debate, please PM me, and Cyrix, please be polite if/when you do so.
  730. <Cyrix> they only serve to make you sound preposterous and stupid
  731. <Megafire> To be honest, Hiero, if I knew you were a fascist, I probably would've argued differently. So that is my bad.
  732. * Elara sets mode +m on #Politics
  733. <Elara> Everybody breathe
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement