Advertisement
Guest User

Untitled

a guest
Nov 13th, 2016
78
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 4.61 KB | None | 0 0
  1. Internet content algorithms are a fairly obvious way that the profit-motive injects itself into Internet culture. Rather than improving the attractiveness of content, clickbait that is favorable to the user is selected for and made easier to access, and thus made easier to click. But we know too that the circlejerk is not just a relation enforced top-down by greedy corporations-- users engage in circlejerk even when absent of a clear oppressor, by virtue of living in a society in which the commodity-form dominates all. In other words, content algorithms are built to cater to the wants of users, but the wants of users are shaped by the society that motivates content algorithms in the first place. Within the profit-motive, the only direction in which content algorithms can be improved is that which best facilitates the circlejerk of users.
  2.  
  3. The most recent US presidential election caught one segment of the population by complete surprise. Some say that the echo chamber created by content algorithms are to blame. This may be true to an extent, but it also seems unlikely that modifying the Facebook algorithm against the profit-motive will work.
  4.  
  5. Content algorithms seem to create a two-fold effect:
  6. 1) Content that users disagree with are effectively censored, as they are pushed to the bottom of the newsfeed. There is no end to content nowadays, after all. As long as one prescribes to a fairly mainstream position, one can spend hours reading flatly affirmative articles without having to venture beyond. Non-affirmative content is not explicitly hidden, but merely deprioritized.
  7.  
  8. 2) Propaganda (in a broad sense, referring to any opinion or rhetorical piece) is delivered to those who already agree with it, and rarely anyone else. Its usefulness is, to an extent, subverted, as propagandists will find that their very tool of dissemination works against them.
  9.  
  10. On the Internet, therefore, propaganda and censorship are no longer distinguishable from one another, nor are their audiences willfully selected by propagandists and censors. Who gets propagandized with what is predetermined, and therefore who gets what censored too is predetermined.
  11.  
  12. Trump is correct that the election was "rigged," insofar as the Democratic party propaganda machine-- including ostensibly unaffiliated liberal media-- is very capable at guiding political discourse in their favor. There is one segment of the population that, having been affected by this propaganda, fully believes that Trump is a proper fascist. But another segment of the population sees him as neither a fascist nor a racist. The difference between them can be seen in their attempts at engaging with each other: the post-electoral Facebook statuses made by the former group seem overwhelmingly to condemn the latter group for being “okay with racism,” while in truth the latter group proceeds from a completely different conception of Trump.
  13.  
  14. The “Facebook bubble” is often accused of having obscured the numerical strength of Trump supporters, but it does much more than that. Liberal propaganda that denounces Trump as a fascist found its way to liberals who were already prepared to agree, while propaganda against this position was indirectly censored. The actual target audience of liberal propaganda, who needed to be persuaded that Trump is a fascist, could not have received the same intensity nor kind of propaganda compared those already inclined to agree. The liberal propaganda machine is powerful, but it missed its mark. Post-election, the former group is caught up in fear and paranoia, whereas the latter seems resigned and depoliticized.
  15.  
  16. In truth, both sides are depoliticized. The direct catering of content to a group of people collapses the dialectic between the people themselves and their politics. Content algorithms seek an identity between the person and their political position— and they are successful, insofar as such an identity exists in the depoliticized present. Political ads and thinkpieces are delivered on the basis of one’s personal information and activity, and this works quite well. Not all content that one sees is affirmative, but just enough of it is such that all non-affirmative content can be successfully pathologized. Any politics based on an identity with one’s person is capable only of engaging with other politics ad hominem, i.e., of attacking the personhood of one’s opponent. In a cataclysmic civil war, this may be useful; within liberal democracy, this is not.
  17.  
  18. Only by distancing one’s political position from one’s personal identity can politics be rescued. The only thing that can be “done” about content algorithms is to be critical of them.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement