Advertisement
uopspop

Untitled

Jul 1st, 2018
210
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 3.70 KB | None | 0 0
  1. Unit 2 Writing Assignment:
  2. Choose a topic that interests you and is related to your area of study or professional field.
  3. * Explain why this topic interests you and is related to your area of study or professional field.
  4. * Research this topic and write an essay that incorporates information from 3 different sources.
  5. * Cite your sources using APA standards learned in this unit.
  6. * The paper should be at least 300-400 words in length, double-spaced, and in Times New Roman font.
  7. * word count.
  8.  
  9. ========================================================================================================================
  10.  
  11. Bias in AI
  12.  
  13. I'm taking courses in order to gain B.S. in C.S. and A.I. is one of the blooming fields in this industry. Therefore, understanding critical issues in A.I. industry will equip me with more insights about how I should use technology to help the society.
  14.  
  15. Although A.I. can work much more effectively than humans in many areas, it can be sexist and racist like humans, too. In some circumstances, the sexism and racism could be even prominent (Bossmann, 2016). Many states in the U.S. has started to use risk assessment systems, such as COMPAS, to help judges decide criminals' verdicts (Angwin, Larson, Mattu and Kirchner, 2016). However, Jeff Larson, Surya Mattu, Lauren Kirchner and Julia Angwin (2016) points out that "Black defendants were twice as likely as white defendants to be misclassified as a higher risk of violent recidivism" in their research on COMPAS Recidivism Algorithm. In order to refine the algorithm, we should be extremely cautious of what factors should be included especially like gender and race. Even statistically the data we feed into the A.I. system largely come from a specific race or gender, the A.I. still shouldn't build a stereotype like humans. There are many other factors that influence human behaviors. Since machines only learn from the data we feed to them, we have the responsibility to omit sexist or racist data set at the very beginning. A.I. should be fair and neutral in terms of producing results and researchers on A.I. should be responsible for their work.
  16.  
  17. Another example of sexist A.I. is Google Translate. In many languages, when we refer to third persons using pronounces, there is no gender difference. However, when Google Translate converts those languages into English, it shows a biased result according to the context, like jobs(Morse, 2017). In the result, when the pronounce refers to doctors, it will be translated to "he", while referring to teachers, it will be translated to "she" (Morse, 2017). Because the algorithm operating behind the scene uses data given to it to output the result and statistically the data of a particular job mainly result from either male or female. The algorithm works as what it is commanded to do, but the result raises concerns to us. In my opinion, the factors that we use to teach Google Translate might be too simplified and need more delicate inputs to teach the A.I. in order to produce translated sentences without sexist content.
  18.  
  19. Reference list:
  20.  
  21. Bossmann, 2016. Top 9 ethical issues in artificial intelligence. Retrieved rom https://www.weforum.org/agenda/2016/10/top-10-ethical-issues-in-artificial-intelligence/
  22.  
  23. Angwin, Larson, Mattu and Kirchner, 2016. Machine Bias. Retrieved from https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing/
  24.  
  25. Angwin, Larson, Mattu and Kirchner, 2016. How We Analyzed the COMPAS Recidivism Algorithm. Retrieved from https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm/
  26.  
  27. Morse, 2017. Google Translate might have a gender problem. Retrieved from https://mashable.com/2017/11/30/google-translate-sexism/
  28.  
  29.  
  30.  
  31. word count: 400
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement