Advertisement
xdxdxd123

Untitled

May 27th, 2017
823
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 55.96 KB | None | 0 0
  1. For Amy, the day began like any other at the Sequential Label and Supply Company
  2. (SLS) help desk. Taking calls and helping office workers with computer problems was not
  3. glamorous, but she enjoyed the work; it was challenging and paid well enough. Some of her
  4. friends in the industry worked at bigger companies, some at cutting-edge tech companies,
  5. but they all agreed that jobs in information technology were a good way to pay the bills.
  6. The phone rang, as it did about four times an hour. The first call of the day, from a worried
  7. user hoping Amy could help him out of a jam, seemed typical. The call display on her mon-
  8. itor showed some of the facts: the user’s name, his phone number and department, where
  9. his office was on the company campus, and a list of his past calls to the help desk.
  10. “Hi, Bob,” she said. “Did you get that document formatting problem squared away?”
  11. “Sure did, Amy. Hope we can figure out what’s going on this time.”
  12. “We’ll try, Bob. Tell me about it.”
  13. “Well, my PC is acting weird,” Bob said. “When I go to the screen that has my e-mail
  14. program running, it doesn’t respond to the mouse or the keyboard.”
  15. “Did you try a reboot yet?”
  16. 1
  17. Copyright 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  18. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
  19. “Sure did. But the window wouldn’t close, and I had to turn my PC off. After it restarted, I
  20. opened the e-mail program, and it’s just like it was before—no response at all. The other
  21. stuff is working OK, but really, really slowly. Even my Internet browser is sluggish.”
  22. “OK, Bob. We’ve tried the usual stuff we can do over the phone. Let me open a case, and
  23. I’ll dispatch a tech over as soon as possible.”
  24. Amy looked up at the LED tally board on the wall at the end of the room. She saw that
  25. only two technicians were dispatched to user support at the moment, and since it was the
  26. day shift, four technicians were available. “Shouldn’t be long at all, Bob.”
  27. She hung up and typed her notes into ISIS, the company’s Information Status and Issues
  28. System. She assigned the newly generated case to the user dispatch queue, which would page
  29. the roving user support technician with the details in a few minutes.
  30. A moment later, Amy looked up to see Charlie Moody, the senior manager of the server
  31. administration team, walking briskly down the hall. He was being trailed by three of his
  32. senior technicians as he made a beeline from his office to the room where the company
  33. servers were kept in a carefully controlled environment. They all looked worried.
  34. Just then, Amy’s screen beeped to alert her of a new e-mail. She glanced down. The screen
  35. beeped again—and again. It started beeping constantly. She clicked the envelope icon and,
  36. after a short delay, the mail window opened. She had 47 new e-mails in her inbox. She
  37. opened one from Davey Martinez in the Accounting Department. The subject line said,
  38. “Wait till you see this.” The message body read, “Funniest joke you’ll see today.” Davey
  39. often sent her interesting and funny e-mails, and she clicked the file attachment icon to open
  40. the latest joke.
  41. After that click, her PC showed the hourglass pointer icon for a second and then the normal
  42. pointer reappeared. Nothing happened. She clicked the next e-mail message in the queue.
  43. Nothing happened. Her phone rang again. She clicked the ISIS icon on her computer desk-
  44. top to activate the call management software and activated her headset. “Hello, Help Desk,
  45. how can I help you?” She couldn’t greet the caller by name because ISIS had not responded.
  46. “Hello, this is Erin Williams in Receiving.”
  47. Amy glanced down at her screen. Still no ISIS. She glanced up to the tally board and was
  48. surprised to see the inbound-call counter tallying up waiting calls like digits on a stopwatch.
  49. Amy had never seen so many calls come in at one time.
  50. “Hi, Erin,” Amy said. “What’s up?”
  51. “Nothing,” Erin answered. “That’s the problem.” The rest of the call was a replay of Bob’s,
  52. except that Amy had to jot notes down on a legal pad. She couldn’t dispatch the user
  53. support team either. She looked at the tally board. It had gone dark. No numbers at all.
  54. Then she saw Charlie running down the hall from the server room. His expression had
  55. changed from worried to frantic.
  56. Amy picked up the phone again. She wanted to check with her supervisor about what to do
  57. now. There was no dial tone.
  58. 2 Chapter 1
  59. Copyright 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  60. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
  61. 1
  62. LEARNING OBJECTIVES:
  63. Upon completion of this material, you should be able to:
  64. • Define information security
  65. • Recount the history of computer security, and explain how it evolved into information security
  66. • Define key terms and critical concepts of information security
  67. • List the phases of the security systems development life cycle
  68. • Describe the information security roles of professionals within an organization
  69. Introduction
  70. JamesAnderson, executiveconsultantatEmagined Security, Inc., believesinformationsecurityin
  71. an enterprise is a “well-informed sense of assurance that the information risks and controls are in
  72. balance.” He is not alone in his perspective. Many information security practitioners recognize
  73. that aligning information security needs with business objectives must be the top priority.
  74. For more information on Emagined Security Consulting, visit www.emagined.com.
  75. This chapter’s opening scenario illustrates that information risks and controls may not be in
  76. balance at SLS. Though Amy works in a technical support role to help users with their prob-
  77. lems, she did not recall her training about malicious e-mail attachments, such as worms or
  78. viruses, and fell victim to this form of attack herself. Understanding how malware might be
  79. the cause of a company’s problems is an important skill for information technology (IT) sup-
  80. port staff as well as users. SLS’s management also shows signs of confusion and seems to have
  81. no idea how to contain this kind of incident. If you were in Amy’s place and were faced with
  82. a similar situation, what would you do? How would you react? Would it occur to you that
  83. something far more insidious than a technical malfunction was happening at your company?
  84. As you explore the chapters of this book and learn more about information security, you will
  85. become more capable of answering these questions. But, before you can begin studying details
  86. about the discipline of information security, you must first know its history and evolution.
  87. The History of Information Security
  88. Key Term
  89. computer security In the early days of computers, this term specified the need to secure the
  90. physical location of computer technology from outside threats. This term later came to represent
  91. all actions taken to preserve computer systems from losses. It has evolved into the current
  92. concept of information security as the scope of protecting information in an organization has
  93. expanded.
  94. The history of information security begins with the concept of computer security. The
  95. need for computer security arose during World War II when the first mainframe computers
  96. were developed and used to aid computations for communication code breaking, as shown in
  97. The History of Information Security 3
  98. Copyright 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  99. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
  100. Figure 1-1. Multiple levels of security were implemented to protect these devices and the mis-
  101. sions they served. This required new processes as well as tried-and-true methods needed to
  102. maintain data confidentiality. Access to sensitive military locations, for example, was con-
  103. trolled by means of badges, keys, and the facial recognition of authorized personnel by secu-
  104. rity guards. The growing need to maintain national security eventually led to more complex
  105. and technologically sophisticated computer security safeguards.
  106. During these early years, information security was a straightforward process composed pre-
  107. dominantly of physical security and simple document classification schemes. The primary
  108. threats to security were physical theft of equipment, espionage against products of the systems,
  109. and sabotage. One of the first documented security problems that fell outside these categories
  110. occurred in the early 1960s, when a systems administrator was working on a MOTD (mes-
  111. sage of the day) file and another administrator was editing the password file. A software glitch
  112. mixed the two files, and the entire password file was printed on every output file. 3
  113. ‡ The 1960s
  114. During the Cold War, many more mainframe computers were brought online to accomplish
  115. more complex and sophisticated tasks. These mainframes required a less cumbersome process
  116. of communication than mailing magnetic tapes between computer centers. In response to this
  117. need, the Department of Defense’s Advanced Research Projects Agency (ARPA) began exam-
  118. ining the feasibility of a redundant, networked communications system to support the mili-
  119. tary’s exchange of information. In 1968, Dr. Larry Roberts developed the ARPANET
  120. 4 Chapter 1
  121. Earlier versions of the German code machine Enigma
  122. were first broken by the Poles in the 1930s. The British
  123. and Americans managed to break later, more complex
  124. versions during World War II. The increasingly complex
  125. versions of the Enigma, especially the submarine or
  126. Unterseeboot version of the Enigma, caused considerable
  127. anguish to Allied forces before finally being cracked. The
  128. information gained from decrypted transmissions was
  129. used to anticipate the actions of German armed forces.
  130. ”Some ask why, if we were reading the Enigma, we did
  131. not win the war earlier. One might ask, instead, when, if
  132. ever, we would have won the war if we hadn’t read it.”
  133. Figure 1-1 The Enigma 1
  134. Source: National Security Agency. Used with permission. 2
  135. Copyright 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  136. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
  137. 1
  138. project. Figure 1-2 is an excerpt from his Program Plan. ARPANET evolved into what we
  139. now know as the Internet, and Roberts became known as its founder.
  140. For more information on Dr. Roberts and the history of the Internet, visit his Web site at
  141. www.packet.cc.
  142. ‡ The 1970s and 80s
  143. During the next decade, ARPANET became more popular and saw wider use, increasing the
  144. potential for its misuse. In 1973, Internet pioneer Robert M. Metcalfe (pictured in Figure 1-3)
  145. identified fundamental problems with ARPANET security. As one of the creators of Ethernet,
  146. a dominant local area networking protocol, he knew that individual remote sites did not
  147. have sufficient controls and safeguards to protect data from unauthorized remote users.
  148. Other problems abounded: vulnerability of password structure and formats; lack of safety
  149. procedures for dial-up connections; and nonexistent user identification and authorizations.
  150. Phone numbers were widely distributed and openly publicized on the walls of phone
  151. booths, giving hackers easy access to ARPANET. Because of the range and frequency of
  152. computer security violations and the explosion in the numbers of hosts and users on
  153. ARPANET, network security was commonly referred to as network insecurity. 5 In 1978,
  154. Richard Bisbey and Dennis Hollingworth, two researchers in the Information Sciences Insti-
  155. tute at the University of Southern California, published a study entitled “Protection Analysis:
  156. Final Report.” It focused on a project undertaken by ARPA to understand and detect
  157. The History of Information Security 5
  158. Figure 1-2 Development of the ARPANET
  159. Source: Courtesy of Dr. Lawrence Roberts. Used with permission. 4
  160. Copyright 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  161. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
  162. vulnerabilities in operating system security. For a timeline that includes this and other semi-
  163. nal studies of computer security, see Table 1-1.
  164. Security that went beyond protecting the physical location of computing devices began with a
  165. single paper sponsored by the Department of Defense. Rand Report R-609 attempted to
  166. define the multiple controls and mechanisms necessary for the protection of a computerized
  167. data processing system. The document was classified for almost ten years, and is now consid-
  168. ered to be the paper that started the study of computer security.
  169. The security—or lack thereof—of systems sharing resources inside the Department of Defense
  170. was brought to the attention of researchers in the spring and summer of 1967. At that time,
  171. systems were being acquired at a rapid rate and securing them was a pressing concern both
  172. for the military and defense contractors.
  173. In June 1967, ARPA formed a task force to study the process of securing classified informa-
  174. tion systems. The task force was assembled in October 1967 and met regularly to formulate
  175. recommendations, which ultimately became the contents of Rand Report R-609. 6 The docu-
  176. ment was declassified in 1979 and released as Rand Report R-609-1. The content of the two
  177. documents is identical with the exception of two transmittal memorandums.
  178. For more information on the Rand Report, visit www.rand.org/pubs/reports/R609-1.html and
  179. click the Read Online Version button.
  180. 6 Chapter 1
  181. Figure 1-3 Dr. Metcalfe receiving the National Medal of Technology
  182. Source: U.S. Department of Commerce. Used with permission.
  183. Copyright 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  184. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
  185. 1
  186. Rand Report R-609 was the first widely recognized published document to identify the role of
  187. management and policy issues in computer security. It noted that the wide use of networking
  188. components in military information systems introduced security risks that could not be miti-
  189. gated by the routine practices then used to secure these systems. Figure 1-4 shows an illustration
  190. of computer network vulnerabilities from the 1979 release of this document. This paper sig-
  191. naled a pivotal moment in computer security history—the scope of computer security expanded
  192. significantly from the safety of physical locations and hardware to include:
  193. Securing the data
  194. Limiting random and unauthorized access to that data
  195. Involving personnel from multiple levels of the organization in information security
  196. MULTICS Much of the early research on computer security centered on a system called
  197. Multiplexed Information and Computing Service (MULTICS). Although it is now obsolete,
  198. The History of Information Security 7
  199. Date Document
  200. 1968 Maurice Wilkes discusses password security in Time-Sharing Computer Systems.
  201. 1970 Willis H. Ware authors the report Security Controls for Computer Systems: Report of Defense Science
  202. Board Task Force on Computer Security - RAND Report R-609, which was not declassified until 1979. It
  203. became known as the seminal work identifying the need for computer security.
  204. 1973 Schell, Downey, and Popek examine the need for additional security in military systems in Preliminary
  205. Notes on the Design of Secure Military Computer Systems.
  206. 1975 The Federal Information Processing Standards (FIPS) examines DES (Digital Encryption Standard) in
  207. the Federal Register.
  208. 1978 Bisbey and Hollingworth publish their study “Protection Analysis: Final Report,” which discussed the
  209. Protection Analysis project created by ARPA to better understand the vulnerabilities of operating
  210. system security and examine the possibility of automated vulnerability detection techniques in
  211. existing system software. 7
  212. 1979 Morris and Thompson author “Password Security: A Case History,” published in the Communications
  213. of the Association for Computing Machinery (ACM). The paper examined the design history of a
  214. password security scheme on a remotely accessed, time-sharing system.
  215. 1979 Dennis Ritchie publishes “On the Security of UNIX” and “Protection of Data File Contents,” which
  216. discussed secure user IDs, secure group IDs, and the problems inherent in the systems.
  217. 1982 The U.S. Department of Defense Computer Security Evaluation Center publishes the first version of
  218. the Trusted Computer Security (TCSEC) documents, which came to be known as the Rainbow Series.
  219. 1984 Grampp and Morris write “The UNIX System: UNIX Operating System Security.” In this report, the
  220. authors examined four “important handles to computer security:” physical control of premises and
  221. computer facilities, management commitment to security objectives, education of employees, and
  222. administrative procedures aimed at increased security. 8
  223. 1984 Reeds and Weinberger publish “File Security and the UNIX System Crypt Command.” Their premise
  224. was: “No technique can be secure against wiretapping or its equivalent on the computer. Therefore
  225. no technique can be secure against the system administrator or other privileged users...the naive user
  226. has no chance.” 9
  227. 1992 Researchers for the Internet Engineering Task Force, working at the Naval Research Laboratory,
  228. develop the Simple Internet Protocol Plus (SIPP) Security protocols, creating what is now known as
  229. IPSEC security.
  230. Table 1-1 Key Dates in Information Security
  231. © Cengage Learning 2015
  232. Copyright 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  233. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
  234. MULTICS is noteworthy because it was the first operating system to integrate security into
  235. its core functions. It was a mainframe, time-sharing operating system developed in the mid-
  236. 1960s by a consortium of General Electric (GE), Bell Labs, and the Massachusetts Institute
  237. of Technology (MIT).
  238. For more information on the MULTICS project, visit web.mit.edu/multics-history.
  239. In 1969, not long after the restructuring of the MULTICS project, several of its developers (Ken
  240. Thompson, Dennis Ritchie, Rudd Canaday, and Doug McIlroy) created a new operating sys-
  241. tem called UNIX. While the MULTICS system implemented multiple security levels and pass-
  242. words, the UNIX system did not. Its primary function, text processing, did not require the
  243. same level of security as that of its predecessor. Not until the early 1970s did even the simplest
  244. component of security, the password function, become a component of UNIX.
  245. In the late1970s, the microprocessor brought the personal computer (PC) and a new age of com-
  246. puting. The PC became the workhorse of modern computing, moving it out of the data center.
  247. This decentralization of data processing systems in the 1980s gave rise to networking—the inter-
  248. connecting of PCs and mainframe computers, which enabled the entire computing community to
  249. make all its resources work together.
  250. 8 Chapter 1
  251. Radiation
  252. Radiation
  253. Radiation
  254. Crosstalk Crosstalk
  255. Processor
  256. Switching
  257. center
  258. Communication
  259. lines
  260. Files
  261. Theft
  262. Copying
  263. Unauthorized access
  264. Failure of protection circuits
  265. contribute to software failures
  266. Radiation
  267. Computer Network Vulnerabilities
  268. Radiation
  269. Taps
  270. Taps
  271. Hardware
  272. Replace supervisor
  273. Reveal protective measures
  274. Operator
  275. Improper connections
  276. Cross coupling
  277. Hardware
  278. Attachment of recorders
  279. Bugs
  280. Access
  281. Remote
  282. Consoles
  283. Identification
  284. Authentication
  285. Subtle software
  286. modifications
  287. User
  288. Disable hardware devices
  289. Use stand-alone utility programs
  290. Maintenance Man
  291. Disable protective features
  292. Provide “ins”
  293. Reveal protective measures
  294. Systems Programmer
  295. Failure of protection features
  296. Access control
  297. Bounds control
  298. etc.
  299. Software
  300. Figure 1-4 Illustration of computer network vulnerabilities from Rand Report R-609
  301. Source: Rand Report R-609. Used with permission. 10
  302. Copyright 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  303. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
  304. 1
  305. In the mid-1980s, the U.S. Government passed several key pieces of legislation that formalized
  306. the recognition of computer security as a critical issue for federal information systems. The
  307. Computer Fraud and Abuse Act of 1986 and the Computer Security Act of 1987 defined com-
  308. puter security and specified responsibilities and associated penalties. These laws and others are
  309. covered in Chapter 3, “Legal, Ethical, and Professional Issues in Information Security.”
  310. In 1988, the Defense Advanced Research Projects Agency (DARPA) within the Department of
  311. Defense created the Computer Emergency Response Team (CERT) to address network security.
  312. ‡ The 1990s
  313. At the close of the 20th century, networks of computers became more common, as did the need
  314. to connect them to each other. This gave rise to the Internet, the first global network of net-
  315. works. The Internet was made available to the general public in the 1990s after decades of
  316. being the domain of government, academia, and dedicated industry professionals. The Internet
  317. brought connectivity to virtually all computers that could reach a phone line or an Internet-
  318. connected local area network (LAN). After the Internet was commercialized, the technology
  319. became pervasive, reaching almost every corner of the globe with an expanding array of uses.
  320. Since its inception as a tool for sharing Defense Department information, the Internet has
  321. become an interconnection of millions of networks. At first, these connections were based
  322. on de facto standards because industry standards for interconnected networks did not exist.
  323. These de facto standards did little to ensure the security of information, though some degree
  324. of security was introduced as precursor technologies were widely adopted and became indus-
  325. try standards. However, early Internet deployment treated security as a low priority. In fact,
  326. many problems that plague e-mail on the Internet today result from this early lack of secu-
  327. rity. At that time, when all Internet and e-mail users were presumably trustworthy computer
  328. scientists, mail server authentication and e-mail encryption did not seem necessary. Early
  329. computing approaches relied on security that was built into the physical environment of the
  330. data center that housed the computers. As networked computers became the dominant style
  331. of computing, the ability to physically secure a networked computer was lost, and the stored
  332. information became more exposed to security threats.
  333. In 1993, the first DEFCON conference was held in Las Vegas. Originally it was established
  334. as a gathering for people interested in information security, including authors, lawyers, gov-
  335. ernment employees, and law enforcement officials. A compelling topic was the involvement
  336. of hackers in creating an interesting venue for the exchange of information between two
  337. adversarial groups—the “white hats” of law enforcement and security professionals and the
  338. “black hats” of hackers and computer criminals.
  339. In the late 1990s and into the 2000s, many large corporations began publicly integrating
  340. security into their organizations. Antivirus products became extremely popular.
  341. ‡ 2000 to Present
  342. Today, the Internet brings millions of unsecured computer networks into continuous commu-
  343. nication with each other. The security of each computer’s stored information is contingent on
  344. the security level of every other computer to which it is connected. Recent years have seen a
  345. growing awareness of the need to improve information security, as well as a realization that
  346. information security is important to national defense. The growing threat of cyberattacks has
  347. made governments and companies more aware of the need to defend the computerized
  348. The History of Information Security 9
  349. Copyright 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  350. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
  351. control systems of utilities and other critical infrastructure. Another growing concern is the
  352. threat of nation-states engaging in information warfare, and the possibility that business and
  353. personal information systems could become casualties if they are undefended. Since 2000,
  354. Sarbanes-Oxley and other laws related to privacy and corporate responsibility have affected
  355. computer security.
  356. The attack on the World Trade Centers on September 11, 2001 resulted in major legislation
  357. changes related to computer security, specifically to facilitate law enforcement’s ability to col-
  358. lect information about terrorism. The USA PATRIOT Act of 2001 and its follow-up laws,
  359. the USA PATRIOT Improvement and Reauthorization Act of 2005 and the PATRIOT
  360. Sunsets Act of 2011, are discussed in Chapter 3.
  361. For more information on the history of computer security, visit the NIST Computer Security site at
  362. http://csrc.nist.gov/publications/history/. NIST is the National Institute of Standards and
  363. Technology.
  364. What Is Security?
  365. Key Terms
  366. C.I.A. triangle The industry standard for computer security since the development of the
  367. mainframe. The standard is based on three characteristics that describe the utility of information:
  368. confidentiality, integrity, and availability.
  369. communications security The protection of all communications media, technology, and
  370. content.
  371. information security Protection of the confidentiality, integrity, and availability of information
  372. assets, whether in storage, processing, or transmission, via the application of policy, education,
  373. training and awareness, and technology.
  374. network security A subset of communications security; the protection of voice and data
  375. networking components, connections, and content.
  376. physical security The protection of physical items, objects, or areas from unauthorized access
  377. and misuse.
  378. security A state of being secure and free from danger or harm. Also, the actions taken to make
  379. someone or something secure.
  380. Security is protection. Protection from adversaries—those who would do harm, intentionally
  381. or otherwise—is the ultimate objective of security. National security, for example, is a multi-
  382. layered system that protects the sovereignty of a state, its assets, its resources, and its people.
  383. Achieving the appropriate level of security for an organization also requires a multifaceted sys-
  384. tem. A successful organization should have multiple layers of security in place to protect its
  385. operations, physical infrastructure, people, functions, communications, and information.
  386. The Committee on National Security Systems (CNSS) defines information security as the pro-
  387. tection of information and its critical elements, including the systems and hardware that use,
  388. store, and transmit the information. 11 Figure 1-5 shows that information security includes the
  389. broad areas of information security management, data security, and network security. The
  390. CNSS model of information security evolved from a concept developed by the computer secu-
  391. rity industry called the C.I.A. triangle. The C.I.A. triangle (see Figure 1-6) has been the
  392. 10 Chapter 1
  393. Copyright 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  394. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
  395. 1
  396. standard for computer security in both industry and government since the development of the
  397. mainframe. This standard is based on the three characteristics of information that give it value
  398. to organizations: confidentiality, integrity, and availability. The security of these three charac-
  399. teristics is as important today as it has always been, but the C.I.A. triangle model is generally
  400. viewed as no longer adequate in addressing the constantly changing environment. The threats
  401. to the confidentiality, integrity, and availability of information have evolved into a vast collec-
  402. tion of events, including accidental or intentional damage, destruction, theft, unintended or
  403. unauthorized modification, or other misuse from human or nonhuman threats. This vast
  404. array of constantly evolving threats has prompted the development of a more robust model
  405. that addresses the complexities of the current information security environment. The
  406. expanded model consists of a list of critical characteristics of information, which are described
  407. in the next section. C.I.A. triangle terminology is used in this chapter because of the breadth
  408. of material that is based on it.
  409. For more information on CNSS, visit www.cnss.gov and click the history link.
  410. ‡ Key Information Security Concepts
  411. This book uses many terms and concepts that are essential to any discussion of information
  412. security. Some of these terms are illustrated in Figure 1-7; all are covered in greater detail in
  413. subsequent chapters.
  414. Access A subject or object’s ability to use, manipulate, modify, or affect another sub-
  415. ject or object. Authorized users have legal access to a system, whereas hackers must
  416. gain illegal access to a system. Access controls regulate this ability.
  417. Asset The organizational resource that is being protected. An asset can be logical, such
  418. as a Web site, software information, or data; or an asset can be physical, such as a
  419. person, computer system, hardware, or other tangible object. Assets, particularly
  420. information assets, are the focus of what security efforts are attempting to protect.
  421. What Is Security? 11
  422. Confidentiality
  423. Computer Security
  424. Data Security
  425. Network Security
  426. Integrity
  427. POLICY
  428. Management of
  429. Information Security
  430. Information Security
  431. Governance
  432. Availability
  433. Figure 1-5 Components of information security
  434. Data
  435. &
  436. Services
  437. Availability
  438. Confidentiality
  439. Integrity
  440. Figure 1-6 The C.I.A. triangle
  441. © Cengage Learning 2015
  442. © Cengage Learning 2015
  443. Copyright 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  444. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
  445. Attack An intentional or unintentional act that can damage or otherwise compromise
  446. information and the systems that support it. Attacks can be active or passive, intentional
  447. or unintentional, and direct or indirect. Someone who casually reads sensitive informa-
  448. tion not intended for his or her use is committing a passive attack. A hacker attempting
  449. to break into an information system is an intentional attack. A lightning strike that
  450. causes a building fire is an unintentional attack. A direct attack is perpetrated by a
  451. hacker using a PC to break into a system. An indirect attack is a hacker compromising a
  452. system and using it to attack other systems—for example, as part of a botnet (slang for
  453. robot network). This group of compromised computers, running software of the attack-
  454. er’s choosing, can operate autonomously or under the attacker’s direct control to attack
  455. systems and steal user information or conduct distributed denial-of-service attacks.
  456. Direct attacks originate from the threat itself. Indirect attacks originate from a compro-
  457. mised system or resource that is malfunctioning or working under the control of a threat.
  458. Control, safeguard, or countermeasure Security mechanisms, policies, or procedures
  459. that can successfully counter attacks, reduce risk, resolve vulnerabilities, and otherwise
  460. improve security within an organization. The various levels and types of controls are
  461. discussed more fully in the following chapters.
  462. 12 Chapter 1
  463. Attack: Ima Hacker downloads an exploit from MadHackz
  464. web site and then accesses buybay’s Web site. Ima then applies
  465. the script, which runs and compromises buybay's security controls
  466. and steals customer data. These actions cause buybay to
  467. experience a loss.
  468. Threat: Theft
  469. Threat agent: Ima Hacker
  470. Exploit: Script from MadHackz
  471. Web site
  472. Asset: buybay’s
  473. customer database
  474. Vulnerability: Buffer
  475. overflow in online
  476. database Web interface
  477. Figure 1-7 Key concepts in information security
  478. Sources (top left to bottom right): © iStockphoto/tadija, Internet Explorer, © iStockphoto/darrenwise, Internet Explorer, Microsoft Excel.
  479. Copyright 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  480. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
  481. 1
  482. Exploit A technique used to compromise a system. This term can be a verb or a noun.
  483. Threat agents may attempt to exploit a system or other information asset by using it
  484. illegally for their personal gain. Or, an exploit can be a documented process to take
  485. advantage of a vulnerability or exposure, usually in software, that is either inherent in
  486. the software or created by the attacker. Exploits make use of existing software tools or
  487. custom-made software components.
  488. Exposure A condition or state of being exposed; in information security, exposure
  489. exists when a vulnerability is known to an attacker.
  490. Loss A single instance of an information asset suffering damage or destruction, unin-
  491. tended or unauthorized modification or disclosure, or denial of use. When an organi-
  492. zation’s information is stolen, it has suffered a loss.
  493. Protection profile or security posture The entire set of controls and safeguards, including
  494. policy, education, training and awareness, and technology, that the organization imple-
  495. ments to protect the asset. The terms are sometimes used interchangeably with the term
  496. security program, although a security program often comprises
  497. managerial aspects of security, including planning, personnel, and subordinate programs.
  498. Risk The probability of an unwanted occurrence, such as an adverse event or loss.
  499. Organizations must minimize risk to match their risk appetite—the quantity and
  500. nature of risk they are willing to accept.
  501. Subjects and objects A computer can be either the subject of an attack—an agent entity
  502. used to conduct the attack—or the object of an attack: the target entity, as shown in
  503. Figure1-8.A computer can also be both the subject and object of an attack. For example, it
  504. can be compromised by an attack (object) and then used to attack other systems (subject).
  505. Threat A category of objects, people, or other entities that represents a danger to an
  506. asset. Threats are always present and can be purposeful or undirected. For example,
  507. hackers purposefully threaten unprotected information systems, while severe storms
  508. incidentally threaten buildings and their contents.
  509. Threat agent The specific instance or a component of a threat. For example, the threat of
  510. “trespass or espionage” is a category of potential danger to information assets, while
  511. “external professional hacker” (like Kevin Mitnick, who was convicted of
  512. hacking into phone systems) is a specific threat agent. A lightning strike, hailstorm,
  513. ortornado isa threatagent that is part ofthe threat known as “acts of God/acts ofnature.”
  514. Vulnerability A weakness or fault in a system or protection mechanism that opens it to
  515. attack or damage. Some examples of vulnerabilities are a flaw in a software
  516. What Is Security? 13
  517. Hacker using a
  518. computer as the
  519. subject of an attack
  520. Hacker request
  521. Stolen information
  522. Remote system that is
  523. the object of an attack
  524. Internet
  525. Figure 1-8 Computer as the subject and object of an attack
  526. © Cengage Learning 2015
  527. Copyright 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  528. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
  529. package, an unprotected system port, and an unlocked door. Some well-known
  530. vulnerabilities have been examined, documented, and published; others remain
  531. latent (or undiscovered).
  532. ‡ Critical Characteristics of Information
  533. Key Terms
  534. accuracy An attribute of information that describes how data is free of errors and has the value
  535. that the user expects.
  536. authenticity An attribute of information that describes how data is genuine or original rather
  537. than reproduced or fabricated.
  538. availability An attribute of information that describes how data is accessible and correctly
  539. formatted for use without interference or obstruction.
  540. confidentiality An attribute of information that describes how data is protected from disclosure
  541. or exposure to unauthorized individuals or systems.
  542. integrity An attribute of information that describes how data is whole, complete, and uncorrupted.
  543. possession An attribute of information that describes how the data’s ownership or control is
  544. legitimate or authorized.
  545. utility An attribute of information that describes how data has value or usefulness for an end
  546. purpose.
  547. The value of information comes from the characteristics it possesses. When a characteristic of
  548. information changes, the value of that information either increases or, more commonly,
  549. decreases. Some characteristics affect information’s value to users more than others, depend-
  550. ing on circumstances. For example, timeliness of information can be a critical factor because
  551. information loses much or all of its value when delivered too late. Though information secu-
  552. rity professionals and end users share an understanding of the characteristics of information,
  553. tensions can arise when the need to secure information from threats conflicts with the end
  554. users’ need for unhindered access to it. For instance, end users may perceive a .1-second
  555. delay in the computation of data to be an unnecessary annoyance. Information security pro-
  556. fessionals, however, may perceive .1 seconds as a minor delay that enables an important task,
  557. like data encryption. Each critical characteristic of information—that is, the expanded C.I.A.
  558. triangle—is defined in the following sections.
  559. Availability Availability enables authorized users—people or computer systems—to
  560. access information without interference or obstruction and to receive it in the required for-
  561. mat. Consider, for example, research libraries that require identification before entrance.
  562. Librarians protect the contents of the library so that they are available only to authorized
  563. patrons. The librarian must accept a patron’s identification before the patron has free access
  564. to the book stacks. Once authorized patrons have access to the stacks, they expect to find
  565. the information they need in a usable format and familiar language. In this case, the infor-
  566. mation is bound in a book that is written in English.
  567. Accuracy Information has accuracy when it is free from mistakes or errors and has the
  568. value that the end user expects. If information has been intentionally or unintentionally
  569. modified, it is no longer accurate. Consider a checking account, for example. You assume
  570. that the information in your account is an accurate representation of your finances. Incor-
  571. rect information in the account can result from external or internal errors. If a bank teller,
  572. for instance, mistakenly adds or subtracts too much money from your account, the value of
  573. 14 Chapter 1
  574. Copyright 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  575. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
  576. 1
  577. the information is changed. Or, you may accidentally enter an incorrect amount into your
  578. account register. Either way, an inaccurate bank balance could cause you to make other
  579. mistakes, such as bouncing a check.
  580. Authenticity Authenticity of information is the quality or state of being genuine or origi-
  581. nal, rather than a reproduction or fabrication. Information is authentic when it is in the same
  582. state in which it was created, placed, stored, or transferred. Consider for a moment some com-
  583. mon assumptions about e-mail. When you receive e-mail, you assume that a specific individual
  584. or group created and transmitted the e-mail—you assume you know its origin. This is not
  585. always the case. E-mail spoofing, the act of sending an e-mail message with a modified field, is
  586. a problem for many people today because the modified field often is the address of the origina-
  587. tor. Spoofing the sender’s address can fool e-mail recipients into thinking that the messages are
  588. legitimate traffic, thus inducing them to open e-mail they otherwise might not have.
  589. Confidentiality Information has confidentiality when it is protected from disclosure or
  590. exposure to unauthorized individuals or systems. Confidentiality ensures that only users
  591. with the rights and privileges to access information are able to do so. When unauthorized
  592. individuals or systems can view information, confidentiality is breached. To protect the con-
  593. fidentiality of information, you can use several measures, including the following:
  594. Information classification
  595. Secure document storage
  596. Application of general security policies
  597. Education of information custodians and end users
  598. Confidentiality, like most characteristics of information, is interdependent with other charac-
  599. teristics and is most closely related to the characteristic known as privacy. The relationship
  600. between these two characteristics is covered in more detail in Chapter 3, “Legal, Ethical,
  601. and Professional Issues in Information Security.”
  602. The value of information confidentiality is especially high for personal information about
  603. employees, customers, or patients. People who transact with an organization expect that their
  604. personal information will remain confidential, whether the organization is a federal agency,
  605. such as the Internal Revenue Service, or a business. Problems arise when companies disclose
  606. confidential information. Sometimes this disclosure is intentional, but disclosure of confiden-
  607. tial information also happens by mistake—for example, when confidential information is mis-
  608. takenly e-mailed to someone outside the organization rather than to someone inside it.
  609. Other examples of confidentiality breaches are an employee throwing away a document of
  610. critical information without shredding it, or a hacker who successfully breaks into an inter-
  611. nal database of a Web-based organization and steals sensitive information about the clients,
  612. such as names, addresses, and credit card numbers.
  613. As a consumer, you give up pieces of personal information in exchange for convenience or
  614. value almost daily. By using a “members” card at a grocery store, you disclose some of your
  615. spending habits. When you fill out an online survey, you exchange pieces of your personal his-
  616. tory for access to online privileges. When you sign up for a free magazine, Web resource, or free
  617. software application, you provide personally identifiable information (PII). The bits and pieces
  618. of personal information you disclose are copied, sold, replicated, distributed, and eventually
  619. coalesced into profiles and even complete dossiers of yourself and your life.
  620. What Is Security? 15
  621. Copyright 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  622. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
  623. Integrity Information has integrity when it is whole, complete, and uncorrupted. The integ-
  624. rity of information is threatened when it is exposed to corruption, damage, destruction, or other
  625. disruption of its authentic state. Corruption can occur while information is being stored or trans-
  626. mitted. Many computer viruses and worms are designed with the explicit purpose of corrupting
  627. data. For this reason, a key method for detecting a virus or worm is to look for changes in file
  628. integrity, as shown by the file size. Another key method of assuring information integrity is file
  629. hashing,inwhicha fileisread bya specialalgorithmthatusesthe bit valuesinthefiletocompute
  630. a single large number called a hash value. The hash value for any combination of bits is unique.
  631. 16 Chapter 1
  632. Unintentional Disclosures
  633. The number of unintentional information releases due to malicious attacks is sub-
  634. stantial. Millions of people lose information to hackers and malware-focused attacks
  635. annually. However, organizations occasionally lose, misplace, or inadvertently
  636. release information in an event not caused by hackers or other electronic attacks.
  637. In January 2008, GE Money, a division of General Electric, revealed that a data
  638. backup tape with credit card data from approximately 650,000 customers and over
  639. 150,000 Social Security numbers went missing from a records management com-
  640. pany’s storage facility. Approximately 230 retailers were affected when Iron Moun-
  641. tain, Inc., announced it couldn’t find a magnetic tape. 12
  642. In February 2005, the data aggregation and brokerage firm ChoicePoint revealed that
  643. it had been duped into releasing personal information about 145,000 people to identity
  644. thieves during 2004. The perpetrators used stolen identities to create ostensibly legiti-
  645. mate business entities, which then subscribed to ChoicePoint to acquire the data fraudu-
  646. lently.Thecompanyreportedthatthecriminalsopenedmanyaccountsandrecordedper-
  647. sonal information, including names, addresses, and identification numbers. They did so
  648. without using any network or computer-based attacks; it was simple fraud. The fraud
  649. was feared to have allowed the perpetrators to arrange hundreds of identity thefts.
  650. The giant pharmaceutical organization Eli Lilly and Co. released the e-mail
  651. addresses of 600 patients to one another in 2001. The American Civil Liberties Union
  652. (ACLU) denounced this breach of privacy, and information technology industry ana-
  653. lysts noted that it was likely to influence the public debate on privacy legislation.
  654. The company claimed the mishap was caused by a programming error that
  655. occurred when patients who used a specific drug produced by Lilly signed up for an
  656. e-mail service to access company support materials.
  657. In another incident in 2005, the intellectual property of Jerome Stevens Pharma-
  658. ceuticals, a small prescription drug manufacturer from New York, was compromised
  659. when the U.S. Food and Drug Administration (FDA) released documents the com-
  660. pany had filed with the agency. It remains unclear whether the release was pur-
  661. poseful or a simple error, but the company secrets were posted to a public Web
  662. site for several months before being removed.
  663. OFFLINE
  664. Copyright 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  665. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
  666. 1
  667. If a computer system performs the same hashing algorithm on a file and obtains a different num-
  668. ber than the file’s recorded hash value, the file has been compromised and the integrity of the
  669. information is lost. Information integrity is the cornerstone of information systems because
  670. information is of no value or use if users cannot verify its integrity. File hashing and hash values
  671. are examined in detail in Chapter 8, “Cryptography.”
  672. For more details on information losses caused by attacks, visit Wikipedia.org and search on the
  673. terms “Data breach” and “Timeline of Computer Security Hacker History.”
  674. File corruption is not necessarily the result of external forces, such as hackers. Noise in the
  675. transmission media, for instance, can also cause data to lose its integrity. Transmitting data on
  676. a circuit with a low voltage level can alter and corrupt the data. Redundancy bits and check bits
  677. can compensate for internal and external threats to the integrity of information. During each
  678. transmission, algorithms, hash values, and error-correcting codes ensure the integrity of the
  679. information. Data whose integrity has been compromised is retransmitted.
  680. Utility The utility of information is the quality or state of having value for some purpose
  681. or end. In other words, information has value when it can serve a purpose. If information
  682. is available but is not in a meaningful format to the end user, it is not useful. For example,
  683. U.S. Census data can quickly become overwhelming and difficult for a private citizen to
  684. interpret; however, for a politician, the same data reveals information about residents in a
  685. district, such as their race, gender, and age. This information can help form a politician’s
  686. next campaign strategy.
  687. Possession The possession of information is the quality or state of ownership or con-
  688. trol. Information is said to be in one’s possession if one obtains it, independent of format
  689. or other characteristics. While a breach of confidentiality always results in a breach of pos-
  690. session, a breach of possession does not always lead to a breach of confidentiality. For
  691. example, assume a company stores its critical customer data using an encrypted file system.
  692. An employee who has quit decides to take a copy of the tape backups and sell the customer
  693. records to the competition. The removal of the tapes from their secure environment is a
  694. breach of possession. But, because the data is encrypted, neither the former employee nor
  695. anyone else can read it without the proper decryption methods; therefore, there is no breach
  696. of confidentiality. Today, people who are caught selling company secrets face increasingly
  697. stiff fines and a strong likelihood of jail time. Also, companies are growing more reluctant
  698. to hire people who have demonstrated dishonesty in their past.
  699. CNSS Security Model
  700. The definition of information security in this text is based in part on the CNSS document
  701. called the National Training Standard for Information Systems Security Professionals,
  702. NSTISSI No. 4011. The hosting organization is the Committee on National Security Systems,
  703. which is responsible for coordinating the evaluation and publication of standards related to
  704. the protection of National Security Systems (NSS). CNSS was originally called the National
  705. Security Telecommunications and Information Systems Security Committee (NSTISSC) when
  706. established in 1990 by National Security Directive (NSD) 42, National Policy for the Security
  707. of National Security Telecommunications and Information Systems. NSTISSI 4011 presents a
  708. CNSS Security Model 17
  709. Copyright 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  710. Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
  711. comprehensive information security model and has become a widely accepted evaluation stan-
  712. dard for the security of information systems. The CNSS standards are expected to be replaced
  713. by the new NIST SP 800-16, “Information Technology Security Training Requirements:
  714. A Role-Based Model for Federal Information Technology/Cyber Security Training,” in the
  715. near future.
  716. For more information on CNSS and its standards, see www.cnss.gov/CNSS/issuances/Instructions
  717. .cfm.
  718. The model, which was created by John McCumber in 1991, provides a graphical representa-
  719. tion of the architectural approach widely used in computer and information security; it is now
  720. known as the McCumber Cube. 14 As shown in Figure 1-9, the McCumber Cube shows three
  721. dimensions. If extrapolated, the three dimensions of each axis become a 3×3×3 cube with 27
  722. cells representing areas that must be addressed to secure today’s information systems. To
  723. ensure system security, each of the 27 areas must be properly addressed during the security
  724. process. For example, the intersection of technology, integrity, and storage requires a control
  725. or safeguard that addresses the need to use technology to protect the integrity of information
  726. while in storage. One such control might be a system for detecting host intrusion that protects
  727. the integrity of information by alerting security administrators to the potential modification of
  728. a critical file. A common omission from such a model is the need for guidelines and policies
  729. that provide direction for the practices and implementations of technologies. The need for pol-
  730. icy is discussed in subsequent chapters of this book.
  731. Key Term
  732. McCumber Cube A graphical representation of the architectural approach widely used in
  733. computer and information security; commonly shown as a cube composed of 3×3×3 cells, similar
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement