Guest User

elsevier_cybersecurity_webinar

a guest
Oct 26th, 2020 (edited)
3,995
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
  1. WEBVTT
  2.  
  3. 1
  4. 00:02:38.430 --> 00:02:39.090
  5. Nick Fowler: You're on mute.
  6.  
  7. 2
  8. 00:02:40.620 --> 00:02:41.130
  9. Nick Fowler: Or unmute
  10.  
  11. 3
  12. 00:02:50.400 --> 00:02:50.700
  13. All right.
  14.  
  15. 4
  16. 00:02:53.040 --> 00:03:10.770
  17. Okere, Kelechi N. (ELS-NYC): Hello, everyone. Thank you. Thank you for joining us. Welcome to the census Security Summit. My name is cultural carry. I am the global director of Elsevier seamless access initiative be the CO moderator of this event, along with my colleague, Daniel Asher from spring and nature.
  18.  
  19. 5
  20. 00:03:12.270 --> 00:03:18.840
  21. Okere, Kelechi N. (ELS-NYC): Before introducing the speaker for our opening remarks, I like to go over some program program logistics.
  22.  
  23. 6
  24. 00:03:19.830 --> 00:03:34.950
  25. Okere, Kelechi N. (ELS-NYC): The theme of today's event is cyber security landscape protecting the scholarly infrastructure speaker BIOS are available on these fancy website you can scan the QR code that you see on your screen to get to the website.
  26.  
  27. 7
  28. 00:03:36.120 --> 00:03:40.890
  29. Okere, Kelechi N. (ELS-NYC): You can also access the website by clicking on the link in the chat box.
  30.  
  31. 8
  32. 00:03:43.080 --> 00:04:00.090
  33. Okere, Kelechi N. (ELS-NYC): Program to there will run from 11am eastern standard time until 3:30pm Eastern Standard Time and the hashtag for this event is sensi security 2020 so please do I engage on social media with the hashtag.
  34.  
  35. 9
  36. 00:04:04.500 --> 00:04:24.510
  37. Okere, Kelechi N. (ELS-NYC): Again, the, the purpose of this virtual Security Summit by the scholarly network security initiative, short for sensi is to discuss security threats to the research ecosystem, with the aim to engender closer collaboration between publishers and academics in dealing with the threats.
  38.  
  39. 10
  40. 00:04:25.680 --> 00:04:37.140
  41. Okere, Kelechi N. (ELS-NYC): So just to go over the program today. We've assembled a list of fantastic experts on this subject to speak to you, and we hope you'll enjoy their presentations.
  42.  
  43. 11
  44. 00:04:37.920 --> 00:04:50.430
  45. Okere, Kelechi N. (ELS-NYC): Will start by with a opening remarks by Nick Fowler, who is the Chief Academic Officer at Elsevier, it will be followed by Corey Roche, who is the Cecil, University of Utah.
  46.  
  47. 12
  48. 00:04:52.050 --> 00:05:03.210
  49. Okere, Kelechi N. (ELS-NYC): Korea will then be followed by a crane household, who's a former FBI director FBI agent and currently the Senior Director of threat research or a Gary and he'll talk to us about
  50.  
  51. 13
  52. 00:05:03.810 --> 00:05:17.790
  53. Okere, Kelechi N. (ELS-NYC): Side hub and other state sponsored or individual bad actors, then we'll break for lunch. When we come back, well lunch for those of us in the US and abroad. For those of you elsewhere.
  54.  
  55. 14
  56. 00:05:18.510 --> 00:05:28.710
  57. Okere, Kelechi N. (ELS-NYC): When we come back we'll be we'll hear from Linda Van Buren, who's the assistant dean for resources and Access Management at the diagram Memorial Library Georgetown University Medical Center.
  58.  
  59. 15
  60. 00:05:29.700 --> 00:05:38.670
  61. Okere, Kelechi N. (ELS-NYC): She'll talk to us about library patrons security and why it's important. Then we'll hear from Joe DeMarco, who is partner divine DeMarco LLP.
  62.  
  63. 16
  64. 00:05:39.180 --> 00:05:47.850
  65. Okere, Kelechi N. (ELS-NYC): About foreign interference in academia, we'll have a break after the break. We'll then we'll hear from Tim Lloyd co lead links.
  66.  
  67. 17
  68. 00:05:48.360 --> 00:06:02.100
  69. Okere, Kelechi N. (ELS-NYC): Who will talk to us about federated authentication how that helps with security that was taught to round things out with a roundtable discussion which we moderated by Rick and this and the university library and Brigham Young University.
  70.  
  71. 18
  72. 00:06:03.300 --> 00:06:15.630
  73. Okere, Kelechi N. (ELS-NYC): Then we'll have the closing remarks and other logistics. So the closing remarks will be given to us by Stephen inch gum, who is the chief publishing and solutions officer, Springer nature.
  74.  
  75. 19
  76. 00:06:17.490 --> 00:06:33.960
  77. Okere, Kelechi N. (ELS-NYC): I want to thank my colleagues that Elsevier, Springer nature. Tell him Francis Brigham Young University HP just associates who contributed to putting this program together. I also want to thank our speakers for the generosity of their time and expertise.
  78.  
  79. 20
  80. 00:06:35.490 --> 00:06:47.250
  81. Okere, Kelechi N. (ELS-NYC): Just to get started in the days of in person conferences, you always had a feel of the room by just looking around and seeing who's in there and talking to people.
  82.  
  83. 21
  84. 00:06:47.730 --> 00:06:58.290
  85. Okere, Kelechi N. (ELS-NYC): So today we have in terms of registrations will have about 16 countries represented about 56 universities 16 publishers and 13 other types of
  86.  
  87. 22
  88. 00:06:58.740 --> 00:07:10.290
  89. Okere, Kelechi N. (ELS-NYC): Organizations are in total. And we got some last minute registrations, which has progressed are about 165 so hopefully all of those people showed up.
  90.  
  91. 23
  92. 00:07:11.730 --> 00:07:21.630
  93. Okere, Kelechi N. (ELS-NYC): Just some housekeeping tips. Again, we thank you for joining. Everyone is on mute the webcast audio will be broadcast through your computer speakers.
  94.  
  95. 24
  96. 00:07:22.500 --> 00:07:33.300
  97. Okere, Kelechi N. (ELS-NYC): We ask you to check your volume and new function as if you cannot hear the webcast will be recording will be available on these fancy website and you'll receive a link to it after the event.
  98.  
  99. 25
  100. 00:07:35.130 --> 00:07:47.160
  101. Okere, Kelechi N. (ELS-NYC): Please use the Q AMP a box to post questions for panelists and except for two presenters questions will be addressed during the Round Table panelists will also answer some questions as we go.
  102.  
  103. 26
  104. 00:07:48.900 --> 00:07:58.260
  105. Okere, Kelechi N. (ELS-NYC): And also use the chat box for comments and general conversation is again not to pose questions because the chat box. Questions can tend to get lost. There
  106.  
  107. 27
  108. 00:08:00.030 --> 00:08:10.110
  109. Okere, Kelechi N. (ELS-NYC): Now for the opening remarks, I like to introduce you to Nick Fowler, who is the chief academic officer Elsevier, and also the co Chair of Sensi. Thank
  110.  
  111. 28
  112. 00:08:12.030 --> 00:08:23.250
  113. Nick Fowler: Thank you. And thank you all for joining us today. It's a real pleasure to be here, but I have to take just a couple of minutes to kick us off, so we can get into the substance of today's sessions.
  114.  
  115. 29
  116. 00:08:24.240 --> 00:08:40.470
  117. Nick Fowler: We're here today because each one of us is a stakeholder in the scholarly ecosystem and IT security is important to all of us during the pandemic. We've seen news articles on hackers targeting universities, especially in the US, Canada and UK
  118.  
  119. 30
  120. 00:08:41.550 --> 00:08:49.320
  121. Nick Fowler: Trying to steal covert 19 vaccine research and other assets. Sadly, this is no surprise education.
  122.  
  123. 31
  124. 00:08:50.370 --> 00:09:04.830
  125. Nick Fowler: Largest sector targeted by cyber attacks, putting our industry ahead of the retail sector university systems. For example, routinely store a tremendous amount of personal data, making them dangerous the attractive targets.
  126.  
  127. 32
  128. 00:09:06.300 --> 00:09:14.370
  129. Nick Fowler: UK is National Cyber Security Center last year published its first report on cyber threats to UK universities.
  130.  
  131. 33
  132. 00:09:14.970 --> 00:09:33.270
  133. Nick Fowler: The report noted that some of the effects of state sponsored espionage includes damage to the value of research, notably in STEM subjects before and in a fallen investment by public or private sector ineffective universities and damage to the UK knowledge advances.
  134.  
  135. 34
  136. 00:09:34.470 --> 00:09:54.180
  137. Nick Fowler: Earlier this year, the White House is Office of Science and Technology Policy or STP gave a presentation on foreign interference among the key takeaways. Was that hidden diversions of intellectual property weaken the US innovation base and threaten our security and economic competitiveness.
  138.  
  139. 35
  140. 00:09:55.770 --> 00:10:02.700
  141. Nick Fowler: This is why the scholarly networks security initiative or sensi for short was for him.
  142.  
  143. 36
  144. 00:10:03.870 --> 00:10:15.960
  145. Nick Fowler: To bring together librarians academic technology and security experts published, large and small, learning societies and anyone with an interest in the scholarly ecosystem.
  146.  
  147. 37
  148. 00:10:16.860 --> 00:10:27.000
  149. Nick Fowler: Together, we aim to solve the cyber challenges threatening the integrity of the scientific record of scholarly systems and the safety of personal data.
  150.  
  151. 38
  152. 00:10:28.560 --> 00:10:43.470
  153. Nick Fowler: So we thank you for your time today. We heard today will inspire and foster greater collaboration between all of us so we can make progress against these very serious challenges. I like to have back nicoletti call introduce our first speaker. Thank you.
  154.  
  155. 39
  156. 00:10:45.960 --> 00:10:47.220
  157. Okere, Kelechi N. (ELS-NYC): All right. Thank you Nick.
  158.  
  159. 40
  160. 00:10:49.590 --> 00:10:53.400
  161. Okere, Kelechi N. (ELS-NYC): Before we introduce our first speaker, I like to just
  162.  
  163. 41
  164. 00:10:55.080 --> 00:10:56.820
  165. Okere, Kelechi N. (ELS-NYC): present you with a poll
  166.  
  167. 42
  168. 00:10:57.900 --> 00:11:00.840
  169. Okere, Kelechi N. (ELS-NYC): So you just see the poll on your screen.
  170.  
  171. 43
  172. 00:11:02.100 --> 00:11:06.420
  173. Okere, Kelechi N. (ELS-NYC): And I'll give it a couple of minutes. I encourage everyone to
  174.  
  175. 44
  176. 00:11:08.130 --> 00:11:15.390
  177. Okere, Kelechi N. (ELS-NYC): Participate in the poll, we just like to get an idea of where people are on this topic.
  178.  
  179. 45
  180. 00:11:27.510 --> 00:11:36.690
  181. Nick Fowler: Let me for some reason I'm getting a note saying hosts and panelists cannot vote. So I'm not sure if something needs to be activated work we've been intercepted already
  182.  
  183. 46
  184. 00:11:38.190 --> 00:11:39.540
  185. Okere, Kelechi N. (ELS-NYC): Yeah, no, I
  186.  
  187. 47
  188. 00:11:41.460 --> 00:11:43.380
  189. Okere, Kelechi N. (ELS-NYC): Think that's that's intentional that
  190.  
  191. 48
  192. 00:11:44.430 --> 00:11:45.990
  193. Okere, Kelechi N. (ELS-NYC): The panelists can participate
  194.  
  195. 49
  196. 00:11:48.420 --> 00:11:48.840
  197. Nick Fowler: Okay.
  198.  
  199. 50
  200. 00:11:48.900 --> 00:11:49.530
  201. Good, yeah.
  202.  
  203. 51
  204. 00:11:50.790 --> 00:11:53.790
  205. Nick Fowler: glad I'm glad you kept me honest here. Thank you.
  206.  
  207. 52
  208. 00:11:54.240 --> 00:11:54.690
  209. Yeah.
  210.  
  211. 53
  212. 00:12:03.930 --> 00:12:07.920
  213. Okere, Kelechi N. (ELS-NYC): Alright 69% of everyone has voted.
  214.  
  215. 54
  216. 00:12:09.060 --> 00:12:16.920
  217. Okere, Kelechi N. (ELS-NYC): Just want to leave it open for a few more seconds just encouraging everyone else to vote.
  218.  
  219. 55
  220. 00:12:18.840 --> 00:12:19.980
  221. Okere, Kelechi N. (ELS-NYC): See if we can get
  222.  
  223. 56
  224. 00:12:22.080 --> 00:12:24.570
  225. Okere, Kelechi N. (ELS-NYC): It's above 70% participation.
  226.  
  227. 57
  228. 00:12:34.470 --> 00:12:40.080
  229. Okere, Kelechi N. (ELS-NYC): Awesome. So we're now up to 74% a few more if you want, please.
  230.  
  231. 58
  232. 00:12:45.120 --> 00:12:47.940
  233. Okere, Kelechi N. (ELS-NYC): And then I'll show you the results as well.
  234.  
  235. 59
  236. 00:12:49.620 --> 00:12:55.980
  237. Okere, Kelechi N. (ELS-NYC): Alright, so just a few more seconds here and then I'll close the poll and show you the results.
  238.  
  239. 60
  240. 00:13:00.630 --> 00:13:01.110
  241. Okere, Kelechi N. (ELS-NYC): Alright.
  242.  
  243. 61
  244. 00:13:07.770 --> 00:13:08.670
  245. Okere, Kelechi N. (ELS-NYC): Alright, so
  246.  
  247. 62
  248. 00:13:11.100 --> 00:13:23.910
  249. Okere, Kelechi N. (ELS-NYC): The question. How concerned are you that cyber security is a threat to the scholarly infrastructure and by infrastructure we mean how peer reviewed literature low prices content is shared funded and trusted
  250.  
  251. 63
  252. 00:13:25.320 --> 00:13:32.430
  253. Okere, Kelechi N. (ELS-NYC): 60% of you said I think about this issue a lot. So this is really good to see that, you know,
  254.  
  255. 64
  256. 00:13:34.230 --> 00:13:40.530
  257. Okere, Kelechi N. (ELS-NYC): Yeah, you know, everyone here most of the majority of people here thinking about this a lot, so
  258.  
  259. 65
  260. 00:13:42.510 --> 00:13:47.010
  261. Okere, Kelechi N. (ELS-NYC): With this, we can then get started and I'll
  262.  
  263. 66
  264. 00:13:48.090 --> 00:13:52.650
  265. Okere, Kelechi N. (ELS-NYC): Call upon by my colleague Daniel Asher to introduce our speaker.
  266.  
  267. 67
  268. 00:13:54.570 --> 00:14:05.970
  269. Daniel Ascher: Thank you collect your neck. So for our keynote speaker. We will now be starting with Corey Roche, the chief information security officer from the University of Utah take away
  270.  
  271. 68
  272. 00:14:10.710 --> 00:14:15.120
  273. Corey Roach: Good morning everyone, or afternoon for our friends in Europe.
  274.  
  275. 69
  276. 00:14:16.410 --> 00:14:18.420
  277. Corey Roach: You see if we can get this up on the screen.
  278.  
  279. 70
  280. 00:14:21.630 --> 00:14:22.320
  281. Right.
  282.  
  283. 71
  284. 00:14:27.240 --> 00:14:32.400
  285. Corey Roach: So as Daniel mentioned, I am Corey Roche, I'm the chief information security officer for the University of Utah.
  286.  
  287. 72
  288. 00:14:32.850 --> 00:14:42.390
  289. Corey Roach: I joined the University of Utah about 22 years ago I kind of came up through the technical ranks and I've focused on information security for much of that time.
  290.  
  291. 73
  292. 00:14:43.290 --> 00:14:50.820
  293. Corey Roach: When I started with the university. There were basically three of us in the information security office for the entire University of Utah.
  294.  
  295. 74
  296. 00:14:51.300 --> 00:14:59.700
  297. Corey Roach: Now we have 34 employees, plus a handful of student interns. I give you a little bit of that background, just because I wanted you to understand that.
  298.  
  299. 75
  300. 00:15:00.090 --> 00:15:06.900
  301. Corey Roach: While I'm not a library and a researcher or a publisher and therefore I don't know everything that is going on in this field.
  302.  
  303. 76
  304. 00:15:07.440 --> 00:15:17.040
  305. Corey Roach: I do know a lot about the threats that you are up against, and I can tell you that as the same kind of goes the chain is only as strong as the weakest link.
  306.  
  307. 77
  308. 00:15:17.790 --> 00:15:27.420
  309. Corey Roach: Unfortunately, in the chain of authors publishers and researchers. The information security around libraries, providing access to that research is a pretty weak link.
  310.  
  311. 78
  312. 00:15:28.860 --> 00:15:41.760
  313. Corey Roach: So hopefully by the end of this presentation, someone will tell me that maybe they're way ahead of me. Or maybe they like what we've brought up here and it's something they want to work along those lines, or maybe it'll just get you thinking and you'll have an idea that's even better.
  314.  
  315. 79
  316. 00:15:43.950 --> 00:15:53.820
  317. Corey Roach: So when I was asked to talk about this topic. Actually, I thought it probably wasn't a terribly interesting one, to be honest. But as I researched it kind of became more and more
  318.  
  319. 80
  320. 00:15:54.360 --> 00:16:11.790
  321. Corey Roach: Intriguing as I got into the details of it and to just restate part of the problem as Nick mentioned, we're concerned partly about the theft of data and the reduction of its value and disrupt disrupting that publishing model, which has a lot of knock on and and downstream effects.
  322.  
  323. 81
  324. 00:16:12.810 --> 00:16:23.580
  325. Corey Roach: It's an interesting problem, partly because there are unique privacy requirements here with lots of industries requiring privacy for one aspect or another but not very many where the
  326.  
  327. 82
  328. 00:16:25.380 --> 00:16:31.020
  329. Corey Roach: Consumer is kind of anonymized from the provider where they don't actually see all of their customers.
  330.  
  331. 83
  332. 00:16:32.100 --> 00:16:39.960
  333. Corey Roach: The assets involved can be fragile, the rate of devaluation of those assets is interesting, you know, having a last once probably doesn't
  334.  
  335. 84
  336. 00:16:40.530 --> 00:16:58.680
  337. Corey Roach: Always reduce it to no value. But the more it has lost the more value reduces there are limited resources involved, you know, libraries are not known for being terribly well funded even universities or parent organizations, if they are well funded oftentimes people like myself.
  338.  
  339. 85
  340. 00:16:59.700 --> 00:17:09.510
  341. Corey Roach: Will direct those resources toward areas where there is risk to the organization, rather than our partners and oftentimes that's not the library.
  342.  
  343. 86
  344. 00:17:10.620 --> 00:17:20.340
  345. Corey Roach: There's also limited legal support, in my experience, there's been practically direct interest a local level, very little at a federal or international level.
  346.  
  347. 87
  348. 00:17:21.240 --> 00:17:32.580
  349. Corey Roach: And although that might increases. Some of the awareness around state sponsored threats comes up, it's not likely to raise soon to the level of prominence is things like child exploitation or extortion and those types of things. So
  350.  
  351. 88
  352. 00:17:33.000 --> 00:17:37.920
  353. Corey Roach: It's pretty unlikely that law enforcement is going to take a strong hand in this in the short run.
  354.  
  355. 89
  356. 00:17:40.140 --> 00:17:46.110
  357. Corey Roach: So as we talk about threats mean one of the things that's important to start with is the threat vector. So how is this happening.
  358.  
  359. 90
  360. 00:17:46.530 --> 00:17:55.860
  361. Corey Roach: Most of this it seems is happening with bots or scripts that are using Valid Credentials in order to scrape information off of publishers
  362.  
  363. 91
  364. 00:17:56.520 --> 00:18:00.540
  365. Corey Roach: And it's important to look at how those credentials are obtained
  366.  
  367. 92
  368. 00:18:01.020 --> 00:18:15.840
  369. Corey Roach: So the first two on that list fishing and social engineering are actually very similar in that it is typically an attacker getting a user to unwittingly give up their credentials. Oftentimes, without the user even realizing that they have done it.
  370.  
  371. 93
  372. 00:18:16.950 --> 00:18:23.670
  373. Corey Roach: Credential reuse. There is when a user uses the same username and password often their email address.
  374.  
  375. 94
  376. 00:18:24.360 --> 00:18:38.820
  377. Corey Roach: For more than one location. And if a less secure site is compromised, then those credentials can be used at other locations and if you know that name happens to end in a.edu it's pretty obvious where to try those credentials.
  378.  
  379. 95
  380. 00:18:40.320 --> 00:18:48.390
  381. Corey Roach: The last one on the list is activism and there's a lot of aspects to that. But in this case, we're talking about people giving up their credentials.
  382.  
  383. 96
  384. 00:18:49.020 --> 00:18:58.020
  385. Corey Roach: For something they believe in, whether that's patriotism for their country or some kind of ethical objection. They know they're not supposed to give away their credentials, but they do it anyway.
  386.  
  387. 97
  388. 00:19:01.200 --> 00:19:11.760
  389. Corey Roach: But anyway, we look at it, we can safely say that this is primarily a people problem and the technology alone is not going to solve that problem technology can help us take reasonable precautions.
  390.  
  391. 98
  392. 00:19:12.330 --> 00:19:19.230
  393. Corey Roach: But we kind of risk, creating an arms race, which we don't want to do and so long as the business model involves
  394.  
  395. 99
  396. 00:19:19.950 --> 00:19:25.980
  397. Corey Roach: allowing access to the data that we're providing and also trying to protect that same data we're unlikely to stop theft entirely
  398.  
  399. 100
  400. 00:19:26.520 --> 00:19:38.160
  401. Corey Roach: So fortunately, this industry is not the first to face this type of a challenge some of the, the first couple that come to mind for me is Motion Picture Association and the recording industry.
  402.  
  403. 101
  404. 00:19:39.690 --> 00:19:48.060
  405. Corey Roach: Today, most of us happily stream music from a subscription service and music piracy is almost dropped off the map.
  406.  
  407. 102
  408. 00:19:48.540 --> 00:19:56.370
  409. Corey Roach: On the other hand, people are frustrated with having to subscribe to multiple services and have a cable bill to access most of movies.
  410.  
  411. 103
  412. 00:19:56.880 --> 00:20:09.780
  413. Corey Roach: And they still don't get the blockbuster movies often until it comes out in a DVD. As a result, movie prior piracy is still pretty rampant. I know my organization fields DMCA complaints almost every day.
  414.  
  415. 104
  416. 00:20:10.590 --> 00:20:19.950
  417. Corey Roach: So I'm not saying either one of those models necessarily solves this problem, but I think there are certainly some takeaways and lessons that we can we can have from their
  418.  
  419. 105
  420. 00:20:20.700 --> 00:20:28.050
  421. Corey Roach: Experience. One would be to be adaptable because technology is always evolving and if you don't innovate, you'll be left behind by those that do
  422.  
  423. 106
  424. 00:20:29.070 --> 00:20:38.160
  425. Corey Roach: One is to kind of show value. If I get more value from the locations that are doing piracy, then why would I go the legitimate route.
  426.  
  427. 107
  428. 00:20:39.360 --> 00:20:47.370
  429. Corey Roach: We don't want to put up any unnecessary barriers, you know, people often like water, follow the path of least resistance and we want that to lead to our product.
  430.  
  431. 108
  432. 00:20:48.090 --> 00:20:54.480
  433. Corey Roach: And then lastly, and this one. Seems kind of obvious to me, but we don't want to be attacking our customers, either.
  434.  
  435. 109
  436. 00:20:54.930 --> 00:21:06.060
  437. Corey Roach: We live in a social media world. And where does get around it seems like the recording industry has learned this one a lot faster than the Motion Picture Association, but
  438.  
  439. 110
  440. 00:21:06.990 --> 00:21:17.250
  441. Corey Roach: Keeping those in mind and keep you in mind that technology is not necessarily a panacea. What I want to propose today is a better way for defending that material, I think.
  442.  
  443. 111
  444. 00:21:19.440 --> 00:21:25.080
  445. Corey Roach: Before we get started, I want to also kind of go over the technologies that I'm going to talk about just so we have some common vocabulary.
  446.  
  447. 112
  448. 00:21:26.700 --> 00:21:33.060
  449. Corey Roach: First off is a web server obviously that serves up content, we're all used to interacting with those
  450.  
  451. 113
  452. 00:21:33.660 --> 00:21:40.980
  453. Corey Roach: There's an automation server those. This allows various technologies to work with each other and create an automated response.
  454.  
  455. 114
  456. 00:21:41.580 --> 00:21:49.050
  457. Corey Roach: There's analysis engines that monitor things like logs and other contextual information to look for bad activity.
  458.  
  459. 115
  460. 00:21:49.980 --> 00:22:04.740
  461. Corey Roach: multifactor devices something most of us have been interacted with. Now it could be an app on your phone. It could be an SMS message could even be a token, you may have used it at your employer bank or even some of the online games use them now.
  462.  
  463. 116
  464. 00:22:06.360 --> 00:22:13.080
  465. Corey Roach: Then we have an identity store which, at minimum is going to be a username and password. Preferably, it has more context than that.
  466.  
  467. 117
  468. 00:22:14.490 --> 00:22:24.990
  469. Corey Roach: And then we have also a log storage which does just that. It stores logs, our customer. In this case, which is likely to be a student research or medical professional, etc.
  470.  
  471. 118
  472. 00:22:25.650 --> 00:22:33.690
  473. Corey Roach: We have a web proxy which downloads data on the user's behalf and it might be there to protect the user or it might be in this case us to anonymize their access
  474.  
  475. 119
  476. 00:22:34.200 --> 00:22:45.330
  477. Corey Roach: And lastly, we have a web application firewall, which is similar to a normal network firewall, but it's intended specifically to protect a web application and often has additional features to do that.
  478.  
  479. 120
  480. 00:22:47.010 --> 00:22:55.530
  481. Corey Roach: So out a typical library. This is kind of the layout for how access to those resources happen. And I'll kind of walk you through it.
  482.  
  483. 121
  484. 00:22:55.860 --> 00:23:07.890
  485. Corey Roach: So the arrows on this diagram, the green ones are the internet. The origins of back end network blue is library logging and purple is publisher logging, but you don't actually need to really remember that, just bear in mind that they're separate processes.
  486.  
  487. 122
  488. 00:23:09.120 --> 00:23:13.860
  489. Corey Roach: So in this design, primarily the focus is privacy not security so
  490.  
  491. 123
  492. 00:23:14.400 --> 00:23:20.820
  493. Corey Roach: A user will request a resource. The first thing that will happen. They'll be asked authenticate which usually is just a username and a password.
  494.  
  495. 124
  496. 00:23:21.240 --> 00:23:36.090
  497. Corey Roach: Once that's done, they can then send their request through and receive the materials back each time something happens on the library side those get logged back to the library server, something that happens on the publisher side gets logged to theirs. So what does that leave us with
  498.  
  499. 125
  500. 00:23:37.440 --> 00:23:45.690
  501. Corey Roach: Typically kind of looks like this. And in essence, the, the library has some limited info and the publishers tend to have even less.
  502.  
  503. 126
  504. 00:23:46.530 --> 00:23:59.880
  505. Corey Roach: It does what it was intended to do. It's great for privacy but not so great for security. So understanding that let's kind of see what happens when this turns into this
  506.  
  507. 127
  508. 00:24:01.230 --> 00:24:08.070
  509. Corey Roach: Now that we have a bot involved in their have bad activity. Typically, in my experience, the process has been that
  510.  
  511. 128
  512. 00:24:08.430 --> 00:24:14.310
  513. Corey Roach: The publisher is the first to notice the anomaly. Usually, that seems like it's manual. Some of them may be automated.
  514.  
  515. 129
  516. 00:24:14.700 --> 00:24:24.450
  517. Corey Roach: But they send a manual notice over to the university and the library staff then will come through their logs and then usually manually contact it and try and get an account turned off.
  518.  
  519. 130
  520. 00:24:25.410 --> 00:24:32.130
  521. Corey Roach: Oftentimes, this can take hours or weeks, which is way too slow. When we're talking about an automated process like a bot.
  522.  
  523. 131
  524. 00:24:34.500 --> 00:24:45.150
  525. Corey Roach: In addition to that, the, the publishers often only really have one recourse. And it's the sledgehammer of turning off access for everyone that's using that proxy. So that pretty broad stroke.
  526.  
  527. 132
  528. 00:24:47.640 --> 00:24:54.360
  529. Corey Roach: So let's step back, though, and kind of look at how a typical web application works in comparison
  530.  
  531. 133
  532. 00:24:54.900 --> 00:25:03.630
  533. Corey Roach: So in a web application, usually that user will request a resource. And again, there'll be asked to authenticate. But we're going to add two factor authentication into this mix.
  534.  
  535. 134
  536. 00:25:03.990 --> 00:25:18.450
  537. Corey Roach: So we get some additional information from the device that they're they're authenticating with and it makes it so that we're pretty sure that the person we're talking to is the one that's the account holder. It's not bulletproof. But it's a lot better than just a username and a password.
  538.  
  539. 135
  540. 00:25:19.590 --> 00:25:24.900
  541. Corey Roach: Once they're done with that, again, they can just kind of request, whatever the resources. They're trying to access
  542.  
  543. 136
  544. 00:25:25.350 --> 00:25:37.470
  545. Corey Roach: And again, all those back end systems log back to a log server. But now we're introducing that analysis engine and it is looking at those logs and it's using context from other sources like the identity store.
  546.  
  547. 137
  548. 00:25:38.550 --> 00:25:55.320
  549. Corey Roach: To look for suspicious behavior if it finds something that it doesn't like it can then use the automation server to send out messages to other parts of the network and say things like your stop talking to this person or ask for additional authentication or slow this process down so
  550.  
  551. 138
  552. 00:25:57.900 --> 00:26:08.250
  553. Corey Roach: This is more what is resulting on the back end from that type of a setup and there's a lot more data. And I'll kind of talk about a little bit about what we can do with that data.
  554.  
  555. 139
  556. 00:26:09.000 --> 00:26:21.240
  557. Corey Roach: But the key feature here is that it provides that automated and rapid response. However, on the downside is that it can be terribly intrusive and provides basically zero privacy.
  558.  
  559. 140
  560. 00:26:22.560 --> 00:26:28.500
  561. Corey Roach: And I believe Linda is going to be talking on a similar subject later in the day. So I'll be interested to hear her take on it, but
  562.  
  563. 141
  564. 00:26:28.890 --> 00:26:39.420
  565. Corey Roach: In this scenario, that seems to be one of the most important parts. So let's kind of look at what we could do if we combine some of that modern web app design with the library layout.
  566.  
  567. 142
  568. 00:26:40.290 --> 00:26:49.410
  569. Corey Roach: So if we put everything back together and we put our proxy back in there. What happens again is the user requests the resource, they get authenticated, they get that second factor.
  570.  
  571. 143
  572. 00:26:50.520 --> 00:26:53.250
  573. Corey Roach: There then able to get their resources passed through again.
  574.  
  575. 144
  576. 00:26:54.000 --> 00:27:01.470
  577. Corey Roach: And then the logging still happens only this time we're going to log the library recess says back to that server and the publisher resources back to their server again.
  578.  
  579. 145
  580. 00:27:02.130 --> 00:27:06.600
  581. Corey Roach: And then the monitoring and analysis can happen and the automated response going to happen.
  582.  
  583. 146
  584. 00:27:07.440 --> 00:27:17.130
  585. Corey Roach: The result is we get something that looks more like this, which is there's rich data on the library side and anonymized data still landing on the publisher side.
  586.  
  587. 147
  588. 00:27:17.940 --> 00:27:26.220
  589. Corey Roach: So I'd like to kind of go over what some of this data is and what we can do with it. So timestamps is pretty obvious. We get a lot more information from the browser in this scenario.
  590.  
  591. 148
  592. 00:27:27.240 --> 00:27:37.470
  593. Corey Roach: We get of course the username and account information, but hopefully here with an identity store. We have a lot more than just their username and password. It might be information about them as a student or an employee.
  594.  
  595. 149
  596. 00:27:38.430 --> 00:27:43.920
  597. Corey Roach: We get the customer IP address of where they're coming from. And the URLs for the material. They've requested.
  598.  
  599. 150
  600. 00:27:44.400 --> 00:27:52.140
  601. Corey Roach: And then we get also information from that two factor device. So we can use those in combination to kind of compare what the two factor device says
  602.  
  603. 151
  604. 00:27:52.380 --> 00:28:00.480
  605. Corey Roach: And what the browser and the IP address from the client side says, and make sure that those all matchup and giving us things like geographic location.
  606.  
  607. 152
  608. 00:28:01.590 --> 00:28:11.580
  609. Corey Roach: We also then get things that are considered user behavior. So that would be stuff like what material. Are they downloading, how do they navigate the site. How quickly are they accessing material.
  610.  
  611. 153
  612. 00:28:12.210 --> 00:28:19.800
  613. Corey Roach: We get biometric data or we can get biometric data which can be things like how quick did they type. How do they move their mouse how random is it
  614.  
  615. 154
  616. 00:28:21.120 --> 00:28:29.490
  617. Corey Roach: And then we can add to that some contextual information either from that identity store or from other places that give us attributes about the user
  618.  
  619. 155
  620. 00:28:30.030 --> 00:28:40.500
  621. Corey Roach: But it can also give us attributes about threats, so we can learn things like, what are the latest attributes of bots that are being used. What are the IP addresses that attackers are using lately.
  622.  
  623. 156
  624. 00:28:41.250 --> 00:28:49.680
  625. Corey Roach: Have we seen this account be compromised recently adding all of that together, then we can start asking some interesting questions we can say things like, you know,
  626.  
  627. 157
  628. 00:28:50.130 --> 00:28:55.080
  629. Corey Roach: We commonly see this user coming in from the US and today it's coming in from Botswana.
  630.  
  631. 158
  632. 00:28:55.860 --> 00:29:08.940
  633. Corey Roach: You know, has there been enough time that they could have traveled from the US to Botswana and actually be there. Have they ever access resources from that country before is there residents on record in that country.
  634.  
  635. 159
  636. 00:29:10.020 --> 00:29:17.280
  637. Corey Roach: You can also move over to behavioral stuff. So it could be, you know, why is a pharmacy major suddenly looking up a lot of material on astrophysics or
  638.  
  639. 160
  640. 00:29:18.300 --> 00:29:27.000
  641. Corey Roach: Why is a medical professional and a hospital suddenly interested in internal combustion things that just don't line up and we can identify fishy behavior.
  642.  
  643. 161
  644. 00:29:28.140 --> 00:29:35.280
  645. Corey Roach: We then have a much broader spectrum of how we can respond to that. Also, so we can do things like send another
  646.  
  647. 162
  648. 00:29:36.780 --> 00:29:43.740
  649. Corey Roach: authentication request or what's known as a capture request, where you get those little pictures or some type of interaction that tries to tell if you're human.
  650.  
  651. 163
  652. 00:29:44.610 --> 00:29:55.980
  653. Corey Roach: We can throttle the user and with this type of material we can actually throttle the user way down if we wanted to, you know, allowing them one paper for a minute is still useful, but it's much slower for something like a bot.
  654.  
  655. 164
  656. 00:29:56.940 --> 00:30:02.280
  657. Corey Roach: We can lock them out temporarily or we can even lock them out permanently. But again,
  658.  
  659. 165
  660. 00:30:03.000 --> 00:30:16.650
  661. Corey Roach: It breaks those logs into two locations and gives us all of this much more useful information. But then on the publisher side, they're still seeing anonymized information and it's not detracting from what they have today, and they still have their controls on that site as well so
  662.  
  663. 166
  664. 00:30:17.790 --> 00:30:28.350
  665. Corey Roach: Looking at that, you know, that is all great, but what would be the obstacles and putting something like this into play. So the first one on that obviously is privacy.
  666.  
  667. 167
  668. 00:30:30.120 --> 00:30:39.780
  669. Corey Roach: Fortunately, most of this information. These institutions like universities already have we already have that information about our students or about our faculty and we are already stewards of that information.
  670.  
  671. 168
  672. 00:30:40.770 --> 00:30:46.530
  673. Corey Roach: The data that we create or new data that we synthesize at that point would be in control of the library and
  674.  
  675. 169
  676. 00:30:46.980 --> 00:31:00.060
  677. Corey Roach: Within the limits of the law or their organizational policy, they can kind of set parameters around what they want to do with that data do we share with our other peers do we, how long do we keep it. Do we want to anonymize or tokens, any of that data.
  678.  
  679. 170
  680. 00:31:01.770 --> 00:31:10.950
  681. Corey Roach: Unfortunately, also, that most of that data is analyzed by an algorithm, not by a person. So there's far less bias, although we do have to be careful about not building bias into the algorithms.
  682.  
  683. 171
  684. 00:31:11.400 --> 00:31:17.850
  685. Corey Roach: Particularly with things like machine learning or artificial intelligence, we have to be careful about not building bias into the system.
  686.  
  687. 172
  688. 00:31:19.770 --> 00:31:29.040
  689. Corey Roach: The second kind of obstacle, there is expertise and although we have a lot of talented library technical staff, most of the time they are not security experts.
  690.  
  691. 173
  692. 00:31:29.700 --> 00:31:45.930
  693. Corey Roach: And security experts oftentimes make extremely high salaries and again we go back to funding where libraries are not always super well funded so paying those salaries becomes pretty tough on top of that, the security industry right now is going through a pretty severe
  694.  
  695. 174
  696. 00:31:47.850 --> 00:31:54.300
  697. Corey Roach: shortage of qualified professionals, there's more coming into the market, but it is kind of hard to hire security professionals right now.
  698.  
  699. 175
  700. 00:31:55.890 --> 00:32:03.780
  701. Corey Roach: In some cases, libraries, may have access to expertise in their parent organization as with my university or sometimes they may not
  702.  
  703. 176
  704. 00:32:06.090 --> 00:32:17.790
  705. Corey Roach: And circling back around to that same issue is costs. So libraries are not generally the well known for being well funded and they're certainly not known for having excessive technology budgets.
  706.  
  707. 177
  708. 00:32:18.660 --> 00:32:28.290
  709. Corey Roach: Security programs like mine as a see. So, as I say, I'm probably likely to put those resources somewhere where I feel the risk is more acute for my organization.
  710.  
  711. 178
  712. 00:32:29.430 --> 00:32:44.640
  713. Corey Roach: commercial tools to do these types of things can be very expensive, even with discounts given to things like government or education and unfortunately free or open source tools that do these kind of things are really not up to snuff. They're not sophisticated are specialized enough
  714.  
  715. 179
  716. 00:32:45.810 --> 00:32:55.230
  717. Corey Roach: So how can we kind of overcome some of these obstacles and that is where you know publishers and Cincy or groups like this can come in to help
  718.  
  719. 180
  720. 00:32:57.120 --> 00:33:04.470
  721. Corey Roach: One of the first things we can do is develop or subsidize a low cost proxy or a plug into existing proxies so
  722.  
  723. 181
  724. 00:33:05.190 --> 00:33:24.660
  725. Corey Roach: To speak plainly. The most commonly used proxy right now is easy proxy and I'd recommend either creating a proxy that is some type of drop in turn key replacement or some type of application or or plugin that can enhance easy proxy so
  726.  
  727. 182
  728. 00:33:25.740 --> 00:33:37.320
  729. Corey Roach: Since that can potentially threaten the business model of OC. Hello, the company that owns easy proxy. I would also propose that they might be a good organization to to approach as being a
  730.  
  731. 183
  732. 00:33:38.070 --> 00:33:45.420
  733. Corey Roach: Potentially, a member of sensi and contributing to these types of efforts. So other things that we can do.
  734.  
  735. 184
  736. 00:33:46.230 --> 00:33:51.420
  737. Corey Roach: As as an organization, you can facilitate threat sharing information between the members.
  738.  
  739. 185
  740. 00:33:52.260 --> 00:34:04.020
  741. Corey Roach: This can be things like mailing list or message board automated mechanisms, but a lot of that, again, is being done within the security industry. So we don't have to reinvent the wheel, but we could have it specialized to this industry.
  742.  
  743. 186
  744. 00:34:05.130 --> 00:34:19.170
  745. Corey Roach: We could provide training to those library IT professionals and help upscale them in the area of security. It doesn't do us any good to have these tools. If we don't have anybody that knows how to run them.
  746.  
  747. 187
  748. 00:34:20.280 --> 00:34:33.780
  749. Corey Roach: We could also promote promote community around this effort, whether it be the proxy or just in general, it's possible that if this proxy or tools like this were to be open source the community may even end up supporting them in the long run.
  750.  
  751. 188
  752. 00:34:34.830 --> 00:34:50.130
  753. Corey Roach: And then last, is that publishers could provide pricing incentives to share the risk, as I've mentioned as a see. So most of the risk in this endeavor is not mine. It's the publishers. But if we were to enter into an organism, an agreement where things like
  754.  
  755. 189
  756. 00:34:51.810 --> 00:35:01.710
  757. Corey Roach: Knocking some amount off the price if there were not security compromises within a period of time would incentivize the organization to share in those risks.
  758.  
  759. 190
  760. 00:35:02.160 --> 00:35:08.310
  761. Corey Roach: And honestly, it really wouldn't have to be that big of an incentive to get organizations like a university or a library to buy in.
  762.  
  763. 191
  764. 00:35:09.360 --> 00:35:12.390
  765. Corey Roach: But at least it would create kind of a shared interest.
  766.  
  767. 192
  768. 00:35:13.680 --> 00:35:28.050
  769. Corey Roach: There are other opportunities that this leads to that are worth mentioning. And many of these collaborations are things that could be useful even on their own, but with the more modern scenario where libraries have the capability to act, they become even more useful.
  770.  
  771. 193
  772. 00:35:29.280 --> 00:35:34.560
  773. Corey Roach: We could as an organization foster security advocates and help provide materials for them or training.
  774.  
  775. 194
  776. 00:35:35.820 --> 00:35:40.020
  777. Corey Roach: Just building allies within the consumer side of this equation.
  778.  
  779. 195
  780. 00:35:40.680 --> 00:35:49.260
  781. Corey Roach: We can educate leaders, whether that be in a parent organization or in a library or other consumers about the shared risks. Honestly, as I say,
  782.  
  783. 196
  784. 00:35:49.590 --> 00:35:58.320
  785. Corey Roach: When I started out with this. I didn't see much shared risk for me as I see. So at all. But as I got more educated on it. I can see that there are things
  786.  
  787. 197
  788. 00:35:58.890 --> 00:36:06.810
  789. Corey Roach: Like the exposure of those credentials, which might affect my in my organization, probably not enough for me to completely realign my security program.
  790.  
  791. 198
  792. 00:36:07.110 --> 00:36:14.970
  793. Corey Roach: But certainly enough to make me want to collaborate and take a closer look at it, we can educate our users on the personal risk, they're exposing themselves to
  794.  
  795. 199
  796. 00:36:15.450 --> 00:36:23.400
  797. Corey Roach: I don't think you want to go so far as to take the Motion Picture Association route and do a you know you don't want to download a car approach, but
  798.  
  799. 200
  800. 00:36:23.730 --> 00:36:28.800
  801. Corey Roach: We could remind users that you know by giving away their credentials or not being cautious about fishing.
  802.  
  803. 201
  804. 00:36:29.100 --> 00:36:37.680
  805. Corey Roach: They may be exposing more than just their access to these types of resources, oftentimes those credentials are things that allowed us access to your student or your employee record.
  806.  
  807. 202
  808. 00:36:38.640 --> 00:36:48.270
  809. Corey Roach: And then last, one of the things that I thought was an interesting finding during my research is that I think there needs to be more effort around promoting the value of this process and these publishers
  810.  
  811. 203
  812. 00:36:48.660 --> 00:37:07.650
  813. Corey Roach: To the customers have that information. It was interesting to me how few of the people that I spoke with could actually explain the value of the publishing process. And so that's why some of their perceptions of consuming the pirated information seemed a little out of balance.
  814.  
  815. 204
  816. 00:37:10.020 --> 00:37:20.490
  817. Corey Roach: So I hope the presentation may have sparked some ideas or interests and how we might collaborate in the future. I appreciate your, your time and attention.
  818.  
  819. 205
  820. 00:37:21.060 --> 00:37:33.360
  821. Corey Roach: I will be coming back for the roundtable this afternoon. If I don't happen to catch you there and you have a question, feel free to reach out and I hope all of you. Enjoy the rest of the webinar. Thank you.
  822.  
  823. 206
  824. 00:37:36.900 --> 00:37:37.770
  825. Thank you, Corey.
  826.  
  827. 207
  828. 00:37:44.490 --> 00:37:55.170
  829. Daniel Ascher: And with that we will be moving to our next speaker, who is crane hassled the Senior Director of threat research at Gary data incorporated
  830.  
  831. 208
  832. 00:37:57.480 --> 00:37:58.140
  833. Daniel Ascher: Take it away. Great.
  834.  
  835. 209
  836. 00:37:58.650 --> 00:37:59.850
  837. Crane Hassold : Thank you very much.
  838.  
  839. 210
  840. 00:38:01.110 --> 00:38:02.400
  841. Crane Hassold : share my screen here.
  842.  
  843. 211
  844. 00:38:13.470 --> 00:38:17.790
  845. Crane Hassold : Alright and just making sure. You're seeing the right view here. Not all of my fancy notes.
  846.  
  847. 212
  848. 00:38:19.680 --> 00:38:20.760
  849. Daniel Ascher: Yes, we can just see here
  850.  
  851. 213
  852. 00:38:20.820 --> 00:38:22.110
  853. Crane Hassold : All right, fantastic.
  854.  
  855. 214
  856. 00:38:22.800 --> 00:38:36.750
  857. Crane Hassold : Alright, thanks for having me on. I'm really excited to actually talk about a topic that I haven't really talked about much really in the past couple years. Um, we're looking at sort of a group within this segues very nicely to the previous presentation.
  858.  
  859. 215
  860. 00:38:37.920 --> 00:38:52.230
  861. Crane Hassold : I'm looking at some librarian a threat group coming out of Iran sigh hub, an issue that I know a lot of folks on this, on this webinar are probably interested in and you know the role of state sponsored actors in
  862.  
  863. 216
  864. 00:38:52.830 --> 00:39:01.890
  865. Crane Hassold : In threats targeting academic institutions. Before we get started, just want to give everyone to a little, a little background about myself.
  866.  
  867. 217
  868. 00:39:02.310 --> 00:39:12.870
  869. Crane Hassold : I'm so I'm currently the senior director of research at a company called Gari where we focus on a real identity deception email based attacks.
  870.  
  871. 218
  872. 00:39:13.200 --> 00:39:22.830
  873. Crane Hassold : Business email compromise is a really big focus of ours right now, which is, you know, one of the predominant threats that is impacting all institutions all over the world right now.
  874.  
  875. 219
  876. 00:39:23.550 --> 00:39:35.670
  877. Crane Hassold : I'm I've been in the private sector for about five years now. Prior to coming to a Gari about two years ago, I was with another company. And I've had had had a role in building out to
  878.  
  879. 220
  880. 00:39:36.120 --> 00:39:43.650
  881. Crane Hassold : Fishing threat intelligence teams really from the ground up, which has been, you know, really, really fun and enjoyable.
  882.  
  883. 221
  884. 00:39:44.250 --> 00:39:53.400
  885. Crane Hassold : Um, prior to that, I was with the FBI for 11 years and most of my time in the FBI. I was in the behavioral analysis units based out of Quantico, Virginia.
  886.  
  887. 222
  888. 00:39:54.180 --> 00:40:04.740
  889. Crane Hassold : Where for six years. I did violent crime behavioral analysis. So looking at serial killers other types of violent criminals, you know, the traditional profiling.
  890.  
  891. 223
  892. 00:40:06.240 --> 00:40:11.370
  893. Crane Hassold : Type of type of work did that for six years and then myself and a few other folks.
  894.  
  895. 224
  896. 00:40:12.690 --> 00:40:27.540
  897. Crane Hassold : Built the FBI cyber behavioral analysis center, which has taken those concepts that have been used for decades in the violent crime profiling world and apply those to cyber threat actors as a new way to better understand how these actors.
  898.  
  899. 225
  900. 00:40:28.290 --> 00:40:38.550
  901. Crane Hassold : Are working how they're motivated and how we can use their behavioral characteristics to better understand, you know, the threats, they pose and just try to help mitigate some of those threats as well.
  902.  
  903. 226
  904. 00:40:39.930 --> 00:40:53.730
  905. Crane Hassold : So that's just a little bit about my background where I'm coming from. I'll start off here with some library and so sound librarian is a group that I started tracking back in late 2017
  906.  
  907. 227
  908. 00:40:54.780 --> 00:41:04.620
  909. Crane Hassold : And I was the one who actually named the group. I got to give them a nice little fancy name that I know if there's anyone on the call. Who knows anything about a PT groups.
  910.  
  911. 228
  912. 00:41:05.430 --> 00:41:17.700
  913. Crane Hassold : That a lot of those names don't make a lot of sense. I always try to make our names means something so obviously from the sun librarian name. You can tell that obviously there is some library connotation to this.
  914.  
  915. 229
  916. 00:41:18.420 --> 00:41:27.900
  917. Crane Hassold : So when we did when we were looking at the song librarian. One of the really interesting aspects of their attacks is, you know, we see
  918.  
  919. 230
  920. 00:41:28.410 --> 00:41:38.280
  921. Crane Hassold : We see a lot of cyber criminals others other types of cyber threat actors targeting universities and colleges all over the world. But what was really unique about
  922.  
  923. 231
  924. 00:41:38.610 --> 00:41:45.120
  925. Crane Hassold : Sound librarian is that the fishing pages. They were setting up we're specifically targeting
  926.  
  927. 232
  928. 00:41:45.810 --> 00:41:53.400
  929. Crane Hassold : Libraries and library credentials, which is something we had never seen before. And when you looked at some of the patterns and how they were setting these up.
  930.  
  931. 233
  932. 00:41:54.210 --> 00:42:03.990
  933. Crane Hassold : It was very unique and in clearly all centered around the same group of actors. So based on our analysis, we were able to track them and find a tax.
  934.  
  935. 234
  936. 00:42:04.500 --> 00:42:14.070
  937. Crane Hassold : Linked to sell my brain, all the way back to 2013. So this is a group that's been around for at this point now going on, seven, eight years.
  938.  
  939. 235
  940. 00:42:14.910 --> 00:42:22.140
  941. Crane Hassold : We were able to link them to around pretty early on in our investigation based on some analysis of the
  942.  
  943. 236
  944. 00:42:22.710 --> 00:42:29.310
  945. Crane Hassold : The fishing kits. They were using as well as an open source intelligence that we were able to link to some of the actors and at the time.
  946.  
  947. 237
  948. 00:42:30.300 --> 00:42:38.100
  949. Crane Hassold : Of our initial report, we were able to see that they were targeting more than 300 schools in 22 different countries all around the world.
  950.  
  951. 238
  952. 00:42:38.490 --> 00:42:50.190
  953. Crane Hassold : And when you looked at a lot of those schools. One of the things that you can see is that, you know, there was clearly some sort of targeting that was happening there. They were targeting some schools.
  954.  
  955. 239
  956. 00:42:50.910 --> 00:42:59.550
  957. Crane Hassold : Multiple times over and over and over again, there's one school University out of Australia that at this point, they've they've targeted I think more than two dozen times
  958.  
  959. 240
  960. 00:43:00.060 --> 00:43:16.050
  961. Crane Hassold : And when you look at those the schools that they were going after a lot of them were research schools research universities that would have access to information that would be that would be of interest to especially something like a like a state sponsored actor
  962.  
  963. 241
  964. 00:43:17.850 --> 00:43:27.480
  965. Crane Hassold : As I mentioned, you know, the, the phishing page is mimicked library login pages. You can see on the screen here. This is actually a phishing page setup I library and this morning.
  966.  
  967. 242
  968. 00:43:27.960 --> 00:43:42.630
  969. Crane Hassold : So that's how fresh. This is for Durham University out of the UK and you can see it looks identical to the actual login page that would be used that someone with a normal student or faculty would see if they're trying to log in.
  970.  
  971. 243
  972. 00:43:43.320 --> 00:43:54.990
  973. Crane Hassold : To to this this library login page. And essentially what they're doing is there simply scraping the HTML code from from the legitimate websites.
  974.  
  975. 244
  976. 00:43:55.710 --> 00:44:01.350
  977. Crane Hassold : From a website and hosting it on another location so that the fishing kits. They use
  978.  
  979. 245
  980. 00:44:01.950 --> 00:44:13.530
  981. Crane Hassold : Which is a very similar tactic that know a lot of actors use out there, whether it's, you know, a university login page or an apple login page or Wells Fargo login page very similar tactic that a lot of these actors use
  982.  
  983. 246
  984. 00:44:14.190 --> 00:44:24.960
  985. Crane Hassold : And at the end of the day, the purpose here is to compromise student in faculty credentials for the most part, this is going to be to get access to library resources.
  986.  
  987. 247
  988. 00:44:25.620 --> 00:44:34.260
  989. Crane Hassold : Journal articles. We know that, you know, based on some work that we've done with some other some other partners that know
  990.  
  991. 248
  992. 00:44:34.740 --> 00:44:44.880
  993. Crane Hassold : Where the big motivator innovations is to access and take down and scrape journal articles that they wouldn't otherwise have access to
  994.  
  995. 249
  996. 00:44:45.510 --> 00:44:52.290
  997. Crane Hassold : Now what's really interesting is this goes back to sort of the profile of the universities, they're going after, when you look at those.
  998.  
  999. 250
  1000. 00:44:52.590 --> 00:44:58.650
  1001. Crane Hassold : While there, isn't there hasn't been any specific evidence of this that I've seen at least
  1002.  
  1003. 251
  1004. 00:44:59.220 --> 00:45:06.060
  1005. Crane Hassold : I think that there's certainly another motivation behind this and this sort of goes to the state sponsored side of things is that there's
  1006.  
  1007. 252
  1008. 00:45:06.510 --> 00:45:17.250
  1009. Crane Hassold : Always the potential of theft of other sensitive research that faculty may be working on at one of these universities that may be of interest to a state sponsored actor
  1010.  
  1011. 253
  1012. 00:45:18.600 --> 00:45:40.290
  1013. Crane Hassold : So that's a brief overview of some librarian looking at their attacks. So one of the really interesting aspects of this group is that since the beginning. Since 2013 to present day. So we're talking about seven, eight years their tactics have barely changed.
  1014.  
  1015. 254
  1016. 00:45:41.370 --> 00:45:59.100
  1017. Crane Hassold : Their lower emails are always coming from the quote unquote library, um, that, you know, there are the messaging that they're using in their emails in some cases have only been updated to correct very basic spelling errors. Other than that, they're exactly the same.
  1018.  
  1019. 255
  1020. 00:46:00.750 --> 00:46:13.710
  1021. Crane Hassold : One of the things that they do is, you know, you know, based on what we do, what a Gari. Well, part of what we do is looking at D mark so know being able to protect one's an organization's domains against direct spoofing
  1022.  
  1023. 256
  1024. 00:46:14.190 --> 00:46:23.040
  1025. Crane Hassold : And we know that in the, you know, for for universities and academic institutions all over the world, D. Mark adoption is not something that has been
  1026.  
  1027. 257
  1028. 00:46:23.730 --> 00:46:30.900
  1029. Crane Hassold : Taken up at to a significant percentage. And so one of the things that song librarian does is that they will just directly spoof
  1030.  
  1031. 258
  1032. 00:46:31.410 --> 00:46:42.930
  1033. Crane Hassold : University email addresses that look like they're coming from the library. So if a if a recipient receives one of these emails, it's going to look like it's coming from, you know, the actual library.
  1034.  
  1035. 259
  1036. 00:46:43.950 --> 00:46:47.880
  1037. Crane Hassold : That, that, that is, is they're pretending to send it from
  1038.  
  1039. 260
  1040. 00:46:49.050 --> 00:46:59.280
  1041. Crane Hassold : What are the other things that they do here is they will in some cases the fishing URLs are so similar that they'll actually embed the actual fishing URL in the email.
  1042.  
  1043. 261
  1044. 00:47:00.060 --> 00:47:06.510
  1045. Crane Hassold : But that's a, you know, a small percentage of the time. Usually what they're doing is they're you're doing something like you still see here
  1046.  
  1047. 262
  1048. 00:47:06.810 --> 00:47:18.990
  1049. Crane Hassold : Where they'll have a link that looks like it's going to, in this case the Carleton University Library. But when you actually look at where that link is going. It's either going to a to a shortened URL.
  1050.  
  1051. 263
  1052. 00:47:19.440 --> 00:47:31.980
  1053. Crane Hassold : That is, could be a freely available service or it could be a university sponsored short URL shortener that we know that one of the things that this group is doing.
  1054.  
  1055. 264
  1056. 00:47:32.550 --> 00:47:43.320
  1057. Crane Hassold : Is also compromising accounts to set up those shortened URLs as well, or it could be just another look alike URL that they that they embed in there that's going directly to the phishing site.
  1058.  
  1059. 265
  1060. 00:47:44.250 --> 00:47:53.160
  1061. Crane Hassold : And you would see at the bottom of the screen here that the fishing URLs. They're using are extremely similar and look almost exactly the same as the legitimate URLs.
  1062.  
  1063. 266
  1064. 00:47:53.580 --> 00:48:13.980
  1065. Crane Hassold : And this case, there are three sets here. One is for McGill University up in up in Canada, and you can see it's shibboleth.mcgill.ca whereas the actual fishing URL is shibboleth.mcgill.ca dot ifta TK. Same thing with this University of North Texas.
  1066.  
  1067. 267
  1068. 00:48:15.030 --> 00:48:25.230
  1069. Crane Hassold : URL just ending just a pending it live.me to the end of that. And then the same thing with the Victoria University in Australia URL down at the bottom.
  1070.  
  1071. 268
  1072. 00:48:25.890 --> 00:48:34.680
  1073. Crane Hassold : And you'll notice that one of the other things that they're doing is, you know, for the most part, while they have no they switch around to some of the top level domains.
  1074.  
  1075. 269
  1076. 00:48:35.250 --> 00:48:39.900
  1077. Crane Hassold : Like the Emmys, the dot info is sometimes they'll even host
  1078.  
  1079. 270
  1080. 00:48:40.710 --> 00:49:01.200
  1081. Crane Hassold : Their phishing sites on.ir domains, but for the most part they're using freely available free nom domain. So there's dot TK dot c f.ml the that can be obtained for no price. A lot of what they're doing is hosting their fishing their fishing pages on those free domains.
  1082.  
  1083. 271
  1084. 00:49:03.270 --> 00:49:11.040
  1085. Crane Hassold : Now that was a look over at the at the attacks themselves. So one of the really interesting things that happened. And this is really where
  1086.  
  1087. 272
  1088. 00:49:11.310 --> 00:49:23.700
  1089. Crane Hassold : Everything sort of came out in the open about what this group is doing is in March of 2018 the US Department of Justice indicted nine Iranian individuals connected to a group called the magnet Institute.
  1090.  
  1091. 273
  1092. 00:49:24.090 --> 00:49:29.550
  1093. Crane Hassold : And even think of the map to institute very similar to a to a contractor that we might have here in the States.
  1094.  
  1095. 274
  1096. 00:49:29.820 --> 00:49:39.600
  1097. Crane Hassold : Where you know they aren't directly affiliated or not working for or directly for a state government, but they're contracted by a government which in this case is the Iranian government
  1098.  
  1099. 275
  1100. 00:49:40.530 --> 00:49:49.200
  1101. Crane Hassold : One of the really interesting aspects of this is the week before this indictment. I was actually giving a talk on sound librarian at a conference.
  1102.  
  1103. 276
  1104. 00:49:49.590 --> 00:49:57.270
  1105. Crane Hassold : And the day this indictment came out, I got a message from one of the people who was who was at my at my talk, and
  1106.  
  1107. 277
  1108. 00:49:57.600 --> 00:50:03.780
  1109. Crane Hassold : Said the guys that you just gave a presentation on last week are getting indicted right now. And I was like, what, what
  1110.  
  1111. 278
  1112. 00:50:04.110 --> 00:50:16.080
  1113. Crane Hassold : Because we've been working with we've been passing information to to the FBI about what we had found but we didn't actually know this was coming. So this was this was really a shock to us and then we were able to actually talk about it publicly
  1114.  
  1115. 279
  1116. 00:50:17.580 --> 00:50:30.810
  1117. Crane Hassold : But some of the interesting aspects of that indictment to show you how much of an impact this group has had is, you know, $3.4 billion of intellectual property has been lost, based on the assessment from the indictment.
  1118.  
  1119. 280
  1120. 00:50:31.530 --> 00:50:42.150
  1121. Crane Hassold : More than 31 terabytes of data of academic data was stolen by this group they compromised almost 8000 University accounts almost 4000 in the States alone.
  1122.  
  1123. 281
  1124. 00:50:42.570 --> 00:50:48.210
  1125. Crane Hassold : And what are the other interesting aspects of this and this is, I think, not surprising when you look at you know
  1126.  
  1127. 282
  1128. 00:50:48.600 --> 00:50:56.880
  1129. Crane Hassold : Who the group is working for is they also targeted other government agencies private companies and international non government organizations.
  1130.  
  1131. 283
  1132. 00:50:57.330 --> 00:51:00.030
  1133. Crane Hassold : With some of their attacks also credential phishing attacks.
  1134.  
  1135. 284
  1136. 00:51:00.750 --> 00:51:07.530
  1137. Crane Hassold : And as we were you know as this came out, we were able to directly link sign librarian to the map to institute
  1138.  
  1139. 285
  1140. 00:51:07.800 --> 00:51:25.500
  1141. Crane Hassold : Based on one of the actors. You can see on the sort of this wanted poster here, one of the actors on the far right most office sadeghi he we were able to link him to one of the websites that song librarian had set up to distribute some of these credentials for for financial gain.
  1142.  
  1143. 286
  1144. 00:51:29.430 --> 00:51:38.070
  1145. Crane Hassold : So after the indictment came out. So one of the things that we continue to do is work with Red Ice ax. So the, the I sack. You know that runs
  1146.  
  1147. 287
  1148. 00:51:38.700 --> 00:51:45.990
  1149. Crane Hassold : That partners with academic institutions we worked with them to mitigate phishing sites we set we had set up a
  1150.  
  1151. 288
  1152. 00:51:46.620 --> 00:51:59.460
  1153. Crane Hassold : We had set up a an automated tracker for when this group had created new phishing sites which based on the fact that, you know, one of the reasons. One of the ways that we did this was by setting up an SSL certificate
  1154.  
  1155. 289
  1156. 00:51:59.940 --> 00:52:22.050
  1157. Crane Hassold : Monitoring, which is, you know, publicly available to anyone. And because their URLs were were and still are today so unique and constructed it with the same similar patterns we were able to create a tracker that notified us every single time a new host was set up by sound librarian.
  1158.  
  1159. 290
  1160. 00:52:23.430 --> 00:52:33.660
  1161. Crane Hassold : In April 2018 I testified at a House committee that was looking at foreign threats to us, research and academic institutions. This was a very interesting experience.
  1162.  
  1163. 291
  1164. 00:52:34.410 --> 00:52:45.180
  1165. Crane Hassold : Primarily, one of the big takeaways I had from this from from this panel and we're all so there were there were four witnesses that were called test files, one of them.
  1166.  
  1167. 292
  1168. 00:52:45.600 --> 00:52:56.580
  1169. Crane Hassold : And all of the three other witnesses looked at, you know, Chinese threats and Russian threats and what was really interesting is, all of them focused on physical threats.
  1170.  
  1171. 293
  1172. 00:52:56.940 --> 00:53:14.910
  1173. Crane Hassold : To universities, none of them looked at the cyber threats to universities. So, and all. And most of the questioning that was coming from the House Committee members was actually at still asking about physical threats. And so one of the big thing takeaways I had from that.
  1174.  
  1175. 294
  1176. 00:53:16.500 --> 00:53:29.670
  1177. Crane Hassold : From from that, from testifying at that committee was that there's still no this big focus on physical threats, whereas the cyber threats which far and large are the biggest threats to most institutions today.
  1178.  
  1179. 295
  1180. 00:53:30.690 --> 00:53:37.740
  1181. Crane Hassold : Are still not being, you know, getting that much attention. And I think one of the biggest things and I think we've seen this a lot with
  1182.  
  1183. 296
  1184. 00:53:38.520 --> 00:53:53.790
  1185. Crane Hassold : With a lot of the state sponsored state sponsored indictments that regardless of whether it's around or North Korea or China or Russia is that the indictments at the end of the day, had absolutely no attack no impact on deterring future attacks.
  1186.  
  1187. 297
  1188. 00:53:54.750 --> 00:54:10.860
  1189. Crane Hassold : Sana librarian is still as active today as they were two years ago when a two and a half years ago when the indictments came out and really as we get into here in a couple of slides that really comes down to motivation of we know what the purpose of these attacks are
  1190.  
  1191. 298
  1192. 00:54:13.080 --> 00:54:19.860
  1193. Crane Hassold : So that's a little bit of overview on some library and I'll come back to them. And just a few slides. But, you know, briefly also want to talk about sigh hub.
  1194.  
  1195. 299
  1196. 00:54:20.100 --> 00:54:30.810
  1197. Crane Hassold : So as I, you know, as I mentioned, I'm sure most of the folks on this webinar are aware of what sigh hub is, you know, it was launched in 2011 by a student out of Kazakhstan.
  1198.  
  1199. 300
  1200. 00:54:31.560 --> 00:54:42.330
  1201. Crane Hassold : Who had, you know, in her mind had had realized that there was a barrier to entry into into the distribution of, you know, academic knowledge.
  1202.  
  1203. 301
  1204. 00:54:42.840 --> 00:54:51.630
  1205. Crane Hassold : Based on the paywalls that have been set up by by journals by academic journals and when you look at this is data from I believe it was April of this year.
  1206.  
  1207. 302
  1208. 00:54:52.530 --> 00:55:08.430
  1209. Crane Hassold : Sigh hub currently contains more than 81 million journal articles and I think I even saw that that analysis that was done recently it was something like 95% of Elsevier journal articles are available on sigh hub.
  1210.  
  1211. 303
  1212. 00:55:09.060 --> 00:55:20.550
  1213. Crane Hassold : When you look at who's using sigh hub. The top countries using it based on some some research that was done on India chat, China, the US Brazil and Iran.
  1214.  
  1215. 304
  1216. 00:55:21.570 --> 00:55:25.800
  1217. Crane Hassold : And so that Iranian aspect really comes back to something we'll talk about here in just a second.
  1218.  
  1219. 305
  1220. 00:55:26.490 --> 00:55:31.950
  1221. Crane Hassold : Which you know goes into know regarding the motivation. We'll talk about motivation here on the next few slides.
  1222.  
  1223. 306
  1224. 00:55:32.340 --> 00:55:41.250
  1225. Crane Hassold : But how do they get these articles, you know, they, they'll say that the the articles are donated that they get donated credentials from students or other supporters
  1226.  
  1227. 307
  1228. 00:55:41.850 --> 00:55:51.630
  1229. Crane Hassold : There's been a lot of talk that there is, there has been some, some of these companies credentials that are used to collect all of these
  1230.  
  1231. 308
  1232. 00:55:52.080 --> 00:56:04.380
  1233. Crane Hassold : All of these journal articles are collected through phishing attacks, which I think is certainly possible. And then one of the really interesting aspects is because Iran has such a connection to sigh hub.
  1234.  
  1235. 309
  1236. 00:56:05.010 --> 00:56:11.490
  1237. Crane Hassold : That there's a potential link here between sigh hub and sound library and and all i don't think that that link has been
  1238.  
  1239. 310
  1240. 00:56:12.120 --> 00:56:21.990
  1241. Crane Hassold : Has a hard link by any means. Now there's some work that I did with that I did when I was researching saw library and a little bit more closely than I am now.
  1242.  
  1243. 311
  1244. 00:56:22.710 --> 00:56:29.010
  1245. Crane Hassold : That sort of provided some some good insight into better understanding that link, so there there has been so
  1246.  
  1247. 312
  1248. 00:56:29.190 --> 00:56:44.190
  1249. Crane Hassold : When you look at some solid library and campaigns. It was shortly after that some of the credentials that are compromised in those campaigns are then use to pull down information. So there is a potential link there. Even though I don't think that is a new a hard and fast link.
  1250.  
  1251. 313
  1252. 00:56:45.990 --> 00:56:54.570
  1253. Crane Hassold : So that's an overview of sigh hub. If we look at sort of now want to pivot, a little bit into understanding motivations for cyber attacks. I think this
  1254.  
  1255. 314
  1256. 00:56:54.840 --> 00:57:06.120
  1257. Crane Hassold : No really, this will give more clarification into know why sigh hub, you know exists, why Simon library and does what he does. And really when you look at a lot of other
  1258.  
  1259. 315
  1260. 00:57:06.870 --> 00:57:17.700
  1261. Crane Hassold : state sponsored actors and other cyber criminals. You know why they do what they do. And there are three main buckets of motivations that you can link, you know, most cyber attacks to
  1262.  
  1263. 316
  1264. 00:57:18.180 --> 00:57:29.310
  1265. Crane Hassold : One is economic that is by far the number one motivation for for cyber attacks and that is going to be most of the cyber criminals out there, regardless of whether it's business email compromise.
  1266.  
  1267. 317
  1268. 00:57:29.520 --> 00:57:41.460
  1269. Crane Hassold : ransomware extortion other types of malware campaigns, almost all of those are going to be done for financial gain. And that is the number one incentive and motivation for for cyber attacks.
  1270.  
  1271. 318
  1272. 00:57:42.180 --> 00:57:48.000
  1273. Crane Hassold : So there's the second motivation is political. And these are going to be where a lot of the state sponsored actors are going to be sitting
  1274.  
  1275. 319
  1276. 00:57:48.660 --> 00:57:53.490
  1277. Crane Hassold : In the middle of economic and political is where you have those state affiliated contractors
  1278.  
  1279. 320
  1280. 00:57:53.760 --> 00:58:03.060
  1281. Crane Hassold : And so that is someone like the magnet Institute. Whoo hoo isn't directly working for a government institution, but they are being contracted on their behalf.
  1282.  
  1283. 321
  1284. 00:58:03.480 --> 00:58:09.240
  1285. Crane Hassold : In the US, this could be like a Booz Allen Lockheed Martin those big contractors that I'm sure most people know
  1286.  
  1287. 322
  1288. 00:58:09.870 --> 00:58:19.740
  1289. Crane Hassold : It's the same deal there where they are, they're getting paid by a government to do work for them on their behalf, but they don't work directly for the government.
  1290.  
  1291. 323
  1292. 00:58:20.640 --> 00:58:33.150
  1293. Crane Hassold : And then the last bucket of motivations. Here is social and this is social justice. A lot of the activism that we've seen that we've seen being done is done for for social motivations.
  1294.  
  1295. 324
  1296. 00:58:34.530 --> 00:58:44.010
  1297. Crane Hassold : So let's look at you know where sigh hub and song librarian fit in these motivations, because it's really interesting, as in most a test for most attacks.
  1298.  
  1299. 325
  1300. 00:58:44.370 --> 00:58:52.740
  1301. Crane Hassold : You can really just bucket attacks into one of these three classifications. But what's interesting about song librarian and sigh hub is that
  1302.  
  1303. 326
  1304. 00:58:53.070 --> 00:59:01.680
  1305. Crane Hassold : They actually touch each one of these. There's a motivation linked to each one of these buckets. So let's look at each one of them. So on the economic side.
  1306.  
  1307. 327
  1308. 00:59:02.220 --> 00:59:12.450
  1309. Crane Hassold : You know, as, as I mentioned, the reason that was given for setting up. Sigh How back in 2011 was to was in response to high paywalls
  1310.  
  1311. 328
  1312. 00:59:12.990 --> 00:59:25.470
  1313. Crane Hassold : By by academic journals, you know, if you are a student at a university other academic institution. Most likely you're going to be able to have access to, to, to journal articles through the school.
  1314.  
  1315. 329
  1316. 00:59:25.770 --> 00:59:38.370
  1317. Crane Hassold : But what if you're not. What if you aren't a student at a school. How do you get access. Then, and that's where this economic burden comes in is that there is a, you know, depending on where you're coming from.
  1318.  
  1319. 330
  1320. 00:59:38.880 --> 00:59:47.670
  1321. Crane Hassold : The amount of money that needs to be paid for a single article for access to a single article has been high in the past, and I very much equate this
  1322.  
  1323. 331
  1324. 00:59:48.120 --> 00:59:56.730
  1325. Crane Hassold : Sigh hub is essentially the Napster for journal articles. Right. So Napster came about in the late 90s around 2000
  1326.  
  1327. 332
  1328. 00:59:57.180 --> 01:00:06.120
  1329. Crane Hassold : As a way because paying for, you know, $15 for a CD, even though you only want to listen to one or two songs was no
  1330.  
  1331. 333
  1332. 01:00:06.600 --> 01:00:16.890
  1333. Crane Hassold : didn't make any economic sense. So Napster came out as a peer to peer as a peer to peer application that allows anyone to download music from anywhere, anytime they want for free.
  1334.  
  1335. 334
  1336. 01:00:17.430 --> 01:00:21.810
  1337. Crane Hassold : Now what happened after Napster came out, which I thought was very interesting.
  1338.  
  1339. 335
  1340. 01:00:22.530 --> 01:00:29.460
  1341. Crane Hassold : Was that no of course there were lawsuits Napster was based in the US. So Napster was essentially eventually taken down.
  1342.  
  1343. 336
  1344. 01:00:29.850 --> 01:00:45.780
  1345. Crane Hassold : But the economic model of music distribution completely changed it moved from physical CDs to iTunes right where you could then by songs on demand, and now you have something like Spotify or Pandora.
  1346.  
  1347. 337
  1348. 01:00:46.230 --> 01:01:04.860
  1349. Crane Hassold : Or Apple Music where you can now stream any music you want anytime and today. Today, something like Napster wouldn't be economically viable, there is no demand for something like Napster because so much has shifted in that in the music in the music landscape.
  1350.  
  1351. 338
  1352. 01:01:05.940 --> 01:01:16.740
  1353. Crane Hassold : It's something very similar to what i think that you know what we would see in the academic journal landscape is if the subscription model was was adopted more widely.
  1354.  
  1355. 339
  1356. 01:01:17.370 --> 01:01:34.350
  1357. Crane Hassold : There than you would probably see the same thing as the, the need for sigh have really isn't there anymore from an economic perspective on the other side for Simon librarian. It's very similar to to the reason why. So I have exists is that because of international sanctions for
  1358.  
  1359. 340
  1360. 01:01:35.430 --> 01:01:51.540
  1361. Crane Hassold : For academic journals to not allow be audited distribute within Iran, there's a need to get those articles from from different places. And so that's why you see this this demand for something like
  1362.  
  1363. 341
  1364. 01:01:52.080 --> 01:01:53.940
  1365. Crane Hassold : Like a sigh hub or other
  1366.  
  1367. 342
  1368. 01:01:54.870 --> 01:02:05.610
  1369. Crane Hassold : Or other avenues of distribution. One of the things that sound librarian did is they would actually sell while some of this, as we'll see here on the next slide was done at the direction of the Iranian government
  1370.  
  1371. 343
  1372. 01:02:05.970 --> 01:02:25.170
  1373. Crane Hassold : These actors also sold access sold credentials on a variety of different Farsi language websites to specific to specific universities and specific journals. So they were making sort of that sort of direct financial gain with with sound librarian.
  1374.  
  1375. 344
  1376. 01:02:26.700 --> 01:02:32.790
  1377. Crane Hassold : There's certainly a political aspect to both of these solid brain is a little bit more direct
  1378.  
  1379. 345
  1380. 01:02:33.420 --> 01:02:43.830
  1381. Crane Hassold : Basically indictment that came out from the Department of Justice. It was directly stated there that what the magnet Institute was doing was acting on at the behest of the Iranian government
  1382.  
  1383. 346
  1384. 01:02:44.190 --> 01:02:51.720
  1385. Crane Hassold : And I think as I mentioned before, while there is, you know, I don't think there's been any public evidence for this yet. I think there's certainly potential
  1386.  
  1387. 347
  1388. 01:02:52.350 --> 01:03:01.170
  1389. Crane Hassold : That some of this activity was done for for intelligence purposes to gather intelligence about academic research being done by some of these universities.
  1390.  
  1391. 348
  1392. 01:03:02.250 --> 01:03:07.380
  1393. Crane Hassold : And then when you look at sigh hub, you know, this, again, I think there's something that is a little bit more indirect
  1394.  
  1395. 349
  1396. 01:03:07.830 --> 01:03:21.390
  1397. Crane Hassold : But it's certainly been hinted at by a number of publications that I've seen is that there's potential backing within for for sigh have by Russian government entities. So that would be a political
  1398.  
  1399. 350
  1400. 01:03:22.560 --> 01:03:37.290
  1401. Crane Hassold : Aspect there. And then the last one here, social, as I mentioned, there's a community need in Iran for access to academic research due to sanctions that have been put into place and then those on the side hub side of things.
  1402.  
  1403. 351
  1404. 01:03:37.950 --> 01:03:48.240
  1405. Crane Hassold : This is much more of a social justice, there's, you know, when we look when you look at the defenders of sigh hub. A lot of them embrace this open access and the global right to knowledge.
  1406.  
  1407. 352
  1408. 01:03:48.780 --> 01:04:01.920
  1409. Crane Hassold : Terminology that everyone has access to knowledge, regardless of whether where it's published and how it's published and that is sort of their justification rationalization to why. So I have exists right
  1410.  
  1411. 353
  1412. 01:04:04.830 --> 01:04:14.940
  1413. Crane Hassold : So that into that looks the motivation here I'll close here at know a very brief look at no nation state actors state sponsored actors and some of the myths and some of the realities.
  1414.  
  1415. 354
  1416. 01:04:15.840 --> 01:04:24.270
  1417. Crane Hassold : In there are, there's one big big big difference between cyber criminals and nation state actors, one is
  1418.  
  1419. 355
  1420. 01:04:24.660 --> 01:04:32.970
  1421. Crane Hassold : The one side of it is cyber criminals are driven by profit, as I mentioned, economic incentives are what, you know, makes them go.
  1422.  
  1423. 356
  1424. 01:04:33.180 --> 01:04:41.490
  1425. Crane Hassold : There in it for financial gain. So if you can minimize their profits. If you can impact their profits, you can make it a little bit more difficult for them to
  1426.  
  1427. 357
  1428. 01:04:42.150 --> 01:04:52.350
  1429. Crane Hassold : To attack a specific target, they will move on to something else, because as I've something that I've said for years is art, you know, criminals are inherently lazy.
  1430.  
  1431. 358
  1432. 01:04:52.590 --> 01:04:57.180
  1433. Crane Hassold : They're going to want to do the least amount of work to make the most amount of money.
  1434.  
  1435. 359
  1436. 01:04:57.660 --> 01:05:04.410
  1437. Crane Hassold : And if you can do that, then, then they will go somewhere else. They will, they will adapt and they will evolve their tactics.
  1438.  
  1439. 360
  1440. 01:05:04.980 --> 01:05:21.840
  1441. Crane Hassold : nation state actors. On the other hand, are driven by mission. You could put up as many roadblocks in their way, as you could. That to increase the, the financial the financial necessities for them to have to attack a specific target, but that doesn't matter to
  1442.  
  1443. 361
  1444. 01:05:21.840 --> 01:05:22.140
  1445. Robert Boissy: Them.
  1446.  
  1447. 362
  1448. 01:05:22.380 --> 01:05:25.260
  1449. Crane Hassold : For the most part they're going to be going after.
  1450.  
  1451. 363
  1452. 01:05:25.680 --> 01:05:33.870
  1453. Crane Hassold : specific targets for a specific purpose and they will not stop until that mission has been successful. So that is one of the biggest
  1454.  
  1455. 364
  1456. 01:05:34.590 --> 01:05:45.000
  1457. Crane Hassold : The biggest threats to nation state actors is that it's much more difficult to stop them from fulfilling their mission because they they're just going to get it done.
  1458.  
  1459. 365
  1460. 01:05:46.380 --> 01:05:59.400
  1461. Crane Hassold : That being said, not all, I think one of the biggest myths with state sponsored actors is not all nation states attacks are technically sophisticated know one of the more recent indictments that came out was for
  1462.  
  1463. 366
  1464. 01:06:00.150 --> 01:06:09.420
  1465. Crane Hassold : It was first what's called status and we're or forgot what the Panda name is for them but you know it's it's the Russian GR GRU
  1466.  
  1467. 367
  1468. 01:06:09.960 --> 01:06:23.970
  1469. Crane Hassold : And these are for these word for indictments for things like not Petya which was, you know, a map the massive ransomware attacks that happened last year Olympic destroyer, which was for the Pyongyang Olympics black energy which is
  1470.  
  1471. 368
  1472. 01:06:24.330 --> 01:06:34.590
  1473. Crane Hassold : Malware that's going after industrial control systems and electric system, the electric grid in the Ukraine, those are what I think most people think of when they think of
  1474.  
  1475. 369
  1476. 01:06:35.460 --> 01:06:43.740
  1477. Crane Hassold : When they think of nation state attacks. But in reality, most nation state attacks are very non technically sophisticated
  1478.  
  1479. 370
  1480. 01:06:44.160 --> 01:06:53.580
  1481. Crane Hassold : Take the DNC compromised back in 2016, for example, that which you can see on the right hand side of the screen. That was a basic Google accounts fishing campaign.
  1482.  
  1483. 371
  1484. 01:06:53.970 --> 01:07:03.540
  1485. Crane Hassold : That was sending out a phishing email saying that someone has your password, you need to change it. You go to a scrape of a Google accounts page.
  1486.  
  1487. 372
  1488. 01:07:03.900 --> 01:07:10.650
  1489. Crane Hassold : And their account was compromised at that time sound librarian is a great example of that. There's nothing technically like like
  1490.  
  1491. 373
  1492. 01:07:11.640 --> 01:07:14.280
  1493. Crane Hassold : Like overly technically sophisticated with those attacks.
  1494.  
  1495. 374
  1496. 01:07:14.910 --> 01:07:27.030
  1497. Crane Hassold : And one of the best examples that just came out I think this morning was the, the, what's been attributed to Iran as well are these these proud boy email campaigns trying to influence
  1498.  
  1499. 375
  1500. 01:07:27.510 --> 01:07:37.050
  1501. Crane Hassold : The election this year and those again those are essentially psychological manipulations. There's no links no malware. It's just trying to
  1502.  
  1503. 376
  1504. 01:07:37.500 --> 01:07:46.680
  1505. Crane Hassold : To manipulate behavior and and and be more psychological and so when we look at nations that attacks. They're not always going to be technically sophisticated
  1506.  
  1507. 377
  1508. 01:07:47.430 --> 01:07:56.490
  1509. Crane Hassold : Like most attacks today where a lot of actors are going. It's just pure basic social engineering. And the reason. The reason that's happening is one
  1510.  
  1511. 378
  1512. 01:07:56.940 --> 01:08:03.900
  1513. Crane Hassold : A lot of technical, you know, historically, a lot of technical defenses that are put in place have actually gotten very good at detecting
  1514.  
  1515. 379
  1516. 01:08:04.290 --> 01:08:13.080
  1517. Crane Hassold : technically sophisticated attacks and at the other than that at the end of the day, social engineering is relatively easy as long as human beings have been on this earth.
  1518.  
  1519. 380
  1520. 01:08:13.410 --> 01:08:24.570
  1521. Crane Hassold : interacting with one another. We've been social engineering each other. The only differences now is we're using computers to social engineer each other rather than doing it face to face or through the mail or something like that.
  1522.  
  1523. 381
  1524. 01:08:26.910 --> 01:08:36.390
  1525. Crane Hassold : And with that, I thanks for everyone for for sticking around for my presentation. And if there any questions, I'd be happy to take them. Or I will also be around for the for the panel later this afternoon.
  1526.  
  1527. 382
  1528. 01:08:39.510 --> 01:08:49.140
  1529. Daniel Ascher: Thank you created that was very informative and interesting. So we are a couple of minutes ahead of schedule here. We're going to go to the lunch break a little bit early.
  1530.  
  1531. 383
  1532. 01:08:51.000 --> 01:08:58.950
  1533. Daniel Ascher: It just as a reminder, we're gonna be putting out a timer. If you'd like to continue the conversation in the chat box as we're on the break, please do so.
  1534.  
  1535. 384
  1536. 01:39:11.190 --> 01:39:13.320
  1537. Okere, Kelechi N. (ELS-NYC): Awesome. Welcome back everyone.
  1538.  
  1539. 385
  1540. 01:39:14.910 --> 01:39:23.160
  1541. Okere, Kelechi N. (ELS-NYC): I hope you all had a good break for those of us in the US, hope you had a good lunch and for those in Europe and other places hope.
  1542.  
  1543. 386
  1544. 01:39:24.270 --> 01:39:27.060
  1545. Okere, Kelechi N. (ELS-NYC): It was a nice time to just grab some dinner.
  1546.  
  1547. 387
  1548. 01:39:28.680 --> 01:39:31.500
  1549. Okere, Kelechi N. (ELS-NYC): So thanks for joining us again.
  1550.  
  1551. 388
  1552. 01:39:32.970 --> 01:39:37.470
  1553. Okere, Kelechi N. (ELS-NYC): To get us going. I like to run. Now, second poll
  1554.  
  1555. 389
  1556. 01:39:38.970 --> 01:39:40.920
  1557. Okere, Kelechi N. (ELS-NYC): So I just want you to in a second.
  1558.  
  1559. 390
  1560. 01:39:47.250 --> 01:40:02.250
  1561. Okere, Kelechi N. (ELS-NYC): And again, the, the, the poll Jesus asks, how much do you know about the different kinds of security threats to the scholarly infrastructure by infrastructure we meet how peer reviewed literature and open access content is shared funded and trusted
  1562.  
  1563. 391
  1564. 01:40:43.770 --> 01:40:47.850
  1565. Okere, Kelechi N. (ELS-NYC): Awesome. We have about 57 people have voted.
  1566.  
  1567. 392
  1568. 01:40:50.070 --> 01:40:53.010
  1569. Okere, Kelechi N. (ELS-NYC): Maybe just a little bit more and then I'll end the poll
  1570.  
  1571. 393
  1572. 01:40:55.350 --> 01:40:57.300
  1573. Okere, Kelechi N. (ELS-NYC): Want to give it a few more seconds.
  1574.  
  1575. 394
  1576. 01:41:05.280 --> 01:41:13.140
  1577. Okere, Kelechi N. (ELS-NYC): Let's maybe match this mourners of 74% of participants haven't voted.
  1578.  
  1579. 395
  1580. 01:41:31.110 --> 01:41:35.310
  1581. Okere, Kelechi N. (ELS-NYC): All right, 60% there we go 62%
  1582.  
  1583. 396
  1584. 01:41:40.440 --> 01:41:43.560
  1585. Okere, Kelechi N. (ELS-NYC): All right one last chance to vote closer
  1586.  
  1587. 397
  1588. 01:41:46.230 --> 01:41:46.650
  1589. Okere, Kelechi N. (ELS-NYC): Alright.
  1590.  
  1591. 398
  1592. 01:41:47.670 --> 01:41:48.450
  1593. Okere, Kelechi N. (ELS-NYC): So,
  1594.  
  1595. 399
  1596. 01:41:50.700 --> 01:41:59.490
  1597. Okere, Kelechi N. (ELS-NYC): On the question of how much do you know about a different kinds of cyber security 29% says have a thorough understanding of cyber security threats.
  1598.  
  1599. 400
  1600. 01:42:01.110 --> 01:42:05.640
  1601. Okere, Kelechi N. (ELS-NYC): 53% the majority says have some information. But I might be
  1602.  
  1603. 401
  1604. 01:42:08.880 --> 01:42:10.230
  1605. Okere, Kelechi N. (ELS-NYC): This is Cruyff here.
  1606.  
  1607. 402
  1608. 01:42:11.580 --> 01:42:26.940
  1609. Okere, Kelechi N. (ELS-NYC): Think is. Yeah, but I am I might need more information and then we have a campus library task force our committee focus on these threats. So 3% and 16% says I let all this new administration worry about it.
  1610.  
  1611. 403
  1612. 01:42:28.080 --> 01:42:30.870
  1613. Okere, Kelechi N. (ELS-NYC): Alright, so those are for our second poll
  1614.  
  1615. 404
  1616. 01:42:35.070 --> 01:42:37.470
  1617. Okere, Kelechi N. (ELS-NYC): And I'll let Dan introduce our next speaker.
  1618.  
  1619. 405
  1620. 01:42:39.630 --> 01:42:49.830
  1621. Daniel Ascher: Thank you culture. So our next two speakers will not be able to attend the roundtable later today, we will be doing Q and A's during their sessions.
  1622.  
  1623. 406
  1624. 01:42:50.220 --> 01:43:08.040
  1625. Daniel Ascher: Our coast, Kathie Lee will be moderating the Q AMP. A at the end of Linda and then next session. So now we have Linda event, Karen, the assistant dean for resources and Access Management at the doll green Memorial Library
  1626.  
  1627. 407
  1628. 01:43:09.090 --> 01:43:17.670
  1629. Daniel Ascher: At Georgetown University Medical Center. So put those questions in the q&a box as soon as you have them and I will pass it to it.
  1630.  
  1631. 408
  1632. 01:43:18.330 --> 01:43:22.320
  1633. Linda Van Keuren : Thank you very much. I'm going to share my screen here.
  1634.  
  1635. 409
  1636. 01:43:26.460 --> 01:43:33.600
  1637. Linda Van Keuren : So thank you very much. So my name is Linda being Karen and I'll be taking a few moments today to talk about library patrons security.
  1638.  
  1639. 410
  1640. 01:43:34.050 --> 01:43:39.600
  1641. Linda Van Keuren : And why it's so important. So I'm the assistant dean for resources and Access Management
  1642.  
  1643. 411
  1644. 01:43:40.350 --> 01:43:44.130
  1645. Linda Van Keuren : As Daniel setup dog Memorial Library at Georgetown University Medical Center.
  1646.  
  1647. 412
  1648. 01:43:44.490 --> 01:43:54.870
  1649. Linda Van Keuren : I've been at Georgetown about nine years last position and prior to that, I was a systems librarian. So my general frame of references academic and how sciences libraries, but
  1650.  
  1651. 413
  1652. 01:43:55.140 --> 01:44:02.670
  1653. Linda Van Keuren : Of course, the topic of library patrons security is important to all libraries public libraries academic and so on.
  1654.  
  1655. 414
  1656. 01:44:03.570 --> 01:44:11.250
  1657. Linda Van Keuren : As a prelude in the context of this presentation, I'm taking kind of a broad look at patron security and at times.
  1658.  
  1659. 415
  1660. 01:44:11.910 --> 01:44:17.940
  1661. Linda Van Keuren : will touch upon privacy issues. So I realized, security is more about having the controls in place to reduce the risk of
  1662.  
  1663. 416
  1664. 01:44:18.300 --> 01:44:30.300
  1665. Linda Van Keuren : Personal or institutional information from falling into the wrong hands and privacy, you know, relates more to the rights to read and research and learn without excessive scrutiny.
  1666.  
  1667. 417
  1668. 01:44:31.260 --> 01:44:47.820
  1669. Linda Van Keuren : And since user specific details can also be used to secure private data. There's definitely this relationship and sometimes it's a tension between privacy and security for me, I consider most security measures to be a tool to help protect our patron privacy's
  1670.  
  1671. 418
  1672. 01:44:49.710 --> 01:44:59.040
  1673. Linda Van Keuren : And that's why I think this summit and group sentence, such as sensi or provide a really wonderful opportunity to discuss these topics and those tensions that might that might arise.
  1674.  
  1675. 419
  1676. 01:45:00.660 --> 01:45:12.810
  1677. Linda Van Keuren : Patron information security isn't really a new concern to librarian. So, you know, for example, 20 years ago, many libraries many academic libraries in the US would have patron so security numbers as part of the patron records.
  1678.  
  1679. 420
  1680. 01:45:13.230 --> 01:45:20.640
  1681. Linda Van Keuren : But as thinking about security patron data evolved most libraries determine there was really no need to use so security number.
  1682.  
  1683. 421
  1684. 01:45:20.910 --> 01:45:32.040
  1685. Linda Van Keuren : And by storing it we were exposing potentials patrons to potential fraudulent activity so many libraries have stopped using that number within the patron record.
  1686.  
  1687. 422
  1688. 01:45:32.910 --> 01:45:38.700
  1689. Linda Van Keuren : Even more recently in my library we had the patron physical address within the patron record.
  1690.  
  1691. 423
  1692. 01:45:39.120 --> 01:45:50.100
  1693. Linda Van Keuren : And again, comparing the added value that the data brought to us as far as service we provide to the students. It didn't really justify the security risk of us having that data in our system.
  1694.  
  1695. 424
  1696. 01:45:50.460 --> 01:45:57.090
  1697. Linda Van Keuren : And so we just we stopped in courting that data into the library system. So those are two very simple examples.
  1698.  
  1699. 425
  1700. 01:45:57.780 --> 01:46:10.110
  1701. Linda Van Keuren : That reflects some of the concerns that libraries have about patron information security. And now, as many of us are in a virtual learning and research environment. Those concerns are even more profound
  1702.  
  1703. 426
  1704. 01:46:15.450 --> 01:46:28.650
  1705. Linda Van Keuren : So as I said, I'm taking kind of a broad view patron security and why it's important in this slide just outline some of the areas of concern in regards to security within a library setting. I'm sure you can all think of others.
  1706.  
  1707. 427
  1708. 01:46:30.030 --> 01:46:39.120
  1709. Linda Van Keuren : The first one is about keeping your credentials to use to log into online library resources secure. So for many of us in academia.
  1710.  
  1711. 428
  1712. 01:46:39.780 --> 01:46:46.530
  1713. Linda Van Keuren : Or in health systems or corporate libraries library credentials are not just library credentials, their credentials that access
  1714.  
  1715. 429
  1716. 01:46:46.830 --> 01:46:57.720
  1717. Linda Van Keuren : institutional resources and the libraries are just as one of those institutional resources. So the importance of keeping these credentials secure is to keep both library access
  1718.  
  1719. 430
  1720. 01:46:58.200 --> 01:47:14.970
  1721. Linda Van Keuren : Secure and the institutional networks and services, secure, you know, we also have issues with Patreon sharing their username password with a friend or colleague, because they might not realize the ramifications of doing so. Or they may do it intentionally or of course they may accidentally
  1722.  
  1723. 431
  1724. 01:47:16.950 --> 01:47:28.950
  1725. Linda Van Keuren : reveal their credentials via phishing scam or something like that, you know, and so many, many institutions have moved to some form of two factor authentication as as one way to combat this issue.
  1726.  
  1727. 432
  1728. 01:47:29.370 --> 01:47:36.540
  1729. Linda Van Keuren : And many librarians encourage patrons and us to utilize best practices and keep their credentials secure
  1730.  
  1731. 433
  1732. 01:47:37.650 --> 01:47:50.490
  1733. Linda Van Keuren : So not only can the misuse of credentials introduce a problem into the institutional network but library public computers many libraries provide some computers that can be used either by non affiliates.
  1734.  
  1735. 434
  1736. 01:47:51.630 --> 01:48:00.960
  1737. Linda Van Keuren : Or anyone that walks into the library and those those like those clever computers could potentially introduce ransomware viruses into an institutional network.
  1738.  
  1739. 435
  1740. 01:48:01.410 --> 01:48:08.130
  1741. Linda Van Keuren : You know there are libraries that have been hit with ransomware attacks and their entire infrastructure was hijacked and
  1742.  
  1743. 436
  1744. 01:48:08.760 --> 01:48:16.140
  1745. Linda Van Keuren : Their patrons couldn't access any of their resources and until a ransom was was paid and no librarian wants to navigate that.
  1746.  
  1747. 437
  1748. 01:48:16.860 --> 01:48:36.600
  1749. Linda Van Keuren : Through resolution with an IT security experts. Experts or law enforcement and so many libraries use things such as institutional kiosk images for public computers and other restrictive measures to try and minimize the security risk fellows by public Lee available computers.
  1750.  
  1751. 438
  1752. 01:48:38.160 --> 01:48:44.370
  1753. Linda Van Keuren : The next item is personally identifiable information and confidentiality and this aspect of security is
  1754.  
  1755. 439
  1756. 01:48:44.940 --> 01:48:55.890
  1757. Linda Van Keuren : Very important to library. And so the responsibility to keep patrons personally identifiable information such as name and phone number, as well as their library use information.
  1758.  
  1759. 440
  1760. 01:48:56.430 --> 01:49:13.350
  1761. Linda Van Keuren : Confidential it deeply informs many library policies. So, you know, decisions about licensing and authentication and security almost always keep that in mind. So even if our patrons at times don't seem as concerned as the staff in regards to the protection of your data.
  1762.  
  1763. 441
  1764. 01:49:14.400 --> 01:49:17.400
  1765. Linda Van Keuren : I think we think it's because they trust us to take care of it.
  1766.  
  1767. 442
  1768. 01:49:18.840 --> 01:49:25.530
  1769. Linda Van Keuren : And have you know a form of this kind of responsibilities even built into the ethics of most library professional organizations.
  1770.  
  1771. 443
  1772. 01:49:26.010 --> 01:49:35.700
  1773. Linda Van Keuren : And I feel that it's the security measures that we put into place that help maintain this patron data privacy in regards to online resource use
  1774.  
  1775. 444
  1776. 01:49:36.570 --> 01:49:47.130
  1777. Linda Van Keuren : You know security measures can reduce who can take a look at what a patron is researching in the online environment and many times libraries make decisions, not even to to
  1778.  
  1779. 445
  1780. 01:49:48.720 --> 01:49:52.920
  1781. Linda Van Keuren : To not save usage data because if it's not saved. It can't be exposed.
  1782.  
  1783. 446
  1784. 01:49:53.790 --> 01:49:59.340
  1785. Linda Van Keuren : And of course it's not only professional ethics matter us libraries and institutions that are
  1786.  
  1787. 447
  1788. 01:49:59.730 --> 01:50:15.930
  1789. Linda Van Keuren : Working within the educational field have to worry about regulations, such as FERPA for educational records and medical library is also need to sometimes also be concerned about HIPAA regulations that are concerned with medical records.
  1790.  
  1791. 448
  1792. 01:50:17.130 --> 01:50:24.600
  1793. Linda Van Keuren : The security of the intellectual property is another important area to libraries so institutional data can be used.
  1794.  
  1795. 449
  1796. 01:50:25.050 --> 01:50:41.820
  1797. Linda Van Keuren : You know, that was used for patent development or drug discovery may have significant monetary value to an organization and libraries often provide data management assistance to researchers and guide them on setting up a data organization plan that can consider
  1798.  
  1799. 450
  1800. 01:50:42.870 --> 01:50:54.420
  1801. Linda Van Keuren : Good cyber security practices within that plan and you know it's not just the institutions intellectual property that librarians are concerned about as everyone is in this audience, I'm sure is aware
  1802.  
  1803. 451
  1804. 01:50:55.260 --> 01:51:08.010
  1805. Linda Van Keuren : When librarians sign licenses for electronic resources part of those licenses often include an agreement to take reasonable efforts to protect the intellectual property of the publishers and, you know,
  1806.  
  1807. 452
  1808. 01:51:08.490 --> 01:51:16.650
  1809. Linda Van Keuren : I understand that publishers also are implementing their own strategies to secure their intellectual property, but libraries can collaborate and helping these efforts.
  1810.  
  1811. 453
  1812. 01:51:17.010 --> 01:51:26.070
  1813. Linda Van Keuren : You know, using secure login credentials and other security measures help combat the whiteboard sharing of licensed materials on sites such as sigh hub.
  1814.  
  1815. 454
  1816. 01:51:32.850 --> 01:51:43.530
  1817. Linda Van Keuren : So I'm thinking about all these areas of concern. I want me to next discuss some things that libraries can do even if they're not directly responsible for their enterprises cyber security efforts.
  1818.  
  1819. 455
  1820. 01:51:44.430 --> 01:51:56.280
  1821. Linda Van Keuren : This is true for many academic libraries, the libraries are partners in keeping the enterprise wide secure, but it usually is the responsibility lies with an IT department.
  1822.  
  1823. 456
  1824. 01:51:57.600 --> 01:52:04.590
  1825. Linda Van Keuren : So the first thing that I think libraries can do and are doing now is they make it easy to access.
  1826.  
  1827. 457
  1828. 01:52:05.340 --> 01:52:17.940
  1829. Linda Van Keuren : And use library resources in a secure way. So users that use sites such as I have sometimes say that they that it's easier. It's the site, ease of access as the rationale for doing so.
  1830.  
  1831. 458
  1832. 01:52:18.420 --> 01:52:25.770
  1833. Linda Van Keuren : And so at my library and others, we try to focus on solutions that make it easier for users to securely access subscribed resources.
  1834.  
  1835. 459
  1836. 01:52:26.370 --> 01:52:33.210
  1837. Linda Van Keuren : And we handle this in many ways. So curated content list installing federated authentication, which I'll talk about in a minute.
  1838.  
  1839. 460
  1840. 01:52:33.810 --> 01:52:41.610
  1841. Linda Van Keuren : We embrace the seamless access initiative. We also provide a browser browser plugin that streamlines resource access
  1842.  
  1843. 461
  1844. 01:52:42.420 --> 01:52:50.130
  1845. Linda Van Keuren : We also curate and encourage high quality open access content in all our finding tools for the library resources.
  1846.  
  1847. 462
  1848. 01:52:50.400 --> 01:53:05.040
  1849. Linda Van Keuren : And we have librarians that are truly integrated into the curricular and in my case, the clinical endeavors of the institution. And so we can see firsthand where the access pain points are. And then we look for solutions to to reduce those pain points.
  1850.  
  1851. 463
  1852. 01:53:06.690 --> 01:53:17.400
  1853. Linda Van Keuren : The next item is collaborate, so nobody can do this alone and many libraries nowadays are facing staffing and other budget restrictions.
  1854.  
  1855. 464
  1856. 01:53:17.790 --> 01:53:28.020
  1857. Linda Van Keuren : And that results in less people and less resources to monitor security matters even in non covert times only the very largest of libraries will have a staff dedicated to security.
  1858.  
  1859. 465
  1860. 01:53:28.380 --> 01:53:34.470
  1861. Linda Van Keuren : You know, at best, most libraries might have a systems librarian and that would be one of many, many job responsibilities.
  1862.  
  1863. 466
  1864. 01:53:34.710 --> 01:53:52.980
  1865. Linda Van Keuren : So it's even more important now for libraries to seek out and partner with their IT departments or whoever's responsible for cyber security on these matters. So for example, if the light IT department wishes to enable a security message method method, such as two factor authentication.
  1866.  
  1867. 467
  1868. 01:53:54.240 --> 01:54:09.210
  1869. Linda Van Keuren : My brands can partner with them on that and ensure that access to library resources use two factor authentication. So I don't think any library wants to be seen as the weak link in the enterprise security so partnering and understanding the
  1870.  
  1871. 468
  1872. 01:54:10.260 --> 01:54:14.130
  1873. Linda Van Keuren : The what what is important to IT departments can really go far.
  1874.  
  1875. 469
  1876. 01:54:15.900 --> 01:54:24.840
  1877. Linda Van Keuren : And collaborating with IT departments also can help is when if and when there is that tension between protecting patron use privacy.
  1878.  
  1879. 470
  1880. 01:54:26.190 --> 01:54:38.610
  1881. Linda Van Keuren : Facilitating user experience and security risk mitigation come up, you can have these conversations because you've already built a trust a trusting collaboration with your IT department.
  1882.  
  1883. 471
  1884. 01:54:39.780 --> 01:54:43.020
  1885. Linda Van Keuren : Library actually just collaborate with it provides you should collaborate with your
  1886.  
  1887. 472
  1888. 01:54:43.770 --> 01:54:52.740
  1889. Linda Van Keuren : publishers that you're working with, have a good understanding of what's important to them in regards to the security of their content and of course if librarians are
  1890.  
  1891. 473
  1892. 01:54:53.640 --> 01:55:11.790
  1893. Linda Van Keuren : Going to have a partner with a vendor that might have patrons personally identifiable information they should do a security audit to make sure that both the vendor and you have the same standards in regards to security measures such as Christian or whatever is important to your institution.
  1894.  
  1895. 474
  1896. 01:55:13.230 --> 01:55:25.800
  1897. Linda Van Keuren : So education. Education is very integral to what librarians, do we spend a lot of time teaching patrons all aspects of utilizing information resources and certainly cyber security can be sprinkled throughout
  1898.  
  1899. 475
  1900. 01:55:26.400 --> 01:55:37.170
  1901. Linda Van Keuren : That those education session. So we can help patrons, be careful about clicking links in emails and opening attach files from people. They don't know
  1902.  
  1903. 476
  1904. 01:55:38.010 --> 01:55:48.390
  1905. Linda Van Keuren : And thereby reduce perhaps the possibility of some malware being introduced into the enterprise network, but we also can help them respect the intellectual property of others.
  1906.  
  1907. 477
  1908. 01:55:49.110 --> 01:56:04.560
  1909. Linda Van Keuren : helping them understand copyright and that might also reduce the number of exposed institutional credentials or excessive download cases, it's not just see users of course library staff also need to have a good understanding of some basic cyber security principles.
  1910.  
  1911. 478
  1912. 01:56:06.420 --> 01:56:12.720
  1913. Linda Van Keuren : And good policy. So having good policies that promote good cyber security. So it shouldn't be an afterthought.
  1914.  
  1915. 479
  1916. 01:56:13.260 --> 01:56:28.950
  1917. Linda Van Keuren : These can be policies about restricting downloads on public computers or a policy about timely installation of software patches. So you want to consider maybe also having a cyber security emergency policy alongside a natural disaster policy. So if
  1918.  
  1919. 480
  1920. 01:56:30.540 --> 01:56:34.770
  1921. Linda Van Keuren : That unfortunately occurs, you would be more ready to handle it.
  1922.  
  1923. 481
  1924. 01:56:35.910 --> 01:56:45.840
  1925. Linda Van Keuren : And also, then, I feel that moving to a federated authentication system for for access can reduce the security risk of a
  1926.  
  1927. 482
  1928. 01:56:46.860 --> 01:56:49.410
  1929. Linda Van Keuren : That a library library access may pose.
  1930.  
  1931. 483
  1932. 01:56:53.100 --> 01:57:00.930
  1933. Linda Van Keuren : And to that last point I'd like to share with you a case study about how my library moved to federated authentication for library resource access
  1934.  
  1935. 484
  1936. 01:57:01.350 --> 01:57:09.330
  1937. Linda Van Keuren : And we undertook this project. About five years ago and it helped us to become a much more thoughtful and deliberate in regards to security and patronage data.
  1938.  
  1939. 485
  1940. 01:57:09.720 --> 01:57:13.950
  1941. Linda Van Keuren : And so for me to explain the project I do need to give you a few details about the library so
  1942.  
  1943. 486
  1944. 01:57:14.340 --> 01:57:21.510
  1945. Linda Van Keuren : Dogger Memorial Library, or we call ourselves the amount we are the library from the medical center. So we serve a hospital a cancer center.
  1946.  
  1947. 487
  1948. 01:57:21.870 --> 01:57:31.230
  1949. Linda Van Keuren : And schools of medicine nursing and biomedicine and that that is about 6500 f t have a larger University ft of 20,000
  1950.  
  1951. 488
  1952. 01:57:32.040 --> 01:57:39.960
  1953. Linda Van Keuren : We are almost entirely online. So we say our collections are 99.9% online and we have them for many, many years.
  1954.  
  1955. 489
  1956. 01:57:40.740 --> 01:57:50.550
  1957. Linda Van Keuren : So the project we undertook as I said is we moved our resource access from IP based authentication to a federated authentication system and we used open Athens to do so.
  1958.  
  1959. 490
  1960. 01:57:51.090 --> 01:58:03.240
  1961. Linda Van Keuren : This project ended up having numerous benefits to us and our patrons and one of them was the implemented into implementation of a much more sophisticated identity management system and security measures.
  1962.  
  1963. 491
  1964. 01:58:07.860 --> 01:58:23.730
  1965. Linda Van Keuren : So before this, we were using IP based authentication for library of useful resources. So we provided our affiliated IP numbers to vendors and anyone within that range could access library provided resources when they were on campus users didn't have to log in and
  1966.  
  1967. 492
  1968. 01:58:25.710 --> 01:58:32.730
  1969. Linda Van Keuren : That's not to say resource access was completely open. Most of the campus computers did require login, just to use the computer.
  1970.  
  1971. 493
  1972. 01:58:33.120 --> 01:58:43.440
  1973. Linda Van Keuren : And the proxy server for off campus required a login, but it was a much more porous environment as it relates to access and we wanted to tie access, not into the location.
  1974.  
  1975. 494
  1976. 01:58:44.040 --> 01:58:53.910
  1977. Linda Van Keuren : We wanted to tie it into the identity of users. And in doing so, we envisioned we would make better security decisions better access decisions and acquisitions decisions.
  1978.  
  1979. 495
  1980. 01:58:54.150 --> 01:58:58.290
  1981. Linda Van Keuren : So we partnered with our IT department. They're called the university information systems.
  1982.  
  1983. 496
  1984. 01:58:58.680 --> 01:59:04.770
  1985. Linda Van Keuren : On this project and they brought with them their vast experience in cyber security and identity management.
  1986.  
  1987. 497
  1988. 01:59:05.100 --> 01:59:15.900
  1989. Linda Van Keuren : So they shared with us that they felt that IP based authentication was a higher security risk than the Federated authentication model. So they were incredibly interested
  1990.  
  1991. 498
  1992. 01:59:16.260 --> 01:59:27.870
  1993. Linda Van Keuren : And pleased with our interest in this model for access, as I said, we also wanted to change our acquisitions model. So part of the change we were purchasing resources for the entire community.
  1994.  
  1995. 499
  1996. 01:59:28.620 --> 01:59:38.160
  1997. Linda Van Keuren : Rather than which is the FDA 20,000 rather than just the medical center community, even though all of our funding comes from the medical center community.
  1998.  
  1999. 500
  2000. 01:59:38.430 --> 01:59:50.370
  2001. Linda Van Keuren : And so we needed to change that. So for clinical information resources or medical center users. Users had priority access and we could purchase and limit, just to those relevant users.
  2002.  
  2003. 501
  2004. 01:59:51.510 --> 02:00:09.840
  2005. Linda Van Keuren : We also only had minimal statistics using IP based authentication and we really wanted demographic based usage statistics and we wanted the statistics because we thought it would be, we'd be able to provide better service and also have better conversations about library funding.
  2006.  
  2007. 502
  2008. 02:00:11.370 --> 02:00:21.420
  2009. Linda Van Keuren : And we wanted to provide more consistent access to our hospital patrons. So our hospitals part of a multi hospital health system. We only serve
  2010.  
  2011. 503
  2012. 02:00:21.990 --> 02:00:29.760
  2013. Linda Van Keuren : One hospital out of many and their IT infrastructure is handled by the health system IT infrastructure.
  2014.  
  2015. 504
  2016. 02:00:30.240 --> 02:00:36.990
  2017. Linda Van Keuren : Those changes would be made to their network infrastructure that would temporarily cut off IP access to that user population.
  2018.  
  2019. 505
  2020. 02:00:37.290 --> 02:00:47.640
  2021. Linda Van Keuren : And that's a user population that really needs information quickly. It is it is critical to patient care. And so we really needed something that was much more stable for our users.
  2022.  
  2023. 506
  2024. 02:00:48.240 --> 02:00:57.330
  2025. Linda Van Keuren : Around the same time, this hospital system unfortunately had a very public ransomware attack and so cyber security was very much on everyone's mind.
  2026.  
  2027. 507
  2028. 02:00:58.350 --> 02:01:01.170
  2029. Linda Van Keuren : As we were undergoing this this project.
  2030.  
  2031. 508
  2032. 02:01:04.830 --> 02:01:11.910
  2033. Linda Van Keuren : So there are other speakers in this summit that will discuss more in depth about federated authentication. So I will leave the details to them.
  2034.  
  2035. 509
  2036. 02:01:12.300 --> 02:01:27.120
  2037. Linda Van Keuren : But this is this slide demonstrates the steps we had to take to undergo the switch to federated authentication our IT department us that an open Athens. They have a security risk.
  2038.  
  2039. 510
  2040. 02:01:29.370 --> 02:01:39.840
  2041. Linda Van Keuren : Checklist. And they also have an identity management checklist and they were incredibly happy with the methods used to protect the security of our patron data and the amount of data that we could
  2042.  
  2043. 511
  2044. 02:01:40.470 --> 02:01:48.510
  2045. Linda Van Keuren : Choose to release if we if we so desire. So once the decision was made. We worked with the identity management team and we
  2046.  
  2047. 512
  2048. 02:01:49.740 --> 02:02:00.930
  2049. Linda Van Keuren : Develop the logic to identify our medical center users that would have access and then that was then tied to an attribute that lived within their network their institutional network credentials.
  2050.  
  2051. 513
  2052. 02:02:02.520 --> 02:02:15.180
  2053. Linda Van Keuren : Once we added the attribute. We then had to configure the connection between the university system and the open Athan system. So users could use their university credentials to log into our resources.
  2054.  
  2055. 514
  2056. 02:02:16.080 --> 02:02:23.070
  2057. Linda Van Keuren : And then we have to configure the resources on the open Athens administration site and then inform our
  2058.  
  2059. 515
  2060. 02:02:23.580 --> 02:02:42.450
  2061. Linda Van Keuren : Publisher partners that we were changing our, our, our authentication type to a federated authentication whenever possible. And then the last step was to update the resource URLs. So, this this took three years. This was not as simple a simple, quick thing to do, but it was it was
  2062.  
  2063. 516
  2064. 02:02:43.620 --> 02:02:51.810
  2065. Linda Van Keuren : It was really worthwhile, you know, for our patrons access remained uninterrupted except now they were asked to log in at all times, regardless of location.
  2066.  
  2067. 517
  2068. 02:02:52.020 --> 02:02:57.270
  2069. Linda Van Keuren : So we definitely gave this a lot of thought and a lot of consideration, knowing that we were moving to a system.
  2070.  
  2071. 518
  2072. 02:02:57.600 --> 02:03:04.710
  2073. Linda Van Keuren : In which library access required the use of personal credentials, when in the past, they could generally search anonymously.
  2074.  
  2075. 519
  2076. 02:03:05.160 --> 02:03:15.750
  2077. Linda Van Keuren : However federated authentication can be set up to have a system generated ID number that is provided them to the service providers to preserve public privacy.
  2078.  
  2079. 520
  2080. 02:03:16.650 --> 02:03:25.290
  2081. Linda Van Keuren : And administrators can also choose to release additional attributes to service provider on a provider provider basis so
  2082.  
  2083. 521
  2084. 02:03:25.740 --> 02:03:46.920
  2085. Linda Van Keuren : For example, we, we really wanted demographic based usage statistics. We wanted to do analysis on our use instead on broad categories of individuals and we chose not to release any of that information and we handle that completely within the university on an anonymized, aggregated basis.
  2086.  
  2087. 522
  2088. 02:03:52.710 --> 02:04:02.040
  2089. Linda Van Keuren : So as I said, it took about three years to completely transition all our resources over using this method, but once completed, we were able to reap the benefits of
  2090.  
  2091. 523
  2092. 02:04:02.430 --> 02:04:10.950
  2093. Linda Van Keuren : A better acquisitions model that gave us more flexibility anonymize demographic base usage statistics that are used to improve services and
  2094.  
  2095. 524
  2096. 02:04:11.340 --> 02:04:19.050
  2097. Linda Van Keuren : Have better funding discussions, but I really want to talk about the security benefits that we feel we reaped from this project. So the first is
  2098.  
  2099. 525
  2100. 02:04:19.470 --> 02:04:28.380
  2101. Linda Van Keuren : federated authentication for us utilizes both open Athens and institutional security monitoring systems of the university.
  2102.  
  2103. 526
  2104. 02:04:28.890 --> 02:04:32.700
  2105. Linda Van Keuren : Both of which are staffed by individuals with much more security.
  2106.  
  2107. 527
  2108. 02:04:33.150 --> 02:04:45.390
  2109. Linda Van Keuren : Expertise than lives in the library. So the library facilitates the security monitoring, but it's not directly responsible for it and that that frees us up to focus on other aspects of service.
  2110.  
  2111. 528
  2112. 02:04:46.260 --> 02:04:58.650
  2113. Linda Van Keuren : And other benefit that was quickly apparent is we can have a precise response to misuse. So if there is saying excessive downloading issue rather than having it the whole IP address.
  2114.  
  2115. 529
  2116. 02:04:59.370 --> 02:05:15.600
  2117. Linda Van Keuren : Turned off. Well, the matter is investigated, we can quickly identify a specific account that might be involved in the misuse and temporarily suspend it while it's being investigated and the whole campus doesn't feel the impact of that misuse.
  2118.  
  2119. 530
  2120. 02:05:17.190 --> 02:05:21.630
  2121. Linda Van Keuren : We also have much better but identity management. So having the ability to provide and
  2122.  
  2123. 531
  2124. 02:05:22.650 --> 02:05:27.990
  2125. Linda Van Keuren : Resource access to the entire medical center or just a single department or single major
  2126.  
  2127. 532
  2128. 02:05:28.440 --> 02:05:35.940
  2129. Linda Van Keuren : Allows the library to much more finely tuned our acquisitions purchase and we just see that as being good stewards of our institutional resources.
  2130.  
  2131. 533
  2132. 02:05:36.570 --> 02:05:43.830
  2133. Linda Van Keuren : And it's easier for us to implement license terms. So, for example, because we're providing content to
  2134.  
  2135. 534
  2136. 02:05:44.550 --> 02:05:53.790
  2137. Linda Van Keuren : A MEDICAL CENTER. Some of our very clinical resources, the content should really only be used by licensed professionals and in the past, we would deal with that by
  2138.  
  2139. 535
  2140. 02:05:54.750 --> 02:05:59.730
  2141. Linda Van Keuren : Having the licensed professionals kind of jumped through a few hoops before they could get to the content.
  2142.  
  2143. 536
  2144. 02:06:00.180 --> 02:06:08.370
  2145. Linda Van Keuren : But now we handle that on the administration straight of level and the path for licensed professional as much more streamline
  2146.  
  2147. 537
  2148. 02:06:09.120 --> 02:06:18.840
  2149. Linda Van Keuren : And then finally, we believe federated access billing brings library resources closer to the user workflow. They can log in from a publisher site. If the publisher has enabled that
  2150.  
  2151. 538
  2152. 02:06:19.260 --> 02:06:28.620
  2153. Linda Van Keuren : And we hope it makes it easy for our patrons to make good decisions. Now, when in regards to respecting intellectual property of others and securing their credentials.
  2154.  
  2155. 539
  2156. 02:06:31.650 --> 02:06:39.510
  2157. Linda Van Keuren : So thank you very much. I think there's a few minutes left. And I'm have left. And I'm happy to answer any, any questions that may have come up
  2158.  
  2159. 540
  2160. 02:06:41.340 --> 02:06:55.980
  2161. Kathleen Neely : And when we have a question around when Georgetown Medical Center move to open Athens federated access. Are there any vendors who could not accommodate Open Office federated access
  2162.  
  2163. 541
  2164. 02:06:56.400 --> 02:07:02.130
  2165. Linda Van Keuren : Yeah. So we did this five years ago and a federated authentication was still newer
  2166.  
  2167. 542
  2168. 02:07:02.640 --> 02:07:10.740
  2169. Linda Van Keuren : At that time, especially in the US it was much more open office was much more well known in Europe. And so there were times where we would be
  2170.  
  2171. 543
  2172. 02:07:11.430 --> 02:07:23.370
  2173. Linda Van Keuren : Connecting the publisher and open Athens. So the publisher who wouldn't be familiar about it could talk to open office directly and then they would, you know, they would get set up in that way.
  2174.  
  2175. 544
  2176. 02:07:26.370 --> 02:07:35.280
  2177. Kathleen Neely : Okay, and I encourage the attendees. We still have some time left answer questions. So there's something else you'd like to know.
  2178.  
  2179. 545
  2180. 02:07:36.480 --> 02:07:37.710
  2181. Kathleen Neely : Please don't be bashful.
  2182.  
  2183. 546
  2184. 02:07:42.360 --> 02:07:46.560
  2185. Kathleen Neely : Let the you mentioned some policies that you put in place.
  2186.  
  2187. 547
  2188. 02:07:47.700 --> 02:07:57.540
  2189. Kathleen Neely : And I just wondered how your patrons reacted to those policies where you're trying to protect them, especially when it comes to HIPAA compliance.
  2190.  
  2191. 548
  2192. 02:07:57.750 --> 02:08:09.240
  2193. Linda Van Keuren : Right, I, you know, in some ways, I think it's easier for us to implement slightly more stringent security policies because most of our users are working or will work in a clinical environment.
  2194.  
  2195. 549
  2196. 02:08:09.600 --> 02:08:19.380
  2197. Linda Van Keuren : And they have incredibly strict security measures there. So for things such as being required to log in the library resources when they didn't have to
  2198.  
  2199. 550
  2200. 02:08:19.800 --> 02:08:21.090
  2201. Linda Van Keuren : Honestly, we didn't get
  2202.  
  2203. 551
  2204. 02:08:21.570 --> 02:08:31.470
  2205. Linda Van Keuren : Many comments about it because our users were used to working in the hospital where it's it's used to login and record everything. So in silence. We're lucky.
  2206.  
  2207. 552
  2208. 02:08:32.970 --> 02:08:33.210
  2209. Kathleen Neely : Okay.
  2210.  
  2211. 553
  2212. 02:08:40.200 --> 02:08:57.930
  2213. Kathleen Neely : Okay, Linda. Can you speak to the issue of security for login credential access training Corey both spoken, the issue of fishing and other sophisticated passport mining. Are there specific strategies CML used to dress this concern.
  2214.  
  2215. 554
  2216. 02:08:59.220 --> 02:09:07.830
  2217. Linda Van Keuren : So, you know, it's like super early on library library credentials are not library credentials, their institutional credentials.
  2218.  
  2219. 555
  2220. 02:09:08.190 --> 02:09:22.230
  2221. Linda Van Keuren : And so we are just one part of that big institutional access point. And so we look to and work with our IT department as far as all the the sort of strategies about keeping those credentials, safe and so
  2222.  
  2223. 556
  2224. 02:09:22.860 --> 02:09:32.880
  2225. Linda Van Keuren : We do things like we talked to patrons about, like I said, we talked to patients about respecting intellectual property we talked to patrons about don't care, your credentials.
  2226.  
  2227. 557
  2228. 02:09:33.630 --> 02:09:43.080
  2229. Linda Van Keuren : Even with your friends. And so we do things sort of on the user by user level, but we really take guidance from the experts in our institution which is our IT department.
  2230.  
  2231. 558
  2232. 02:09:44.760 --> 02:09:45.000
  2233. Okay.
  2234.  
  2235. 559
  2236. 02:09:47.610 --> 02:09:59.520
  2237. Kathleen Neely : All right. Um, I don't think we have any other questions, we'll give it another moment to see if anybody has the last question, and then not, I will pass it back to Dan
  2238.  
  2239. 560
  2240. 02:10:04.770 --> 02:10:07.470
  2241. Kathleen Neely : Okay, Dan, I think we're back to you.
  2242.  
  2243. 561
  2244. 02:10:08.880 --> 02:10:09.330
  2245. Daniel Ascher: Thank
  2246.  
  2247. 562
  2248. 02:10:10.380 --> 02:10:17.520
  2249. Daniel Ascher: You very much for that great talk. Linda, and thank you, Cathy. So for our next presenter.
  2250.  
  2251. 563
  2252. 02:10:18.450 --> 02:10:36.690
  2253. Daniel Ascher: We will have Joseph DeMarco, who is a partner at before and DeMarco LLP and similar to Linda Joseph will be taking questions at the end of his talk. So if you have any questions throughout, please feel free to enter them into the Q AMP a box and I will pass it to you. So
  2254.  
  2255. 564
  2256. 02:10:37.800 --> 02:10:43.230
  2257. Joseph DeMarco : Thanks very much, and I really appreciate it. Um, so I don't have any slides, and I'm going to speak.
  2258.  
  2259. 565
  2260. 02:10:43.740 --> 02:10:52.230
  2261. Joseph DeMarco : For about 20 minutes and then I obviously invite any questions that you might have also available to answer questions offline if people would prefer
  2262.  
  2263. 566
  2264. 02:10:52.560 --> 02:11:00.210
  2265. Joseph DeMarco : Um, let me just tell you, kind of, what I'm going to cover. First, I'll give you a little bit about my background and how I came to this particular event today.
  2266.  
  2267. 567
  2268. 02:11:01.020 --> 02:11:12.840
  2269. Joseph DeMarco : Second, I'd like to talk about one aspect of the problem that we really haven't focused on yet, which is not just the theft of student credentials, but the theft of faculty credentials, because we've seen those as well.
  2270.  
  2271. 568
  2272. 02:11:14.010 --> 02:11:21.330
  2273. Joseph DeMarco : And then applying the applying that and building off of that, I'd like to talk about the impact that the
  2274.  
  2275. 569
  2276. 02:11:22.140 --> 02:11:35.490
  2277. Joseph DeMarco : The enemies of publishing and the institutions that are subscribers to various publishers publications on what their motivations are and what we believe is going on, you know, kind of behind the scenes.
  2278.  
  2279. 570
  2280. 02:11:36.420 --> 02:11:41.970
  2281. Joseph DeMarco : You know there have been press reports out there which you know people. I'm sure have found or could find on the problem.
  2282.  
  2283. 571
  2284. 02:11:42.240 --> 02:11:48.570
  2285. Joseph DeMarco : But I think it really does bear in mind, you know, kind of pulling everything together and try to understand the problem holistically.
  2286.  
  2287. 572
  2288. 02:11:49.020 --> 02:12:02.010
  2289. Joseph DeMarco : As to how it was an is that foreign actors might be leveraging the issues we're talking about today to benefit themselves beyond just the collection of pirated academic journals.
  2290.  
  2291. 573
  2292. 02:12:03.120 --> 02:12:09.210
  2293. Joseph DeMarco : First, a little bit about about myself. I'm a lawyer in private practice and I've been working with and representing Elsevier.
  2294.  
  2295. 574
  2296. 02:12:09.480 --> 02:12:16.920
  2297. Joseph DeMarco : For several years now on its civil litigation against a Bach in Science Direct and the library Genesis projects, as many of you know
  2298.  
  2299. 575
  2300. 02:12:17.160 --> 02:12:23.940
  2301. Joseph DeMarco : On the call that litigation resulted in the entry of permanent and preliminary injunctions against a rocky and and sigh hub.
  2302.  
  2303. 576
  2304. 02:12:24.300 --> 02:12:31.290
  2305. Joseph DeMarco : Several years ago, and ultimately the entry of a default judgment for $15 million against her, and the other defendants
  2306.  
  2307. 577
  2308. 02:12:32.250 --> 02:12:45.960
  2309. Joseph DeMarco : The problem, obviously, is quite permission pernicious what she's doing is quite sophisticated and we believe is, you know, part of a larger infrastructure and network that is targeting universities and publishers across the world.
  2310.  
  2311. 578
  2312. 02:12:47.340 --> 02:12:54.450
  2313. Joseph DeMarco : Before starting my firm for about a decade. I was a federal prosecutor in New York, where I ran the cyber crime unit at the Department of Justice in New York.
  2314.  
  2315. 579
  2316. 02:12:54.750 --> 02:13:04.260
  2317. Joseph DeMarco : And part of my job. There was to focus on intellectual property theft, whether it was committed by companies against companies faithless employees against their employers.
  2318.  
  2319. 580
  2320. 02:13:04.500 --> 02:13:09.720
  2321. Joseph DeMarco : Or nation states directed against large producers of institutional intellectual property.
  2322.  
  2323. 581
  2324. 02:13:10.320 --> 02:13:19.650
  2325. Joseph DeMarco : Um, obviously, this is the problem that you know Academic Publishers and IP producers have been grappling with for decades. And the problem is not going to go away. Even if a blocky and we're
  2326.  
  2327. 582
  2328. 02:13:20.100 --> 02:13:27.990
  2329. Joseph DeMarco : Suddenly to reform her ways and shut down. Science Direct. Um, but I think the problem has become much more pernicious over the last few years.
  2330.  
  2331. 583
  2332. 02:13:28.620 --> 02:13:41.040
  2333. Joseph DeMarco : And one particular area, which I'd like to spend a little bit of time on focus is not just on the problem that can arise when student credentials are compromised. But the problem that can arise when faculty credentials are compromised.
  2334.  
  2335. 584
  2336. 02:13:41.550 --> 02:13:50.220
  2337. Joseph DeMarco : As prior speakers have noted today the credentials that students and also the credentials that faculty use to log on to their systems.
  2338.  
  2339. 585
  2340. 02:13:50.460 --> 02:14:00.210
  2341. Joseph DeMarco : Provide access, not only to the university libraries and systems, but also provide access to other parts of the computing environment of academic institutions.
  2342.  
  2343. 586
  2344. 02:14:01.020 --> 02:14:09.630
  2345. Joseph DeMarco : I'm I'm also in addition to, you know, my, my practice as a lawyer. I'm also an adjunct professor of law at Columbia University School of Law.
  2346.  
  2347. 587
  2348. 02:14:10.020 --> 02:14:13.620
  2349. Joseph DeMarco : Where I teach the internet crime seminar and as a faculty member
  2350.  
  2351. 588
  2352. 02:14:14.220 --> 02:14:26.490
  2353. Joseph DeMarco : You know I'm aware of the types of data that are available to me that may not necessarily be available to students. I also, as you might imagine, do my very best to secure my credentials to the Columbia environment.
  2354.  
  2355. 589
  2356. 02:14:27.180 --> 02:14:35.100
  2357. Joseph DeMarco : I'm in preparation for the talk today, I asked one of the analysts that our law firm, whose it was a student at an Ivy League school
  2358.  
  2359. 590
  2360. 02:14:36.060 --> 02:14:46.680
  2361. Joseph DeMarco : To just tell me what information he could access as a student and I coupled that with my access permissions as a faculty member and here's a list of information that
  2362.  
  2363. 591
  2364. 02:14:47.490 --> 02:14:54.960
  2365. Joseph DeMarco : People who have access to a student's login credentials, which again at the same typically log on credentials to university library.
  2366.  
  2367. 592
  2368. 02:14:55.800 --> 02:15:04.290
  2369. Joseph DeMarco : System as to the rest of the network in a non federated non authenticated, you know, modality. Here's some of the information that's available.
  2370.  
  2371. 593
  2372. 02:15:05.160 --> 02:15:12.360
  2373. Joseph DeMarco : If you have those credentials. Well, obviously you have access. If you're a student to student submissions and works.
  2374.  
  2375. 594
  2376. 02:15:13.200 --> 02:15:23.550
  2377. Joseph DeMarco : If your access to the system is that of a faculty member, you also, of course, have access to faculty teaching files class discussions Silla by
  2378.  
  2379. 595
  2380. 02:15:23.880 --> 02:15:32.970
  2381. Joseph DeMarco : Class recordings, you know, in this world of coven and remote teaching a lot of lectures are being recorded. That includes the Q AMP. A between the students and a teacher.
  2382.  
  2383. 596
  2384. 02:15:33.570 --> 02:15:47.220
  2385. Joseph DeMarco : You have the ability to extract upload and delete course files give access to grades academic records student evaluations reports of faculty faculty misconduct reports.
  2386.  
  2387. 597
  2388. 02:15:47.610 --> 02:16:01.830
  2389. Joseph DeMarco : You have access to contact information inside the university past student lists recommendations for students, you have the ability to access account and password settings within the university system.
  2390.  
  2391. 598
  2392. 02:16:03.330 --> 02:16:13.680
  2393. Joseph DeMarco : You also, again, if you are able to obtain student and or faculty credentials to university platform have access to faculty contact information.
  2394.  
  2395. 599
  2396. 02:16:14.160 --> 02:16:31.050
  2397. Joseph DeMarco : names, addresses, phone numbers birthdays email addresses pictures and emergency contacts for faculty and students incredibly valuable information if you are engaged or want to be engaged in some type of identity theft or social engineering crime.
  2398.  
  2399. 600
  2400. 02:16:32.040 --> 02:16:37.560
  2401. Joseph DeMarco : Obviously you have access to terabytes of intellectual property. We've spoken about the theft of that already. With regard to the uranium hacking group.
  2402.  
  2403. 601
  2404. 02:16:38.250 --> 02:16:45.900
  2405. Joseph DeMarco : You also have the ability at the technical level to register network cards to connect devices to the university networks to be authenticated devices.
  2406.  
  2407. 602
  2408. 02:16:46.620 --> 02:17:05.610
  2409. Joseph DeMarco : You obviously have, in many cases, access to email systems, the course and the school student provided in faculty provided email systems which can contain a treasure trove of highly sensitive highly personal and highly granular information again all for the taking.
  2410.  
  2411. 603
  2412. 02:17:07.200 --> 02:17:14.100
  2413. Joseph DeMarco : Obviously you have access to a great deal of financial information paychecks. If you are a faculty member or a staff of the university.
  2414.  
  2415. 604
  2416. 02:17:14.400 --> 02:17:21.030
  2417. Joseph DeMarco : Or if you're a student who's working on campus payroll records relating to your work on campus that, of course, along with it.
  2418.  
  2419. 605
  2420. 02:17:21.600 --> 02:17:35.100
  2421. Joseph DeMarco : Involves bank information W two information loan information. And, of course, a whole range of other HR information as well. Also included in that our health information related to the faculty at the student members.
  2422.  
  2423. 606
  2424. 02:17:36.120 --> 02:17:43.410
  2425. Joseph DeMarco : Campus services information is also available security alerts events maps housing information.
  2426.  
  2427. 607
  2428. 02:17:44.160 --> 02:18:03.600
  2429. Joseph DeMarco : And access to internal calendars and events information, all of which provide for essentially a nearly complete picture of what's going on on a university campus and or the life of that particular student or group of students or faculty or group of faculty members, I'm
  2430.  
  2431. 608
  2432. 02:18:04.800 --> 02:18:06.630
  2433. Joseph DeMarco : In my experience as a prosecutor.
  2434.  
  2435. 609
  2436. 02:18:08.100 --> 02:18:15.690
  2437. Joseph DeMarco : I never once encountered a criminal who was engaged in a theft of one particular item.
  2438.  
  2439. 610
  2440. 02:18:16.140 --> 02:18:27.720
  2441. Joseph DeMarco : who knowingly and intentionally passed up the opportunity to steal a second item you know people who break into someone's someone's house to steal jewelry.
  2442.  
  2443. 611
  2444. 02:18:28.200 --> 02:18:34.740
  2445. Joseph DeMarco : If they see other valuable information that's out there will steal the other valuable information.
  2446.  
  2447. 612
  2448. 02:18:35.250 --> 02:18:42.870
  2449. Joseph DeMarco : Um, and I think what we've seen recently in the last few years in the area of intellectual property theft hacking and cybercrime
  2450.  
  2451. 613
  2452. 02:18:43.590 --> 02:18:58.860
  2453. Joseph DeMarco : Is that confluence events between criminal groups organized crimes individual criminals individual hackers and foreign state sponsors of economic crime and in some cases terrorism come together.
  2454.  
  2455. 614
  2456. 02:18:59.250 --> 02:19:10.890
  2457. Joseph DeMarco : In loose or sometimes not so loose affiliations of organizations that are operating online to hack into organizations databases and create a range of crimes.
  2458.  
  2459. 615
  2460. 02:19:11.940 --> 02:19:22.080
  2461. Joseph DeMarco : You know, there was a very good example of this, about three weeks ago, United States Government brought a criminal indictment against members of a PT 41
  2462.  
  2463. 616
  2464. 02:19:22.470 --> 02:19:25.200
  2465. Joseph DeMarco : Which is a Chinese government sponsored hacking group.
  2466.  
  2467. 617
  2468. 02:19:25.560 --> 02:19:35.490
  2469. Joseph DeMarco : Which had broken into a number of computer networks of online services providers and used those break ins to then further effectuate
  2470.  
  2471. 618
  2472. 02:19:35.730 --> 02:19:46.650
  2473. Joseph DeMarco : Downstream crimes to entities that had subscriptions at those online service providers and platforms, essentially the end user customers get a businesses or individuals.
  2474.  
  2475. 619
  2476. 02:19:46.920 --> 02:19:55.260
  2477. Joseph DeMarco : Of those platforms to gain access to the platforms and in gaining access to the platforms they gained access to the end user customer accounts and information.
  2478.  
  2479. 620
  2480. 02:19:55.830 --> 02:20:08.760
  2481. Joseph DeMarco : And what are the bad guys do. Well, that guy's just didn't just steal intellectual property belonging to the subscribers of those institutions, those, those platforms. They didn't just engage in
  2482.  
  2483. 621
  2484. 02:20:09.390 --> 02:20:23.460
  2485. Joseph DeMarco : cryptocurrency mining, that is to say, using the computing power of those companies to solve the complex equations and algorithms that are necessary to be solved in order to mine new bitcoin, they didn't just do that.
  2486.  
  2487. 622
  2488. 02:20:24.210 --> 02:20:34.350
  2489. Joseph DeMarco : They didn't just steal a password information log on information and information that could be used in identity theft schemes, they didn't just do that.
  2490.  
  2491. 623
  2492. 02:20:35.040 --> 02:20:42.450
  2493. Joseph DeMarco : They didn't just engage in fraud fraud schemes, the theft of funds and the theft of other valuable information.
  2494.  
  2495. 624
  2496. 02:20:42.960 --> 02:20:51.990
  2497. Joseph DeMarco : They engaged in all of those things thank each and all of those things at the same time, wherever they could whether the whenever they could
  2498.  
  2499. 625
  2500. 02:20:52.380 --> 02:21:00.210
  2501. Joseph DeMarco : opportunistically looking to make the most money in the shortest amount of time based on the access that they had
  2502.  
  2503. 626
  2504. 02:21:00.870 --> 02:21:09.210
  2505. Joseph DeMarco : Now what's interesting is if you go back and and find that in Diamond. It's on the DOJ web pages. Google abt 41 and, you know, DOJ hacking charges.
  2506.  
  2507. 627
  2508. 02:21:10.110 --> 02:21:16.470
  2509. Joseph DeMarco : What you'll find is a number of the compromised organizations and entities, happened to be universities.
  2510.  
  2511. 628
  2512. 02:21:17.160 --> 02:21:28.170
  2513. Joseph DeMarco : Now in the diamond that the DOJ brought down the access credentials to those universities weren't necessarily obtained in the same way that of Aki and and her Confederates have obtained the
  2514.  
  2515. 629
  2516. 02:21:28.620 --> 02:21:42.330
  2517. Joseph DeMarco : Log on informations that they use to perpetrate the size of scheme those credentials were obtained and another way, but the example underscores, I think the point that once the bad guys are in
  2518.  
  2519. 630
  2520. 02:21:43.500 --> 02:21:55.110
  2521. Joseph DeMarco : There in and once they're in the odds are quite high. In fact, extraordinarily high that they're going to use that access for a range of mayhem and misconduct.
  2522.  
  2523. 631
  2524. 02:21:55.680 --> 02:21:58.710
  2525. Joseph DeMarco : And when we know from the prior indictments that have been brought
  2526.  
  2527. 632
  2528. 02:21:59.070 --> 02:22:07.110
  2529. Joseph DeMarco : By the Department of Justice and United States and other foreign law enforcement agencies against some of the groups that have been involved in this.
  2530.  
  2531. 633
  2532. 02:22:07.410 --> 02:22:16.140
  2533. Joseph DeMarco : When we know that the information that they're taking not only relates to information that can be used. You know, for fraud purposes or piracy purposes.
  2534.  
  2535. 634
  2536. 02:22:16.710 --> 02:22:24.420
  2537. Joseph DeMarco : When we know that all of that is going on. But in addition to that. What's also going on is the theft of intellectual property.
  2538.  
  2539. 635
  2540. 02:22:24.690 --> 02:22:35.970
  2541. Joseph DeMarco : The theft, for example of covert treatment information therapeutics vaccine information that we've seen recently, we have to ask ourselves what's really going on behind the scenes.
  2542.  
  2543. 636
  2544. 02:22:36.960 --> 02:22:49.680
  2545. Joseph DeMarco : Look, when you put together that fat with the fact that some of the more sophisticated organizations, including but not limited to sigh hub require a fair amount of internet and technical backbone.
  2546.  
  2547. 637
  2548. 02:22:50.250 --> 02:23:00.840
  2549. Joseph DeMarco : I would submit that the evidence is very, very compelling that the people that are involved in piracy are not just stopping at piracy.
  2550.  
  2551. 638
  2552. 02:23:01.290 --> 02:23:09.630
  2553. Joseph DeMarco : They're engaged in other crimes as well. Now, will they ever admit to that. No, will they go to great lengths to hide that fact, of course, they will
  2554.  
  2555. 639
  2556. 02:23:10.080 --> 02:23:24.870
  2557. Joseph DeMarco : They can't do anything other than that, and it's important for them to hide that because if they don't hide it, what would otherwise be a simple issue of copyright law then becomes something much more severe.
  2558.  
  2559. 640
  2560. 02:23:25.470 --> 02:23:46.320
  2561. Joseph DeMarco : But we know for example from, you know, Hawkins own interviews that she knows the credential. She's using are stolen. We have anecdotal evidence that when credentials or stolen as part of the sigh of piracy scheme, they appear on the dark web for sale barter or or just to be given away.
  2562.  
  2563. 641
  2564. 02:23:47.760 --> 02:23:51.720
  2565. Joseph DeMarco : We're dealing with a sophisticated audience here today. We're dealing with people that understand
  2566.  
  2567. 642
  2568. 02:23:52.380 --> 02:23:59.130
  2569. Joseph DeMarco : What's going on with regards to the attacks upon their systems, the vulnerabilities that they face if they're not properly secured.
  2570.  
  2571. 643
  2572. 02:23:59.370 --> 02:24:09.420
  2573. Joseph DeMarco : We have actual indictments but the Department of Justice has brought with incredibly detailed descriptions of what the wrongdoers are doing what they're after.
  2574.  
  2575. 644
  2576. 02:24:09.810 --> 02:24:23.190
  2577. Joseph DeMarco : And it hasn't stopped and it hasn't stopped because intellectual property producers keep producing intellectual property right. I mean, you know, earlier on today. We've heard about how 95% of the contents of else of years.
  2578.  
  2579. 645
  2580. 02:24:24.030 --> 02:24:33.810
  2581. Joseph DeMarco : Science Direct platform have been stolen. Well, it'll never get to, you know, 100% all in one moment because new publications are being added all the time.
  2582.  
  2583. 646
  2584. 02:24:34.260 --> 02:24:49.860
  2585. Joseph DeMarco : My guess is will always hover between 95 and 99% because as new publications are added the fest will go on and the stuff will go on. I believe until you know participants in this discussion increase security raise awareness.
  2586.  
  2587. 647
  2588. 02:24:50.880 --> 02:24:56.130
  2589. Joseph DeMarco : You know, obviously things need to be done in the legal context as well to bring pressure bear on the problem.
  2590.  
  2591. 648
  2592. 02:24:56.850 --> 02:25:04.920
  2593. Joseph DeMarco : But I think that to simply confined the problem to the narrow box of copyright infringement or piracy, or open access
  2594.  
  2595. 649
  2596. 02:25:05.190 --> 02:25:11.400
  2597. Joseph DeMarco : Or, you know, kind of the freedom of information is, I think, to miss what really is the elephant in the room.
  2598.  
  2599. 650
  2600. 02:25:11.700 --> 02:25:19.050
  2601. Joseph DeMarco : You know, the elephant in the room is that there's a lot more going on behind the scenes and that there are powerful forces in play.
  2602.  
  2603. 651
  2604. 02:25:19.320 --> 02:25:29.610
  2605. Joseph DeMarco : Designed to perpetrate this problem, to the detriment of academic institutions around the world and universities and schools and other institutions around the world.
  2606.  
  2607. 652
  2608. 02:25:29.970 --> 02:25:40.080
  2609. Joseph DeMarco : And I would just again point to the fact of you know what what what may have could result from faculty credentials being stolen, which would, which we know they are being stolen. In addition to
  2610.  
  2611. 653
  2612. 02:25:40.590 --> 02:25:47.550
  2613. Joseph DeMarco : In addition to student credentials. So let me pause there, see if there are any questions. I see a couple of buttons on the chat buttons.
  2614.  
  2615. 654
  2616. 02:25:50.100 --> 02:25:50.910
  2617. Joseph DeMarco : And
  2618.  
  2619. 655
  2620. 02:25:52.140 --> 02:26:00.990
  2621. Joseph DeMarco : One comment says seems that no matter the approach usernames and passwords are the vulnerable vulnerable point. You know, I would agree with that.
  2622.  
  2623. 656
  2624. 02:26:01.470 --> 02:26:14.040
  2625. Joseph DeMarco : I think that as long as we only have usernames and passwords that are the means of access and absent, you know, the policing of those credentials, you know, complex requirements.
  2626.  
  2627. 657
  2628. 02:26:14.520 --> 02:26:27.180
  2629. Joseph DeMarco : You know prohibitions against concurrent logins geo filtering and other kinds of behavior analytics that we've spoken about today, which are parcels partial solutions not complete solutions.
  2630.  
  2631. 658
  2632. 02:26:27.600 --> 02:26:37.530
  2633. Joseph DeMarco : As long as we have kind of the username and password as the modality with nothing more, you know, with no other two factor authentication whatever format that authentication comes in.
  2634.  
  2635. 659
  2636. 02:26:38.400 --> 02:26:44.640
  2637. Joseph DeMarco : The, the, the greater the vulnerabilities going to be. I do think there is one reason for optimism.
  2638.  
  2639. 660
  2640. 02:26:45.570 --> 02:27:01.530
  2641. Joseph DeMarco : And that is that I think increasingly just the user base of computers are beginning to use multi factor authentication more seamlessly and with less resistance as time goes on.
  2642.  
  2643. 661
  2644. 02:27:02.040 --> 02:27:10.770
  2645. Joseph DeMarco : You know today having an authenticator app downloaded onto your iPhone. It's not a, you know, weird freakishly paranoid, you know, super secure
  2646.  
  2647. 662
  2648. 02:27:11.640 --> 02:27:20.910
  2649. Joseph DeMarco : Thing to do many people have authenticator apps which will provide them with one time codes to log on to a platform st with, you know, receiving a text message.
  2650.  
  2651. 663
  2652. 02:27:21.360 --> 02:27:31.140
  2653. Joseph DeMarco : In connection with logging on those are not, you know, kind of unusual things and I think as you know computer users generally become more attuned to that they will be more
  2654.  
  2655. 664
  2656. 02:27:31.770 --> 02:27:41.070
  2657. Joseph DeMarco : Willing to do that. And there'll be less resistance to that and hopefully you know that will be rolled out on a on a on a greater and greater basis.
  2658.  
  2659. 665
  2660. 02:27:42.000 --> 02:27:52.230
  2661. Joseph DeMarco : We have to increase security, we have to raise awareness, we have to increase training. The problem is not going to go away. I think there will always be kind of this you know cat and mouse game.
  2662.  
  2663. 666
  2664. 02:27:52.470 --> 02:28:11.550
  2665. Joseph DeMarco : Between the content producers and the content thieves, but understand the content. These are not just content. These either they are themselves or they're working with people groups or states that are interested in far more than piracy of copyrighted academic journals.
  2666.  
  2667. 667
  2668. 02:28:14.640 --> 02:28:20.340
  2669. Kathleen Neely : Let's encourage the attendees. If there's any other questions for Joe to Cheryl.
  2670.  
  2671. 668
  2672. 02:28:22.830 --> 02:28:32.040
  2673. Kathleen Neely : And I think you're right, the authentication means are getting much stronger and we're all getting used to those across the board in our day to day lines.
  2674.  
  2675. 669
  2676. 02:28:33.870 --> 02:28:47.760
  2677. Kathleen Neely : We do have another comment, is it not too much. Is it not so much about resistance and more to do with accessibility, should a personal phone be a requirement to have access to work resources.
  2678.  
  2679. 670
  2680. 02:28:48.360 --> 02:28:58.650
  2681. Joseph DeMarco : Look, it's a great question and and you know i'm i'm not on the technical end or the business end. Um, so I would encourage that as a point of discussion.
  2682.  
  2683. 671
  2684. 02:28:59.760 --> 02:29:09.000
  2685. Joseph DeMarco : But I but I totally TAKE YOUR POINT AND GET YOUR POINT AND IT IS. It is a valid point. I mean, we, we, I think as an industry wants to strive
  2686.  
  2687. 672
  2688. 02:29:09.300 --> 02:29:23.400
  2689. Joseph DeMarco : To make our content as available and as seamlessly and easily viewable as humanly possible. And I think that is that is a noble goal you know i think that i think i just think there's room for improvement. I guess is what I'm saying.
  2690.  
  2691. 673
  2692. 02:29:24.510 --> 02:29:24.840
  2693. Kathleen Neely : Okay.
  2694.  
  2695. 674
  2696. 02:29:26.310 --> 02:29:35.610
  2697. Kathleen Neely : All right, and you don't seem to have any other questions, we'll give it another moment, since there's a last burning question from somebody
  2698.  
  2699. 675
  2700. 02:29:36.630 --> 02:29:45.510
  2701. Kathleen Neely : Joe, thank you so much for your presentation. It was fabulous. I've learned a lot. Myself and Dan. I'll leave it to you to take us on
  2702.  
  2703. 676
  2704. 02:29:46.470 --> 02:29:47.160
  2705. Joseph DeMarco : Thank you for having me.
  2706.  
  2707. 677
  2708. 02:29:48.180 --> 02:29:50.250
  2709. Daniel Ascher: Thank you very much. Joe, thank you Kathy.
  2710.  
  2711. 678
  2712. 02:29:51.540 --> 02:30:06.630
  2713. Daniel Ascher: And so with the conclusion of Joe's presentation. We are now going into our 15 minute breaks in here. And if you'd like to continue the conversation in the chat box, while we're on break. Please feel free to stop teasing.
  2714.  
  2715. 679
  2716. 02:45:07.410 --> 02:45:24.450
  2717. Daniel Ascher: Okay. Welcome back everyone, hope you had a good break there and we enter the final portion of today's Security Summit. So for our final featured speaker before the round table, we will have Tim like the CEO of lip likes
  2718.  
  2719. 680
  2720. 02:45:25.500 --> 02:45:27.750
  2721. Daniel Ascher: And whenever you're ready to take it away.
  2722.  
  2723. 681
  2724. 02:45:38.280 --> 02:45:40.500
  2725. Tim Lloyd : Hi. Can you see me, Daniel.
  2726.  
  2727. 682
  2728. 02:45:42.810 --> 02:45:43.530
  2729. Daniel Ascher: Yes, perfect.
  2730.  
  2731. 683
  2732. 02:45:43.980 --> 02:45:44.790
  2733. Tim Lloyd : Great. Okay.
  2734.  
  2735. 684
  2736. 02:46:01.410 --> 02:46:02.010
  2737. Tim Lloyd : Okay, great.
  2738.  
  2739. 685
  2740. 02:46:04.110 --> 02:46:11.100
  2741. Tim Lloyd : So hi, everyone. Hi. Good evening. Good afternoon, or good morning, depending on where you are. My name is Tim Lloyd.
  2742.  
  2743. 686
  2744. 02:46:11.940 --> 02:46:19.260
  2745. Tim Lloyd : By way of a brief introduction. I spent the last six years, focusing on identity and access in relation to Stoli content.
  2746.  
  2747. 687
  2748. 02:46:19.920 --> 02:46:24.450
  2749. Tim Lloyd : I previously worked in publishing developing Scalia resources in collaboration libraries.
  2750.  
  2751. 688
  2752. 02:46:25.230 --> 02:46:31.350
  2753. Tim Lloyd : And a member of the governance committee of seniors access to dog and a co Chair of the outreach committee not talk a little bit about that later.
  2754.  
  2755. 689
  2756. 02:46:32.130 --> 02:46:39.090
  2757. Tim Lloyd : And I spent a lot of my life, managing both sides of resource access. So this is a very comfortable cemetery part of unread delighted to be here.
  2758.  
  2759. 690
  2760. 02:46:40.860 --> 02:46:48.720
  2761. Tim Lloyd : I'm going to talk about four topics today in my heart. Our first I'm going to review how federated authentication works at using a simple analogy.
  2762.  
  2763. 691
  2764. 02:46:49.350 --> 02:46:55.710
  2765. Tim Lloyd : Apologies. If you've already seen me present this before, but it's just really hard to talk about federated authentication and there's some
  2766.  
  2767. 692
  2768. 02:46:56.340 --> 02:47:01.020
  2769. Tim Lloyd : There's some base level of knowledge about how it actually works. And this is just the quickest and easiest way to do that.
  2770.  
  2771. 693
  2772. 02:47:01.740 --> 02:47:09.120
  2773. Tim Lloyd : A second, I'm going to talk about the basics of how identity is managed and federated authentication and so time well to some of the comments that Linda made earlier.
  2774.  
  2775. 694
  2776. 02:47:10.050 --> 02:47:15.600
  2777. Tim Lloyd : Third, I'll briefly talk about the thing, the facts of project and how it relates to the security federated authentication.
  2778.  
  2779. 695
  2780. 02:47:16.590 --> 02:47:25.560
  2781. Tim Lloyd : And then finally, I'm going to compare a security federated authentication to the most common use methods to authenticate access to scholarly resources which is IP authentication.
  2782.  
  2783. 696
  2784. 02:47:28.230 --> 02:47:38.430
  2785. Tim Lloyd : So federated authentication. So a popular misconception about is that this is really just the same as single sign on when you're using it on your phone. And it's really not the same.
  2786.  
  2787. 697
  2788. 02:47:38.910 --> 02:47:45.540
  2789. Tim Lloyd : It's an extension single sign on. It's designed to allow users to use the organizational credentials to authenticate access
  2790.  
  2791. 698
  2792. 02:47:46.110 --> 02:47:53.490
  2793. Tim Lloyd : To a wide variety of online resources that are provided by third parties outside their organization. Sounds pretty similar. It's normalcy why
  2794.  
  2795. 699
  2796. 02:47:54.240 --> 02:48:06.360
  2797. Tim Lloyd : If you're unfamiliar with the term better X authentication. You may recognize the name Shibboleth instead shoplifters open source software commonly used to implement Federation authentication in research and education institutions.
  2798.  
  2799. 700
  2800. 02:48:07.740 --> 02:48:08.040
  2801. Tim Lloyd : So,
  2802.  
  2803. 701
  2804. 02:48:09.720 --> 02:48:27.930
  2805. Tim Lloyd : simple analogy is Bob runs a conference booth. He provides books. Anyone who studies that he's describing institution. Amy comes up to the booth and says, Hi, I have a book also show and asked her if she's other subscribing institution. Amy says, yep. I'm a student at ABC college
  2806.  
  2807. 702
  2808. 02:48:29.070 --> 02:48:41.670
  2809. Tim Lloyd : However boggling to me. So he needs to verify. She's actually registered with ABC college. Luckily he has a phone book where you can look up someone who can help him in the case of ABC College, the person to talk to if Carol.
  2810.  
  2811. 703
  2812. 02:48:42.900 --> 02:48:49.740
  2813. Tim Lloyd : both coasts Carol asked if she can confirm the person that his booth is students at ABC college
  2814.  
  2815. 704
  2816. 02:48:51.270 --> 02:48:54.810
  2817. Tim Lloyd : care a lot about deposit owns the student so she can talk to her directly.
  2818.  
  2819. 705
  2820. 02:48:57.210 --> 02:49:01.650
  2821. Tim Lloyd : Carol. Talk to me. And it's able to confirm that she's a valid students at ABC college
  2822.  
  2823. 706
  2824. 02:49:02.790 --> 02:49:08.850
  2825. Tim Lloyd : And then Amy part of the phone back to Bob. So Carol can tell him directly. Yep. The student in front of us at ABC college
  2826.  
  2827. 707
  2828. 02:49:10.500 --> 02:49:16.410
  2829. Tim Lloyd : Now, Bob. But ideally like to know students name so we can learn more about your interests and recommend other books for future
  2830.  
  2831. 708
  2832. 02:49:18.090 --> 02:49:25.200
  2833. Tim Lloyd : But ABC colleges policy is not to really student names. And so, Carol calm provide both with any additional information on the student
  2834.  
  2835. 709
  2836. 02:49:27.270 --> 02:49:31.080
  2837. Tim Lloyd : So, okay, but it's verified that the students in front of them is that ABC college
  2838.  
  2839. 710
  2840. 02:49:32.280 --> 02:49:45.660
  2841. Tim Lloyd : gives her a book and you also get very bright green badge where that said, I'm with ABC college and motels or other booth. See that patch. It'll save some time she wrote me tell every booth which institutions, you study that
  2842.  
  2843. 711
  2844. 02:49:47.340 --> 02:49:51.660
  2845. Tim Lloyd : So this simple snow is pretty close to help federated authentication works the high level.
  2846.  
  2847. 712
  2848. 02:49:52.980 --> 02:50:03.930
  2849. Tim Lloyd : So we've got Bob as a service provider or sometimes SP is referred to the needs to check if VISTAs institutional affiliation before providing access to services.
  2850.  
  2851. 713
  2852. 02:50:05.730 --> 02:50:21.720
  2853. Tim Lloyd : The phone book he consulted as an identity Federation trusted list the details how to talk to a set of vetted institutions and vendors. So examples of identity Federation's in higher education include in common in the United States and the UK access management iteration.
  2854.  
  2855. 714
  2856. 02:50:23.430 --> 02:50:32.370
  2857. Tim Lloyd : Carol is the identity provider IDP. So it's the institutions federated authentication service confirms a business entity.
  2858.  
  2859. 715
  2860. 02:50:33.570 --> 02:50:45.630
  2861. Tim Lloyd : And while you know in this simple analogy, everyone speaking English. In reality, Bob, Carol. The Federation are communicating using a language called security Assertion Markup Language or SAML for short.
  2862.  
  2863. 716
  2864. 02:50:48.120 --> 02:51:02.640
  2865. Tim Lloyd : And that badge. The green badge that is an improvement to federated authentication that enables me to avoid having tell every service provider that she visits, what institution choose from and it's enabled by this new initiative called seamless access now to briefly about later.
  2866.  
  2867. 717
  2868. 02:51:04.260 --> 02:51:13.590
  2869. Tim Lloyd : So it's important to note that Carol as the identity provider was in control of Amy's identity, she opted not to share any information about Amy with Bob, such as a name.
  2870.  
  2871. 718
  2872. 02:51:14.280 --> 02:51:24.030
  2873. Tim Lloyd : Or pop God was confirmation emails definitely affiliated with ABC college and because Bob trust the phone book trust the carers, the right person to confirm that
  2874.  
  2875. 719
  2876. 02:51:24.780 --> 02:51:37.200
  2877. Tim Lloyd : So in federated authentication identity providers control user identities by deciding whether or not to share extra information and has attributes for the service provider. But, in this example, no attributes for shared
  2878.  
  2879. 720
  2880. 02:51:38.190 --> 02:51:47.100
  2881. Tim Lloyd : So that's a very quick introduction to service providers identity providers Federation's and attributes and I'm going to talk a bit more about attributes now.
  2882.  
  2883. 721
  2884. 02:51:52.110 --> 02:51:56.580
  2885. Tim Lloyd : Attributes. So the term used to describe the data about an authenticated user
  2886.  
  2887. 722
  2888. 02:51:57.120 --> 02:52:09.360
  2889. Tim Lloyd : An attribute release the process by which that data is shared by an identity provider, such as a research, education institutions, but the service provider, but to the publisher as part of the authentication process.
  2890.  
  2891. 723
  2892. 02:52:10.260 --> 02:52:17.790
  2893. Tim Lloyd : For mathematically takes depends on the underlying technology. So, for example, family is a technology that underpins Shibboleth and open happens
  2894.  
  2895. 724
  2896. 02:52:18.270 --> 02:52:26.610
  2897. Tim Lloyd : But there are other technologies that support federated authentication on. So an example is Open ID Connect, which is used by consumer facing services like Facebook and Google
  2898.  
  2899. 725
  2900. 02:52:28.320 --> 02:52:35.400
  2901. Tim Lloyd : So some key things to understand about attributes in federated authentication firstly attribute release is not required.
  2902.  
  2903. 726
  2904. 02:52:36.060 --> 02:52:44.880
  2905. Tim Lloyd : So an identity provider can simply assert that a user is an authorized member of their organization and do nothing more. Just what happened in the simple analogy I showed earlier.
  2906.  
  2907. 727
  2908. 02:52:45.690 --> 02:52:50.310
  2909. Tim Lloyd : So in this case, the identity provider would just share an anonymous assertion identify
  2910.  
  2911. 728
  2912. 02:52:50.850 --> 02:53:05.490
  2913. Tim Lloyd : The technical name, there'll be associated by the service provider with the authentication response. And you can see an example of one on that slide. It's uniquely generated for each authentication contains no personally identifiable information is ensures that user privacy is preserved.
  2914.  
  2915. 729
  2916. 02:53:07.680 --> 02:53:12.630
  2917. Tim Lloyd : So here's some examples. The types of attributes that can be passed as a result of a successful user authentication.
  2918.  
  2919. 730
  2920. 02:53:13.320 --> 02:53:28.410
  2921. Tim Lloyd : So first off, we've got affiliation attributes. So this defines the organizational association between the user and their home institution. It could be through employment membership enrollment in an educational program, for example, to users a faculty member
  2922.  
  2923. 731
  2924. 02:53:29.820 --> 02:53:41.220
  2925. Tim Lloyd : Next one is an entitlement attribute that confirms the users right access it given resource based on criteria previously agreed with a service provider. So that might be a URL for licensing contract.
  2926.  
  2927. 732
  2928. 02:53:43.230 --> 02:53:49.350
  2929. Tim Lloyd : As to dominate pseudonymous identify can be shared. So that's unique to each person and for each service provider.
  2930.  
  2931. 733
  2932. 02:53:50.010 --> 02:54:00.000
  2933. Tim Lloyd : So it masks the true identity, personally, then for information is just alone alphanumeric string, but it does enable that use it to be identified by the same service provider, the next time they visit
  2934.  
  2935. 734
  2936. 02:54:01.140 --> 02:54:13.830
  2937. Tim Lloyd : But it can't be used to go to patent of usage across service providers. So this is the means to personalize a user's experience. And we'll come back to this later. And then there's obviously also personally identifiable attributes such as the name and email address.
  2938.  
  2939. 735
  2940. 02:54:16.440 --> 02:54:23.400
  2941. Tim Lloyd : So attributes are really important that this whole crux of this because they give both sides of this authentication transaction greater control.
  2942.  
  2943. 736
  2944. 02:54:24.180 --> 02:54:34.260
  2945. Tim Lloyd : That control can be valuable in a variety different ways. For example, to access control CUSTOMERS, DON'T BE AS WELL, MAYBE. So an institution can choose to make a resource available only to users who are
  2946.  
  2947. 737
  2948. 02:54:34.710 --> 02:54:42.000
  2949. Tim Lloyd : Say full time staff and students and prevent say alumni or contractors from access attributes could use that could enable that.
  2950.  
  2951. 738
  2952. 02:54:42.510 --> 02:54:48.180
  2953. Tim Lloyd : I cost control. So if library could limit resource access to users with a certain role from a certain department.
  2954.  
  2955. 739
  2956. 02:54:49.140 --> 02:55:00.720
  2957. Tim Lloyd : And then risk control so synonymous ID allows users to benefit from personalization without exposing them to the risks and hassle separately registering yet another username and password.
  2958.  
  2959. 740
  2960. 02:55:01.830 --> 02:55:13.410
  2961. Tim Lloyd : Service provider can recognize a returning pseudonymous ID, they can personalize that user's experience accordingly. They don't receive personally identifiable information. They don't need to store an email address, they don't need to ask for a password.
  2962.  
  2963. 741
  2964. 02:55:16.950 --> 02:55:27.180
  2965. Tim Lloyd : So how's it work so attribute release only happens after a users authenticated a service provider a publisher can't pull the attributes there and you receive what the identity provider chooses to send
  2966.  
  2967. 742
  2968. 02:55:28.410 --> 02:55:38.220
  2969. Tim Lloyd : Attribute releases configured or an identity provider by institution which category of service provider. Typically when they can do it for each individual service provider itself a lot more effort.
  2970.  
  2971. 743
  2972. 02:55:38.970 --> 02:55:45.810
  2973. Tim Lloyd : And library access as any one of a number of valuable use cases but federated authentication. So other ones will be research collaborations.
  2974.  
  2975. 744
  2976. 02:55:46.290 --> 02:55:53.340
  2977. Tim Lloyd : Where you've got researchers collaborating across different institutions and might typically share more personal data, such as name and email addresses.
  2978.  
  2979. 745
  2980. 02:55:54.000 --> 02:56:09.930
  2981. Tim Lloyd : Or institutional workflows. So that might require users to confirm their institutional affiliation with a third party to access some sort of service that might be safe faculty authorizing the use of institutional funds to pay an open access article published in charge.
  2982.  
  2983. 746
  2984. 02:56:11.460 --> 02:56:18.540
  2985. Tim Lloyd : Because the identity providers in control any special needs or attributes must be agreed in advance so that the attribute release.
  2986.  
  2987. 747
  2988. 02:56:18.930 --> 02:56:25.590
  2989. Tim Lloyd : Can be configured appropriately a service provider or publisher can't just, you know, after you decide, you know what, I think we want email addresses. It's not going to happen.
  2990.  
  2991. 748
  2992. 02:56:28.650 --> 02:56:40.320
  2993. Tim Lloyd : So let's look at some how the use of attributes translates into the real world. So this is some example publishing use cases just ground those concepts into how's that play out in reality it's like a result of access.
  2994.  
  2995. 749
  2996. 02:56:40.950 --> 02:56:46.980
  2997. Tim Lloyd : So in the first scenario. We've got users accessing full text articles on a platform where there's no option for personalization.
  2998.  
  2999. 750
  3000. 02:56:47.580 --> 02:56:55.920
  3001. Tim Lloyd : They just need to confirm their members of your organization. So in this case, will the vendor needs as this anonymous token anonymous assertion identify that will be fine.
  3002.  
  3003. 751
  3004. 02:56:57.030 --> 02:57:02.460
  3005. Tim Lloyd : And scenario to youth can get content recommendations in the user interface based on the price search history.
  3006.  
  3007. 752
  3008. 02:57:02.880 --> 02:57:11.310
  3009. Tim Lloyd : So, to enable that a vendor will need to recognize them when they return. So a synonymous identify would do this. And again, there's no personally identifiable information being transmitted
  3010.  
  3011. 753
  3012. 02:57:12.630 --> 02:57:21.750
  3013. Tim Lloyd : It's not a tree. We've got a special feature that's only available to certain users. So in this case, the ability to tap into prepaid funds to buy say eBooks for department.
  3014.  
  3015. 754
  3016. 02:57:22.560 --> 02:57:31.140
  3017. Tim Lloyd : Live we doesn't want everyone to be able to do this to especially students. So in this case, an attribute for a user's role could be used in addition to a synonymous ID.
  3018.  
  3019. 755
  3020. 02:57:32.580 --> 02:57:37.920
  3021. Tim Lloyd : And finally, it's not a for has clinicians doing online training learning continuing education credits.
  3022.  
  3023. 756
  3024. 02:57:38.400 --> 02:57:49.860
  3025. Tim Lloyd : And then need to receive a certificate by email and have the accreditation officially associated with them. So in this case, use the consent to be sought to pen an email address. In addition to the pseudo anonymous ID.
  3026.  
  3027. 757
  3028. 02:57:53.190 --> 02:58:02.340
  3029. Tim Lloyd : So we talked a little bit about the basics of federated authentication, how it identity management manage their let's briefly segue into seamless access
  3030.  
  3031. 758
  3032. 02:58:03.240 --> 02:58:11.070
  3033. Tim Lloyd : So seamless access grew out of a project. Some of you may have heard of called resource access in the 21st century or or a 21 for short.
  3034.  
  3035. 759
  3036. 02:58:11.580 --> 02:58:21.810
  3037. Tim Lloyd : It was initiated in 2016 initially to explore the challenge of remote access users within those stakeholders from the publishing library software and identity communities.
  3038.  
  3039. 760
  3040. 02:58:22.440 --> 02:58:32.940
  3041. Tim Lloyd : Took input from 60 organizations over three years and it didn't fight a federated authentication health and most promise for a robust and scalable solution for remote access the scholarly content.
  3042.  
  3043. 761
  3044. 02:58:33.750 --> 02:58:45.300
  3045. Tim Lloyd : Investigated barriers to take up develop some ideas of best practices pilots typical approaches and it's conclusions were published as a nicer recommended practice in June 2019 last year.
  3046.  
  3047. 762
  3048. 02:58:46.140 --> 02:58:55.320
  3049. Tim Lloyd : That then led to seeing the factors which was created in July 2019 as a community driven effort to enable seamless access to information resources.
  3050.  
  3051. 763
  3052. 02:58:55.770 --> 02:59:03.750
  3053. Tim Lloyd : Scholarly collaboration tools and shared research infrastructure. So the goal is a broader than just library access them really focusing on that for this talk.
  3054.  
  3055. 764
  3056. 02:59:04.590 --> 02:59:12.360
  3057. Tim Lloyd : It's a coalition of for organizations. So we've got the National Information standards organization life. Oh, the International Association of STM publishers
  3058.  
  3059. 765
  3060. 02:59:12.990 --> 02:59:21.360
  3061. Tim Lloyd : Internet to choose the US Research and Education Network. It also operates the US identity Federation in common amongst many other activities.
  3062.  
  3063. 766
  3064. 02:59:21.930 --> 02:59:33.570
  3065. Tim Lloyd : And Jay, on which is a European Research and Education Network that operates a service called as you gain some of you might have heard of that connects about 60 or so research and education that identity Federation around the world.
  3066.  
  3067. 767
  3068. 02:59:35.280 --> 02:59:43.980
  3069. Tim Lloyd : So think of seamless access as the operational successor to the RA 21 project delivering an operational service plus and best practices and standards.
  3070.  
  3071. 768
  3072. 02:59:45.540 --> 02:59:54.270
  3073. Tim Lloyd : So why Athena factors. Why do we need this simply because access Brechtian deters usage. We've heard this from several speakers already you know we're all very aware
  3074.  
  3075. 769
  3076. 02:59:54.960 --> 03:00:03.120
  3077. Tim Lloyd : That when you develop scholarly resources are you trying to make them available to users that ease of access is critical and barriers and people's ways and then alternative places.
  3078.  
  3079. 770
  3080. 03:00:04.020 --> 03:00:11.190
  3081. Tim Lloyd : And by access friction. I mean, the extra effort required to navigate the access barriers that we put in front of pay ward scholarly resources.
  3082.  
  3083. 771
  3084. 03:00:12.660 --> 03:00:21.810
  3085. Tim Lloyd : So, you know, Harris. Some examples are not intended to name and shame anyone I mean pretty much all publishers have had some interfaces are just very, very inconsistent.
  3086.  
  3087. 772
  3088. 03:00:22.530 --> 03:00:29.010
  3089. Tim Lloyd : In a federated authentication currently generates friction because of the need to identify a user's institutional affiliation.
  3090.  
  3091. 773
  3092. 03:00:29.610 --> 03:00:37.350
  3093. Tim Lloyd : And traditionally have done by having them selected their organization from a list. And the problem is that there's little to no consistency across the way the publishers do this.
  3094.  
  3095. 774
  3096. 03:00:37.920 --> 03:00:44.430
  3097. Tim Lloyd : So you have different visual find posts. Am I clicking on login or signing or access PDF or access for tech
  3098.  
  3099. 775
  3100. 03:00:45.000 --> 03:00:50.940
  3101. Tim Lloyd : There are different user journeys are different visual layouts to institutional access. And there's different terminology
  3102.  
  3103. 776
  3104. 03:00:51.330 --> 03:01:02.430
  3105. Tim Lloyd : You know, there's lots of jargon in this business Shibboleth federated authentication open up, then you can access management Federation IDP Discovery Service. Now, most of that means nothing to users.
  3106.  
  3107. 777
  3108. 03:01:03.330 --> 03:01:10.740
  3109. Tim Lloyd : And then if you add back all the other authentication methods that publishers are allowing you really can get a easy to get a confusing array of choices for users.
  3110.  
  3111. 778
  3112. 03:01:11.970 --> 03:01:18.240
  3113. Tim Lloyd : So seamless access addresses this in three ways. Firstly, it has a standard visual cue.
  3114.  
  3115. 779
  3116. 03:01:18.690 --> 03:01:25.680
  3117. Tim Lloyd : How a user accesses resources required institutional affiliation and that these two screenshots both have the same
  3118.  
  3119. 780
  3120. 03:01:26.040 --> 03:01:40.680
  3121. Tim Lloyd : Thing, the facts. The bottom in them. So this button either display the generic access for your institution message to prompt you to select your institution or a custom message listing your most recent institutional choice relevant and users always have the option to change that.
  3122.  
  3123. 781
  3124. 03:01:41.880 --> 03:01:50.070
  3125. Tim Lloyd : Secondary thing the facts. That's office a standard method for finding our institution. So if you're thinking of the screenshots I show it. Everyone's just coming up with different in faith.
  3126.  
  3127. 782
  3128. 03:01:50.940 --> 03:01:58.380
  3129. Tim Lloyd : Flows and looks and feels for doing this. So they offer a standard way that features best practice design so dynamic search results as you type
  3130.  
  3131. 783
  3132. 03:01:58.710 --> 03:02:08.460
  3133. Tim Lloyd : Will turn to spelling and acronyms institutional logos to supply collection and in technical terms. This is called an identity provider discovery services, how you discover
  3134.  
  3135. 784
  3136. 03:02:08.940 --> 03:02:19.350
  3137. Tim Lloyd : Which organization you're finding in through. And thirdly, and most powerfully seen attack that stores your institutional choices on your computer and local browser storage.
  3138.  
  3139. 785
  3140. 03:02:20.310 --> 03:02:26.820
  3141. Tim Lloyd : So this information can only be accessed by applications coming from the senior factors.org domain. It's not stored remotely anywhere.
  3142.  
  3143. 786
  3144. 03:02:27.360 --> 03:02:39.270
  3145. Tim Lloyd : You can opt out, which in practice just means you make your choice. Every single time. And there's nothing personally identifiable is just saying that the last time you logged in using federated authentication you logged in as you know as people say
  3146.  
  3147. 787
  3148. 03:02:42.930 --> 03:02:44.820
  3149. Tim Lloyd : There's also a couple of other important things
  3150.  
  3151. 788
  3152. 03:02:45.870 --> 03:02:52.710
  3153. Tim Lloyd : That seem to practice of doing one is working on some important best practices to simplify access to federated authentication.
  3154.  
  3155. 789
  3156. 03:02:53.010 --> 03:03:00.870
  3157. Tim Lloyd : So the first is the development of standardized entity categories and associated attribute release bundles of it sounds like jargon.
  3158.  
  3159. 790
  3160. 03:03:01.290 --> 03:03:07.920
  3161. Tim Lloyd : But you may recall a few slides ago, I talked about fact that libraries can configure attribute release what data to share about users.
  3162.  
  3163. 791
  3164. 03:03:08.430 --> 03:03:17.910
  3165. Tim Lloyd : At the category level, sort of like if you're managing your outlook contacts, rather than going through each one you can just say, all of these are personal all these work and and treat them that way.
  3166.  
  3167. 792
  3168. 03:03:19.380 --> 03:03:21.750
  3169. Tim Lloyd : But there's no standardization for these categories are
  3170.  
  3171. 793
  3172. 03:03:23.160 --> 03:03:30.720
  3173. Tim Lloyd : So if you're a library. This makes it much more complex and more prone to error. So to address this seamless accessibility three standard
  3174.  
  3175. 794
  3176. 03:03:31.320 --> 03:03:39.090
  3177. Tim Lloyd : Again entity categories, with the help of across the industry working group. So those categories are firstly authentication only
  3178.  
  3179. 795
  3180. 03:03:39.540 --> 03:03:49.530
  3181. Tim Lloyd : So like that first scenario talked about slides ago. And so we used by a service provider. It doesn't need any user attributes at all just confirmation of that organizational affiliation.
  3182.  
  3183. 796
  3184. 03:03:50.940 --> 03:04:03.990
  3185. Tim Lloyd : Next one's called anonymous authorization. So this will be used when the service provider needs to filter access based on the user's affiliation or entitlements, so it will be an anonymous identifier plus something so it might be your faculty member
  3186.  
  3187. 797
  3188. 03:04:05.130 --> 03:04:18.150
  3189. Tim Lloyd : And then the two documents authorization category will be used by service provider needs to personalize the service and will also allow for additional entitlements or affiliation data. So you can provide more control. Never access
  3190.  
  3191. 798
  3192. 03:04:19.620 --> 03:04:30.450
  3193. Tim Lloyd : The second important development is contract language templates for library US based on these proposed entity categories. This we could libraries, a mechanism to ensure attribute release compliance.
  3194.  
  3195. 799
  3196. 03:04:30.930 --> 03:04:42.180
  3197. Tim Lloyd : And just to note there's nothing here which stops libraries sharing more data. The key point is that that should be a conversation had between the library, the service provider and it should be reflected in the conditions.
  3198.  
  3199. 800
  3200. 03:04:48.600 --> 03:04:58.860
  3201. Tim Lloyd : So what about security and privacy so seamless access has adopted the giant data protection code of conduct might record jr was the European Research and Education provider. I mentioned earlier.
  3202.  
  3203. 801
  3204. 03:04:59.610 --> 03:05:06.840
  3205. Tim Lloyd : So this code of conduct provide specific guidance to service providers and how they should handle personal data in the context of federated authentication.
  3206.  
  3207. 802
  3208. 03:05:07.470 --> 03:05:15.330
  3209. Tim Lloyd : It covers the four principles, don't bother reading them out, but in a nutshell in plain English. What this means is it should only use attributes that are necessary for access
  3210.  
  3211. 803
  3212. 03:05:15.840 --> 03:05:22.320
  3213. Tim Lloyd : Should use a little later as possible, wherever possible, you should not do anything but provide access with this data.
  3214.  
  3215. 804
  3216. 03:05:22.860 --> 03:05:33.750
  3217. Tim Lloyd : And you should delete or anonymize it and it's no longer needed. Um, it is a remarkably readable documents, feel free to Google giant data protection code of conduct is it's a few pages and it's it's
  3218.  
  3219. 805
  3220. 03:05:34.500 --> 03:05:44.580
  3221. Tim Lloyd : The joy to read when you're going through a lot of technical documents is a great one and also aligns very closely with the American Library Association library privacy guidelines that are found in the code of ethics.
  3222.  
  3223. 806
  3224. 03:05:46.020 --> 03:05:48.090
  3225. Tim Lloyd : So I'm going to pivot. Now back to
  3226.  
  3227. 807
  3228. 03:05:49.260 --> 03:05:54.360
  3229. Tim Lloyd : The aim of my presentation, which was to talk about security in the context of federated authentication.
  3230.  
  3231. 808
  3232. 03:05:55.980 --> 03:06:06.450
  3233. Tim Lloyd : So I've set it up front. I'm going to compare it to IP authentication and I'm going to look at two aspects of security that particular concern just got leaking engine we've touched on both of these today.
  3234.  
  3235. 809
  3236. 03:06:07.110 --> 03:06:14.160
  3237. Tim Lloyd : So the first one is the security access. How can we be sure that a person accessing a scholarly resources properly authorized
  3238.  
  3239. 810
  3240. 03:06:15.540 --> 03:06:21.450
  3241. Tim Lloyd : The second is the security of identity. How can we be sure that the users privacy is perfectly safe God
  3242.  
  3243. 811
  3244. 03:06:22.470 --> 03:06:28.590
  3245. Tim Lloyd : I'm not going to talk to the wealth of security issues that arise within applications off the users and authenticated.
  3246.  
  3247. 812
  3248. 03:06:30.750 --> 03:06:34.650
  3249. Tim Lloyd : So let's start with security of access and IP authentication.
  3250.  
  3251. 813
  3252. 03:06:36.270 --> 03:06:39.060
  3253. Tim Lloyd : So IP addresses are actually quite hard to fake
  3254.  
  3255. 814
  3256. 03:06:39.480 --> 03:06:50.460
  3257. Tim Lloyd : That built into the fabric of how the internet works right IP or internet protocol. So the complexity arises from the fact that users don't have IP addresses. Obviously, it's the devices and networks that they use.
  3258.  
  3259. 815
  3260. 03:06:50.850 --> 03:06:58.320
  3261. Tim Lloyd : That provide the IP address that are published ultimately sees so simple analogy of the layers of an onion, the user sits at the core
  3262.  
  3263. 816
  3264. 03:06:59.040 --> 03:07:04.620
  3265. Tim Lloyd : But would interact with a variety of devices that each assign IP addresses, starting with the device that using
  3266.  
  3267. 817
  3268. 03:07:05.220 --> 03:07:08.040
  3269. Tim Lloyd : Then devices that manage access on their local or home network.
  3270.  
  3271. 818
  3272. 03:07:08.610 --> 03:07:17.730
  3273. Tim Lloyd : They're accessing remotely and then likely be using some form of proxy service that presents yet another IP address, which is the VPN or a web proxy service like Ed proxy.
  3274.  
  3275. 819
  3276. 03:07:18.660 --> 03:07:23.370
  3277. Tim Lloyd : So the first security concern is how easily can use it access by registered IP address.
  3278.  
  3279. 820
  3280. 03:07:24.180 --> 03:07:37.680
  3281. Tim Lloyd : Or best practice would require all users to enter individual credentials before being given access to a registered on campus IP address or to rich proxy address on and Linda mentioned earlier, that's exactly what they're doing in her institution.
  3282.  
  3283. 821
  3284. 03:07:39.240 --> 03:07:49.800
  3285. Tim Lloyd : However, there are scenarios where users can access on campus IP addresses simply through physical presence or with generic credentials, such as walk in login not create loopholes that can be exploited.
  3286.  
  3287. 822
  3288. 03:07:51.480 --> 03:07:55.500
  3289. Tim Lloyd : There's also the problem compromised credentials, which speaks have spoken about a lot.
  3290.  
  3291. 823
  3292. 03:07:56.850 --> 03:08:01.500
  3293. Tim Lloyd : Now, while this is equally shared across IP authentication and federated authentication.
  3294.  
  3295. 824
  3296. 03:08:02.010 --> 03:08:09.180
  3297. Tim Lloyd : As Linda mentioned the challenges that publishers are very limited options to do with fraudulent access on the IP authentication.
  3298.  
  3299. 825
  3300. 03:08:09.780 --> 03:08:18.990
  3301. Tim Lloyd : Or users are anonymous and so they either have to disable access to register the IP addresses shutting out pilot users as well or asked the institutional customer to investigate.
  3302.  
  3303. 826
  3304. 03:08:20.400 --> 03:08:35.220
  3305. Tim Lloyd : And as Corey said at the beginning, my presentation I normally say days, he said, from our two weeks. And it's true painstaking analysis to trace the access back from an IP address through a proxy server to a physical computer and back to a specific login
  3306.  
  3307. 827
  3308. 03:08:36.660 --> 03:08:41.580
  3309. Tim Lloyd : The second security concern is the accuracy of the lists are registered IP addresses held by publishers
  3310.  
  3311. 828
  3312. 03:08:42.330 --> 03:08:50.460
  3313. Tim Lloyd : And I made this whole system for human error. So, psi, the business that specializes IP address or it's find that on average 58%
  3314.  
  3315. 829
  3316. 03:08:51.180 --> 03:08:55.200
  3317. Tim Lloyd : Of IP ranges held by publishers to authenticate libraries or in accurate.
  3318.  
  3319. 830
  3320. 03:08:55.920 --> 03:09:07.830
  3321. Tim Lloyd : Having worked with a publisher, I can testify to the number of problems that arise when IP ranges and manually communicated with Mary at opportunities for error and it's not surprising when into the. How many people touch this data.
  3322.  
  3323. 831
  3324. 03:09:08.910 --> 03:09:15.540
  3325. Tim Lloyd : So, for example, it felt much by the library about old IP addresses no longer being used when new ones being added.
  3326.  
  3327. 832
  3328. 03:09:16.230 --> 03:09:26.370
  3329. Tim Lloyd : The library fails to communicate those changes to a publisher. In some cases, as some sort of distributing intermediary in between. So it's through purchasing agent or purchasing consortium distributor.
  3330.  
  3331. 833
  3332. 03:09:27.000 --> 03:09:30.150
  3333. Tim Lloyd : The service provider fails to record those changes in his records.
  3334.  
  3335. 834
  3336. 03:09:30.690 --> 03:09:39.360
  3337. Tim Lloyd : And this process just means that stuff's passing through people's hands all the time, and each step up the chain, there's an opportunity to inaccurately transcribers addresses.
  3338.  
  3339. 835
  3340. 03:09:40.110 --> 03:09:46.950
  3341. Tim Lloyd : Throw in IPv6 as a completely new format for these dresses and you know it's easy to see how complex this can get
  3342.  
  3343. 836
  3344. 03:09:47.730 --> 03:09:57.450
  3345. Tim Lloyd : And and what makes makes it particularly pernicious is that the impact can often be really hidden so users turned away because their IP address isn't recognized simply go elsewhere.
  3346.  
  3347. 837
  3348. 03:09:58.560 --> 03:10:04.200
  3349. Tim Lloyd : They don't notify you maybe because they're unaware, the library actually provides access or because it's seen as too much effort.
  3350.  
  3351. 838
  3352. 03:10:04.740 --> 03:10:15.210
  3353. Tim Lloyd : unauthorized users can access when they shouldn't valid use has got access but they usage attributed to another library because the data is incorrect or juicer overlapping IP ranges.
  3354.  
  3355. 839
  3356. 03:10:16.050 --> 03:10:23.850
  3357. Tim Lloyd : So there are solutions that make this better online registry, such as the IP registry significantly reduce the level of accuracy.
  3358.  
  3359. 840
  3360. 03:10:24.990 --> 03:10:34.710
  3361. Tim Lloyd : But the issue arises from the inherent need to actually communicate large volumes of dynamic formation. So the system's ultimately only as good as the information put into it.
  3362.  
  3363. 841
  3364. 03:10:41.700 --> 03:10:44.520
  3365. Tim Lloyd : So let's look at the security of access on the Federated authentication.
  3366.  
  3367. 842
  3368. 03:10:45.060 --> 03:10:52.890
  3369. Tim Lloyd : So there's a very different authentication process going on as we saw earlier. So rather than publisher trusting a credential passed by the user's computer
  3370.  
  3371. 843
  3372. 03:10:53.310 --> 03:11:02.340
  3373. Tim Lloyd : Such as an IP address the publisher instead relies on that institutional customer to authenticate users individual credentials. So to recap from the analogy earlier.
  3374.  
  3375. 844
  3376. 03:11:02.790 --> 03:11:08.760
  3377. Tim Lloyd : user request access to pay for content, the publisher, ask the user to confirm that institutional affiliation.
  3378.  
  3379. 845
  3380. 03:11:09.450 --> 03:11:24.960
  3381. Tim Lloyd : The publisher looked up at institution by our trusted Federation, that tells them where to send the user to login user logged in, by the institutional Identity Service and then that institution confirm back to the publisher, whether the user has successfully authenticated or not.
  3382.  
  3383. 846
  3384. 03:11:26.340 --> 03:11:35.010
  3385. Tim Lloyd : The head of the parties involved user their institution, the publisher know that the counterparty they're dealing with the right ones and can trust their responses.
  3386.  
  3387. 847
  3388. 03:11:35.460 --> 03:11:41.550
  3389. Tim Lloyd : So federated authentication has a concept called a trust fabric both into it is based around the role of the Federation.
  3390.  
  3391. 848
  3392. 03:11:42.210 --> 03:11:49.860
  3393. Tim Lloyd : So you're a course, my left side. The Federation acts as a trusted phone book list the names and contact details of the publishers institutions involved.
  3394.  
  3395. 849
  3396. 03:11:50.700 --> 03:11:58.110
  3397. Tim Lloyd : So when users share that institutional affiliation. The Federation that confirms the publisher. Here's where you send them about the login
  3398.  
  3399. 850
  3400. 03:11:58.710 --> 03:12:07.800
  3401. Tim Lloyd : When institution receives a request authenticate the user the federation data enable them to validate the digital signature presented by the publisher as part of that request.
  3402.  
  3403. 851
  3404. 03:12:08.730 --> 03:12:20.760
  3405. Tim Lloyd : When a publisher receives an authentication response from institution. Again, the federation data helps them validate the source of the response and tie it back to original request. So someone just can't make up a request as question.
  3406.  
  3407. 852
  3408. 03:12:21.810 --> 03:12:28.020
  3409. Tim Lloyd : So unlike IP authentication identity of all the organizations is known involved is known and validated.
  3410.  
  3411. 853
  3412. 03:12:28.890 --> 03:12:36.810
  3413. Tim Lloyd : As to how about the user. How do we know we can trust their credentials. Well, the beauty of this method, the credentials only exist in one place.
  3414.  
  3415. 854
  3416. 03:12:37.290 --> 03:12:45.300
  3417. Tim Lloyd : under the control of the organization that supplies them. So unlike IP addresses which can change unpredictably and need to be propagated throughout the scholarly system.
  3418.  
  3419. 855
  3420. 03:12:45.840 --> 03:12:56.070
  3421. Tim Lloyd : User always locked in by their own institution at the institution controls this credentials can easily update them as a user situation changes so on and off boarding changes in role.
  3422.  
  3423. 856
  3424. 03:12:57.030 --> 03:13:03.000
  3425. Tim Lloyd : And you know if the credentials are in currently stored. It's a pretty easy thing to correct by the institution concern.
  3426.  
  3427. 857
  3428. 03:13:04.200 --> 03:13:12.510
  3429. Tim Lloyd : And what about stolen credentials. Well, again, the beauty of federated authentication is that the institution will always know the density of user logging in at their end
  3430.  
  3431. 858
  3432. 03:13:13.080 --> 03:13:22.740
  3433. Tim Lloyd : And can delete or reset compromised credentials without impacting other users and because every SAML authentication have that anonymous assertion identify that was in West Side earlier.
  3434.  
  3435. 859
  3436. 03:13:23.340 --> 03:13:29.010
  3437. Tim Lloyd : This is something that a publisher can quote back to institutions, they don't need to know the identity of the person, but it can say
  3438.  
  3439. 860
  3440. 03:13:29.430 --> 03:13:37.800
  3441. Tim Lloyd : This event. We're concerned about can you look into it. And so it's much easier for the institution to trace that back to a specific login and take whatever actions necessary.
  3442.  
  3443. 861
  3444. 03:13:39.540 --> 03:13:46.590
  3445. Tim Lloyd : Now let's get back in and consider the security of the users identity. So IP authentication. It's inherently anonymous its privacy protecting
  3446.  
  3447. 862
  3448. 03:13:47.070 --> 03:13:53.490
  3449. Tim Lloyd : Proxy servers, make it more so because they obscure a patient's underlying IP address to kind of identify them in certain situations.
  3450.  
  3451. 863
  3452. 03:13:54.000 --> 03:13:58.290
  3453. Tim Lloyd : So if your policies, never to provide personal data, under any circumstances, then this fits the bill.
  3454.  
  3455. 864
  3456. 03:13:59.130 --> 03:14:08.400
  3457. Tim Lloyd : But you know, I chose the words appropriately safeguarded that deliberate in the flight because it can depend on your circumstances, it can vary by application by user by library so
  3458.  
  3459. 865
  3460. 03:14:09.330 --> 03:14:19.590
  3461. Tim Lloyd : Based on the popularity of mobile device that's most users value some level of personalization even that simply to remember the topics I'm interested in, so I didn't have to rediscover them every time I use your interface.
  3462.  
  3463. 866
  3464. 03:14:20.370 --> 03:14:26.280
  3465. Tim Lloyd : Real Estate valid reasons why personal data needs to be shared some resources such as the example I gave earlier of accreditation.
  3466.  
  3467. 867
  3468. 03:14:27.480 --> 03:14:39.960
  3469. Tim Lloyd : But by Anonymizing access IP authentication forces users wanting personalization to register directly with service providers, which made him personally harm their privacy more federated authentication.
  3470.  
  3471. 868
  3472. 03:14:40.920 --> 03:14:49.980
  3473. Tim Lloyd : Their options are to reuse social login further increase exposure their life to Facebook and Google will store yet more usernames and passwords with third parties.
  3474.  
  3475. 869
  3476. 03:14:50.400 --> 03:15:00.720
  3477. Tim Lloyd : We know from research that most users tend to reuse existing credentials. So this just exposes both their home and work passwords and creeks increases the general security risks around it.
  3478.  
  3479. 870
  3480. 03:15:02.550 --> 03:15:10.710
  3481. Tim Lloyd : In contrast, on the Federated authentication, you have flexibility. It's one of the appeals of the process. It offers libraries and managing privacy.
  3482.  
  3483. 871
  3484. 03:15:11.130 --> 03:15:19.860
  3485. Tim Lloyd : So institutions are always in control of the information shared under federated authentication and they affect you have a sliding scale of privacy, they can apply. So one and
  3486.  
  3487. 872
  3488. 03:15:20.400 --> 03:15:32.190
  3489. Tim Lloyd : They can simply confirm a user's organizational affiliation provide new information completely anonymous or they can share that affiliation entitlement information allow more granular control over the experience
  3490.  
  3491. 873
  3492. 03:15:33.300 --> 03:15:43.650
  3493. Tim Lloyd : It personalization is needed, you can move the slider further and share those two documents identify and find me in cases where it's really appropriate and you can choose to share personal data, such as a name and email address.
  3494.  
  3495. 874
  3496. 03:15:45.330 --> 03:15:46.740
  3497. Tim Lloyd : And that is it.
  3498.  
  3499. 875
  3500. 03:15:47.970 --> 03:15:48.720
  3501. Tim Lloyd : Back to you. Thanks.
  3502.  
  3503. 876
  3504. 03:15:53.820 --> 03:16:03.510
  3505. Daniel Ascher: Thank you very much to it was very informative. I liked your chart in the beginning, there was a native simplified way of explaining something that can get very complicated quickly.
  3506.  
  3507. 877
  3508. 03:16:07.320 --> 03:16:17.280
  3509. Daniel Ascher: So now we are going to move on to the roundtable discussion moderated by Rick Anderson university librarian at Brigham Young University.
  3510.  
  3511. 878
  3512. 03:16:26.310 --> 03:16:29.790
  3513. Rick Anderson : Everybody, I am not sure whether you can see me.
  3514.  
  3515. 879
  3516. 03:16:30.810 --> 03:16:31.620
  3517. Tim Lloyd : Yep, I can see
  3518.  
  3519. 880
  3520. 03:16:32.040 --> 03:16:33.960
  3521. Rick Anderson : I'm also not sure whether it matters.
  3522.  
  3523. 881
  3524. 03:16:35.370 --> 03:16:36.870
  3525. Rick Anderson : Not being able to see me is not the
  3526.  
  3527. 882
  3528. 03:16:36.870 --> 03:16:38.490
  3529. Rick Anderson : Worst problem in the world to have
  3530.  
  3531. 883
  3532. 03:16:39.120 --> 03:16:45.090
  3533. Rick Anderson : So thanks so much to all of our presenters. This has been an incredibly interesting and informative.
  3534.  
  3535. 884
  3536. 03:16:46.980 --> 03:16:48.240
  3537. Rick Anderson : Morning or
  3538.  
  3539. 885
  3540. 03:16:49.380 --> 03:16:51.630
  3541. Rick Anderson : Afternoon, depending on where you are evening.
  3542.  
  3543. 886
  3544. 03:16:53.040 --> 03:16:58.410
  3545. Rick Anderson : I I have gathered some of the questions that that people submitted.
  3546.  
  3547. 887
  3548. 03:16:59.550 --> 03:17:07.740
  3549. Rick Anderson : In the Q AMP a box and I've also added a couple of my own, just in case they're needed.
  3550.  
  3551. 888
  3552. 03:17:08.550 --> 03:17:23.460
  3553. Rick Anderson : I did want, and I'm not sure whether Cory Roche has been able to make it back. I know that he had a meeting that was supposed to end at about 1230 so he may join us a couple of minutes late, but he privately sent me a note that
  3554.  
  3555. 889
  3556. 03:17:24.480 --> 03:17:30.630
  3557. Rick Anderson : Comment that he wanted to make sure was communicated to everyone. So I'm going to go ahead and read it, he said.
  3558.  
  3559. 890
  3560. 03:17:31.230 --> 03:17:40.770
  3561. Rick Anderson : I worry that the emphasis on Federation and MFA slash to FA in the sessions may leave attendees the wrong impression
  3562.  
  3563. 891
  3564. 03:17:41.490 --> 03:17:48.990
  3565. Rick Anderson : Unfortunately, I have to drop off during the next session and the roundtable I'd suggest that a discussion about the limits of those technologies might be a good topic.
  3566.  
  3567. 892
  3568. 03:17:49.710 --> 03:17:59.730
  3569. Rick Anderson : In my view, Federation and multifactor authentication should be quote unquote table stakes to enter the game. They don't eliminate risk they reduce it.
  3570.  
  3571. 893
  3572. 03:18:00.300 --> 03:18:07.650
  3573. Rick Anderson : The security industry has used that tech on lots of other resources and those resources still battle unauthorized access
  3574.  
  3575. 894
  3576. 03:18:08.100 --> 03:18:20.670
  3577. Rick Anderson : Exhibit A would be the report on the recent Twitter hack. There's an entire industry around Identity and Access Management, the University of Utah's Identity and Access Management team is a quarter of the total ISO staff.
  3578.  
  3579. 895
  3580. 03:18:22.620 --> 03:18:33.150
  3581. Rick Anderson : So let's, let's just throw that out there and see if our panelists have anything they'd like to add or or say in response to that comment from Corey.
  3582.  
  3583. 896
  3584. 03:18:35.640 --> 03:18:48.060
  3585. Tim Lloyd : No, I totally agree with that comment. It's, it's a base level. And I think one of the threads that you can see throughout the whole of today's summit is, you know, major weakness is credentials and all these systems are aligned credentials and
  3586.  
  3587. 897
  3588. 03:18:49.170 --> 03:18:59.070
  3589. Tim Lloyd : As I see people putting up barriers to that, like, two factor authentication. I see us as humans, trying to get around those barriers that we don't always use it. We complain about it forces publishers to drop it.
  3590.  
  3591. 898
  3592. 03:19:00.450 --> 03:19:07.740
  3593. Tim Lloyd : So yes, I, this isn't the be all and end all, but I think it is a scaling up of our abilities as an industry.
  3594.  
  3595. 899
  3596. 03:19:08.940 --> 03:19:10.590
  3597. Tim Lloyd : My like the use of the phrase table stakes.
  3598.  
  3599. 900
  3600. 03:19:11.700 --> 03:19:16.260
  3601. Rick Anderson : Yeah, you've got to at least have MFA to get in the game.
  3602.  
  3603. 901
  3604. 03:19:18.420 --> 03:19:23.940
  3605. Rick Anderson : Anybody else have additional comments on on coreys observation there.
  3606.  
  3607. 902
  3608. 03:19:28.170 --> 03:19:32.220
  3609. Tim Lloyd : I've got a follow up. Just another comment. He made in relation to that he was taking out some of the obstacles.
  3610.  
  3611. 903
  3612. 03:19:32.760 --> 03:19:39.210
  3613. Tim Lloyd : And now there are some real obstacles here in one he mentioned his privacy and I agree with him that in many cases institutions.
  3614.  
  3615. 904
  3616. 03:19:39.510 --> 03:19:43.530
  3617. Tim Lloyd : Already either have to stay tuned, we'll have the ability to log it anonymously, if that's what they want to do.
  3618.  
  3619. 905
  3620. 03:19:44.400 --> 03:19:54.900
  3621. Tim Lloyd : But, you know, customer experiences and trivial. You know, this is an upscaling of infrastructure and you know I see this happening on both sides of the coin publishers are also
  3622.  
  3623. 906
  3624. 03:19:55.590 --> 03:20:04.140
  3625. Tim Lloyd : Struggling with dealing with upgrading systems that were built, you know, a decade ago when people were really just relying on IP authentication and username password.
  3626.  
  3627. 907
  3628. 03:20:04.770 --> 03:20:11.730
  3629. Tim Lloyd : And now there's a multitude of different ways that users can authenticate and especially if you're a publisher who's selling to different channels.
  3630.  
  3631. 908
  3632. 03:20:12.240 --> 03:20:19.440
  3633. Tim Lloyd : And it's not just academia, trying to do with its government. It might be medical and healthcare might be public libraries and they all have different technologies and so
  3634.  
  3635. 909
  3636. 03:20:19.800 --> 03:20:29.010
  3637. Tim Lloyd : You know, everyone involved in this needs to recognize that there's going to be investment required to up everyone's game and it's not always obvious where that's going to come from.
  3638.  
  3639. 910
  3640. 03:20:31.020 --> 03:20:32.190
  3641. Excellent. Thanks, Tim.
  3642.  
  3643. 911
  3644. 03:20:35.730 --> 03:20:42.540
  3645. Rick Anderson : All right, let's let's move on to a question that was posed by one of the attendees.
  3646.  
  3647. 912
  3648. 03:20:45.720 --> 03:20:57.750
  3649. Rick Anderson : How do publishers train their staff to not be tricked by strategies to compromise their staff accounts. I know what universities do but what, how can publishers do this.
  3650.  
  3651. 913
  3652. 03:21:02.730 --> 03:21:10.440
  3653. Kathleen Neely : Kathleen, I'm happy to jump in here and talk about that a little bit. So from a security standpoint.
  3654.  
  3655. 914
  3656. 03:21:11.670 --> 03:21:24.720
  3657. Kathleen Neely : I guess I should go on camera to here. So everybody sees me apologies, but I'm from a security standpoint I and the rest of the team goes through probably about every six months.
  3658.  
  3659. 915
  3660. 03:21:25.440 --> 03:21:37.740
  3661. Kathleen Neely : Security training and it is not, you know, optional. Everybody has to do it and it needs to be done and that goes through all the different types of ways that
  3662.  
  3663. 916
  3664. 03:21:39.030 --> 03:21:50.760
  3665. Kathleen Neely : You could be potentially compromising your security passwords or just potentially compromising the business as a whole. So I hope that helps.
  3666.  
  3667. 917
  3668. 03:21:53.400 --> 03:21:55.440
  3669. Okere, Kelechi N. (ELS-NYC): Yeah. I'll also add to that.
  3670.  
  3671. 918
  3672. 03:21:57.000 --> 03:22:10.320
  3673. Okere, Kelechi N. (ELS-NYC): I know that you know at Elsevier, we have these trainers that come up periodically. Right. And these are very strict you're sort of chased around by you're given a deadline. And if you don't
  3674.  
  3675. 919
  3676. 03:22:11.250 --> 03:22:23.520
  3677. Okere, Kelechi N. (ELS-NYC): Take them you're constantly and if you keep ignoring them, then your boss's boss's boss's boss is alerted that you haven't taken them and recently they started doing a very clever one on fishing.
  3678.  
  3679. 920
  3680. 03:22:24.570 --> 03:22:39.960
  3681. Okere, Kelechi N. (ELS-NYC): And I admit fell in there twice right where I got a, you know, an email and it was very like it looked okay you know K ish, you know, and I just clicked on it.
  3682.  
  3683. 921
  3684. 03:22:40.680 --> 03:22:54.000
  3685. Okere, Kelechi N. (ELS-NYC): And it said this was a an intentional email a test on fishing, you know, and then it is, you know, obviously, you failed it and here's a link to, you know, for training.
  3686.  
  3687. 922
  3688. 03:22:55.200 --> 03:23:03.990
  3689. Okere, Kelechi N. (ELS-NYC): And then a week later I mean it was so clever. It was a different form of it the same thing happen I clicked on what I shouldn't have clicked on.
  3690.  
  3691. 923
  3692. 03:23:06.690 --> 03:23:14.550
  3693. Okere, Kelechi N. (ELS-NYC): And then the same thing and it says, you know, obviously you felt the test. And so, and then I and then I went through a training.
  3694.  
  3695. 924
  3696. 03:23:15.750 --> 03:23:20.790
  3697. Okere, Kelechi N. (ELS-NYC): And then I think it was earlier this week. I got a third one I said haha
  3698.  
  3699. 925
  3700. 03:23:21.720 --> 03:23:22.650
  3701. Okere, Kelechi N. (ELS-NYC): Now you know
  3702.  
  3703. 926
  3704. 03:23:23.160 --> 03:23:24.780
  3705. Okere, Kelechi N. (ELS-NYC): You're not going to catch me a third time.
  3706.  
  3707. 927
  3708. 03:23:25.980 --> 03:23:32.550
  3709. Okere, Kelechi N. (ELS-NYC): So, I mean, I think this is something that, you know, with Elsevier happens you know continuously right about
  3710.  
  3711. 928
  3712. 03:23:33.750 --> 03:23:48.120
  3713. Okere, Kelechi N. (ELS-NYC): You know, these phishing attacks and also what to do with data of users or customers that you come in contact with on a routine basis of of doing business. And also when GDPR was
  3714.  
  3715. 929
  3716. 03:23:48.690 --> 03:24:03.750
  3717. Okere, Kelechi N. (ELS-NYC): being implemented back in 2018 there was extensive training on you know on GDP are and how you manage our user data. If you are a staff member who comes in contact with customer and also use a data.
  3718.  
  3719. 930
  3720. 03:24:05.550 --> 03:24:11.220
  3721. Rick Anderson : You know, collect. Yeah, I would suggest the fact that you failed that test twice is actually very encouraging.
  3722.  
  3723. 931
  3724. 03:24:11.730 --> 03:24:19.890
  3725. Rick Anderson : Because I think all of us, I mean. Well, certainly. I know I don't think I've ever encountered a corporate training module that was not a complete joke.
  3726.  
  3727. 932
  3728. 03:24:20.760 --> 03:24:28.650
  3729. Rick Anderson : Where you know where you had any any reasonable likelihood of failing the quiz at the end if you had paid even 20% attention.
  3730.  
  3731. 933
  3732. 03:24:29.010 --> 03:24:44.340
  3733. Rick Anderson : And what we're talking about something as important and and as impactful as network security. I'm glad to hear that, that your employer is is actually creating challenging training experiences for the staff. So that's it.
  3734.  
  3735. 934
  3736. 03:24:44.340 --> 03:24:47.340
  3737. Okere, Kelechi N. (ELS-NYC): Was very clever very, very clever.
  3738.  
  3739. 935
  3740. 03:24:48.720 --> 03:24:51.300
  3741. Kathleen Neely : Say, we've done the same thing to our staff.
  3742.  
  3743. 936
  3744. 03:24:51.660 --> 03:25:05.250
  3745. Kathleen Neely : And I almost did the same thing as you policy and I just happened to glance back one more time at the email address and thought, I think this is really right and I sent it to
  3746.  
  3747. 937
  3748. 03:25:05.670 --> 03:25:15.180
  3749. Kathleen Neely : One of our security guys. And he said, oh, this is a test to see if people fail. No. No. Okay. So yeah, we're using it as well.
  3750.  
  3751. 938
  3752. 03:25:17.010 --> 03:25:25.230
  3753. Tim Lloyd : I think there's a more serious point though here, which is the you know the examples here of all big global publishers and I think there's a risk that
  3754.  
  3755. 939
  3756. 03:25:25.770 --> 03:25:28.710
  3757. Tim Lloyd : There's a fracturing within the publishing industry as well as
  3758.  
  3759. 940
  3760. 03:25:29.370 --> 03:25:38.010
  3761. Tim Lloyd : Our institutional partners, where you have organizations that have the ability to fund information security departments and clearly Korea's organization has done a great job there.
  3762.  
  3763. 941
  3764. 03:25:38.640 --> 03:25:44.580
  3765. Tim Lloyd : Linda's one sounds like doing a great job there. But there is a lot of organizations in our industry, both on the
  3766.  
  3767. 942
  3768. 03:25:45.240 --> 03:25:55.050
  3769. Tim Lloyd : Vendor publisher side as well as an institutional side where they simply don't have anywhere near the money. And if you start looking outside of science tech you know SDN publishing to other areas.
  3770.  
  3771. 943
  3772. 03:25:55.710 --> 03:26:03.690
  3773. Tim Lloyd : Who's helping them you know they're there, they're a generation behind and some of the systems and numbers of the loopholes will be much harder to close.
  3774.  
  3775. 944
  3776. 03:26:05.700 --> 03:26:06.300
  3777. Great points.
  3778.  
  3779. 945
  3780. 03:26:08.760 --> 03:26:12.030
  3781. Rick Anderson : Any other comments on this this particular issue before I move on.
  3782.  
  3783. 946
  3784. 03:26:14.850 --> 03:26:15.360
  3785. Rick Anderson : Alright.
  3786.  
  3787. 947
  3788. 03:26:17.580 --> 03:26:24.810
  3789. Rick Anderson : Here's another another question or comment. I've worked at multiple academic institutions and for a major publisher.
  3790.  
  3791. 948
  3792. 03:26:25.350 --> 03:26:32.790
  3793. Rick Anderson : And my question to the panel relates to the responsibility of researchers and librarians regarding cyber intrusions
  3794.  
  3795. 949
  3796. 03:26:33.090 --> 03:26:44.490
  3797. Rick Anderson : Computer Security at virtually all universities. I work out was frankly at this I worked at was frankly abysmal with sticky notes with usernames and passwords in plain view and rampant credential swapping
  3798.  
  3799. 950
  3800. 03:26:44.850 --> 03:26:49.170
  3801. Rick Anderson : For many years, there is either a naive tea or indifference about these issues from many
  3802.  
  3803. 951
  3804. 03:26:49.710 --> 03:27:04.680
  3805. Rick Anderson : But what ethical obligations to researchers and librarians have to keep research data generated mostly by government funding and or copyrighted material secure given recent developments and more knowledge and sophistication around these matters.
  3806.  
  3807. 952
  3808. 03:27:06.930 --> 03:27:08.670
  3809. Tim Lloyd : I think you should have stopped answering that one, Rick.
  3810.  
  3811. 953
  3812. 03:27:10.890 --> 03:27:23.190
  3813. Rick Anderson : Well yeah i mean i i would i would certainly start by saying that as as librarians when we're, of course, you're asking very different questions. When you talk about keeping research data.
  3814.  
  3815. 954
  3816. 03:27:24.330 --> 03:27:27.090
  3817. Rick Anderson : Secure and keeping copyrighted material secure
  3818.  
  3819. 955
  3820. 03:27:28.110 --> 03:27:37.560
  3821. Rick Anderson : In libraries access to copy to online copyrighted materials typically governed by license agreements, which are contracts to which the library is a signatory
  3822.  
  3823. 956
  3824. 03:27:37.950 --> 03:27:53.970
  3825. Rick Anderson : And that create a legal obligation on us to manage access to the content. And to the degree that we fail to do that. We're, we're breaching the terms of our licenses. So just at the most at the strictest most
  3826.  
  3827. 957
  3828. 03:27:56.070 --> 03:28:00.960
  3829. Rick Anderson : Sort of rabbinical level, you know, we need to abide by the terms of the contracts. We are signed
  3830.  
  3831. 958
  3832. 03:28:02.040 --> 03:28:13.290
  3833. Rick Anderson : On a on a deeper ethical, moral level, we have to think about the degree to which we believe it's it's incumbent upon us to protect the legal rights of others.
  3834.  
  3835. 959
  3836. 03:28:13.830 --> 03:28:23.010
  3837. Rick Anderson : This is the question that is much more controversial in the library world right now where the legal rights of copyright holders are
  3838.  
  3839. 960
  3840. 03:28:24.270 --> 03:28:27.570
  3841. Rick Anderson : Not always top priority for us.
  3842.  
  3843. 961
  3844. 03:28:28.620 --> 03:28:36.810
  3845. Rick Anderson : This is a departure from where we were. I'd say 2025 years ago when we used to say, oh, librarians are, you know, great champions of copyright. Now, I find that
  3846.  
  3847. 962
  3848. 03:28:37.230 --> 03:28:46.650
  3849. Rick Anderson : My colleagues tend to be more great skeptics of copyright in the realm of scholarly information that's topic for a whole other conversation.
  3850.  
  3851. 963
  3852. 03:28:47.430 --> 03:28:59.130
  3853. Rick Anderson : When we're talking about research data. Boy, that that's where, that's where it gets I speaking as a librarian our obligations with regard to keeping research data secure
  3854.  
  3855. 964
  3856. 03:28:59.730 --> 03:29:09.000
  3857. Rick Anderson : Really vary from situation to situation we are not always in fact we're not usually stewards of research data, though in some cases we may be
  3858.  
  3859. 965
  3860. 03:29:10.710 --> 03:29:17.850
  3861. Rick Anderson : So I can't really speak very well to the to the ethical obligations of researchers for to keep their data secure
  3862.  
  3863. 966
  3864. 03:29:18.360 --> 03:29:31.140
  3865. Rick Anderson : Another complicating factor is the fact that, in some cases, researchers may be under an ethical obligation to keep their research to keep their research data publicly available. So it all depends on the terms under which they conducted the research and accepted the funding.
  3866.  
  3867. 967
  3868. 03:29:32.820 --> 03:29:36.060
  3869. Rick Anderson : What are other other other people's thoughts on those questions.
  3870.  
  3871. 968
  3872. 03:29:39.540 --> 03:29:50.340
  3873. Tim Lloyd : My experience has been that open access, as in a boat and the muddy the waters, and this is a topic that is endless conversation with our industry. I know, but the idea that people who do not understand that open access is
  3874.  
  3875. 969
  3876. 03:29:50.790 --> 03:30:00.840
  3877. Tim Lloyd : Publishing with a different model, but somehow is disintermediation need the publishers and and view that you know if everything's going open access, why do we need to worry about controlling access to anything.
  3878.  
  3879. 970
  3880. 03:30:01.320 --> 03:30:08.940
  3881. Tim Lloyd : And this is very wide I think misunderstanding of what an open access publishing model actually means. But authors and publishers and users.
  3882.  
  3883. 971
  3884. 03:30:10.320 --> 03:30:15.180
  3885. Rick Anderson : I know that that's a conversation. I've had with colleagues on multiple occasions where they said, look, the
  3886.  
  3887. 972
  3888. 03:30:15.390 --> 03:30:24.960
  3889. Rick Anderson : The answered all these network security problems is not to lock down the information more effectively. The answer is to make the information free and make site hub unnecessary.
  3890.  
  3891. 973
  3892. 03:30:25.770 --> 03:30:33.870
  3893. Rick Anderson : Which is, okay, fine, to the degree that you're talking strictly about access to content, but it certainly doesn't address any of these network security issues themselves.
  3894.  
  3895. 974
  3896. 03:30:36.930 --> 03:30:39.180
  3897. Rick Anderson : Any other thoughts from from the panelists.
  3898.  
  3899. 975
  3900. 03:30:39.930 --> 03:30:43.560
  3901. Crane Hassold : One thing that I'll say is, so you mentioned some examples of
  3902.  
  3903. 976
  3904. 03:30:44.160 --> 03:30:53.790
  3905. Crane Hassold : You know what types of security measures should be taking place and what and what type of stream it should not be taken. So I know one of the examples that was there is having sticky notes with passwords.
  3906.  
  3907. 977
  3908. 03:30:54.090 --> 03:30:59.910
  3909. Crane Hassold : No, that's that's available to libraries. I'll tell you what I'm fine with that as a as a cyber security person.
  3910.  
  3911. 978
  3912. 03:31:00.210 --> 03:31:09.930
  3913. Crane Hassold : That is totally fine with in my book. I know that there are a lot of folks in, you know, information technology that looked down on things like hard copy password books.
  3914.  
  3915. 979
  3916. 03:31:10.320 --> 03:31:18.780
  3917. Crane Hassold : Those are great in my mind because it allows like the general hygiene that's necessary. The, you know, having different passwords for different websites.
  3918.  
  3919. 980
  3920. 03:31:19.200 --> 03:31:33.240
  3921. Crane Hassold : You can do that. And the primary threat to credentials is not someone coming to your desk and stealing your password. It's going to a malicious website and getting your credentials stolen there.
  3922.  
  3923. 981
  3924. 03:31:33.630 --> 03:31:38.130
  3925. Rick Anderson : And so for an Iranian hacker to get into my physical notebook of passwords.
  3926.  
  3927. 982
  3928. 03:31:38.430 --> 03:31:46.980
  3929. Crane Hassold : Yeah, exactly. Like, I love me, and especially when you think about the expectations of, you know, especially, you know, I'm not. I don't mean to stereotype librarians.
  3930.  
  3931. 983
  3932. 03:31:47.640 --> 03:31:53.280
  3933. Crane Hassold : But they're probably not the types of people who are going to be knowledgeable in the cybersecurity. We're on anyway.
  3934.  
  3935. 984
  3936. 03:31:53.730 --> 03:32:02.670
  3937. Crane Hassold : And so you need to at least set the expectations that we need to be doing enough like a like a base level of, you know, protecting
  3938.  
  3939. 985
  3940. 03:32:03.000 --> 03:32:13.800
  3941. Crane Hassold : Their data and everyone else's data they have access to and not thinking that everyone should be locked down 100% all the time because that's just going to end up in failure every single time.
  3942.  
  3943. 986
  3944. 03:32:17.400 --> 03:32:18.420
  3945. Rick Anderson : Great, thank you Craig.
  3946.  
  3947. 987
  3948. 03:32:21.300 --> 03:32:22.770
  3949. Rick Anderson : Other thoughts or questions.
  3950.  
  3951. 988
  3952. 03:32:24.930 --> 03:32:25.200
  3953. Rick Anderson : Okay.
  3954.  
  3955. 989
  3956. 03:32:26.640 --> 03:32:30.180
  3957. Rick Anderson : Here's another longest question and then I've got a handful of shorter ones.
  3958.  
  3959. 990
  3960. 03:32:32.430 --> 03:32:42.120
  3961. Rick Anderson : One attendee says I'd like coreys reference to the chain and the weakest link problem. Let me. I'm not sure we've got Corey back yet. I don't think we do.
  3962.  
  3963. 991
  3964. 03:32:42.780 --> 03:32:49.920
  3965. Rick Anderson : But others may be able to address this comment as well. I like coreys reference to the chain and the weakest link problem.
  3966.  
  3967. 992
  3968. 03:32:50.490 --> 03:33:01.740
  3969. Rick Anderson : Obviously robust long term solutions to security threats, while important to pursue vigorously are costly and time consuming to effectively implement on a global scale.
  3970.  
  3971. 993
  3972. 03:33:02.370 --> 03:33:12.600
  3973. Rick Anderson : If major universities struggle to contend with economics priorities and scarcity of resources. What about the 10s of thousands of libraries around the world for which this would be but a pipe dream.
  3974.  
  3975. 994
  3976. 03:33:13.020 --> 03:33:16.200
  3977. Rick Anderson : This may be goes to some of Tim's comments about smaller publishers to
  3978.  
  3979. 995
  3980. 03:33:17.130 --> 03:33:22.620
  3981. Rick Anderson : For publishers as their content is distributed worldwide that represents many, many weak links.
  3982.  
  3983. 996
  3984. 03:33:22.980 --> 03:33:39.210
  3985. Rick Anderson : Can the panel discuss what can be done in the short term, to address that immediate problem. For example, encouraging truly widespread adoption and participation in the development of known block lists, even if that sort of approach isn't perfect, in and of itself.
  3986.  
  3987. 997
  3988. 03:33:40.560 --> 03:33:56.490
  3989. Rick Anderson : So it sounds like this is kind of a question about, you know, less sub sub optimal solutions that are achievable in the short term, as opposed to optimal solutions that are unavailable to many organizations and maybe only practically available in the long term.
  3990.  
  3991. 998
  3992. 03:33:59.580 --> 03:34:02.640
  3993. Tim Lloyd : I've got one thought, which is just to start building these into infrastructure.
  3994.  
  3995. 999
  3996. 03:34:03.180 --> 03:34:11.490
  3997. Tim Lloyd : And one of the challenges if you're a small publisher or a small institution is that it's expensive and hard for you to make changes to your own infrastructure, especially upgrades.
  3998.  
  3999. 1000
  4000. 03:34:11.880 --> 03:34:24.210
  4001. Tim Lloyd : But to the extent we use a shared infrastructure in a building in security into that as much easier. So things like the block list could be applied as an infrastructure level and to extend that we have shared infrastructure that makes it easier
  4002.  
  4003. 1001
  4004. 03:34:27.360 --> 03:34:30.180
  4005. Rick Anderson : You're talking about industry is shared infrastructure within the industry.
  4006.  
  4007. 1002
  4008. 03:34:30.210 --> 03:34:30.630
  4009. Yeah.
  4010.  
  4011. 1003
  4012. 03:34:33.600 --> 03:34:41.490
  4013. Tim Lloyd : You know either existing ones or potential new ones. I mean, I could imagine a scenario where so Shibboleth is an open source software.
  4014.  
  4015. 1004
  4016. 03:34:42.360 --> 03:34:50.190
  4017. Tim Lloyd : What if there was a project to take Shibboleth and upgraded. So the featured much better security options that you know had
  4018.  
  4019. 1005
  4020. 03:34:50.760 --> 03:34:54.450
  4021. Tim Lloyd : Can be turned on and off by different institutions, depending on their level of security need
  4022.  
  4023. 1006
  4024. 03:34:54.870 --> 03:35:07.650
  4025. Tim Lloyd : But then became available to the community. I mean, the problem with open source software is is like puppies for Christmas. You still need to look after it, but it would at least sold the software problem and an overload institution to use open source software solutions like Shibboleth
  4026.  
  4027. 1007
  4028. 03:35:09.990 --> 03:35:10.590
  4029. Tim Lloyd : Now his Cory
  4030.  
  4031. 1008
  4032. 03:35:11.400 --> 03:35:13.980
  4033. Rick Anderson : Cory, he came in just in time to miss that question.
  4034.  
  4035. 1009
  4036. 03:35:15.000 --> 03:35:16.500
  4037. Corey Roach: Of timing sorry everyone
  4038.  
  4039. 1010
  4040. 03:35:17.040 --> 03:35:25.710
  4041. Rick Anderson : No, no worries. Um, let me just since we do have some extra time, I'm going to go ahead and read it again, really quick because I think Korea will probably have some good thoughts.
  4042.  
  4043. 1011
  4044. 03:35:27.570 --> 03:35:37.350
  4045. Rick Anderson : The question was obviously robust long term solutions to security threats, while important to pursue vigorously are costly and time consuming to effectively implement on a global scale.
  4046.  
  4047. 1012
  4048. 03:35:37.710 --> 03:35:47.220
  4049. Rick Anderson : If major universities struggle to contend with economics priorities and scarcity of resources. What about the 10s of thousands of libraries around the world for which this would be about a pipe dream.
  4050.  
  4051. 1013
  4052. 03:35:47.700 --> 03:35:52.680
  4053. Rick Anderson : For publishers as their content is distributed worldwide that represents many, many weak links.
  4054.  
  4055. 1014
  4056. 03:35:53.010 --> 03:36:06.960
  4057. Rick Anderson : Can the panel discuss what can be done in the short term, to address that immediate problem. For example, encouraging truly widespread adoption and participation in the development of known block lists, even if that sort of approach isn't perfect, in and of itself.
  4058.  
  4059. 1015
  4060. 03:36:08.550 --> 03:36:14.880
  4061. Corey Roach: Like caught the tail end of Tim's answer and I think the question itself actually kind of leads into somewhat of an answer and that is
  4062.  
  4063. 1016
  4064. 03:36:15.390 --> 03:36:31.440
  4065. Corey Roach: It's not an all or nothing effort by any means. There are many different things we can do, some of which are very expensive and some of which are not block lists are an example of something that it's fairly easy to implement, and it's something that we can do fairly inexpensively.
  4066.  
  4067. 1017
  4068. 03:36:32.460 --> 03:36:39.510
  4069. Corey Roach: The efficacy of it is kind of the low hanging fruit. I mean, if it's someone who is sophisticated, they're going to get around that.
  4070.  
  4071. 1018
  4072. 03:36:39.960 --> 03:36:50.670
  4073. Corey Roach: So in its most security controls are that way. They are layered one on top of the other, because any one individual one doesn't really take care of the questions for the threat entirely by itself.
  4074.  
  4075. 1019
  4076. 03:36:52.470 --> 03:36:55.080
  4077. Corey Roach: I, I agree. It's turning the Titanic.
  4078.  
  4079. 1020
  4080. 03:36:56.280 --> 03:37:01.050
  4081. Corey Roach: But I think there are there are tools that are out there and I I mentioned in my my talk that
  4082.  
  4083. 1021
  4084. 03:37:01.620 --> 03:37:11.160
  4085. Corey Roach: Some of the open source tools are not up to par with commercial tools, but they are coming along and, you know, even if they're not 100% 30% and
  4086.  
  4087. 1022
  4088. 03:37:11.460 --> 03:37:17.880
  4089. Corey Roach: Tim's point about, you know, feeding a puppy is absolutely true. They are not free, if anybody tells you open sources for you it's not
  4090.  
  4091. 1023
  4092. 03:37:18.300 --> 03:37:31.800
  4093. Corey Roach: But it is less expensive. So I think there's kind of a graduated scale there. I also mentioned in the presentation about building community. And I think that is one of the biggest resources in my mind would be
  4094.  
  4095. 1024
  4096. 03:37:33.000 --> 03:37:34.890
  4097. Corey Roach: Really, making sure that
  4098.  
  4099. 1025
  4100. 03:37:36.000 --> 03:37:50.070
  4101. Corey Roach: We can share that information between organizations, because the risk absolutely is shared between 10s of thousands of organizations. So getting the word out getting ideas out there that can be done relatively inexpensively. I think would be a big help.
  4102.  
  4103. 1026
  4104. 03:37:51.120 --> 03:38:02.190
  4105. Rick Anderson : So core you your response in your response, you've talked about building community and Tim's response he talked about shared infrastructure, both of you, or both of you are pointing at
  4106.  
  4107. 1027
  4108. 03:38:02.880 --> 03:38:25.560
  4109. Rick Anderson : Community sorts of responses that would require collaboration and coordination between entities, some of whom are competitors do you guys or anybody else have any thoughts on what what could we do to foster that kind of that kind of Community action in a way that
  4110.  
  4111. 1028
  4112. 03:38:27.180 --> 03:38:42.750
  4113. Rick Anderson : That sort of gets us around the both the inertia, because it nourishes is always the biggest enemy of Community action and also that gets us around the sort of competitive complicating factors that we might encounter. Sure.
  4114.  
  4115. 1029
  4116. 03:38:43.860 --> 03:38:58.950
  4117. Corey Roach: Um, so my first whack at that would be I, although I think very much. We need to build community for this group as well. I would also recommend looking around and making sure that we're not reinventing the wheel anywhere, particularly our
  4118.  
  4119. 1030
  4120. 03:38:58.980 --> 03:39:01.110
  4121. Rick Anderson : Agent agent sites that already exists.
  4122.  
  4123. 1031
  4124. 03:39:01.140 --> 03:39:07.590
  4125. Corey Roach: Right. And on the education side, in particular, there's things like Renee sec, I think that was mentioned in another one of the presentations.
  4126.  
  4127. 1032
  4128. 03:39:08.970 --> 03:39:19.620
  4129. Corey Roach: Where some of that community has already built there's already vetting processes in place. There's already sharing guidelines in place, those kinds of things. And, you know, even if we don't use those we can model from them.
  4130.  
  4131. 1033
  4132. 03:39:20.700 --> 03:39:31.350
  4133. Corey Roach: But as an industry. It's not uncommon in security for competitors to collaborate on the technical side of security.
  4134.  
  4135. 1034
  4136. 03:39:33.570 --> 03:39:41.880
  4137. Corey Roach: I do that, both within pure our organizations on the education side but also on the healthcare side here in Utah, we are
  4138.  
  4139. 1035
  4140. 03:39:43.380 --> 03:39:46.440
  4141. Corey Roach: Neck and neck with one other health care provider Intermountain Healthcare
  4142.  
  4143. 1036
  4144. 03:39:46.830 --> 03:39:55.380
  4145. Corey Roach: And they are one of my biggest collaborators, you know, in the middle of an incident. Am I going to call them up and give them information about, you know, things that are at risk in my organization. No, but
  4146.  
  4147. 1037
  4148. 03:39:56.340 --> 03:40:04.530
  4149. Corey Roach: On a day to day basis, as far as sharing threats sharing approaches sharing to ebooks. Absolutely. We collaborate, and I think we both gained from it.
  4150.  
  4151. 1038
  4152. 03:40:06.120 --> 03:40:08.130
  4153. Rick Anderson : Excellent to what are your thoughts on them.
  4154.  
  4155. 1039
  4156. 03:40:09.090 --> 03:40:22.770
  4157. Tim Lloyd : I think my whole experience of the seamless access project in the last few years has been one where organizations from across the industry have come together because there's a shared concerned that they have an I think security is absolutely a share concern.
  4158.  
  4159. 1040
  4160. 03:40:24.330 --> 03:40:30.090
  4161. Tim Lloyd : I didn't see any reason why competitors will be concerned about trying to build a better security infrastructure, because it Floats all boats.
  4162.  
  4163. 1041
  4164. 03:40:30.420 --> 03:40:39.090
  4165. Tim Lloyd : You know, if we all want more use these resources. Doesn't matter what side of the authentication selection we are, we recognize the leakage of usage is to what our detriment. So I think there's
  4166.  
  4167. 1042
  4168. 03:40:39.840 --> 03:40:47.190
  4169. Tim Lloyd : There's everything to play for there and you know whether that gold is open source software or just better education.
  4170.  
  4171. 1043
  4172. 03:40:48.060 --> 03:41:01.140
  4173. Tim Lloyd : You know, some sort of cloud based solutions that can help part time security experts that can be loaned out the foreign institution you know there's there's all these opportunities, I think, I think the competitive concerns are probably the least concerns.
  4174.  
  4175. 1044
  4176. 03:41:02.460 --> 03:41:14.520
  4177. Tim Lloyd : But I can I just throw a question back, which is if you know one of the thread throughout this summit has been that the costs of fraudulent access can be much greater than just stolen content.
  4178.  
  4179. 1045
  4180. 03:41:16.770 --> 03:41:32.010
  4181. Tim Lloyd : I maybe this is for you. Curious if that's the case, why aren't institutions more concerned about this and maybe, you know, putting higher up than a securities security infrastructure mean yours obviously has. But do you think enough institutions recognize this.
  4182.  
  4183. 1046
  4184. 03:41:32.760 --> 03:41:35.490
  4185. Tim Lloyd : Or the problem that they don't think it's in libraries that cells were
  4186.  
  4187. 1047
  4188. 03:41:35.910 --> 03:41:52.710
  4189. Corey Roach: To be fair, my institution does have a fairly sophisticated security group, however, I would, if I'm being perfectly honest, say that our secure security around our library infrastructure is probably not up at the top 10 places. I'm worried about securing
  4190.  
  4191. 1048
  4192. 03:41:53.490 --> 03:42:09.690
  4193. Corey Roach: And that was actually brought up as I think one of the comments or questions during the presentations of, you know, why did I make an argument for most of the risk being on the publisher side for this and it really has to do with the consequence of a breach.
  4194.  
  4195. 1049
  4196. 03:42:10.830 --> 03:42:20.550
  4197. Corey Roach: In my organization, as I mentioned, there's graduated security controls, depending on the risk for that particular area. So if a student were to give away their
  4198.  
  4199. 1050
  4200. 03:42:20.550 --> 03:42:24.840
  4201. Corey Roach: Credential certainly is not going to allow access to my medical record system.
  4202.  
  4203. 1051
  4204. 03:42:26.850 --> 03:42:35.070
  4205. Corey Roach: If that's used to scrape Elsevier or another publisher, the losses on their side and you know as much as I want to help prevent that.
  4206.  
  4207. 1052
  4208. 03:42:35.490 --> 03:42:42.750
  4209. Corey Roach: At the end of the day, if I have to spend the dollars. I'm going to spend it on the regulated data that's over here in the healthcare space that I'm required to by law.
  4210.  
  4211. 1053
  4212. 03:42:43.590 --> 03:42:52.800
  4213. Corey Roach: Then protecting those resources that belong to somebody else. But, you know, we also have a contractual obligation to do our best and not be negligent either
  4214.  
  4215. 1054
  4216. 03:42:53.850 --> 03:43:02.970
  4217. Corey Roach: And I think some of the controls that we're talking about today probably are table stakes. I mean we really ought to be doing these just to be playing the game we ought to be doing them.
  4218.  
  4219. 1055
  4220. 03:43:04.560 --> 03:43:10.890
  4221. Corey Roach: To be a you know a good member of the community and not be spreading that risk without reason
  4222.  
  4223. 1056
  4224. 03:43:12.120 --> 03:43:21.780
  4225. Corey Roach: So our, our access to our resources is tied to our Identity and Access Management System. Ours is for all matters. It's sale point but
  4226.  
  4227. 1057
  4228. 03:43:23.610 --> 03:43:42.450
  4229. Corey Roach: It is a full time job to do that to doing Federation doing two factor assigning roles doing all the things that come along with identity management is not easy. So I, I would caution, just that, not to think that doing Federation or doing two FA is the end of the road.
  4230.  
  4231. 1058
  4232. 03:43:43.950 --> 03:43:46.620
  4233. Corey Roach: Again, I'd say that's that's the minimum we ought to be doing.
  4234.  
  4235. 1059
  4236. 03:43:49.740 --> 03:43:55.470
  4237. Rick Anderson : Do we have, do we still have Linda with us. I do have a question that's directed to her, but she's not with us.
  4238.  
  4239. 1060
  4240. 03:43:55.860 --> 03:43:56.910
  4241. Okere, Kelechi N. (ELS-NYC): No, no, she had to leave.
  4242.  
  4243. 1061
  4244. 03:43:57.270 --> 03:43:57.660
  4245. Okay.
  4246.  
  4247. 1062
  4248. 03:43:59.760 --> 03:44:10.920
  4249. Rick Anderson : The question, and she really may be the best one to answer it. But I'll throw it out there in case anybody and Corey actually may have some comments on this because he's he's working closely with a medical
  4250.  
  4251. 1063
  4252. 03:44:12.630 --> 03:44:20.640
  4253. Rick Anderson : Health Sciences facility. The question is, is anonymity necessary for all researchers for all subjects.
  4254.  
  4255. 1064
  4256. 03:44:23.220 --> 03:44:36.780
  4257. Corey Roach: I would actually I would pitch that one back over to the library folks to give a stronger opinion on in, in my opinion, probably not, but it is kind of a foundational principle behind things like libraries.
  4258.  
  4259. 1065
  4260. 03:44:38.370 --> 03:44:42.300
  4261. Corey Roach: So I don't see that going away anytime soon.
  4262.  
  4263. 1066
  4264. 03:44:42.930 --> 03:44:45.720
  4265. Tim Lloyd : Oh, I see that anonymity, the library.
  4266.  
  4267. 1067
  4268. 03:44:46.140 --> 03:44:47.340
  4269. Tim Lloyd : Will externally, Rick.
  4270.  
  4271. 1068
  4272. 03:44:47.580 --> 03:44:48.060
  4273. Rick Anderson : I'm sorry.
  4274.  
  4275. 1069
  4276. 03:44:48.570 --> 03:44:51.150
  4277. Tim Lloyd : If I anonymity internally or externally.
  4278.  
  4279. 1070
  4280. 03:44:51.420 --> 03:44:53.370
  4281. Rick Anderson : Well, I'm not sure. And actually,
  4282.  
  4283. 1071
  4284. 03:44:53.580 --> 03:44:54.480
  4285. Tim Lloyd : That's a big difference.
  4286.  
  4287. 1072
  4288. 03:44:54.840 --> 03:45:03.630
  4289. Rick Anderson : Yeah. And yeah, when I read the question originally I understood it to mean a medical researchers, not people conducting research in the library.
  4290.  
  4291. 1073
  4292. 03:45:05.070 --> 03:45:11.460
  4293. Corey Roach: Well, and I guess the way I interpreted it was if they are a student or an academic of some kind, doing research.
  4294.  
  4295. 1074
  4296. 03:45:11.910 --> 03:45:21.330
  4297. Corey Roach: The library is always going to know who they are, even if that's a, you know, punch out card on when you've got to return a resource, they're going to know. That's part of what they are obligated to protect
  4298.  
  4299. 1075
  4300. 03:45:22.050 --> 03:45:29.970
  4301. Corey Roach: Whether that gets passed on to third party partners, including people like publishers nuts and more interesting question for the librarians.
  4302.  
  4303. 1076
  4304. 03:45:30.360 --> 03:45:45.300
  4305. Rick Anderson : Well, actually, that's a pretty simple question for the librarians as a standard rule we we don't pass along information to publishers about patrons. And not only that, but we actually don't keep information about about. So, for example, at
  4306.  
  4307. 1077
  4308. 03:45:46.890 --> 03:45:55.650
  4309. Rick Anderson : The University of Utah. I know when a patron returns a book that has been checked out. Certainly the library of retains the information about who has the book at a, at a given moment.
  4310.  
  4311. 1078
  4312. 03:45:56.010 --> 03:46:10.860
  4313. Rick Anderson : But 30 days after that book is returned. That information as expunged from the system so that if somebody were to come to us in the future and say, who checked out this book from, you know, 2015 to 2020. We can't tell them, because we don't know.
  4314.  
  4315. 1079
  4316. 03:46:11.250 --> 03:46:21.120
  4317. Corey Roach: And much of and I kind of went over this a little bit in the presentation, but much of the controls and the data we're talking about in order to improve the security posture can be treated the same way. Yeah.
  4318.  
  4319. 1080
  4320. 03:46:21.780 --> 03:46:27.630
  4321. Corey Roach: You know, I'm not doing modeling on events that happened 18 months ago. I'm doing it in the last 30 days.
  4322.  
  4323. 1081
  4324. 03:46:28.320 --> 03:46:39.210
  4325. Corey Roach: And, you know, much of the information that I would use for context is information that my organization already has, you know, I know what your major is I know where you've caught home you know I know when you're in class. And when you're not
  4326.  
  4327. 1082
  4328. 03:46:40.530 --> 03:46:48.450
  4329. Corey Roach: So that that I'm already safeguarding things that we create new or synthesize things that might be sensitive, like what material you are researching
  4330.  
  4331. 1083
  4332. 03:46:49.140 --> 03:47:02.430
  4333. Corey Roach: You know, we can set the parameters around. Do we tokenism anonymize it. Do we delete it. How long do we keep it. What do we use it for now. I think those are definitely worthwhile discussions, even within this community, you know, pull up holding some community standards.
  4334.  
  4335. 1084
  4336. 03:47:03.420 --> 03:47:13.320
  4337. Rick Anderson : Yeah, but the more the more difficult controversial question among librarians historically has been, what do we do if a content provider offers to make
  4338.  
  4339. 1085
  4340. 03:47:14.340 --> 03:47:20.610
  4341. Rick Anderson : Research material available to our patrons, but only if our patrons agree to provide an email address.
  4342.  
  4343. 1086
  4344. 03:47:22.680 --> 03:47:36.210
  4345. Rick Anderson : To do we as a library. Take the stance that we are not going to enter into an agreement with this publisher, because we, we think that that they are encroaching on your privacy.
  4346.  
  4347. 1087
  4348. 03:47:36.780 --> 03:47:43.860
  4349. Rick Anderson : Or do we say when it comes to something like giving away your email address, your privacy is up to you.
  4350.  
  4351. 1088
  4352. 03:47:44.190 --> 03:47:51.570
  4353. Rick Anderson : And if you're telling me I want access to this content and I'm willing to share my email address in exchange for getting that access
  4354.  
  4355. 1089
  4356. 03:47:51.870 --> 03:47:58.500
  4357. Rick Anderson : Is it up to us as librarians to say no, that's not a wise privacy decision. So we're not going to broker that
  4358.  
  4359. 1090
  4360. 03:47:58.980 --> 03:48:06.600
  4361. Rick Anderson : There's a lot of disagreement or historically has been a lot of disagreement among librarians on what the right balance of those two
  4362.  
  4363. 1091
  4364. 03:48:07.200 --> 03:48:17.850
  4365. Rick Anderson : Of those two issues is those with a more libertarian leaning tend to say hey you know it's your email address you can do what you want. Our job is to get you the information you need those with a more
  4366.  
  4367. 1092
  4368. 03:48:19.650 --> 03:48:30.900
  4369. Rick Anderson : I don't know, activists leaving tend to say we are not going to insert ourselves into into an exchange that we think is fundamentally improper
  4370.  
  4371. 1093
  4372. 03:48:31.950 --> 03:48:32.340
  4373. Corey Roach: I guess.
  4374.  
  4375. 1094
  4376. 03:48:33.060 --> 03:48:40.710
  4377. Corey Roach: Not being in that field. I don't have a strong opinion one way or the other, although I would add to the conversation.
  4378.  
  4379. 1095
  4380. 03:48:42.180 --> 03:48:48.240
  4381. Corey Roach: In my experience, a lot of time when people make those decisions. They're not fully informed about what the consequences are.
  4382.  
  4383. 1096
  4384. 03:48:48.630 --> 03:49:01.230
  4385. Corey Roach: So for me to feel good about it. I would want to explain to you why are you giving them this email address, does it link to your, your actual usage of their resources. How long are they going to keep it. Are they going to sell it to third parties. What's going to be done with it.
  4386.  
  4387. 1097
  4388. 03:49:02.490 --> 03:49:05.940
  4389. Corey Roach: A lot of times the internet. We don't think about that until it's far too late. It's a great point.
  4390.  
  4391. 1098
  4392. 03:49:06.390 --> 03:49:19.890
  4393. Tim Lloyd : I think it varies by segment as well. I mean when we talk about libraries if we're talking about academic libraries. And that's definitely the case if we're talking about corporate libraries, very different. We deal with lots of corporate libraries, who don't give a second thought to privacy.
  4394.  
  4395. 1099
  4396. 03:49:20.490 --> 03:49:22.260
  4397. Corey Roach: Well, while you're writing
  4398.  
  4399. 1100
  4400. 03:49:22.710 --> 03:49:34.800
  4401. Tim Lloyd : But privacy, they didn't care. You know, defendant knows which person called up the article. Now they don't want those articles share with competitors who are doing research, but the individuals that just employees. It's a very, very different conversation, but then language as well.
  4402.  
  4403. 1101
  4404. 03:49:35.160 --> 03:49:38.910
  4405. Corey Roach: And while you're operating as an employee. The expectation for privacy is very different.
  4406.  
  4407. 1102
  4408. 03:49:39.690 --> 03:49:51.960
  4409. Corey Roach: You know, research firms so Gartner research. Every time I download a paper it is watermark and stamped with my login for the same reason they don't want me to share that information outside of the license.
  4410.  
  4411. 1103
  4412. 03:49:54.630 --> 03:50:05.310
  4413. Tim Lloyd : It makes an interesting scenario because often you have these two circles that intersect where you've got corporate libraries and academic libraries and then you have these medical libraries that sits in between.
  4414.  
  4415. 1104
  4416. 03:50:05.820 --> 03:50:13.080
  4417. Tim Lloyd : Where they're part of institution and you know I in my day job I deal with a lot of libraries that are in that intersection
  4418.  
  4419. 1105
  4420. 03:50:13.500 --> 03:50:23.370
  4421. Tim Lloyd : And it's very interesting because the people within the hospital unit spend a little time dealing with commercial hospitals and sort of act the same way. But there within an institution that's got very different privacy approaches.
  4422.  
  4423. 1106
  4424. 03:50:23.880 --> 03:50:30.870
  4425. Tim Lloyd : And, you know, almost leads to a split personality and, you know, what should we do to and that's where you see it most start with differences between those two sectors.
  4426.  
  4427. 1107
  4428. 03:50:31.410 --> 03:50:40.200
  4429. Corey Roach: And I can even see that stuff maybe straddling that line for different roles you know if I'm providing clinical care as a pharmacist. I really don't care what you look at for
  4430.  
  4431. 1108
  4432. 03:50:40.680 --> 03:50:49.890
  4433. Corey Roach: When I'm looking up interactions or reference material, but in my research and I'm, what I'm going to publish. I mean, I want everybody to see everything I'm looking up.
  4434.  
  4435. 1109
  4436. 03:50:50.190 --> 03:50:50.490
  4437. Right.
  4438.  
  4439. 1110
  4440. 03:50:52.320 --> 03:50:57.240
  4441. Okere, Kelechi N. (ELS-NYC): I have a question for you. Cory, just as maybe as a follow up to one of the things you said
  4442.  
  4443. 1111
  4444. 03:50:58.710 --> 03:51:08.970
  4445. Okere, Kelechi N. (ELS-NYC): And that when you were up against security threats, you know, maybe, historically, you know, breaching to let's say student records or medical records.
  4446.  
  4447. 1112
  4448. 03:51:09.450 --> 03:51:25.320
  4449. Okere, Kelechi N. (ELS-NYC): As opposed to, you know, reach coming through potentially a library resources, you know, you wake up, you're going to go with you're going to spend efforts on on the first one. And I wonder, is that because of a historical thing that may be
  4450.  
  4451. 1113
  4452. 03:51:26.490 --> 03:51:39.420
  4453. Okere, Kelechi N. (ELS-NYC): There, there's not enough evidence that breaches on library resources are that come through that channel, aren't you know there's not enough evidence that they go beyond
  4454.  
  4455. 1114
  4456. 03:51:40.140 --> 03:51:41.250
  4457. Corey Roach: You know that says no.
  4458.  
  4459. 1115
  4460. 03:51:41.850 --> 03:51:51.810
  4461. Corey Roach: I don't think that's the case. I think that's a fantastic question, but I don't think that's the case, I think it is and I should probably have qualified this first time, but I think it's somewhat particular to my organization.
  4462.  
  4463. 1116
  4464. 03:51:52.290 --> 03:52:00.660
  4465. Corey Roach: Because we do have a lot of graduated controls based on risk. So having that credential by itself doesn't buy you a whole lot
  4466.  
  4467. 1117
  4468. 03:52:01.170 --> 03:52:11.310
  4469. Corey Roach: Whereas, you know, if I were a less mature organization or, you know, even the University of Utah 10 years ago that username and password would have gotten you into a whole lot more than it does today.
  4470.  
  4471. 1118
  4472. 03:52:12.780 --> 03:52:20.310
  4473. Corey Roach: So yeah, then, then that profile changes those risks change a lot if that is your only control over your data.
  4474.  
  4475. 1119
  4476. 03:52:21.540 --> 03:52:24.660
  4477. Corey Roach: But that it does kind of depend on what you have in place.
  4478.  
  4479. 1120
  4480. 03:52:29.430 --> 03:52:37.890
  4481. Rick Anderson : Here's, here's a question that came through. There are many pros to federated and seamless access. Are there any cons.
  4482.  
  4483. 1121
  4484. 03:52:42.150 --> 03:52:46.770
  4485. Corey Roach: I'll give my two second answer, but that's not really my specialty. But I'm
  4486.  
  4487. 1122
  4488. 03:52:48.150 --> 03:52:55.380
  4489. Corey Roach: The only thing I would particularly say as it needs to be done. Well, it does have a lot of overhead. But it's also you are joining an ecosystem.
  4490.  
  4491. 1123
  4492. 03:52:56.190 --> 03:53:05.430
  4493. Corey Roach: You are sharing those credentials with other systems, you know, so if if my login only ever gets me into library resources, then
  4494.  
  4495. 1124
  4496. 03:53:06.060 --> 03:53:11.760
  4497. Corey Roach: You know that risk is contained within one little ecosystem. If I start changing it into other things, and that you know
  4498.  
  4499. 1125
  4500. 03:53:12.120 --> 03:53:20.100
  4501. Corey Roach: Kind of goes to Chelsea's point, just a minute ago, is if that if that credential in one place gets me into another you're you're joining a greater area and
  4502.  
  4503. 1126
  4504. 03:53:20.730 --> 03:53:28.830
  4505. Corey Roach: Honestly Federation by itself is not a control. It just gives more uniformity and allows you to plug controls better across the board.
  4506.  
  4507. 1127
  4508. 03:53:30.270 --> 03:53:32.040
  4509. Corey Roach: I'd love to hear somebody supposing have been
  4510.  
  4511. 1128
  4512. 03:53:34.890 --> 03:53:36.060
  4513. Tim Lloyd : So I can talk to this one.
  4514.  
  4515. 1129
  4516. 03:53:37.440 --> 03:53:41.190
  4517. Tim Lloyd : I think the biggest con is simply the effort involved in setting it up.
  4518.  
  4519. 1130
  4520. 03:53:42.720 --> 03:53:50.580
  4521. Tim Lloyd : It can be expensive. If you don't have people with appropriate experience that increasingly hard to find an expensive to hire
  4522.  
  4523. 1131
  4524. 03:53:51.570 --> 03:54:01.740
  4525. Tim Lloyd : It involves a wholesale change in attitude. Typically the systems cuts across your campuses. So it's not something that the library implements not something that just it implements and involves collaboration across the board.
  4526.  
  4527. 1132
  4528. 03:54:02.850 --> 03:54:12.540
  4529. Tim Lloyd : And you know as Corey hinted if you get it wrong, you know, you can do a lot more damage than, you know, a traditional system which is based on older style methods.
  4530.  
  4531. 1133
  4532. 03:54:13.950 --> 03:54:15.690
  4533. Tim Lloyd : So, you know, you don't don't go into it.
  4534.  
  4535. 1134
  4536. 03:54:16.740 --> 03:54:30.570
  4537. Tim Lloyd : Thinking, it's going to be a simple thing you need to have people that know what they're doing, set it up. But I would say that you know someone that maintains these one set set up the maintenance is it's much lower the setup. Well, and you've got people in place to maintain it.
  4538.  
  4539. 1135
  4540. 03:54:30.960 --> 03:54:33.570
  4541. Corey Roach: If you can get people to stop integrating stuff. Yeah.
  4542.  
  4543. 1136
  4544. 03:54:34.770 --> 03:54:35.040
  4545. Tim Lloyd : Yeah.
  4546.  
  4547. 1137
  4548. 03:54:37.350 --> 03:54:38.700
  4549. Tim Lloyd : presale wants want to buy because we're
  4550.  
  4551. 1138
  4552. 03:54:40.260 --> 03:54:44.790
  4553. Tim Lloyd : Taking. We do this every time we bring a new product on board pretty much anywhere in my institution, it gets
  4554.  
  4555. 1139
  4556. 03:54:45.000 --> 03:54:52.980
  4557. Corey Roach: Integrated into the Identity and Access Management, a quarter of my staff, our Identity and Access Management Specialist. So yeah, absolutely.
  4558.  
  4559. 1140
  4560. 03:54:53.880 --> 03:55:08.040
  4561. Rick Anderson : Crazy crane you're coming from a originally coming from a law enforcement slash national security type of perspective what what what's your take on the question of, of what the cons of federated and seamless access are
  4562.  
  4563. 1141
  4564. 03:55:09.000 --> 03:55:18.300
  4565. Crane Hassold : The biggest the biggest downside is that you have a single point of failure. The more like it's the court was saying, the more things I can get access to by entering
  4566.  
  4567. 1142
  4568. 03:55:18.870 --> 03:55:31.230
  4569. Crane Hassold : A little bit of information, the more valuable. It is to me. So, you know, if I can get into one thing i mean sort of course check if I can get into one thing with set of credentials. That's great. If I can get into 10 things
  4570.  
  4571. 1143
  4572. 03:55:31.890 --> 03:55:37.530
  4573. Crane Hassold : Then that's going to be significantly more valuable and I'm going to you know put a bit more effort into actually
  4574.  
  4575. 1144
  4576. 03:55:38.430 --> 03:55:45.540
  4577. Crane Hassold : Getting access to those. And so, you know, we're going through something similar with single sign on right now across three different products and
  4578.  
  4579. 1145
  4580. 03:55:46.110 --> 03:55:52.530
  4581. Crane Hassold : We've gone through the same issue that the security that has to be in place for our products has to
  4582.  
  4583. 1146
  4584. 03:55:52.980 --> 03:56:03.720
  4585. Crane Hassold : Has to be completely fail safe because you now have one set of credentials has access to three different things. So I think that's, to me, that's, you know, that's, you know, from a risk perspective, that's the biggest downside.
  4586.  
  4587. 1147
  4588. 03:56:04.470 --> 03:56:13.350
  4589. Corey Roach: The other thing I would point out is that federated identity is not purely a security function. It's kind of the way the world is going at this point.
  4590.  
  4591. 1148
  4592. 03:56:14.160 --> 03:56:24.810
  4593. Corey Roach: I have all sorts of different constituents within my environment that have logins to the University of Utah, not all of them are students they not they shouldn't all fall under the student license with a publisher.
  4594.  
  4595. 1149
  4596. 03:56:25.350 --> 03:56:35.160
  4597. Corey Roach: I have to be applying roles and attributes to be able to figure out who gets access to what applying only an IP address or only a username doesn't give you that level of insight.
  4598.  
  4599. 1150
  4600. 03:56:36.930 --> 03:56:43.080
  4601. Crane Hassold : I give you a good example of this sort of not in the, in the, not in the library world is social media.
  4602.  
  4603. 1151
  4604. 03:56:43.530 --> 03:56:51.180
  4605. Crane Hassold : Like how many I mean all of us, most people probably have smartphones today and everyone has apps. How many people have logged in to something using your Facebook account.
  4606.  
  4607. 1152
  4608. 03:56:51.480 --> 03:56:59.190
  4609. Crane Hassold : Or your Twitter account like that's, you know, I now have I probably have a dozen or more different apps that are tied to my social media profile.
  4610.  
  4611. 1153
  4612. 03:56:59.580 --> 03:57:09.270
  4613. Crane Hassold : Simply because that's an easy and seamless way to access things, and yet from a risk perspective, you know, I certainly wouldn't do that for like my banking.
  4614.  
  4615. 1154
  4616. 03:57:09.810 --> 03:57:18.660
  4617. Crane Hassold : My banking app. But, you know, for if anyone wanted to get access to something that was tied to one of those to my social media profiles, you know, it's
  4618.  
  4619. 1155
  4620. 03:57:18.690 --> 03:57:24.750
  4621. Corey Roach: What your comment right there is interesting because you're doing that Federation for applications that are the equal risk profile. Yeah.
  4622.  
  4623. 1156
  4624. 03:57:24.780 --> 03:57:27.480
  4625. Corey Roach: You're choosing to do a different credential for things that are higher risk.
  4626.  
  4627. 1157
  4628. 03:57:27.540 --> 03:57:27.900
  4629. Yep.
  4630.  
  4631. 1158
  4632. 03:57:30.030 --> 03:57:38.340
  4633. Tim Lloyd : But that that is a great example of what can go wrong with Federation authentication, because there's there's plenty of approaches to solve that problem crane.
  4634.  
  4635. 1159
  4636. 03:57:38.910 --> 03:57:44.970
  4637. Tim Lloyd : But if it's not implemented well and people apply the same thing across the board, though, there's a great example.
  4638.  
  4639. 1160
  4640. 03:57:45.390 --> 03:57:52.740
  4641. Tim Lloyd : I've cited on the conferences of a major US research institution that was using federated authentication for research collaborations.
  4642.  
  4643. 1161
  4644. 03:57:53.250 --> 03:58:00.930
  4645. Tim Lloyd : And zoom research collaborations, you've got different scientists across different organizations collaborating with each other. So they're sharing attributes like names and email addresses.
  4646.  
  4647. 1162
  4648. 03:58:01.500 --> 03:58:06.630
  4649. Tim Lloyd : And the it campus it people on campus just applied that same model.
  4650.  
  4651. 1163
  4652. 03:58:07.110 --> 03:58:14.520
  4653. Tim Lloyd : To library resources. So they switched on a whole bunch of libraries sources Federation temptation and all the publishers concerned got oodles of personally identifiable data.
  4654.  
  4655. 1164
  4656. 03:58:15.360 --> 03:58:23.850
  4657. Tim Lloyd : Not a problem with the technology, it probably isn't understanding technology and configuring it properly, which you need some expertise to make sure your voice on
  4658.  
  4659. 1165
  4660. 03:58:25.290 --> 03:58:29.910
  4661. Crane Hassold : The other side of it is, you know, depending on how you set up authentication.
  4662.  
  4663. 1166
  4664. 03:58:30.420 --> 03:58:39.480
  4665. Crane Hassold : Across different applications. You know, if you're using a third party your trust is now being then put into that third party so like today.
  4666.  
  4667. 1167
  4668. 03:58:39.900 --> 03:58:59.640
  4669. Crane Hassold : Do I feel comfortable that Facebook has access to a dozen or so applications that you know that I'm using now. But, you know, that was something that that I that I ended up doing, and none of those applications have control over how Facebook uses any of that data.
  4670.  
  4671. 1168
  4672. 03:59:05.610 --> 03:59:07.140
  4673. Rick Anderson : Okay, um,
  4674.  
  4675. 1169
  4676. 03:59:08.850 --> 03:59:19.590
  4677. Rick Anderson : here's a, here's a, here's another longish question of this is a question in response to Korea's remarks regarding stolen student credentials.
  4678.  
  4679. 1170
  4680. 03:59:20.400 --> 03:59:28.890
  4681. Rick Anderson : Not posing that much of a risk. What perplexed me as why universities don't view a publishers report of suspicious activity.
  4682.  
  4683. 1171
  4684. 03:59:29.340 --> 03:59:43.470
  4685. Rick Anderson : As a potential security breach that warrants immediate investigation, not from a library slash publisher contract perspective, but from a university risk perspective, it seems just as likely as the fish credentials publishers
  4686.  
  4687. 1172
  4688. 03:59:46.260 --> 04:00:00.030
  4689. Rick Anderson : It. I think that's meant to say. It seems just as likely that the fished credentials publishers might help you detect could be for one of your medical researchers who have access to regulate a data as fished student credentials.
  4690.  
  4691. 1173
  4692. 04:00:01.140 --> 04:00:06.990
  4693. Corey Roach: So two part answer. And actually, I like this one because it was similar to another one that I saw go by in the chat and that was
  4694.  
  4695. 1174
  4696. 04:00:07.320 --> 04:00:18.600
  4697. Corey Roach: I was actually shocked to hear that there are universities who don't take those reports. Seriously. My biggest problem with them as they come very, very late but we absolutely take them seriously those accounts are suspended. Second, we see him.
  4698.  
  4699. 1175
  4700. 04:00:20.790 --> 04:00:27.870
  4701. Corey Roach: But if there are organizations that are not I am a little perplexed as to why they would do that, although the second part of that answer is
  4702.  
  4703. 1176
  4704. 04:00:28.620 --> 04:00:38.670
  4705. Corey Roach: It doesn't matter as much if they're a medical researcher and a student, because in, in my case all they've got to do to get to the research.
  4706.  
  4707. 1177
  4708. 04:00:39.300 --> 04:00:49.440
  4709. Corey Roach: Resources is put in their username and password in order to get into our medical records they either have to be on site or coming through our VPN, and they have to two factor to be able to get in.
  4710.  
  4711. 1178
  4712. 04:00:50.010 --> 04:01:00.480
  4713. Corey Roach: Excuse me. So having that password alone doesn't get them very far. But we also if I'm brutally honest. I think we somewhat use it as a canary in a coal mine.
  4714.  
  4715. 1179
  4716. 04:01:00.960 --> 04:01:09.390
  4717. Corey Roach: I mean we scan the internet for people posting our credentials out on dark websites and other places. We want to know if they've gotten compromised.
  4718.  
  4719. 1180
  4720. 04:01:10.920 --> 04:01:20.010
  4721. Corey Roach: So it I, as I say, I'm a little stunned that simply this might not take that seriously. But if they're not, they're definitely missing out on a potential resource.
  4722.  
  4723. 1181
  4724. 04:01:20.940 --> 04:01:40.560
  4725. Rick Anderson : I think one point of failure might, you know, honestly, might be in the library because the the reports of the suspicious activity reports in, in my experience, are typically sent to the library and its role as the licensee rather than sent directly to campus it and so
  4726.  
  4727. 1182
  4728. 04:01:42.240 --> 04:01:49.110
  4729. Rick Anderson : And and and libraries are, you know, we are typically very good about passing those along and in part because
  4730.  
  4731. 1183
  4732. 04:01:50.010 --> 04:02:04.680
  4733. Rick Anderson : In compliance with our licenses. We have to get those we have to get those IP addresses or or accounts shut down and we can only do that by going to it. But if that if that communication link breaks that would be one of the things that would
  4734.  
  4735. 1184
  4736. 04:02:05.940 --> 04:02:06.180
  4737. Rick Anderson : That
  4738.  
  4739. 1185
  4740. 04:02:07.650 --> 04:02:16.410
  4741. Corey Roach: So several poor steps in that interface, though. One is by only allowing the publisher to see the proxy IP address.
  4742.  
  4743. 1186
  4744. 04:02:17.430 --> 04:02:23.130
  4745. Corey Roach: Their controls are much more coarse grained they have to see a huge spike in activity to realize as something crazy went on.
  4746.  
  4747. 1187
  4748. 04:02:24.360 --> 04:02:32.010
  4749. Corey Roach: Whereas, if you're looking at each individual user, you can get a lot more fine grain. This guy suddenly looked up 100 PDFs and five minutes, what's going on.
  4750.  
  4751. 1188
  4752. 04:02:34.020 --> 04:02:35.820
  4753. Rick Anderson : So that varies from publisher to publisher.
  4754.  
  4755. 1189
  4756. 04:02:36.450 --> 04:02:36.930
  4757. Has
  4758.  
  4759. 1190
  4760. 04:02:38.220 --> 04:02:48.390
  4761. Corey Roach: Some of them seem to come back pretty quick, and some of them seem to come back really slow, but then to go to our back for conversation about Federation and tying some of these more sophisticated controls in
  4762.  
  4763. 1191
  4764. 04:02:49.590 --> 04:02:57.540
  4765. Corey Roach: Right now as a library you receive that complaint. You go back into your, your proxy logs and you're lining up okay this date and time with this resource and
  4766.  
  4767. 1192
  4768. 04:02:57.750 --> 04:03:11.100
  4769. Corey Roach: Oh, that's this user logged in, and you got to go down the line and hand by hand figure that stuff out, if that's part of a larger monitoring system that's all automated. So you you get rid of that time lag as well. Yeah, good point.
  4770.  
  4771. 1193
  4772. 04:03:12.960 --> 04:03:14.370
  4773. Rick Anderson : Other thoughts in response to that.
  4774.  
  4775. 1194
  4776. 04:03:15.480 --> 04:03:15.900
  4777. Question I
  4778.  
  4779. 1195
  4780. 04:03:17.580 --> 04:03:25.680
  4781. Tim Lloyd : Think it goes back to the extent to which the institution has people that it can fund to do this sort of work. I mean, Korea's what sounds like
  4782.  
  4783. 1196
  4784. 04:03:26.700 --> 04:03:28.320
  4785. Tim Lloyd : A fairly large and experience team.
  4786.  
  4787. 1197
  4788. 04:03:30.120 --> 04:03:38.670
  4789. Tim Lloyd : A lot of institutions don't have any. And so even if the librarian is on the board and forwards it, they may just be forwarding it to a campus IT person whose role is mainly just
  4790.  
  4791. 1198
  4792. 04:03:39.180 --> 04:03:46.740
  4793. Tim Lloyd : Basic things like going to check the IP addresses are correct on campus. And, you know, maybe there isn't someone around who's paying particular attention to this.
  4794.  
  4795. 1199
  4796. 04:03:47.070 --> 04:03:55.920
  4797. Tim Lloyd : Or they just view it as okay you know a credential hack this reset the credential and move on. Rather than, you know, digging deeper into what's the cause of this and what else could have happened.
  4798.  
  4799. 1200
  4800. 04:03:57.510 --> 04:04:11.340
  4801. Corey Roach: And that I thought about that a lot, actually. As I was putting together my presentation in the I realized if I was designing controls that only fit my organization that's a very small fraction of the problem.
  4802.  
  4803. 1201
  4804. 04:04:12.180 --> 04:04:23.850
  4805. Corey Roach: But it goes back to, you know, if we can get some low cost tools and we can get some rudimentary training and we can get some community resources where librarians can ask questions and, you know, maybe we can raise all boats.
  4806.  
  4807. 1202
  4808. 04:04:24.810 --> 04:04:36.930
  4809. Rick Anderson : Yeah, this. Yeah, this goes. This goes to something that Tim just had touched on, which is that, you know, Cory. Cory has this amazing team and and facility in large part because he oversees
  4810.  
  4811. 1203
  4812. 04:04:37.950 --> 04:04:46.920
  4813. Rick Anderson : He oversees network security at an institution that operates, not only a major research medical research facility, but also a major healthcare network.
  4814.  
  4815. 1204
  4816. 04:04:47.760 --> 04:05:03.360
  4817. Rick Anderson : So, and so that generates a lot of revenue and that that not only generates a lot of revenue to support this kind of infrastructure, but also creates an enormous risk profile that makes this kind of infrastructure absolutely essential if you're at, you know, Greenbrier college
  4818.  
  4819. 1205
  4820. 04:05:04.440 --> 04:05:06.750
  4821. Rick Anderson : It's going to be, it's going to be more of a struggle.
  4822.  
  4823. 1206
  4824. 04:05:07.770 --> 04:05:11.580
  4825. Rick Anderson : To shake loose the kinds of resources necessary to put to put really
  4826.  
  4827. 1207
  4828. 04:05:12.990 --> 04:05:24.090
  4829. Rick Anderson : really effective and and and pervasive security and well and this, and this goes to one of the questions that I had that I prepared as a as a pump primer if necessary.
  4830.  
  4831. 1208
  4832. 04:05:25.500 --> 04:05:37.830
  4833. Rick Anderson : So if an institution were to come to you and say, look, we want to increase our network security as much as possible, but we have no money. What solution would give us the most benefit at the least cost.
  4834.  
  4835. 1209
  4836. 04:05:40.410 --> 04:05:43.770
  4837. Rick Anderson : And I realized that question may be too broad. But, but, you know,
  4838.  
  4839. 1210
  4840. 04:05:44.760 --> 04:05:47.460
  4841. Rick Anderson : Pretend somebody comes up to you at a cocktail party and asked you that
  4842.  
  4843. 1211
  4844. 04:05:48.270 --> 04:05:56.070
  4845. Corey Roach: My answer as a C level executive rather than a, you know, technical security person would be you don't get anything for free.
  4846.  
  4847. 1212
  4848. 04:05:57.120 --> 04:06:02.010
  4849. Corey Roach: If you are making major structural changes and you think it comes for free. You're disillusioned.
  4850.  
  4851. 1213
  4852. 04:06:03.720 --> 04:06:04.980
  4853. Corey Roach: You're going to be disillusioned.
  4854.  
  4855. 1214
  4856. 04:06:08.340 --> 04:06:10.050
  4857. Corey Roach: But that said,
  4858.  
  4859. 1215
  4860. 04:06:10.710 --> 04:06:12.900
  4861. Corey Roach: There are ways to go about it.
  4862.  
  4863. 1216
  4864. 04:06:13.170 --> 04:06:24.930
  4865. Corey Roach: I mean, I think you can do things like use open source tools like train you know most it people that I interact with in the library space or elsewhere. They're fascinated by security they want to learn about it. They
  4866.  
  4867. 1217
  4868. 04:06:25.530 --> 04:06:30.540
  4869. Corey Roach: Many of them want to move their career that way because there is a gap in skills there and it's an opportunity
  4870.  
  4871. 1218
  4872. 04:06:31.080 --> 04:06:44.220
  4873. Corey Roach: So, you know, I think we can upscale some of the people that are already doing some of that work. I was pleased to see that somebody pasted in the chat earlier that easy proxy is incorporating security controls into their, their product which
  4874.  
  4875. 1219
  4876. 04:06:44.880 --> 04:06:53.670
  4877. Corey Roach: To be clear, I didn't mean to pick on easy proxy. I even thought about whether to put them in my presentation at all. But I realized such a large number of people use them. It was probably an important thing to talk about.
  4878.  
  4879. 1220
  4880. 04:06:55.830 --> 04:07:03.720
  4881. Corey Roach: And the problem there is that their product does exactly what it was designed to do when it was designed but now it appears, they're putting security controls into it, which is great.
  4882.  
  4883. 1221
  4884. 04:07:04.470 --> 04:07:18.690
  4885. Corey Roach: I did go and look at the the product release for it and they're pretty rudimentary right now but you know it's a first step. So that's great. And you know I wouldn't necessarily want them to put super complex controls in there until there's a sport and infrastructure but
  4886.  
  4887. 1222
  4888. 04:07:20.250 --> 04:07:24.270
  4889. Corey Roach: Hopefully, things like that, things like you know they're there are
  4890.  
  4891. 1223
  4892. 04:07:26.790 --> 04:07:41.580
  4893. Corey Roach: Open Source intrusion detection systems. For example, there's one called Zeke that started out as an academic project called bro, it's free, you know, and it scales to size of institutions like mine. You do have to put some hardware behind it, but the product is free.
  4894.  
  4895. 1224
  4896. 04:07:43.140 --> 04:07:47.460
  4897. Corey Roach: So, you know, there are creative ways to make stretch your dollar and get some security in there.
  4898.  
  4899. 1225
  4900. 04:07:49.830 --> 04:07:52.860
  4901. Rick Anderson : Good, thanks. Other other thoughts about low cost, high yield.
  4902.  
  4903. 1226
  4904. 04:07:55.470 --> 04:07:56.310
  4905. Rick Anderson : Solutions.
  4906.  
  4907. 1227
  4908. 04:07:57.600 --> 04:08:01.620
  4909. Tim Lloyd : Which made the point about a puppy again. So yeah, software, but if you don't
  4910.  
  4911. 1228
  4912. 04:08:01.770 --> 04:08:08.520
  4913. Tim Lloyd : Figure it correctly, you know, you think you've got security and actually what's happening is just horrible, but you don't even know it because you don't
  4914.  
  4915. 1229
  4916. 04:08:09.420 --> 04:08:17.430
  4917. Tim Lloyd : Have the right people managing it and maybe it could be bad. I think we've all heard about the example that sees gala right where campus it get products on to
  4918.  
  4919. 1230
  4920. 04:08:17.790 --> 04:08:24.090
  4921. Tim Lloyd : Block of IP addresses. No one mentions to it that live your life and IP authentication and all actors get shut off.
  4922.  
  4923. 1231
  4924. 04:08:24.510 --> 04:08:36.030
  4925. Tim Lloyd : And eventually. Someone figured out. Oh, it's just new thing that people put in quotes. These gala that random ideas IP addresses. And that's a perfect example of getting in a solution that no one really understood, you talked about
  4926.  
  4927. 1232
  4928. 04:08:39.420 --> 04:08:41.340
  4929. Tim Lloyd : So I think, yeah, there's no much free here. There really isn't
  4930.  
  4931. 1233
  4932. 04:08:41.580 --> 04:08:41.910
  4933. Rick Anderson : Yeah.
  4934.  
  4935. 1234
  4936. 04:08:41.970 --> 04:08:46.950
  4937. Tim Lloyd : I mean, the free solutions are very dodgy and you know you get what you get when you pay for free.
  4938.  
  4939. 1235
  4940. 04:08:47.070 --> 04:08:50.280
  4941. Corey Roach: They, they tend to take a higher level of expertise. If I'm perfectly
  4942.  
  4943. 1236
  4944. 04:08:50.280 --> 04:08:50.760
  4945. Tim Lloyd : Honest.
  4946.  
  4947. 1237
  4948. 04:08:50.850 --> 04:08:55.770
  4949. Corey Roach: I mean, Zeke is an amazing tool, you almost have to be a data scientist to use it. So yeah.
  4950.  
  4951. 1238
  4952. 04:08:55.830 --> 04:09:00.900
  4953. Rick Anderson : It's like any other open source solution that the cost. The cost comes in the back end.
  4954.  
  4955. 1239
  4956. 04:09:01.770 --> 04:09:05.820
  4957. Tim Lloyd : And that's why the concept of shared infrastructure makes sense. You know, whether it's a
  4958.  
  4959. 1240
  4960. 04:09:06.660 --> 04:09:16.830
  4961. Tim Lloyd : Fast service that know provided by a private business or whether it's a community shared service. You know, there are common costs incurred in building security infrastructure.
  4962.  
  4963. 1241
  4964. 04:09:17.220 --> 04:09:27.330
  4965. Tim Lloyd : We can always to incur them individually, one by one, or the industry can try and share them. There's lots of models for sharing it, but that's just an obvious, a type of solution, but it sort of problem.
  4966.  
  4967. 1242
  4968. 04:09:27.600 --> 04:09:35.220
  4969. Corey Roach: Many of those information sharing groups I mentioned for a sec. But there's our organization probably five or six of them were many members of
  4970.  
  4971. 1243
  4972. 04:09:35.820 --> 04:09:39.630
  4973. Corey Roach: Many of them have automated sharing mechanisms where, you know,
  4974.  
  4975. 1244
  4976. 04:09:40.620 --> 04:09:52.980
  4977. Corey Roach: Get a little nerdy about. It's called one on ones. It's called sticks and taxi. It's a standardized format for those threats to become into my organization and those get plugged into the tools automatically. That's a bunch of research and work that my guys don't have to do
  4978.  
  4979. 1245
  4980. 04:09:55.500 --> 04:10:03.900
  4981. Tim Lloyd : So one of the points that you made courage. I thought was fascinating. Was this issue of, you know, on the level of the threats that you worry about to say so.
  4982.  
  4983. 1246
  4984. 04:10:04.290 --> 04:10:13.020
  4985. Tim Lloyd : Now the library is very low. And I just wonder whether, you know, the problem is underlying this whole conversation is that yes, there are people there who
  4986.  
  4987. 1247
  4988. 04:10:13.380 --> 04:10:25.320
  4989. Tim Lloyd : Focus on security. There are tools out there to address security, but in the grand scheme of things, the library use case is just not that important, and there just doesn't seem to be a clear enough tie between
  4990.  
  4991. 1248
  4992. 04:10:25.860 --> 04:10:31.650
  4993. Tim Lloyd : The losses incurred by the library and bigger losses within the institution to you know make it right.
  4994.  
  4995. 1249
  4996. 04:10:31.980 --> 04:10:32.550
  4997. Tim Lloyd : So we look
  4998.  
  4999. 1250
  5000. 04:10:33.120 --> 04:10:37.470
  5001. Corey Roach: At and I'm not sure if I'm pitching toward my own interests or not in this particular statement, but
  5002.  
  5003. 1251
  5004. 04:10:39.780 --> 04:10:48.420
  5005. Corey Roach: One of the things I mentioned toward the very tail end of the presentation was. I agree. I think there should be probably some risk sharing and incentivizing
  5006.  
  5007. 1252
  5008. 04:10:49.110 --> 04:10:59.790
  5009. Corey Roach: Between the publishers and the organizations, you know, I think if some of these publishers came back and said look, we're going to knock five points off your license.
  5010.  
  5011. 1253
  5012. 04:11:00.270 --> 04:11:08.190
  5013. Corey Roach: And we're going to give you this community and we're going to help you set up these concerts security controls and we're going to give you three compromises in a year.
  5014.  
  5015. 1254
  5016. 04:11:08.640 --> 04:11:15.570
  5017. Corey Roach: But every compromise after that we're going to put a point back on your license, you're going to have an increased cost for the next cycle.
  5018.  
  5019. 1255
  5020. 04:11:15.900 --> 04:11:26.430
  5021. Corey Roach: If you have security compromises. Well, now is the sea so money comes into it, I gotta go. Well, is it worth offsetting that with some controls and effort so that I get a better price.
  5022.  
  5023. 1256
  5024. 04:11:27.570 --> 04:11:38.910
  5025. Corey Roach: You know, it's, it's not a regulatory control. I'm not getting slapped with a fine by Health and Human Services. But, you know, a few points off for the, the publisher could have a pretty good return
  5026.  
  5027. 1257
  5028. 04:11:39.600 --> 04:11:46.380
  5029. Okere, Kelechi N. (ELS-NYC): Yeah so bit of, I don't know if social engineering is the right thing, but I bet if I creativity around that.
  5030.  
  5031. 1258
  5032. 04:11:46.740 --> 04:11:50.520
  5033. Corey Roach: It just creates a shared interest. I mean, I'm getting a better deal. So I'm happy.
  5034.  
  5035. 1259
  5036. 04:11:50.640 --> 04:11:53.940
  5037. Corey Roach: Now, the publisher is getting better security. So they're happy.
  5038.  
  5039. 1260
  5040. 04:11:54.240 --> 04:11:54.510
  5041. Yeah.
  5042.  
  5043. 1261
  5044. 04:11:55.590 --> 04:12:05.340
  5045. Crane Hassold : That's the whole you know with with any I've seen some cyber security that's that's from a vendor's perspective, that's what you have to sell like you're selling an ROI like
  5046.  
  5047. 1262
  5048. 04:12:06.060 --> 04:12:11.850
  5049. Crane Hassold : A cost associated with something, then I have to figure out, well, is the risk, you know,
  5050.  
  5051. 1263
  5052. 04:12:12.360 --> 04:12:23.790
  5053. Crane Hassold : Am I, am I willing to accept this risk based on what it could cost me if I get exploited, if I am. Then I'm not going to pay a ton of money for security, if I'm not
  5054.  
  5055. 1264
  5056. 04:12:24.180 --> 04:12:39.540
  5057. Crane Hassold : Then I'm going to want to pay and prioritize it over other things. And so that's that's the whole name of the game, regardless of whether you're at a massive hundred thousand dollar hundred thousand employee company or a library with, you know, one it person.
  5058.  
  5059. 1265
  5060. 04:12:40.710 --> 04:12:40.890
  5061. Crane Hassold : Yeah.
  5062.  
  5063. 1266
  5064. 04:12:41.160 --> 04:12:54.000
  5065. Okere, Kelechi N. (ELS-NYC): And crane one question for you just kind of like taken Tim's question a step further, you said that congressional investigations on threats to universities often focus on physical threats.
  5066.  
  5067. 1267
  5068. 04:12:54.960 --> 04:13:04.470
  5069. Okere, Kelechi N. (ELS-NYC): But not cyber threats. Is that a historical within where they may be thing that the cyber threats don't exist or and do you think that's changing
  5070.  
  5071. 1268
  5072. 04:13:04.980 --> 04:13:17.940
  5073. Crane Hassold : So that was my experience when I testify on that house committee and that was it was we honestly was weird to me because it felt like everyone was acting like we're still like in the 1980s. Cold War.
  5074.  
  5075. 1269
  5076. 04:13:18.240 --> 04:13:29.490
  5077. Crane Hassold : Where it like the cyber threats didn't exist and every like all the threats were were Chinese students and Russian students coming from overseas, like it was very strange to me.
  5078.  
  5079. 1270
  5080. 04:13:30.090 --> 04:13:44.760
  5081. Crane Hassold : I haven't seen you know a there were there were a couple of congressmen who were calling for have understood what the problem was. But that was really my experience when you know, two years ago, when I testified in front of that on that house committee.
  5082.  
  5083. 1271
  5084. 04:13:46.320 --> 04:13:56.460
  5085. Crane Hassold : Honestly, I haven't, haven't really seen much to show me that they don't think like that because, you know, just on from, you know, legislative perspective there hasn't been a push.
  5086.  
  5087. 1272
  5088. 04:13:56.700 --> 04:14:14.310
  5089. Crane Hassold : Regardless of whether it's the academic community or anywhere else to really substantially increase cyber security in anything. So yeah, I mean, I think that's sort of the, the thought process for for a lot of the folks that control the purse purse purse strings on Capitol Hill.
  5090.  
  5091. 1273
  5092. 04:14:14.790 --> 04:14:25.140
  5093. Corey Roach: Yeah, and I don't really have enough experience to speak to the legislative level of that. I do think there is a growing interest kind of more at the regional and tactical level. I mean, I
  5094.  
  5095. 1274
  5096. 04:14:25.800 --> 04:14:35.010
  5097. Corey Roach: I've got my cyber security FBI office for their local Salt Lake office on speed, you know, there are various things that they are interested in hearing about
  5098.  
  5099. 1275
  5100. 04:14:36.840 --> 04:14:45.630
  5101. Corey Roach: And they've had even some good resources for us where they've offered to come in and do audits and other things. So tactically maybe they're more interested, maybe not strategically it
  5102.  
  5103. 1276
  5104. 04:14:47.190 --> 04:14:50.130
  5105. Tim Lloyd : Tosca. Quick question interview, Rick. So
  5106.  
  5107. 1277
  5108. 04:14:51.810 --> 04:14:52.770
  5109. Tim Lloyd : A couple minutes left.
  5110.  
  5111. 1278
  5112. 04:14:53.040 --> 04:15:06.420
  5113. Tim Lloyd : Okay, yeah, I'll make it quick. So do you think you could go to your administration and say, okay, a bunch of major publishers are going to start penalizing us for infractions. I think we should therefore invest more in security. Do you think you could sell them.
  5114.  
  5115. 1279
  5116. 04:15:06.930 --> 04:15:09.600
  5117. Rick Anderson : Yes, depending on what the dollar figures are
  5118.  
  5119. 1280
  5120. 04:15:11.580 --> 04:15:15.300
  5121. Rick Anderson : I mean, if it's a publisher, with whom we do $1,000 of business a year.
  5122.  
  5123. 1281
  5124. 04:15:15.300 --> 04:15:15.810
  5125. Tim Lloyd : And there might
  5126.  
  5127. 1282
  5128. 04:15:16.020 --> 04:15:23.010
  5129. Rick Anderson : Last $100. No, but if it's Elsevier, and they're going to penalize us 10% of our annual expenditure than yes
  5130.  
  5131. 1283
  5132. 04:15:23.520 --> 04:15:39.900
  5133. Corey Roach: Well, and I think it would be important on how Elsevier frames that and if they come in next year and say, you know, hey, here's your normal increase in cost over the year. But if we set up this risk sharing program. I've got a 5% discount. I can get you
  5134.  
  5135. 1284
  5136. 04:15:40.920 --> 04:15:41.700
  5137. Corey Roach: Know you got to look
  5138.  
  5139. 1285
  5140. 04:15:41.970 --> 04:15:47.730
  5141. Tim Lloyd : Towards model seems to be a good fit here doesn't it when you get a benefit from good behavior and a penalty for bad behavior.
  5142.  
  5143. 1286
  5144. 04:15:48.240 --> 04:15:55.500
  5145. Rick Anderson : But, but the other thing that it depends on is where if a penalty is going to be paid out of what budget is it going to be paid.
  5146.  
  5147. 1287
  5148. 04:15:55.770 --> 04:16:04.320
  5149. Rick Anderson : If it's going to be paid out of the library's budget that's already been allocated to it by the university. The university stance could be. You just have to manage this.
  5150.  
  5151. 1288
  5152. 04:16:04.830 --> 04:16:17.490
  5153. Rick Anderson : Now, if we come back to the university and saying this is significantly undermining our ability to provide content that that the university needs in order to do its work okay but we make that argument to the to the university, all the time. And we don't always win.
  5154.  
  5155. 1289
  5156. 04:16:18.030 --> 04:16:26.430
  5157. Corey Roach: Well, and I would argue, I sorry I want to quit because it's really going to minutes, but I would argue also that it's kind of a prod, it's not the entirety of the equation. So,
  5158.  
  5159. 1290
  5160. 04:16:27.270 --> 04:16:34.830
  5161. Corey Roach: It's the thing would get your sea. So thinking about, okay, but now I got to factor in the reputational risk. And do I have economies of scale where this doesn't cost me much and
  5162.  
  5163. 1291
  5164. 04:16:35.160 --> 04:16:44.490
  5165. Corey Roach: You know, regulatory things. And, you know, but it's that initial first push to say you know securities actually starting to matter over here on these library resources better take a look at it.
  5166.  
  5167. 1292
  5168. 04:16:44.760 --> 04:16:45.810
  5169. Rick Anderson : Yeah, totally agree.
  5170.  
  5171. 1293
  5172. 04:16:47.040 --> 04:16:55.890
  5173. Rick Anderson : komaci. First of all, thanks so much to everybody for thanks for submitting the questions and thanks for offering such great answers to our panelists who are able to stay with us.
  5174.  
  5175. 1294
  5176. 04:16:56.310 --> 04:17:05.280
  5177. Rick Anderson : I think it's been a really, really useful and certainly a very interesting conversation collective. What should we say to the folks who submitted questions that we didn't have time to get to
  5178.  
  5179. 1295
  5180. 04:17:07.440 --> 04:17:08.730
  5181. Okere, Kelechi N. (ELS-NYC): I think we could
  5182.  
  5183. 1296
  5184. 04:17:11.670 --> 04:17:18.630
  5185. Okere, Kelechi N. (ELS-NYC): Put it this way if if there is someone on the panel that has an answer to any of these questions, we can just
  5186.  
  5187. 1297
  5188. 04:17:19.770 --> 04:17:23.520
  5189. Okere, Kelechi N. (ELS-NYC): Put that answer in the email that we're going to send to
  5190.  
  5191. 1298
  5192. 04:17:24.720 --> 04:17:38.880
  5193. Okere, Kelechi N. (ELS-NYC): You know, to our attendees. So I think by tomorrow Monday will be send an email to attendees with a link to the to this recording, we can also put the answer to these are remaining questions in that email.
  5194.  
  5195. 1299
  5196. 04:17:39.990 --> 04:17:47.730
  5197. Corey Roach: Great. Well, I can also say that if this organization is interested in setting up things like birds of a feather or
  5198.  
  5199. 1300
  5200. 04:17:48.180 --> 04:17:58.140
  5201. Corey Roach: Community sharing stuff I would be interested in participating or sending one of my engineers to participate. So I think you could get those type of questions answered on a regular basis.
  5202.  
  5203. 1301
  5204. 04:18:01.020 --> 04:18:06.360
  5205. Okere, Kelechi N. (ELS-NYC): That's very good to know. Yeah. And I think that's also one of the
  5206.  
  5207. 1302
  5208. 04:18:07.410 --> 04:18:20.490
  5209. Okere, Kelechi N. (ELS-NYC): Intent for this forum right to kind of think about ways that we can collaborate publishers and universities can be working more closely together, you know, since says he is a
  5210.  
  5211. 1303
  5212. 04:18:21.300 --> 04:18:33.450
  5213. Okere, Kelechi N. (ELS-NYC): SSI Rick, I just couldn't correct myself. Is there a relatively new coalition. Right. So I think, in essence, this gives us a lot about food for thought.
  5214.  
  5215. 1304
  5216. 04:18:35.850 --> 04:18:42.510
  5217. Rick Anderson : I give it a hard time because I'm a huge reggae fan and every time I hear the word sensi it startles me in this context.
  5218.  
  5219. 1305
  5220. 04:18:45.720 --> 04:18:47.280
  5221. Corey Roach: Not something I would have guessed.
  5222.  
  5223. 1306
  5224. 04:18:51.570 --> 04:19:09.180
  5225. Okere, Kelechi N. (ELS-NYC): So I want to thank all the all the panelists. I want to thank the audience the attendees that are still remaining. Thank you all for a fantastic presentation all throughout the day. Lots of our thought provoking ideas and insights that you that you provided to us today.
  5226.  
  5227. 1307
  5228. 04:19:10.890 --> 04:19:12.060
  5229. Okere, Kelechi N. (ELS-NYC): As we now.
  5230.  
  5231. 1308
  5232. 04:19:13.410 --> 04:19:17.700
  5233. Okere, Kelechi N. (ELS-NYC): Let me just hand it back to Daniel to then take us through to close.
  5234.  
  5235. 1309
  5236. 04:19:31.980 --> 04:19:36.480
  5237. Daniel Ascher: Said that you to read for moderating the panel and to our panelists Corey.
  5238.  
  5239. 1310
  5240. 04:19:36.510 --> 04:19:37.560
  5241. Daniel Ascher: Tim and crane.
  5242.  
  5243. 1311
  5244. 04:19:37.830 --> 04:19:42.450
  5245. Daniel Ascher: And two other question asked her as and attendance, a lot of credit discussion just occur.
  5246.  
  5247. 1312
  5248. 04:19:43.470 --> 04:19:55.770
  5249. Daniel Ascher: So with that we will go to our closing remarks by Stephen income, the chief publishing and solutions officer at spirit nature as well as the co Chair of SNS I
  5250.  
  5251. 1313
  5252. 04:19:59.280 --> 04:20:00.930
  5253. Steven Inchcoombe: Done what a
  5254.  
  5255. 1314
  5256. 04:20:02.310 --> 04:20:12.900
  5257. Steven Inchcoombe: What a wonderful discussion we've just had the benefit off. And if I reflect for the last couple of minutes on what we've had over today's sessions.
  5258.  
  5259. 1315
  5260. 04:20:14.250 --> 04:20:21.030
  5261. Steven Inchcoombe: Corey started with the keynote and gave us the perspective from a university is chief information security officer.
  5262.  
  5263. 1316
  5264. 04:20:21.900 --> 04:20:37.080
  5265. Steven Inchcoombe: It shone a light for us on what's going on when bad bots, which tend to be customers while will why wrong time intervention is needed and how to balance that with privacy following that he gave us some advice on how to strengthen the system.
  5266.  
  5267. 1317
  5268. 04:20:39.060 --> 04:20:48.570
  5269. Steven Inchcoombe: can utilize his FBI experience and he's more recent work to cover the attacks by silent librarian one 300 plus universities around the world.
  5270.  
  5271. 1318
  5272. 04:20:49.170 --> 04:21:04.050
  5273. Steven Inchcoombe: And that their links to the Iranian government with frankly shocking consequences crane also explained how sigh have has taken advantage of economic opportunities resulting from there piracy in places like you're on.
  5274.  
  5275. 1319
  5276. 04:21:05.070 --> 04:21:16.860
  5277. Steven Inchcoombe: Its political backing in Russia and how it is cynically exploited the next this movement to claim a societal mission suffering in many we simply unknown consequences today.
  5278.  
  5279. 1320
  5280. 04:21:19.170 --> 04:21:35.190
  5281. Steven Inchcoombe: In her own at the university library Linda gave us her perspective on her priorities and the tension, she sees between security, privacy and the various regulations so they have to apply, apply with in education and in the medical spheres.
  5282.  
  5283. 1321
  5284. 04:21:36.300 --> 04:21:45.540
  5285. Steven Inchcoombe: Linda highlighted the responsibility we all have to take reasonable steps to protect each other and went on to explain how her library undertook this
  5286.  
  5287. 1322
  5288. 04:21:45.990 --> 04:21:58.050
  5289. Steven Inchcoombe: By moving to federated access or utilizing open Athens so security could be dealt with across the university by experts with highly targeted approaches.
  5290.  
  5291. 1323
  5292. 04:22:01.110 --> 04:22:18.060
  5293. Steven Inchcoombe: can focus on the treasure trove of information accessible in academia if cyber criminals or hackers act on behalf of hostile governments take advantage of this and greatly hurt their victims, which is all of us, our family and our friends.
  5294.  
  5295. 1324
  5296. 04:22:19.380 --> 04:22:33.720
  5297. Steven Inchcoombe: Jeremy's very clear that site hub and others are using the concerns of some academics, but some academics have over copyright as a cover to get into university networks and then the consequences of these breaches can be truly appalling.
  5298.  
  5299. 1325
  5300. 04:22:35.670 --> 04:22:41.880
  5301. Steven Inchcoombe: Tim tried for a solution federated authentication and it's extension seamless access
  5302.  
  5303. 1326
  5304. 04:22:42.990 --> 04:22:49.200
  5305. Steven Inchcoombe: To me explain that this work, enabling authentication and control, whilst protecting privacy.
  5306.  
  5307. 1327
  5308. 04:22:50.430 --> 04:22:57.570
  5309. Steven Inchcoombe: He went on show how seamless access overcomes much of the landscape fragmentation terminology and some of the friction
  5310.  
  5311. 1328
  5312. 04:22:58.740 --> 04:23:09.270
  5313. Steven Inchcoombe: And in terms of security and privacy, how it is, how it is applied the shield data protection code of conduct which aligns closely with the
  5314.  
  5315. 1329
  5316. 04:23:10.080 --> 04:23:22.500
  5317. Steven Inchcoombe: Standards. He then went on to contrast this with why relying on IP addresses is likely to result in an inconsistent customer experience and security enforcement challenges.
  5318.  
  5319. 1330
  5320. 04:23:24.720 --> 04:23:36.660
  5321. Steven Inchcoombe: And lastly, in what I think has been one of the most interesting panel discussions. I've experienced for a long time, which chaired a roundtable discussion with all the speakers.
  5322.  
  5323. 1331
  5324. 04:23:38.070 --> 04:23:50.760
  5325. Steven Inchcoombe: This is brought to life the day to day challenges the limits of the technical solutions and the vulnerability of the human factor, especially in organizations that can help their staff easily left leg game.
  5326.  
  5327. 1332
  5328. 04:23:52.470 --> 04:24:04.170
  5329. Steven Inchcoombe: The discussion in terms of the level of risk of responsibility people and librarians in particular feel over security when coupled with access to copyright material.
  5330.  
  5331. 1333
  5332. 04:24:05.490 --> 04:24:09.540
  5333. Steven Inchcoombe: will cost us about whether competitors will work together to solve this
  5334.  
  5335. 1334
  5336. 04:24:10.890 --> 04:24:13.410
  5337. Steven Inchcoombe: Karen, I would say that the CO chairs of
  5338.  
  5339. 1335
  5340. 04:24:15.900 --> 04:24:27.030
  5341. Steven Inchcoombe: Si, si come from Elsevier and spring in Asia, and I can assure you that our organizations compete strongly elsewhere, but when it comes to security we actively to avoid
  5342.  
  5343. 1336
  5344. 04:24:29.100 --> 04:24:41.040
  5345. Steven Inchcoombe: The discussion then turned to what is sufficient to meet contractual reasonable requirements and then the focus was on war two in Korea referred to as table stakes.
  5346.  
  5347. 1337
  5348. 04:24:42.870 --> 04:24:58.140
  5349. Steven Inchcoombe: Moving to federated access was described as a way to enable way to control, but given this way to empower the disciplines around is use are critical in increasing that way to control this needed in order to which use these risks.
  5350.  
  5351. 1338
  5352. 04:25:01.080 --> 04:25:09.150
  5353. Steven Inchcoombe: Towards the end of the discussion which turns the different capabilities resources and attitudes of libraries institutional IT departments and publishers
  5354.  
  5355. 1339
  5356. 04:25:09.780 --> 04:25:21.630
  5357. Steven Inchcoombe: To identify and deal with security breaches trying to find entry level solutions and the understanding that is needed to apply them cause much debate.
  5358.  
  5359. 1340
  5360. 04:25:22.650 --> 04:25:34.140
  5361. Steven Inchcoombe: The conclusion seem to be the shared infrastructure may be a better way. And I think Corey for publishers could incentivize his customers, the risk reward sharing approach.
  5362.  
  5363. 1341
  5364. 04:25:35.430 --> 04:25:47.100
  5365. Steven Inchcoombe: Overall I think a fantastic meeting of minds ideas experience and Alex that helps us really inform what may be our parts going forward.
  5366.  
  5367. 1342
  5368. 04:25:49.980 --> 04:25:59.850
  5369. Steven Inchcoombe: If we now as we finished the event, think about our next steps. I would just leave you with two thoughts, I would urge you, when you
  5370.  
  5371. 1343
  5372. 04:26:01.080 --> 04:26:21.060
  5373. Steven Inchcoombe: Interact with colleagues and peers and in the coming days to take this conversation, further to share with them what you've learned and to have an open debate about where reasonable bad behaviors and actions are and what it is that is our common
  5374.  
  5375. 1344
  5376. 04:26:22.230 --> 04:26:24.660
  5377. Steven Inchcoombe: significant risks that we all face.
  5378.  
  5379. 1345
  5380. 04:26:26.160 --> 04:26:33.180
  5381. Steven Inchcoombe: And secondly, I would ask you get involved to become involved in, and she's like the scholarly network security initiative.
  5382.  
  5383. 1346
  5384. 04:26:33.720 --> 04:26:49.020
  5385. Steven Inchcoombe: To inform your own strategies, your plans and your actions are tackling the cyber threats which ultimately are a threat to the whole research enterprise. We're all interconnected. So let's share and collaborate wherever we can.
  5386.  
  5387. 1347
  5388. 04:26:51.420 --> 04:27:01.620
  5389. Steven Inchcoombe: Find me on behalf of Nick Fowler, my co chair and my colleagues this solidly Network Security Initiative. I want to thank all of our speakers.
  5390.  
  5391. 1348
  5392. 04:27:02.190 --> 04:27:11.250
  5393. Steven Inchcoombe: For all of their contributions to this summit, they've been incredibly interesting and insightful and I'd like to thank everyone that has joined us online and participated today.
  5394.  
  5395. 1349
  5396. 04:27:12.630 --> 04:27:20.280
  5397. Steven Inchcoombe: I wish you all in your family's a safe and healthy fall season and I'll hand over now to Dan to wrap up today's events.
  5398.  
  5399. 1350
  5400. 04:27:21.390 --> 04:27:22.260
  5401. Steven Inchcoombe: Thank you very much.
  5402.  
  5403. 1351
  5404. 04:27:24.240 --> 04:27:24.960
  5405. Daniel Ascher: Thank you very much.
  5406.  
  5407. 1352
  5408. 04:27:28.230 --> 04:27:30.840
  5409. Daniel Ascher: So for our final poll question here.
  5410.  
  5411. 1353
  5412. 04:27:46.560 --> 04:27:50.580
  5413. Okere, Kelechi N. (ELS-NYC): I'll leave the the poll for just leave it open for a minute.
  5414.  
  5415. 1354
  5416. 04:27:52.290 --> 04:27:58.290
  5417. Okere, Kelechi N. (ELS-NYC): About 42% of people have voted and panelists can vote this time to
  5418.  
  5419. 1355
  5420. 04:28:09.900 --> 04:28:13.170
  5421. Okere, Kelechi N. (ELS-NYC): Hope hoping to get 100% of everyone voted.
  5422.  
  5423. 1356
  5424. 04:28:23.400 --> 04:28:27.570
  5425. Okere, Kelechi N. (ELS-NYC): Alright nine more seconds. Any final votes.
  5426.  
  5427. 1357
  5428. 04:28:33.540 --> 04:28:33.990
  5429. Okere, Kelechi N. (ELS-NYC): Alright.
  5430.  
  5431. 1358
  5432. 04:28:35.400 --> 04:28:53.100
  5433. Okere, Kelechi N. (ELS-NYC): Just sharing the results. So 89% of you voted sorry in up to 10. Have you found the discussion to the very useful and 11% final somewhat useful so also good to see that no one thought that it was not useful. So
  5434.  
  5435. 1359
  5436. 04:28:54.180 --> 04:28:57.000
  5437. Okere, Kelechi N. (ELS-NYC): Thank you all again for participating.
  5438.  
  5439. 1360
  5440. 04:29:00.480 --> 04:29:00.780
  5441. Okere, Kelechi N. (ELS-NYC): Then
  5442.  
  5443. 1361
  5444. 04:29:01.110 --> 04:29:02.250
  5445. Daniel Ascher: Yes, thank you.
  5446.  
  5447. 1362
  5448. 04:29:03.810 --> 04:29:13.800
  5449. Daniel Ascher: For all attendees and panelists and speakers. So after this, there will be an email with a link to the recording.
  5450.  
  5451. 1363
  5452. 04:29:14.580 --> 04:29:32.070
  5453. Daniel Ascher: Along with a survey we would really appreciate if you fill that out. Once it's received and as we mentioned at the end of the round table the panelists will also be answering some of the questions that we were not able to cover during the roundtable session.
  5454.  
  5455. 1364
  5456. 04:29:35.340 --> 04:29:38.580
  5457. Daniel Ascher: So anything else to add before we close this out coaching.
  5458.  
  5459. 1365
  5460. 04:29:38.940 --> 04:29:49.620
  5461. Okere, Kelechi N. (ELS-NYC): Know, I think that said it's really been a fantastic day. It's always a deep sigh of relief. You know when you come to the end of
  5462.  
  5463. 1366
  5464. 04:29:50.100 --> 04:29:59.910
  5465. Okere, Kelechi N. (ELS-NYC): A session like this. And so I want to thank everyone. I want to thank my colleagues who contributed to planning this event, putting it together. I want to thank
  5466.  
  5467. 1367
  5468. 04:30:00.690 --> 04:30:09.840
  5469. Okere, Kelechi N. (ELS-NYC): The speakers for all your thought provoking presentations and all the time that you put into it and also want to thank all the attendees for making the time
  5470.  
  5471. 1368
  5472. 04:30:10.500 --> 04:30:27.270
  5473. Okere, Kelechi N. (ELS-NYC): Today, so do visit the SSI website as an SI dot info for more information and also share what you've learned today with colleagues. Thank you again very much, bye bye now. Have a good day.
  5474.  
  5475. 1369
  5476. 04:30:28.080 --> 04:30:28.470
  5477. Daniel Ascher: Thank you.
  5478.  
RAW Paste Data