Advertisement
Guest User

Untitled

a guest
Apr 21st, 2018
11,572
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 281.01 KB | None | 0 0
  1. ZUCKERBERG
  2. - Thank you. Chairman Walden, Ranking Member Pallone and members
  3. of the committee, we face a number of important issues around
  4. privacy, security and democracy. And you will rightfully have
  5. some hard questions for me to answer. Before I talk about the
  6. steps we're taking to address them, I want to talk for a minute
  7. about how we got there. Facebook is an idealistic and optimistic
  8. company. For most of our existence, we focused on all the good
  9. that connecting people can bring. And, as Facebook has grown,
  10. people everywhere have gotten a powerful new tool for staying
  11. connected to the people they care about most, for making their
  12. voices heard and for building community and businesses. Just
  13. recently, we've seen the “Me Too” movement and the March for Our
  14. Lives organized, at least part, on Facebook. After Hurricane
  15. Harvey, people came together and raised more than $20 million
  16. for relief. And there are more than 70 million small businesses
  17. around the world that use our tools to grow and create jobs.
  18. - But it's clear now that we didn't do enough to prevent these
  19. tools from being used for harm, as well. And that goes for fake
  20. news, foreign interference in elections and hate speech, as well
  21. as developers and data privacy. We didn't take a broad enough
  22. view of our responsibility, and that was a big mistake. It was
  23. my mistake, and I am sorry. I started Facebook, I run it, and,
  24. at the end of the day, I am responsible for what happens here.
  25. So, now, we have to go through every part of our relationship
  26. with people to make sure that we're taking a broad enough view
  27. of our responsibility. It's not enough to just connect people.
  28. We have to make sure those connections are positive. It's not
  29. enough to just give people a voice. We need to make sure that
  30. voice isn't used to harm other people or spread misinformation.
  31. And it's not enough to just give people control of their
  32. information. We need to make sure that the developers that they
  33. share it with protect their information too. Across the board,
  34. we have a responsibility to not just give people tools, but to
  35. make sure that those tools are used for good. It's going to take
  36. some time to work through all the changes we need to make. But I
  37. am committed to getting this right, and that includes the basic
  38. responsibility of protecting people's information, which we
  39. failed to do with Cambridge Analytica. So here are a few key
  40. things that we're doing to address this situation and make sure
  41. that this doesn't happen again. First, we're getting to the
  42. bottom of exactly what Cambridge Analytica did, and telling
  43. everyone who may have been affected. What we know now is that
  44. Cambridge Analytica improperly obtained some information about
  45. millions of Facebook members by buying it from an app developer
  46. that people had shared it with. This information was generally
  47. information that people share publicly on their profile pages,
  48. like their name and profile picture and the list of pages that
  49. they follow. When we first contacted Cambridge Analytica, they
  50. told us that they had deleted the data. And then, about a month
  51. ago, we heard a new report that suggested that this was not
  52. true. So now we're working with governments in the U.S., the
  53. U.K. and around the world to do a full audit of what they've
  54. done and to make sure that they get rid of any data that they
  55. still have. Second, to make sure that no other app developers
  56. are out there misusing data, we're now investigating every
  57. single app that had access to a large amount of people's
  58. information on Facebook in the past. And, if we find someone
  59. that improperly used data, we're going to ban them from our
  60. platform and tell everyone affected. Third, to prevent this from
  61. ever happening again, we're making sure developers can't access
  62. as much information, going forward. The good news here is that
  63. we made some big changes to our platform in 2014 that would
  64. prevent this specific instance with Cambridge Analytica from
  65. happening again today. But there's more to do, and you can find
  66. more of the details of the other steps we're taking in the
  67. written statement I provided. My top priority has always been
  68. our social mission of connecting people, building community and
  69. bringing the world closer together. Advertisers and developers
  70. will never take priority over that for as long as I am running
  71. Facebook. I started Facebook when I was in college. We've come a
  72. long way since then. We now serve more than 2 billion people
  73. around the world, and, every day, people use our services to
  74. stay connected with the people that matter to them most. I
  75. believe deeply in what we're doing, and I know that, when we
  76. address these challenges, we'll look back and view helping
  77. people connect and giving more people a voice as a positive
  78. force in the world. I realize the issues we're talking about
  79. today aren't just issues for Facebook and our community; they're
  80. challenges for all of us as Americans. Thank you for having me
  81. here today, and I am ready to take your questions.
  82. - Thank you, Mr. Chairman. I consider us to be a technology
  83. company, because the primary thing that we do is have engineers
  84. who write code and build products and services for other people.
  85. There are certainly other things that we do, too. We — we do pay
  86. to help produce content. We build enterprise software, although
  87. I don't consider us an enterprise software company. We build
  88. planes to help connect people, and I don't consider ourselves to
  89. be an aerospace company. But, overall, when people ask us if
  90. we're a media company, what — what I hear is, “Do we have a
  91. responsibility for the content that people share on Facebook?”
  92. And I believe the answer to that question is yes.
  93. - Mr. Chairman, I do not consider ourselves to be a financial
  94. institution, although you're right that we do provide tools for
  95. people to send money.
  96. - Well, Mr. Chairman, I think we've evolved quite a bit as a
  97. company. When I started it, I certainly didn't think that we
  98. would be the ones building this broad of a community around the
  99. world. I thought someone would do it. I didn't think it was
  100. going to be us. So we've definitely grown.
  101. - Mr. Chairman, you're right that we don't sell any data. And I
  102. would say that we do try to explain what we do as — as time goes
  103. on. It's a — it's a broad system. You know, every day, about 100
  104. billion times a day, people come to one of our products, whether
  105. it's Facebook or Messenger or Instagram or WhatsApp, to put in a
  106. piece of content, whether it's a — a photo that they want to
  107. share or a message they want to send someone. And, every time,
  108. there's a control right there about who you want to share it
  109. with. Do you want to share it publicly, to broadcast it out to
  110. everyone? Do you want to share it with your friends, a specific
  111. group of people? Do you want to message it to just one — one
  112. person or a couple of people? That's the most important thing
  113. that we do. And I think that, in the product, that's quite
  114. clear. I do think that we can do a better job of explaining how
  115. advertising works. There is a common misperception, as you say,
  116. that is just reported — often keeps on being reported, that, for
  117. some reason, we sell data.
  118. - I can't be clearer on this topic: We don't sell data. That's
  119. not how advertising works, and I do think we could probably be
  120. doing a clearer job explaining that, given the misperceptions
  121. that are out there.
  122. - Congressman, yes. We limit a lot of the data that we collect
  123. and use.
  124. - Congressman, yes. In — in response to these issues, we've
  125. changed a lot of the way that our platform works, so, that way,
  126. developers can't get access to as much information.
  127. - Congressman, we try to collect and — and give people the
  128. ability ...
  129. - Congressman, this is a complex issue that I think is — deserves
  130. more than a one-word answer.
  131. - Yes.
  132. - Congressman, what we allowed — what we allow with our developer
  133. platform is for people to choose to sign into other apps and
  134. bring their data with them. That's something a lot of people
  135. want to be able to do. The reason why we built the developer
  136. platform in the first place was because we thought it would be
  137. great if more experiences that people had could be more social,
  138. so if you could have a calendar that showed your friends'
  139. birthdays; if you could have an address book that had pictures
  140. of your friends in it; if you could have a map that showed your
  141. friends' addresses on it. In order to do that, you need to be
  142. able to sign into an app, bring some of your data and some of
  143. your friends' data. And that's what we built. Now, since then,
  144. we have recognized that that can be used for abuse, too. So
  145. we've limited it, so now people can only bring their data when
  146. they go to an app. But that's something that a lot of people do
  147. on a day-to-day basis — is sign into apps and websites with
  148. their — with Facebook. And that's something that we're ...
  149. - Congressman, in that specific case, our team made an
  150. enforcement error. And we have already gotten in touch with them
  151. to reverse it.
  152. - Congressman, I do agree that we should work to give people the
  153. fullest free expression that is possible. That's what — when I
  154. talk about giving people a voice, that's what I care about.
  155. - Congressman, that's correct.
  156. - Congressman, we have a number of measures in place to protect
  157. minors specifically. We make it so that adults can't contact
  158. minors who they — they aren't already friends with. We make it
  159. so that certain content that may be inappropriate for minors, we
  160. don't show. The reality that we see is that teens often do want
  161. to share their opinions publicly, and that's a service that ...
  162. - Yes, we do.
  163. - Congressman, every time that someone chooses to share something
  164. on Facebook — you go to the app; right there, it says, “Who do
  165. you want to share with?” When you sign up for a Facebook
  166. account, it starts off sharing with just your friends. If you
  167. want to share publicly, you have to specifically go and change
  168. that setting to be sharing publicly. Every time ...
  169. - Congressman, this is an important question because I think
  170. people often ask what the difference is between surveillance and
  171. what we do. And I think that the difference is extremely clear,
  172. which is that, on Facebook, you have control over your
  173. information. The content that you share, you put there. You can
  174. take it down at any time. The information that we collect, you
  175. can choose to have us not collect. You can delete any of it,
  176. and, of course, you can leave Facebook if you want. I know of no
  177. surveillance organization that gives people the option to delete
  178. the data that they have, or even know what — what they're
  179. collecting.
  180. - Congressman, as I've said, every time that a person chooses to
  181. share something on Facebook, they're proactively going to the
  182. service and choosing that they want to share a photo, write a
  183. message to someone. And, every time, there is a control right
  184. there — not buried in settings somewhere, but right there, when
  185. they're — when they're posting ...
  186. - ... about who they want to share it with.
  187. - Congressman, since we learned about that, we've removed the
  188. option for advertisers to exclude ethnic groups from targeting.
  189. - Congressman, thank you, and let me say a couple of things on
  190. this. First, to your point about competition, the average
  191. American uses about eight different apps to communicate and stay
  192. connected to people. So there's a lot of competition that we
  193. feel every day. And — and that — that's — that's an important
  194. force that — that we — that we definitely feel when running the
  195. company. Second, on your point about regulation, the Internet is
  196. growing in importance around the world in people's lives, and I
  197. think that it is inevitable that there will need to be some
  198. regulation. So my position is not that there should be no
  199. regulation. But I also think that you have to be careful about
  200. what regulation you put in place for a lot of the reasons that
  201. you're saying. I think, a lot of times, regulation, by
  202. definition, puts in place rules that a company that is larger,
  203. that has resources like ours, can easily comply with, but that
  204. might be more difficult for a smaller start-up to — to comply
  205. with.
  206. - So I think that all things that need to be thought through very
  207. carefully when — when thinking through what — what rules we want
  208. to put in place.
  209. - Congressman, I'm not sure either. I'm not familiar with that
  210. specific case. It's quite possible that we made a mistake, and
  211. we'll follow up afterward to — on that.
  212. - Overall — yeah, we have — by the end of this year, we'll have
  213. about 20,000 people at the company who work on security and
  214. content-review-related issues. But there's a lot of content
  215. flowing through the systems and a lot of reports, and,
  216. unfortunately, we don't always get these things right when
  217. people report it to us.
  218. - Congresswoman, yes.
  219. - Yes. We are starting to notify people this week. We started
  220. Monday, I believe.
  221. - Congresswoman, yes. That's how our platform works. You have to
  222. opt in to sign in to any app before you use it.
  223. - Congresswoman, no, although we are currently going through the
  224. process of investigating every ...
  225. - ... that had access to a large amount of data.
  226. - It means that we're going to look into every app that had a
  227. large amount of access to data in the past, before we lock down
  228. the platform. I ...
  229. - ... because there are tens of thousands of apps, we will find
  230. some ...
  231. - ... and, when we find them ...
  232. - Yes.
  233. - Congresswoman, we are — have made and are continuing to make
  234. changes to reduce the amount of ...
  235. - Congresswoman, I'm not sure what that means.
  236. - Congresswoman, it might be useful to clarify what actually
  237. happened here. A developer does research ...
  238. - Congresswoman, yes. When we learned in 2015 that a Cambridge
  239. University researcher associated with the academic institution
  240. that built an app that people chose to share their data with ...
  241. - Yes. I'm answering your question.
  242. - When — when we learned about that, we ...
  243. - Yes.
  244. - We shut down the app.
  245. - We got in touch with them, and we asked them to — to — we
  246. commanded that they delete any of the data that they had, and
  247. their chief data officer told us that they had.
  248. - Congressman, this is — this is an important question to
  249. clarify. So, in 2007, we launched the platform in order to make
  250. it so that people could sign in to other apps, bring some of
  251. their information and some of their friends' information, to
  252. have social experiences. This created a lot of innovative
  253. experiences — new games, companies like Zynga. There were
  254. companies that you're — that you're familiar with, like Netflix
  255. and Spotify — had integrations with this that allowed social
  256. experiences in their apps. But, unfortunately, there were also a
  257. number of apps that used this for abuse, to collect people's
  258. data ...
  259. - Yeah, there was abuse. And that's why, in 2014, we took the
  260. step of fundamentally changing how the platform works. So, now,
  261. when you sign into an app, you can bring your information, and,
  262. if a friend has also signed into the app, then we'll — then the
  263. app can know that you're friends, so you can have a social
  264. experience in that app. But, when you sign into an app, it now
  265. no longer brings information from other people.
  266. - Yes, Congressman. Good question. So we're going to start by
  267. doing an investigation, internally, of every single app that had
  268. access to a large amount of information, before we lock down the
  269. platform. If we detect any suspicious activity at all, we are
  270. working with third-party auditors — I imagine there will have to
  271. be a number of them, because there are a lot of apps — and they
  272. will conduct the audit for us.
  273. - Yes.
  274. - Yes, Congressman. Thank you for giving me the opportunity to
  275. clarify that. So one — one of the questions is — is, what
  276. information do we track, and why, about people who are not
  277. signed into Facebook. We track certain information for security
  278. reasons and for ads reasons. For security, it's to make sure
  279. that people who are not signed into Facebook can't scrape
  280. people's public information. You can — even when you're not
  281. signed in, you can look up the information that people have
  282. chosen to make public on their page, because they wanted to
  283. share it with everyone. So there's no reason why you should have
  284. to be logged in. But, nonetheless, we don't want someone to be
  285. able to go through and download every single public piece of
  286. information. Even if someone chose to make it public, that
  287. doesn't mean that it's good to allow someone to aggregate it.
  288. So, even if someone isn't logged in, we track certain
  289. information, like how many pages they're accessing, as a
  290. security measure. The second thing that we do is we provide an
  291. ad network that third-party websites and apps can run in order
  292. to help them make money. And those ads — you know, similar to
  293. what Google does and what the rest of the industry does — it's
  294. not limited to people who are just on Facebook. So, for the
  295. purposes of that, we may also collect information to make it so
  296. that those ads are more relevant and work better on those
  297. websites. There's a control that — for that second class of
  298. information around ad targeting — anyone can turn off, has
  299. complete control over it. For obvious reasons, we do not allow
  300. people to turn off the — the measurement that we do around
  301. security.
  302. - Congressman, it's something that we're looking into. We already
  303. took action by banning him from the platform, and we're going to
  304. be doing a full audit to make sure that he gets rid of all the
  305. data that — that he — that he has, as well. To your point about
  306. Cambridge University, what we've found now is that there's a
  307. whole program associated with Cambridge University where a
  308. number of researchers, not just Aleksandr Kogan — although, to
  309. our current knowledge, he's the only one who's sold the data to
  310. Cambridge Analytica — there were a number of other researchers
  311. who were building similar apps. So we do need to understand
  312. whether there was something bad going on at Cambridge University
  313. overall that will require a stronger action from us.
  314. - Congressman, we're not aware of any specific groups like that,
  315. that have — that have engaged in this. We are, as I've said,
  316. conducting a full investigation of any apps that had access to a
  317. large amount of data. And, if we find anything suspicious, we'll
  318. tell everyone affected. We do not allow hate groups on Facebook,
  319. overall. So, if — if there's a group that — their primary
  320. purpose or — or a large part of what they do is spreading hate,
  321. we will ban them from the platform, overall.
  322. - Sorry. Can you repeat that?
  323. - Congressman, yes. That's certainly an important thing that —
  324. that we need to do.
  325. - Congressman, yes. This is an extremely important area. After we
  326. were slow to identify the Russian information operations in
  327. 2016, this has become a top priority for our company — to
  328. prevent that from ever happening again, especially this year, in
  329. 2018, which is such an important election year with the U.S.
  330. midterms, but also major elections in India, Brazil, Mexico,
  331. Hungary, Pakistan and a number of other places. So we're doing a
  332. number of things that — that I'm — that I'm happy to talk about,
  333. or follow up with afterward, around deploying new A.I. tools
  334. that can proactively catch fake accounts that Russia or others
  335. might create to spread misinformation. And one thing that I'll —
  336. that I'll end on here, just because I — I know we're — we're
  337. running low on time, is, since the 2016 election, there have
  338. been a number of significant elections, including the French
  339. presidential election, the German election and, last year, the
  340. U.S. Senate Alabama special election.
  341. - And the A.I. tools that we deployed in those elections were
  342. able to proactively take down tens of thousands of fake accounts
  343. that may have been trying to do the activity that you're — that
  344. you're talking about. So our tools are getting better. For as
  345. long as Russia has people who are employed, who are trying to
  346. perpetrate this kind of interference, it will be hard for — for
  347. us to guarantee that we're going to fully stop everything. But
  348. it's an arms race, and I think that we're making ground and are
  349. — are doing better and better and are confident about how we're
  350. going to be able to do ...
  351. - Congressman, yes. I think that it's really important for the
  352. service that people understand what they are doing and signing
  353. up for and how the service works. We have laid out all of what
  354. we do in the terms of service, because that's what is legally
  355. required of us.
  356. - Congressman, yes. We have a developer terms of service, which
  357. is separate from the normal terms of service for — for
  358. individuals using the service.
  359. - Congressman, I'm not sure what you mean by that.
  360. - Congressman, I think you're raising an important point, which
  361. is that I think, if someone wanted to know, they could. But I
  362. think that a lot of people probably just accept terms of service
  363. without taking the time to read through it. I view our
  364. responsibility not as just legally complying with laying it out
  365. and getting that consent, but actually trying to make sure that
  366. people understand what's happening throughout the product.
  367. That's why, every single time that you share something on
  368. Facebook or one of our services, right there is a control in
  369. line, where you control who — who you want to share with,
  370. because I don't just think that this is about a terms of
  371. service. It's contextual. You — you want to present people with
  372. the information about what — what they might be doing and give
  373. them the relevant controls in line, at the time that they're
  374. making those decisions, not just have it be in the background
  375. sometime, or up front — make a one-time decision.
  376. - That is — I'm not sure what you mean by extrapolating data.
  377. - Congressman, as you know, the FTC is investigating this. And we
  378. are certainly going to be complying with them and working with
  379. them on that investigation.
  380. - Yes, Congressman. All the same controls will be available
  381. around the world.
  382. - Yes, Congressman. We believe that everyone around the world
  383. deserves good privacy controls. We've had a lot of these
  384. controls in place for years. The GDPR requires us to do a few
  385. more things, and we're going to extend that to the world.
  386. - Congressman, we're going to put, at the top of everyone's app
  387. when they sign in, a tool that walks people through the settings
  388. and gives people the choices and — and asks them to make
  389. decisions on how they want their settings set.
  390. - Congressman, I think we may be updating it a little bit. But,
  391. as you say, we've had the ability to download your information
  392. for years now. And people have the ability to see everything
  393. that — that they have in Facebook, to take that out, delete
  394. their account and move their data anywhere that they want.
  395. - Congressman, I believe that all of your information is in that
  396. — that file.
  397. - Congressman, I'm not sure how we're going to implement that
  398. yet. Let me follow up with you on that.
  399. - Congresswoman, I believe that everyone owns their own content
  400. online. And that's — the first line of our terms of service, if
  401. you read it, says that.
  402. - Congresswoman, giving people control of their information and
  403. how they want to set their privacy is foundational to the whole
  404. service. It's not just a — kind of an add-on feature, something
  405. we have to ...
  406. - ... comply with.
  407. - The reality is, if you have a photo — if you just think about
  408. this in your day-to-day life ...
  409. - Congresswoman, I'm not directly familiar with the details of
  410. what you just said. But I certainly think that regulation in
  411. this area ...
  412. - Congresswoman, we don't think about what we're doing as
  413. censoring speech. I think that there are — there are types of
  414. content like terrorism that I think that we all agree we do not
  415. want to have on our service. So we build systems that can
  416. identify those and can remove that content, and we're very proud
  417. of that work.
  418. - Sorry, Congresswoman, I'm not familiar with ...
  419. - Of over that?
  420. - The market cap of the company was greater than that, yes.
  421. - Yes.
  422. - Yes.
  423. - Yes, that's correct.
  424. - Congresswoman, I'm not familiar with the details of that.
  425. - Yes.
  426. - Congresswoman, I — I get briefed on — on these things ...
  427. - I'm not familiar with the details of it.
  428. - Congresswoman, I'm not familiar with ...
  429. - I — I ...
  430. - ... I discuss them with — with our team, but I don't remember
  431. the exact details of them.
  432. - The FTC investigation?
  433. - Yes.
  434. - Congresswoman, I don't remember if we had a financial penalty.
  435. - I — I remember the consent decree. The consent decree is
  436. extremely important to how we operate the company.
  437. - Congressman, yes.
  438. - Congressman, I believe that those are — are — that we collect
  439. different data for those. But I can follow up on the details of
  440. — of that.
  441. - Congressman, let me follow up with you on that. That situation
  442. developed while I was here, preparing to testify, so I'm not ...
  443. - ... details on it.
  444. - Congressman, this is a really important question. There is
  445. absolutely no directive in any of the changes that we make to
  446. have a bias in anything that we do. To the contrary, our goal is
  447. to be a platform for all ideas ...
  448. - Congressman, we didn't allow the Obama campaign to do anything
  449. that any developer on the platform wouldn't have otherwise been
  450. able to do.
  451. - Yes, I ...
  452. - Congressman, we pride ourselves on — on doing good technical
  453. work, yes.
  454. - Among other things.
  455. - Congressman, in 2015, when we heard that the developer on our
  456. platform, Aleksandr Kogan ...
  457. - That — that Aleksandr Kogan had ...
  458. - ... sold data to Cambridge Analytica?
  459. - Yes.
  460. - Congressman, sometimes we do. I generally think that ...
  461. - Congressman, I disagree with that assessment. I do think that,
  462. going forward, we need to take a more proactive view of — of
  463. policing what the developers do. But, looking back, we've had an
  464. app review process. We investigate ...
  465. - ... tens of thousands of apps a year.
  466. - Congressman, we have a consent decree, yes.
  467. - Congressman, I'm not — I'm not familiar with all of the things
  468. that the FTC said, although I'm very familiar with the FTC ...
  469. - ... order, itself.
  470. - Congressman, respectfully, I disagree with that
  471. characterization. We've had a review process for apps for years.
  472. We've reviewed tens of thousands of apps a year and taken action
  473. against a number of them. Our process was not enough to catch a
  474. developer who sold data ...
  475. - ... that they had in their ...
  476. - ... outside of our system.
  477. - Congressman, we have not seen that activity.
  478. - I — not that I am aware of.
  479. - There are tens of thousands of apps that had access to a large
  480. amount of people's information before we locked down the
  481. platform in 2014. So we're going to do an investigation that
  482. first involves looking at their patterns of API access and what
  483. those companies were doing. And then, if we find anything
  484. suspicious, then we're going to bring in third-party auditors to
  485. go through their technical and physical systems to understand
  486. what they did. And, if they — we find that they misused any
  487. data, then we'll ban them from our platform, make sure they
  488. delete the data and tell everyone affected.
  489. - Yes, Congressman. It's going to take many months to do this
  490. full process.
  491. - And it's going to — it's going to be an expensive process with
  492. a lot of auditors. But we think that this is the right thing to
  493. do at this point. You know, before, we'd thought that, when
  494. developers told us that they weren't going to sell data, that
  495. that was — that that was a good representation. But one of the
  496. big lessons that we've learned here is that, clearly, we cannot
  497. just take developers' word for it. We need to go and enforce
  498. that.
  499. - Yes, Congressman, I'm — I'm aware of the audits that we do. We
  500. do audits every other year. They're ongoing. The audits have not
  501. found material issues with our privacy programs in place at the
  502. company. I think the broader question here is — we have had this
  503. FTC consent decree, but we take a broader view of what our
  504. responsibility for people's privacy is. And our — our view is
  505. that this — what a developer did — that they represented to us
  506. that they were going to use the data in a certain way, and then,
  507. in their own systems, went out and sold it — we do not believe
  508. is a violation of the consent decree. But it's clearly a breach
  509. of people's trust. And the standard that we hold ourselves to is
  510. not just following the laws that are in place. But we also — we
  511. just want to take a broader view of this in protecting people's
  512. information.
  513. - Sorry, can you repeat that?
  514. - Congressman, I believe we do provide the audits to the FTC.
  515. - Congressman, not personally, although I'm briefed on all of the
  516. audits by our team.
  517. - Congresswoman, we expect it to take many months.
  518. - I hope not.
  519. - Congresswoman, we can follow up with you to make sure you get
  520. all that information.
  521. - I don't believe it was a large number. But, as we complete the
  522. audits we will know more.
  523. - A handful.
  524. - Yes, Congresswoman. In 2015, when we first learned about it, we
  525. immediately demanded that the app developer and the firms that
  526. he sold it to delete the data. And they all represented to us
  527. that they had. It wasn't until about a month ago that new
  528. reports surfaced that suggested that they hadn't, which is what
  529. has kicked off us needing to now go do this full audit and
  530. investigation and investigate all these other apps that have
  531. come up.
  532. - Congresswoman, we need to complete the investigation and audit
  533. before I can confirm that.
  534. - What they represented to us is that they have. But we need to
  535. now get into their systems and confirm that before I want to
  536. stand up here confidently and say what they've done.
  537. - Congresswoman, the GDPR has a bunch of different, important
  538. pieces. One is around offering controls over specific — over
  539. every use of people's data.
  540. - That, we're doing. The second is around pushing for affirmative
  541. consent and putting a control in front of people that walks
  542. people through their — their choices.
  543. - We're going to do that too. The second — although that might be
  544. different, depending on the laws in specific countries and
  545. different places — but we're going to put a tool at the top of
  546. everyone's app that walks them through their settings and helps
  547. them understand what is going on.
  548. - Congresswoman, yes, I feel like we have a very important
  549. responsibility to outline what the content policies are and the
  550. community standards are. This is one of the areas that, frankly,
  551. I'm worried we're not doing a good enough job at right now,
  552. especially because, as an American-based company where about 90
  553. percent of the people in our community are outside of the U.S.,
  554. where there are different social norms and different cultures,
  555. it's not clear to me that our current situation of how we define
  556. community standards is going to be effective for articulating
  557. that around the world. So we're looking at different ways to
  558. evolve that, and I think that this is one of the more important
  559. things that we will do.
  560. - Yes, Congresswoman. I'm not sure specifically what that person
  561. was referring to, but I can walk you through what the algorithm
  562. change was, if that's useful.
  563. - Congresswoman, the principle that we're a platform for all
  564. ideas is something that I care very deeply about. I'm worried
  565. about bias, and we take a number of steps to make sure that none
  566. of the changes that we make are targeted at — in any kind of
  567. biased way. And I'd be happy to follow up with you and go into
  568. more detail on that, because I agree that this is a serious
  569. issue.
  570. - Congresswoman, it sounds like we made a mistake there, and I
  571. apologize for that. And, unfortunately, with the amount of
  572. content in our systems and the current systems that we have in
  573. place to review, we make a relatively small percent of mistakes
  574. in content review. But that can be — that's — that's too many.
  575. And this is an area where we need to improve. What I — what I
  576. will say is that I wouldn't extrapolate from a few examples, to
  577. assuming that the overall system is biased. I — I get how people
  578. can — can look at that and draw that conclusion, but I don't
  579. think that that reflects the — the way that we're trying to
  580. build the system or what we've seen.
  581. - I agree.
  582. - Congressman, I think that that's a good idea and we should
  583. follow up on it. From the conversations that I have with my
  584. fellow leaders in the tech industry, I — I know that this
  585. something that we all understand that the whole industry is
  586. behind on. And Facebook is certainly a big part of that issue.
  587. And we care about this not just from the justice angle, but
  588. because we know that having diverse, different viewpoints is
  589. what will help us serve our community better, which is
  590. ultimately what we're here to do. And I think we know that the
  591. industry is behind on this and want to ...
  592. - Congressman, this is an issue that we're — we're focused on. We
  593. have a broader leadership than just five people. I mean ...
  594. - I understand that.
  595. - Congressman, we will certainly work with you. This is an
  596. important issue.
  597. - Congressman, we — we try to include a lot of important
  598. information in the diversity updates. I will go discuss that
  599. with my team after I get back from this hearing.
  600. - Congressman, that's correct. And a different developer could
  601. have built that app.
  602. - Congressman, the big difference between these cases is that, in
  603. — in the Kogan case, people signed into that app expecting to
  604. share the data with Kogan, and then he turned around and, in
  605. violation of our policies and in violation of people's
  606. expectations, sold it to a third-party firm — to Cambridge
  607. Analytica, in this case.
  608. - I — I think that we — we were very clear about how the platform
  609. worked at the time — that anyone could sign into an app and
  610. they'd be able to bring their information, if they wanted, and
  611. some information from their friends. People had control over
  612. that. So, if you wanted, you could — you could turn off the
  613. ability to sign into apps, or turn off the ability for your
  614. friends to be able to bring your information. The platform
  615. worked the way that we had designed it at the time. I think we
  616. now know that we should have a more restrictive platform where
  617. people cannot also bring information from their friends, and can
  618. only bring their own information. But that's the way that system
  619. worked at the time.
  620. - Congressman, what I think people are — are rightfully very
  621. upset about is that an app developer that people had shared data
  622. with sold it to someone else and, frankly, we didn't do enough
  623. to prevent that or understand it soon enough.
  624. - And now we have to go through and — and put in place systems
  625. that prevent that from happening again and — making sure that we
  626. have sufficient controls in place in our ecosystem so, that way,
  627. developers can't abuse people's data.
  628. - Congresswoman, I — I believe that people own all of their own
  629. content. Where this gets complicated is — let's say I take a
  630. photo and I share it with you. Now, is that my photo, or is it
  631. your photo? I — I would take the position that it's our photo,
  632. which is why we make it so that you can bring — it's — that I
  633. can bring that — that photo to another app, if I want, but you
  634. can't.
  635. - Sorry. Can you clarify that?
  636. - Congresswoman, all the data that you put in, all the content
  637. that you share on Facebook is yours. You control how it's used.
  638. You can remove it at any time. You can get rid of your account
  639. and get rid of all of it at once. You can ...
  640. - Congresswoman, I — I disagree with that, because one core tenet
  641. of our advertising system is that we don't sell data to
  642. advertisers. Advertisers don't get access to your data. There's
  643. a — there's a core misunderstanding about how that system works,
  644. which is that — let's say if you're — if you're a shop, and
  645. you're selling muffins, right, it's — you might want to target
  646. people in a specific town who might be interested in baking, or
  647. — or some demographic. But we don't send that information to
  648. you. We just show the message to the right people. And that's a
  649. really important, I think, common misunderstanding ...
  650. - ... about how this system works.
  651. - Yes, Congresswoman, we run ads. That's the — the business model
  652. is running ads. And we use the data that people put into the
  653. system in order to make the ads more relevant, which also makes
  654. them more valuable. But it's — what we hear from people is that,
  655. if they're going to see ads, they want them to be good and
  656. relevant ...
  657. - No, you have complete control over that.
  658. - Congressman, yes. This is extremely important. And I think the
  659. — the point that you raise is particularly important — that
  660. we've heard in — today a number of examples of — where we may
  661. have made content review mistakes on conservative content. But I
  662. can assure you that there are a lot of folks who think that we
  663. make content moderation or content review mistakes of liberal
  664. content, as well.
  665. - We will review it and get back to you.
  666. - Yes, Congressman, I do. We were trying to balance two equities:
  667. on the one hand, making it so that people had data portability,
  668. the ability to bring their data to another app in order to have
  669. new experiences in other places, which I think is a value that
  670. we all care about. On the other hand, we also need to balance
  671. making sure that everyone's information is protected. And I
  672. think that we — we didn't get that balance right up front.
  673. - We do not believe it did. But, regardless, we take a broader
  674. view of what our responsibility is to protect people's privacy.
  675. And, if a developer who people gave their information to — in
  676. this case, Aleksandr Kogan — then goes and, in violation of — of
  677. his agreement with us, sells the data to Cambridge Analytica,
  678. that's a big issue. And I think people have a right to be very
  679. upset. I'm upset that that happened. And we need to make sure
  680. that we put in place the systems to prevent that from happening
  681. again.
  682. - Congresswoman, I believe that we ...
  683. - Congresswoman, I — I'm not sure — I don't think that that's
  684. what we're tracking.
  685. - Congresswoman ...
  686. - That's right, that we — that we understand, in order to show
  687. which of your friends liked a page ...
  688. - Congressman — Congresswoman ...
  689. - I — I — I actually — if they share it with us. But
  690. Congresswoman, overall, I — I'm ...
  691. - Congresswoman, I don't think any of those buttons share
  692. transaction data. But broadly, I — I disagree with the
  693. characterization.
  694. - Congresswoman, yes, we collect some data for security purposes,
  695. and ...
  696. - Congresswoman, everyone has control over how that works.
  697. - Congresswoman, I disagree with that characterization.
  698. - Congresswoman, the primary way that Facebook works is that
  699. people choose to share data, and they share content because
  700. they're trying to communicate.
  701. - Congresswoman, we just announced two weeks ago that we were
  702. going to stop interacting with data brokers, and even though
  703. that's an industry norm, to make it so that the advertising can
  704. be more relevant ...
  705. - No, Congressman. You're — you're right. I mean, this is ad-
  706. based business models have been a common way that people have
  707. been able to offer free services for a long time. And our social
  708. mission of trying to help connect everyone in the world relies
  709. on having a service that can be affordable for everyone; that
  710. everyone can use. And that's why the ads business model is in
  711. service of the social mission that we have, and you know, I
  712. think sometimes that gets lost, but I think that's a really
  713. important point.
  714. - Well, Congressman, it would make the ads less relevant. So what
  715. we ...
  716. - And — yeah. It would — it would reduce — it would have a number
  717. of effects. For people using the services, it would make the ads
  718. less relevant to them. For businesses, like the small businesses
  719. that use advertising, it would make advertising more expensive,
  720. because now they would have to reach — they would have to pay
  721. more to reach more people, and efficiently, because targeting
  722. helps small businesses be able to afford and — and reach — and
  723. reach people as effectively as big companies have typically had
  724. the ability to do for a long time. It would affect our revenue
  725. some amount too, but I think one — there are a couple of points
  726. here that are lost. One is that we already give people a control
  727. to not use that data and ads, if they want. Most people don't do
  728. that. I think part of the reason for that is that people get
  729. that if they are going to see ads, that they want them to be
  730. relevant. But the other thing is that our — a lot of what our
  731. business — what makes the ads work, or what makes the business
  732. good is just that people are very engaged with Facebook. We have
  733. more than a billion people who spend almost an hour a day across
  734. all our services.
  735. - If you delete your account, we immediately make it so that your
  736. account is — is no longer available, once you're — once you're
  737. done deleting it. So no one can find you on the service. We
  738. wouldn't be able to re-create your account from that. We do have
  739. data centers and systems that are redundant, and we have backups
  740. in case something bad happens. And, over a number of days, we'll
  741. — we'll go through and make sure that we flush all the content
  742. out of the system. But, as soon as you delete your account,
  743. effectively, that content is — is dismantled and we wouldn't be
  744. able to put your account back together if we wanted to.
  745. - Do you want me to ...
  746. - Congressman, I can quickly respond to the first point, too.
  747. - Congressman ...
  748. - ... we ...
  749. - We offer sales support to every campaign.
  750. - Congressman, the — the Trump campaign had sales support ...
  751. - Congressman, I do not, sitting here off the top of my head.
  752. - Congressman, we apply the same standard to all campaigns.
  753. - Congressman ...
  754. - ... what I'm — yes. What I'm saying is that ...
  755. - ... following the same standards.
  756. - Mr. Chairman, do you mind, for the record, if I just answer the
  757. first point for — for ...
  758. - ... take 10 seconds.
  759. - When I was referring to the campaigns yesterday, I meant the
  760. DNC and RNC. So I may have misspoken, and maybe, technically,
  761. that's called the committees. But that — those were the folks
  762. who I was referring to.
  763. - Well, Congressman, I view our responsibility as not just
  764. building services that people like to use, but making sure that
  765. those services are also good for people and good for society
  766. overall. At the time, there were a number of questions about
  767. whether people seeing content that was either positive or
  768. negative on social networks was affecting their mood. And we
  769. felt like we had a responsibility to understand whether that was
  770. the case, because we don't want to have that effect, right? We
  771. don't want to have it so that — we want use of social media and
  772. our products to be good for people's well-being. I mean, we
  773. continually make changes to — to that effect, including, just
  774. recently, this year, we did a number of research projects that
  775. showed that when social media is used for building relationships
  776. — and so when you're interacting with people, it's associated
  777. with a lot of positive effects of — of well-being that you'd
  778. expect. It — it makes you feel more connected, less lonely, it
  779. correlates with long term measures of happiness and health.
  780. Whereas if you're using social media or the Internet just to
  781. passively consume content, then that doesn't have those same
  782. positive effects or can even be negative. So we've tried to
  783. shift the product more towards helping people interact with
  784. friends and family as a result of that. So that's the kind of —
  785. an example of the kind of work that we — that we do.
  786. - Yes.
  787. - Yes. The 27,000 number is full time employees. And the security
  788. and content review includes contractors, of which there are tens
  789. of thousands. Or will be. Will be by the time that we hire
  790. those.
  791. - Well, Congressman, the — the issue with Cambridge Analytica and
  792. Alexander Kogan happened before we ramped those programs up
  793. dramatically. But one thing that I think is important to
  794. understand overall is just the sheer volume of content on
  795. Facebook makes it so that we can't — no amount of people that we
  796. can hire will be enough to review all of the content. We need to
  797. rely on and build sophisticated A.I. tools that can help us flag
  798. certain content. And we're getting good in certain areas. One of
  799. the areas that I mentioned earlier was terrorist content, for
  800. example, where we now have A.I. systems that can identify and —
  801. and take down 99 percent of the al-Qaeda and ISIS-related
  802. content in our system before someone — a human even flags it to
  803. us. I think we need to do more of that.
  804. - Yes, Congressman. We have a “download your information” tool.
  805. We've had it for years. You can go to it in your settings and
  806. download all of the content that you have on Facebook.
  807. - Congressman, that would be correct. If — if we don't have
  808. content in there, then that means that — that you don't have it
  809. on Facebook. Or you haven't put it there.
  810. - Congressman, my understanding is that all of your information
  811. is included in your “download your information.”
  812. - Congressman, we're working on doing that as quickly as
  813. possible. I don't have the exact date yet.
  814. - We're working on it.
  815. - Well, Congressman, let me first just set aside that my position
  816. isn't that there should be no regulation.
  817. - But regardless of what the laws are that are in place, we have
  818. a very strong incentive to protect people's information. This is
  819. the core thing that Facebook is, is about 100 billion times a
  820. day people come to our service to share a photo or share a
  821. message or ...
  822. - Congressman, this is an incredibly high priority for us. What I
  823. was saying before, that the core use of the product every day,
  824. about 100 billion times, is that people come and try to share
  825. something with a specific set of people. That works because
  826. people have confidence that if they send a message, it's going
  827. to go to the person that they want. If they want to share a
  828. photo with their friends, it's going to go to the people they
  829. want. That's incredibly important. We've built a — a robust
  830. privacy program. We have a chief privacy officer ...
  831. - Congressman, I believe ...
  832. - No, of course not.
  833. - Congressman, I'm not ...
  834. - ... aware of his quote, but I heard that he — that he said
  835. something. And let me just speak to this for a second ...
  836. - Congressman, I think that there are a number of areas of
  837. content that we need to do a better job policing on our service.
  838. Today, the primary way that content (inaudible) — regulation
  839. works here, and review, is that people can share what they want
  840. openly on the service, and then, if someone sees an issue, they
  841. can flag it to us, and then we will review it. Over time, we're
  842. shifting to a mode where ...
  843. - Congressman, right now, when people report the posts to us, we
  844. will take them down and have people ...
  845. - Congressman, I agree that this is a terrible issue, and,
  846. respectfully, when there are tens of billions or 100 billion
  847. pieces of content that are shared every day, even 20,000 people
  848. reviewing it can't look at everything. What we need to do is
  849. build more A.I. tools that can proactively find that content.
  850. - Yes.
  851. - Congressman, yes, of course.
  852. - Yes, Congressman.
  853. - Congressman, that seems like a reasonable principle to me.
  854. - Congressman, that one might be more interesting to debate,
  855. because ...
  856. - Yes, Congressman, and they have that ability.
  857. - Congressman, I certainly think that that's an area where we
  858. should discuss some sort of oversight.
  859. - Congressman, I think that's — this is an area where some
  860. regulation makes sense. You proposed a very specific thing, and
  861. I think the details matter.
  862. - Congressman, yes, and I'll make sure that we work with — with
  863. you to flesh this out.
  864. - Congressman, in — in general, the way we approach data and law
  865. enforcement is, if we have knowledge of imminent harm — physical
  866. harm that might happen to someone, we try to reach out to local
  867. law enforcement in order to help prevent that. I think that that
  868. is less built out around the world. It is more built out in the
  869. U.S. So, for example, on that example, we built out specific
  870. programs in the U.S.
  871. - We have 3,000 people that are help — that are focused on making
  872. sure that, if we detect that someone is at risk of harming
  873. themselves, we can get them the appropriate ...
  874. - The — the second category of — of information is when there is
  875. a valid legal process served to us. In general, if a government
  876. puts something out that's overly broad, we're going to fight
  877. back on it. We view our duty as protecting people's information.
  878. But, if there is valid service, especially in the U.S., we will,
  879. of course, work with law enforcement. In general, we are not in
  880. the business of providing a lot of information to the Russian
  881. government.
  882. - Sorry, can you repeat that?
  883. - Well, Congressman, in general, countries do not have
  884. jurisdiction to have any valid legal reason to request data of
  885. someone outside of their country.
  886. - We don't store any data in Russia.
  887. - Yes.
  888. - Sorry, Congressman, could you repeat that?
  889. - Yes.
  890. - Congressman, let me be more precise in my testimony.
  891. - I have no specific knowledge of any data that we've ever given
  892. to Russia. In general, we'll work with valid law enforcement
  893. requests in different countries, and we can get back to you on
  894. what that might mean with Russia, specifically. But I have no
  895. knowledge, sitting here, of any time that we would have given
  896. them information.
  897. - Yes, Congressman. This is an important issue, and it's — fake
  898. accounts, overall, are a big issue, because that's how a lot of
  899. the — the other issues that we see around fake news and foreign
  900. election interference are happening, as well. So, long-term, the
  901. solution here is to build more A.I. tools that find patterns of
  902. people using the services that no real person would do. And
  903. we've been able to do that in order to take down tens of
  904. thousands of accounts, especially related to election
  905. interference leading up to the French election, the German
  906. election and, last year, the U.S. Alabama Senate state election
  907. — Senate election — special election. And that's an area where
  908. we should be able to extend that work and develop more A.I.
  909. tools that can do this more broadly.
  910. - Congressman, I'm not specifically familiar with that. The
  911. feature that we identified — I think it was a few weeks ago, or
  912. a couple weeks ago, at this point — was a search feature that
  913. allowed people to look up some information that people had
  914. publicly shared on their profiles.
  915. - So names, profile pictures, public information.
  916. - Congressman, in general, we collect data of people who have not
  917. signed up for Facebook for security purposes, to prevent the
  918. kind of scraping that you were just referring to.
  919. - Congressman, I'm not — I'm not familiar with that ...
  920. - I do not know off the top of my head.
  921. - Congressman, I do not off the top of my head, but I can have
  922. our team get back to you afterwards.
  923. - Congressman, anyone can turn off and opt out of any data
  924. collection for ads, whether they use our services or not. But,
  925. in order to prevent people from scraping public information,
  926. which — again, the search feature you brought up only showed
  927. public information — people's names and profiles and things that
  928. they had made public. But, nonetheless, we don't want people
  929. aggregating even public information.
  930. - ... block that, so we need to know when someone is trying to
  931. repeatedly access our services ...
  932. - Congressman, we're working with the right authorities on that,
  933. and I'm happy to answer specific questions here, as well.
  934. - Yes, Congressman. We will certainly follow up with you on this.
  935. Part of the mission of connecting everyone around the world
  936. means that everyone needs to be able to be on the Internet. And,
  937. unfortunately, too much of the Internet infrastructure today is
  938. too expensive for the current business models of carriers to
  939. support a lot of rural communities with the quality of service
  940. that they deserve. So we are building a number of specific
  941. technologies, from planes that can beam down Internet access, to
  942. repeaters and mesh networks to make it so that — that all these
  943. communities can be served. And we'd be happy to follow-up with
  944. you on this to ...
  945. - Congressman, without weighing in on that specific piece of
  946. content, let me outline the way that we approach fighting fake
  947. news in general. There are three categories of fake news that we
  948. fight. One are basically spammers. They're economic actors, like
  949. — like the Macedonian trolls that I think we have all heard
  950. about — basically, folks who do not have an ideological goal.
  951. They're just trying to write the most sensational thing they
  952. can, in order to get people to click on it so they can make
  953. money on ads. It's all economics. So the way to fight that is we
  954. make it so they can't run our ads, they can't make money. We
  955. make it so we can detect what they're doing and show it in less
  956. in news feeds, so they can make less money. When they stop
  957. making money, they just go and do something else, because
  958. they're economically inclined. The second category are basically
  959. state actors, right, so what we've found with Russian
  960. interference. And those people are setting up fake accounts. So,
  961. for that, we need to build A.I. systems that can go and identify
  962. a number of their fake account networks. And, just last week, we
  963. traced back the Russian activity to — to specific — a fake
  964. account network that Russia had in Russia to influence Russian
  965. culture and other Russian-speaking countries around them. And we
  966. took down a number of their fake accounts and pages, including a
  967. news organization that was sanctioned by Russian — by the
  968. Russian government as a Russian state news organization. So
  969. that's a pretty big action. But removing fake accounts is the
  970. other way that we can fake — stop the spread of false
  971. information.
  972. - Yes, Congressman. That's actually the third category that I was
  973. going to get to next, after economic spammers and state actors
  974. with fake accounts. One of the things we're doing is working
  975. with a number of third parties who — so, if people flag things
  976. as — as false news or — or incorrect, we run them by third-party
  977. fact checkers, who are all accredited by the — this Pointer
  978. Institute of Journalism. There are ...
  979. - ... firms of all — of all leanings around this, who do this
  980. work, and that's — that's an important part of the effort.
  981. - Congressman, my understanding is that, if there's — if we have
  982. information from you visiting other places, then you have a way
  983. of getting access to that and deleting it and making sure that
  984. we don't store it anymore. In the specific question that the —
  985. the other congressman asked, I think it's possible that we just
  986. didn't have the information that he was asking about in the
  987. first place, and that's why it wasn't there.
  988. - Congressman, I think we're responsible for protecting people's
  989. information, for sure. But one thing that you said that I — that
  990. I want to provide some clarity on ...
  991. - Well, you said earlier — you referenced that you thought that
  992. we were only taking action after this came to light. Actually,
  993. we made significant changes to the platform in 2014 that would
  994. have made this incident with Cambridge Analytica impossible to
  995. happen again today. I wish we'd made those changes a couple of
  996. years earlier, because this poll app got people to use it back
  997. in 2013 and 2014. And, if we had made the changes a couple of
  998. years earlier, then we would have — then we ...
  999. - Congressman, if people flag those ads for us, we will take them
  1000. down now.
  1001. - Yes.
  1002. - If people flag them for us, we will look at them as quickly as
  1003. we can ...
  1004. - The ads that are flagged for us, we will review and take down,
  1005. if they violate our policies, which I believe the ones ...
  1006. - ... but — but what I think really needs to happen here is not
  1007. just us reviewing content that gets flagged for us. We need to
  1008. be able to build tools that can proactively go out and identify
  1009. what might be these — these ads for — for opioids, before people
  1010. even have to flag them for us to review.
  1011. - And that's — that's going to be a longer term thing, in order
  1012. to build that solution. So — but, today, if someone flags the
  1013. ads for us, we will take them down.
  1014. - Congressman, that clearly sounds like a big issue and something
  1015. that would violate our policies. I don't have specific knowledge
  1016. of that case, but what I imagine happened, given what you just
  1017. said, is that they reported it to us and one of the people who
  1018. reviews content probably made an enforcement error. And then,
  1019. when you reached out, we probably looked at it again and
  1020. realized that it — that it violated the policies, and took it
  1021. down. We have a number of steps that we need to take to improve
  1022. the accuracy of our enforcement.
  1023. - That's — that's a big issue. And we have to check content
  1024. faster ...
  1025. - ... and we need to — to be able to do better at this. I think
  1026. the same solution to the opioid question that you raised
  1027. earlier, of doing more with automated tools, will lead to both
  1028. faster response times, and more accurate enforcement of the
  1029. policies.
  1030. - Congresswoman, I agree that we need to work on diversity. In
  1031. this specific case, I don't think that that was the issue,
  1032. because we were, frankly, slow to identifying the whole Russian
  1033. misinformation operation, and not just that specific example.
  1034. Going forward, we're going to address this by verifying the
  1035. identity of every single advertiser who's running political or
  1036. issue-oriented ads, to make it so that foreign actors or people
  1037. trying to spoof their identity or say that they're someone that
  1038. they're not cannot run political ads or run large pages of the
  1039. type you're talking about.
  1040. - Congresswoman, we announced a change in how we're going to
  1041. review ads and big pages so that, now, going forward, we're
  1042. going to verify the identity and location of every advertiser
  1043. who's running political or issue ads or — and the identities ...
  1044. - That will be in place for these elections.
  1045. - Yes, Congresswoman.
  1046. - No, Congresswoman, it did not.
  1047. - Of course.
  1048. - Congressman, it's a combination of both. So, at the end of the
  1049. day, we have — we have community standards that are written out,
  1050. and try to be very clear about what's — what is acceptable. And
  1051. we have a large team of people. As I said, by the end of this
  1052. year, we're going to have about 20,000 — more than 20,000 people
  1053. working on security and content review across the company. But,
  1054. in order to flag some content quickly, we also build technical
  1055. systems in order to take things down. So, if we see terrorist
  1056. content, for example, we'll flag that, and we can — we can take
  1057. that down.
  1058. - Congressman, for content reviewers specifically, their
  1059. performance is going to be measured by whether they do their job
  1060. accurately, and ...
  1061. - I — I'm — I'm sure we do. As is part of the normal course of —
  1062. of running a company, you — you're hiring and firing people all
  1063. the time to grow your capacity, and — and to ...
  1064. - Congressman, I'm not specifically aware of that case.
  1065. - We will.
  1066. - Yes.
  1067. - Yes.
  1068. - Congressman, we don't sell people's data. So I think that
  1069. that's an important thing to clarify up front. And then, in
  1070. terms of collecting data, I mean, the whole purpose of the
  1071. service is that you can share the things that you want with the
  1072. people around you, right, or — and your friends. So ...
  1073. - Well, Congressman, it would be possible for our business to
  1074. exist without having a developer platform. It would not be
  1075. possible for our business to — or — or our products or our
  1076. services or anything that we do to exist without having the
  1077. opportunity for people to go to Facebook, put in the content
  1078. that they want to share and who they want to share it with, and
  1079. then go do that. That's the core thing that ...
  1080. - Congressman, for the developer platform changes that we
  1081. announced, they're implemented. We're putting those into place.
  1082. We announced a bunch of specific things. It's on our — our blog,
  1083. and I wrote it in my written testimony, and that stuff is
  1084. happening. We're also going back and investigating every single
  1085. app that had access to a large amount of data before we locked
  1086. down the platform in the past. We will tell people if we find
  1087. anything that misused their data, and we will tell people when
  1088. the investigation is complete.
  1089. - Congressman, part of what I just said is that we're going to do
  1090. an investigation of every single app that had access to a large
  1091. amount of people's data. If you — if you signed into another
  1092. app, then that probably has access to some of your data. And
  1093. part of the investigation that we're going to do is — is to
  1094. determine whether those app developers did anything improper, or
  1095. shared that data further, beyond that. And, if we find anything
  1096. like that, we will tell people that their — that their data was
  1097. misused.
  1098. - No, Congressman. FaceMash was a — a prank website that I
  1099. launched in college, in my dorm room, before I started Facebook.
  1100. There was a movie about this — or it said it was about this. It
  1101. was of unclear truth. And the — the claim that FaceMash was
  1102. somehow connected to the development of Facebook — it isn't. It
  1103. wasn't.
  1104. - It was in 2003.
  1105. - ... took it down, and it actually has nothing to do with
  1106. Facebook.
  1107. - Congressman, that is an accurate description of the prank
  1108. website that I made when I was a sophomore in college.
  1109. - I do.
  1110. - I — I believe — is that Diamond and Silk?
  1111. - Well, Congressman, nothing is unsafe about that. The specifics
  1112. of — of this situation, I — I'm not as up to speed on as — as I
  1113. probably would be ...
  1114. - Congressman, so you're right that, in 2015, when we found out
  1115. that the app developer, Aleksandr Kogan, had sold data to
  1116. Cambridge Analytica, we reached out to them. At that point, we
  1117. demanded that they delete all the data that they had. They told
  1118. us, at that point, that they had done that. And then, a month
  1119. ago, we heard a new report that said that they actually hadn't
  1120. done that.
  1121. - The audit team that we are sending in?
  1122. - The first order of business is to understand exactly what
  1123. happened. And ...
  1124. - Congressman, I do not believe that we have. And ...
  1125. - ... one specific point on this is that our audit in the — of
  1126. Cambridge Analytica — we have paused that in order to cede to
  1127. the U.K. government, which is conducting its own government
  1128. audit, which, of course — an investigation which, of course ...
  1129. - Congressman, yes. What I'm saying is that the U.K. government
  1130. is going to complete its investigation before we go in and do
  1131. our audit. So they will have full access to all the information.
  1132. - Yes, we've — we've — we've paused it, pending theirs.
  1133. - Congressman, yes. We have a document retention policy at the
  1134. company where, for some people, we delete emails after a period
  1135. of time, but we, of course, preserve anything that there's a
  1136. legal hold on.
  1137. - Well, Congressman, I would disagree that we allow it. We
  1138. actually expressly prohibit any developer that people ...
  1139. - Yes, Congressman. Some of it is — is in response to reports
  1140. that we get, and some of it is we do spot checks to make sure
  1141. that the apps are actually doing what they — what they say
  1142. they're doing. And, going forward, we're going to increase the
  1143. number of audits that we do, as well.
  1144. - Congressman ...
  1145. - ... Congressman, you have control over what we do for — for ads
  1146. and the information collection around that. On security, there
  1147. may be specific things about how you use Facebook, even if
  1148. you're not logged in, that we — that we keep track of, to make
  1149. sure that people aren't abusing the systems.
  1150. - ... Congressman, you have control over what we do for — for ads
  1151. and the information collection around that. On security, there
  1152. may be specific things about how you use Facebook, even if
  1153. you're not logged in, that we — that we keep track of, to make
  1154. sure that people aren't abusing the systems.
  1155. - Congressman, we're not collecting any information verbally on
  1156. the microphone, and we don't have contracts with anyone else who
  1157. is. The only time that we might use the microphone is when
  1158. you're recording a video or doing something where you
  1159. intentionally are trying to record audio. But we don't have
  1160. anything that is trying to listen to what's going on in the
  1161. background.
  1162. - Congressman, we do. I don't think we have a policy that says
  1163. that your phone can't be on. And, again, I'm not that — I'm not
  1164. familiar with — Facebook doesn't do this, and I'm not familiar
  1165. with other companies that — that do, either. My understanding is
  1166. that a lot of these cases that you're talking about are a
  1167. coincidence, or someone is — might be talking about something,
  1168. but then they also go to a website or interact with it on
  1169. Facebook, because they were talking about it, and then maybe
  1170. they'll see the ad because of that, which is a much clearer
  1171. statement of the — the intent.
  1172. - Yes.
  1173. - Congressman, the way this works is — let's say you have a
  1174. business that is selling skis, Okay, and you have on your
  1175. profile that you are interested in skiing. But let's say you
  1176. haven't made that public, but you share it with your — with your
  1177. friends, all right? So, broadly, we don't tell the advertiser
  1178. that — “Here's a list of people who like skis.” They just say,
  1179. “Okay, we're trying to sell skis. Can you reach people who like
  1180. skis?” And then we match that up on our side, without sharing
  1181. any of that information with the advertisers.
  1182. - Congressman, no. And I — I also would push back on the idea
  1183. that we're giving them access to the data. We allow them to
  1184. reach people who have said that on Facebook, but we're not
  1185. giving them access to data.
  1186. - Congressman, I'm not sure I understand the question. Can you —
  1187. can you give me an example of what you mean?
  1188. - Yes. So, Congressman, my understanding is that the targeting
  1189. options that are — that are available for advertisers are
  1190. generally things that are based on what people share. Now, once
  1191. an advertiser chooses how they want to target something,
  1192. Facebook also does its own work to help rank and determine which
  1193. ads are going to be interesting to which people.
  1194. - So we may use metadata or other behaviors of what you've shown
  1195. that you're interested in on news feed or other places in order
  1196. to make our systems more relevant to you. But that's a little
  1197. bit different from giving that as an option to an advertiser, if
  1198. that makes sense.
  1199. - Congressman, I — I agree that we should be a platform for all
  1200. ideas, and that we should focus on that.
  1201. - I ...
  1202. - Congressman, yes. In general, I mean, I think that people own
  1203. their ...
  1204. - Congressman, these sound relatively accurate.
  1205. - Congressman, I don't think so. There are — there are a couple
  1206. of big issues here. One is what happened specifically with
  1207. Cambridge Analytica — how were they able to buy data from a
  1208. developer that people chose to share it with? And how do we make
  1209. sure that that can't happen again?
  1210. - People had it on Facebook, and then chose to share theirs and
  1211. some of their friends' information with this developer, yes.
  1212. - Congressman, we just recently announced that we were stopping
  1213. working with data brokers as part of the ad system. It's ...
  1214. - It's — it's an industry standard ad practice, and, recently,
  1215. upon examining all of our systems, we decided that's not a thing
  1216. that we want to be a part of, even if everyone else is doing it.
  1217. - Yes, until we announced that we're shutting it down. Yes.
  1218. - Congressman, I don't believe that. I think that there may have
  1219. been a specific factual inaccuracy that we ...
  1220. - ... that specific point, yes.
  1221. - Congressman, you're right that we apologized after they posted
  1222. the story. They had the — most of the details of what was — of
  1223. what was right there.
  1224. - And I don't think we objected to that.
  1225. - There was a specific thing ...
  1226. - Congressman, I'm — I am definitely committed to taking a
  1227. broader view of our responsibility. That's what my testimony is
  1228. about, making sure that we don't just give people tools, but
  1229. make sure that they're used for good.
  1230. - Congresswoman, thanks for the question. Terrorist content and
  1231. propaganda has no place in our network and we have developed a
  1232. number of tools that have now made it so that 99 percent of the
  1233. ISIS and al-Qaeda content that we take down is identified by the
  1234. systems and taken down before anyone our system even flags it
  1235. for us. So that's an example of removing harmful content that
  1236. we're proud of, and I think is a model for other types of
  1237. harmful content as well.
  1238. - Congressman, it's a good question, and it's a combination of
  1239. technology and people. We have a counterterrorism team at
  1240. Facebook.
  1241. - Two hundred people are just focused on counterterrorism, and
  1242. there are other content reviewers who are reviewing content that
  1243. gets flagged to them as well. So those are folks who are working
  1244. specifically on that. I think we have capacity in 30 languages
  1245. that we're working on. In addition to that we have a number of
  1246. A.I. tools that we're developing, like the ones that I mentioned
  1247. that can proactively go flag the content.
  1248. - Yes so there's ...
  1249. - Yes.
  1250. - So we identify what might be the patterns of communication or
  1251. messaging that they might put out and then design systems that
  1252. can proactively identify that and flag those for our teams. That
  1253. way we can go and take those down.
  1254. - Thank you. We will. And, Mr. Chairman, if you don't mind before
  1255. we go to the next question, there was something I wanted to
  1256. correct in my testimony from earlier, when I went back and
  1257. talked to my team afterwards.
  1258. - I'd said that if — if — this was in response to a question
  1259. about whether web logs that — that we had about a person would
  1260. be able to download your information. I had said that they were.
  1261. And I clarified with my team that in fact, the Web logs are not
  1262. and download your information. We only store them temporarily,
  1263. and we convert the Web logs into a set of ad interests, that you
  1264. might be interested in those ads, and we put that in the
  1265. “download your information” instead, and you have complete
  1266. control over that. So I just wanted to clarify that one for the
  1267. record.
  1268. - Congressman, in retrospect, it was a mistake and we should and
  1269. I wish we had identified — notified and told people about it.
  1270. - The reason why we didn't ...
  1271. - Yes, Congressman, I don't believe that — that we necessarily
  1272. had a legal obligation to do so. I just think it was probably
  1273. ...
  1274. - ... I think that it was the right thing to have done. The
  1275. reason why we didn't do it at the time ...
  1276. - Absolutely.
  1277. - Congressman, regardless of what the laws or regulations are
  1278. that are in place, we take a broader view of our
  1279. responsibilities around privacy, and I think that we should have
  1280. notified people, because it would have been the right thing to
  1281. do, and I've committed ...
  1282. - Congressman, I think it's an idea that deserves a lot of
  1283. consideration. I think — I — I'm not the type of person who
  1284. thinks that there should be no regulation, especially because
  1285. the Internet is getting to be so important in people's lives
  1286. around the world. But I think the details on this really matter,
  1287. and whether it's an agency, or a law that is passed, or the FTC
  1288. has certain abilities, I — I that is — is is all something that
  1289. we should be ...
  1290. - Congressman, we look forward to following up, too.
  1291. - Congressman, I believe that people should have the ability to
  1292. choose to share their data how they want, and they need to
  1293. understand how that's working. But I — I agree with what you're
  1294. saying, that people want to have the ability to move their data
  1295. to another app, and we want to give them the tools to — to do
  1296. that.
  1297. - Yes, Congressman. On — on most devices, the way the operating
  1298. systems is architected would prevent something that you do in
  1299. another app like Google from being visible to — to the Facebook
  1300. app.
  1301. - Congressman, yes, we — we collect information to make sure that
  1302. the ad experience on Facebook can be relevant and valuable to
  1303. small businesses ...
  1304. - ... and — and others who want to reach people.
  1305. - Congressman, yes, there is. There is a setting, so if you don't
  1306. want any data to be collected around advertising, you can — you
  1307. can turn that off, and then we won't do it. In general, we offer
  1308. a lot of settings over every type of information that you might
  1309. want to share on Facebook, in every way that you might interact
  1310. with the system, from here's the content that you put on your
  1311. page, to here is who can see your interests, to here's how you
  1312. might show up in — in search results if people look for you, to
  1313. here's how the — how you might be able to sign into developer
  1314. apps, and login with Facebook, and — and advertising. And we —
  1315. we try to make the controls as easy to understand as possible.
  1316. You know, it's a — it's a broad service. People use it for a lot
  1317. of things, so there are a number of controls, but we try to make
  1318. it as easy as possible, and — and to put those controls in front
  1319. of people so that they can configure the experience in a way
  1320. that they want.
  1321. - Thank you.
  1322. - Congressman, I think that that makes sense to discuss, and I
  1323. agree with the broader point that I think you're making, which
  1324. is that the Internet and technology overall is just becoming a
  1325. much more important part of all of our lives. The — the
  1326. companies in the technology industry are — are growing ...
  1327. - Congressman, it's certainly something that we can consider,
  1328. although one thing that I would push back on is I think it is
  1329. often characterized as maybe these mistakes happen because
  1330. there's some conflict between what people and business
  1331. interests. I actually don't think that's the case. I think a lot
  1332. of these hard decisions come down to different interests between
  1333. different people. So for example, on the one hand people want
  1334. the ability to sign into apps and bring some of their
  1335. information and bring some of their friend's information in
  1336. order to have a social experience. And on the other hand,
  1337. everyone wants their information locked down and completely
  1338. private. And the question is — it's not a business question as
  1339. much as which of those equities do you weigh more?
  1340. - Congressman, well there are — there are a lot of things that
  1341. the — that the Europeans do, and — and I think that — I think
  1342. that GDPR in general is — is going to be a very positive step
  1343. for the Internet, and it codifies a lot of the things in there
  1344. are things that we've done for a long time. Some of them are
  1345. things that — that I think would be — would be good steps for us
  1346. to take. So for example, the controls that — that this requires,
  1347. are generally controls, privacy controls that we've offered
  1348. around the world for years. Putting the tools in front of people
  1349. repeatedly, not just having them in settings, but putting them
  1350. in front of people and getting — and making sure that people
  1351. understand what the controls are and that they get affirmative
  1352. consent, I think it's a good thing to do that we've done
  1353. periodically in the past, but I think it makes sense to do more,
  1354. and I think that's something the GDPR will — will require us to
  1355. do and — and will be positive.
  1356. - I would — I need to think about that more.
  1357. - I did.
  1358. - Congressman, I'm not — I'm not specifically aware of — of that
  1359. threat, but in general, there are a number of national security
  1360. and election integrity-type issues that we focus on, and we try
  1361. to take a very broad view of that. And the more input that we
  1362. can get from the intelligence community as well, encouraging us
  1363. to — to look into specific things, the more effectively we can
  1364. do that work.
  1365. - Congressman, this is an important question. So there are a
  1366. couple of standards. The strongest one is things that will cause
  1367. physical harm, or threats of physical harm, but then there is a
  1368. broader standard of — of hate speech and speech that might make
  1369. people feel just broadly uncomfortable or unsafe in the
  1370. community.
  1371. - Congressman, that's a very important question, and I think is —
  1372. is one that we struggle with continuously, and the question of,
  1373. what is hate speech versus what is legitimate political speech
  1374. is, I — I think, something that we get criticized both from the
  1375. left and the right on what the definitions are that we have.
  1376. It's — it is — it's nuanced, and what we try to — we try to lay
  1377. this out in our community standards, which are public documents,
  1378. that we can make sure that you and your — your office get to
  1379. look through the definitions on this, but this is an area where
  1380. I think society's sensibilities are also shifting quickly, and
  1381. it's also very different and ...
  1382. - I agree.
  1383. - Congressman, thank you. So, before 2014 when we announced the
  1384. change, a — someone could sign into an app and share some of
  1385. their data, but also could share some basic information about
  1386. their friends. And in 2014 the major change was we said, now
  1387. you're not going to be able to share any information about your
  1388. friends. So if you and your friend both happen to be playing a
  1389. game together or on an app that — listening to music together,
  1390. then that app could have some information from both of you
  1391. because you both had signed in and authorized that app. But
  1392. other than that, people wouldn't be able to share information
  1393. from their friends. So that the basic issue here were 300,000
  1394. people used this poll and came — and the app and then ultimately
  1395. sold it to Cambridge Analytica and Cambridge Analytica had
  1396. access to as many as 87 million people's information wouldn't be
  1397. possible today. Today if 300,000 people used an app, the app
  1398. might have information about 300,000 people.
  1399. - Thank you.
  1400. - Congressman, I — I don't sitting here today remember a lot of
  1401. the specifics of — of early on, but we saw generally a bunch of
  1402. app developers who were asking for permissions to access
  1403. people's data in ways that weren't connected to the functioning
  1404. of an app. So they'd just say, Okay, if you want to log in to my
  1405. app, you — you would have to share all this content, even though
  1406. the app doesn't actually use that in any reasonable way. So we
  1407. looked at that and said, hey, this isn't — this isn't right. Or
  1408. we should review these apps and make sure that if an app
  1409. developer's going to ask someone to access their data that they
  1410. actually have a reason why they want to access to it. And over
  1411. time, that we — we made a series of changes that culminated in
  1412. the major change in 2014 that I referenced before where
  1413. ultimately we made it so now a person could sign in but not
  1414. bring their friends information with them anymore.
  1415. - Congressman, it would be difficult to ever guarantee that any
  1416. single — that — that — that there are — that there are no bad
  1417. actors. Every problem around security is — is sort of an arms
  1418. race, where you have people who are trying to abuse systems, and
  1419. our responsibility is to make that as hard as possible and to
  1420. take the — the necessary precautions for a company of our scale.
  1421. And I think that the responsibility that we have is growing with
  1422. our scale and we need to make sure that we ...
  1423. - Congressman, yes politically. Although I — I — I think what you
  1424. — when I hear that what I hear is kind of normal political
  1425. speech. We certainly are not going to allow ads for terrorist
  1426. content for example so ...
  1427. - ... banning those views.
  1428. - Sorry, could you repeat that?
  1429. - Congresswoman, so when you're using the service, if you share a
  1430. photo, for example, and you say “I only want my friends to see
  1431. it,” then in news feed and Facebook, only your friends are going
  1432. to see it. If you then go to a website and then you want to sign
  1433. into that website, that website can ask you and say “Hey, here
  1434. are the things that — that I want to get access to in order for
  1435. you to use the website.” If you sign in after seeing that screen
  1436. where the website is asking for certain information, then you
  1437. are also authorizing that website to have access to that
  1438. information. If you've turned off the platform completely, which
  1439. is what the control is that you have on the left, then you
  1440. wouldn't be able to sign in to another website. You'd have to go
  1441. reactivate this before that would even work.
  1442. - Congresswoman, I think that these, that the settings when
  1443. you're signing into an app are quite clear in terms of, every
  1444. time you go to sign into an app, you have to go through a whole
  1445. screen that says “Here's the app, here's your friends who use
  1446. it, here are the pieces of information that it would like to
  1447. have access to.” You make a decision whether you sign in, yes or
  1448. no. And until you say “I want to sign in,” nothing gets shared.
  1449. Similarly, in terms of sharing content, every single time that
  1450. you go to upload a photo, you have to make a decision — it's
  1451. right there at the top, it says “are you sharing this with your
  1452. friends or publicly or with some group,” and every single time
  1453. that's — that's quite clear. So in those cases, yes, I think
  1454. that this is quite clear.
  1455. - Congresswoman, we typically do two things. We have a settings
  1456. page that has all of your settings in one place in case you want
  1457. to go and play around or configure your settings. But the more
  1458. important thing is putting the settings in line when you're
  1459. trying to make a decision. So if you're going to share a photo
  1460. now, we think that your setting about who you want to share that
  1461. photo with should be in line right there. If you're going to
  1462. sign into an app, we think that the — it should be very clear
  1463. right in line when you're signing into the app what permissions
  1464. that app is asking for. So we do both. It's both in one place in
  1465. settings if you want to go to it, and it's in line in the
  1466. relevant place.
  1467. - Can you repeat that?
  1468. - What was the other piece?
  1469. - Well, Congresswoman, I think that privacy is not something that
  1470. you can ever — it's — our understanding of the issues between
  1471. people and how they interact online only grows over time. So I
  1472. think we'll figure out what the social norms are and the rules
  1473. that we want to put in place. Then five years from now, we'll
  1474. come back and we'll have learned more things and either that'll
  1475. just be that social norms have evolved and the company's
  1476. practices have evolved or we'll put rules in place. But I think
  1477. that our understanding of this is going to evolve over quite a
  1478. long time. So I would expect that even if a state like
  1479. California's forward-leaning, that's not necessarily going to
  1480. mean that we fully understand everything or have solved all the
  1481. issues.
  1482. - Congresswoman, I don't know the answer to that off the top my
  1483. head, but we'll get back to you.
  1484. - I believe we've served the like button on pages more than that,
  1485. but I don't know the number of pages that have the like button
  1486. on actively.
  1487. - I don't know the answer to that exactly off the top my head
  1488. either, but that's something that we can follow up with you on.
  1489. - Congresswoman, you're asking some specific stats that I don't
  1490. know off the top of my head, but we can follow up with you and
  1491. get back to you on all of these.
  1492. - Congresswoman, I will talk to my team and we will follow up.
  1493. - Congresswoman, as I've said a number of times, we're now going
  1494. to investigate every single app that access to a large amount of
  1495. people's information in the past before we lock down the
  1496. platform. I do imagine that we will find some apps that — that
  1497. were either doing something suspicious or misused people's data,
  1498. if we find them, then we will ban them from the platform, take
  1499. action to make sure they delete the data and make sure that
  1500. everyone involved is informed.
  1501. - As soon as we find them.
  1502. - Yes, Congressman. So there are a few parts of GDPR that I think
  1503. are important and — and good. One is making sure that people
  1504. have control over how each piece of information that they share
  1505. used. So people should have the ability to know what a company
  1506. knows about them, to control and have a setting about who can
  1507. see it and to be able to delete it whenever they want. The
  1508. second set of things is making sure that people actually
  1509. understand what the tools are that are available. So not just
  1510. having it in some settings page somewhere, but put the tools in
  1511. front of people so that they can make a decision. And that both
  1512. builds trust and makes inside people's experiences are
  1513. configured in the way that they want. That's something that
  1514. we've done a number of times over the years at Facebook. But
  1515. with GDPR, we will now be doing more and around the whole world.
  1516. The third piece is there are some very sensitive technologies
  1517. that I think are important to enable innovation around like face
  1518. recognition, but that you want to make sure that you get special
  1519. consent for. Right, it's if we — if we make it too hard for
  1520. American companies to innovate in areas like facial recognition,
  1521. then we will lose to Chinese companies and other companies
  1522. around the world where — that are able innovate in that.
  1523. - Congressman, I think that that's a — that's a good question.
  1524. And I think that this is something that probably — that — that
  1525. we should — that people should have control over, how it is used
  1526. and that we're going to be rolling out and asking people whether
  1527. they want us to use it for them around the world as part of this
  1528. — this push that's upcoming. But I think in general for — for
  1529. sensitive technologies like that, I do think you want a special
  1530. consent.
  1531. - And I think that's a — that would be a valuable thing to
  1532. consider.
  1533. - Congressman ...
  1534. - Congressman, I'm not familiar with how the term is legally
  1535. used.
  1536. - Well, Congressman, let me put it this way, there is content
  1537. that we fund, specifically in video today.
  1538. - And when we're commissioning a video to be created, then I
  1539. certainly think we have full responsibility ...
  1540. - ... of owning — of owning that content.
  1541. - But the vast majority of the content on Facebook is not
  1542. something that we commissioned. For that, I think our
  1543. responsibility is to make sure that the content on Facebook is
  1544. not harmful, that people are seeing things that are relevant to
  1545. them and that encourage interaction and building relationships
  1546. with the people around them. And that, I think, is — is the
  1547. primary responsibility that we have.
  1548. - Thank you.
  1549. - I did not know that specifically.
  1550. - Yes.
  1551. - Yes, especially among certain demographics.
  1552. - Congressman, I will make sure that someone is there.
  1553. (Inaudible).
  1554. - Congressman, I was not specifically aware of that, but I think
  1555. we — we know that — that there are issues with content like
  1556. this, that we need more proactive monitoring for.
  1557. - Congressman, I have not heard that.
  1558. - Congressman, I believe that has been an issue for a long time.
  1559. - Congressman yes, we take this very seriously. That's a big part
  1560. of the reason overall these content issues why, by the end of
  1561. this year, we're going to have more than 20,000 people working
  1562. on security and content review. And we need to build more tools,
  1563. too.
  1564. - Well Congressman, I think that we can all agree that certain
  1565. content like terrorist propaganda should have no place on our
  1566. network. And the First Amendment, my understanding of it, is
  1567. that that kind of speech is allowed in the world. I just don't
  1568. think that it is the kind of thing that we want to allow to
  1569. spread on the Internet. So once you get into that, you're
  1570. already — you're deciding that you — you take this value that
  1571. you care about safety. And that we don't want people to be able
  1572. to spread information that can cause harm. And I think that that
  1573. — it — our general responsibility is to — is to allow the
  1574. broadest spectrum of free expression as we can ...
  1575. - Well Congressman, I think that we — we make a number of
  1576. mistakes in content review today that I don't think only focus
  1577. on one political persuasion. And I think it's unfortunate that
  1578. when those happen, people think that we're focused on them. And
  1579. it happens in different political groups, and it's — we have ...
  1580. - Thank you.
  1581. - Congressman, I agree that this is very important, and I — I
  1582. miscommunicated if I left the impression that we weren't
  1583. proactively going to work on tools to take down this content,
  1584. and we're only going to rely on people to flag it for us. Right
  1585. now, I think underway, we have efforts to focus not only on ads,
  1586. which has been most of the — the majority of the questions, but
  1587. a lot of people share this stuff in groups, too, and the — the
  1588. free part of the products that aren't paid, and we need to get
  1589. that content down, too. I understand how big of an issue this
  1590. is. Unfortunately, the enforcement isn't — isn't perfect. We do
  1591. need to make it more proactive, and I'm committed to doing that.
  1592. - Congressman, let me answer that in a second, and before —
  1593. before I get to that, on your last point, the content reviewers
  1594. who we have are not primarily located in — in — in Silicon
  1595. Valley. So I think that — that's — that was an important point,
  1596. and ...
  1597. - ... I do worry about the general bias of people in Silicon
  1598. Valley. But the — the majority of the folks doing content review
  1599. are — are around the world in different places. To your question
  1600. about net neutrality, I think that there's a big difference
  1601. between Internet service providers and platforms on top of them.
  1602. And the big reason is that, well, I just think about my own
  1603. experience. When I was starting Facebook, I had one choice of an
  1604. Internet service provider. And if I had to potentially pay extra
  1605. in order to make it so that people could have Facebook as an
  1606. option for something that they used, then I'm not sure that we'd
  1607. be here today. Platforms, there are just many more. So it may be
  1608. true that a lot of people choose to use Facebook. The average
  1609. American, I think, uses about eight different communication and
  1610. social network apps to stay connected to people. And just as
  1611. clearly correct or true that there are more choices on
  1612. platforms. So even though they can reach large-scale, I think
  1613. the pressure of just having one or two in a place does require
  1614. us to think a little bit ...
  1615. REP. MARSHA BLACKBURN (R-TENN.)
  1616. - Thank you, Mr. Chairman. Mr. Zuckerberg, I tell you, I think
  1617. your cozy community, as Dr. Mark Jameson recently said, is
  1618. beginning to look a whole lot like “The Truman Show,” where
  1619. people's identities and relationships are made available to
  1620. people that they don't know. And then that data is crunched and
  1621. it is used and they are fully unaware of this. So I've got to
  1622. ask you, I think what we're getting to here is, who owns the
  1623. virtual you? Who owns your presence online? And I'd like for you
  1624. to comment. Who do you think owns an individual's presence
  1625. online? Who owns their virtual you? Is it you or is it them?
  1626. SCHAKOWSKY
  1627. - Years?
  1628. - Okay. I want to ask you — yesterday — following up on your
  1629. response to Senator Baldwin's question, you said yesterday that
  1630. Kogan also sold data to other firms. You named Eunoia
  1631. Technologies. How many are there total? And what are their
  1632. names? Can we get that? And how many are total — are there
  1633. total?
  1634. - Yeah, but order of magnitude?
  1635. - What's a large number?
  1636. - Has Facebook tried to get those firms to delete user data and
  1637. its derivatives?
  1638. - And were derivatives deleted?
  1639. - You are looking at the ...
  1640. - So Mr. Green asked about the General Data Protection Regulation
  1641. on May 25th that's going to go into effect by the E.U. And your
  1642. response was — let me ask: Is your response that exactly the
  1643. protections that are guaranteed, not the — what did he say?
  1644. Yeah, not just the controls, but all the rights that are
  1645. guaranteed under the General Data Protection Regulations will be
  1646. applied to Americans, as well?
  1647. - Right, that's one. Yes.
  1648. - Exactly.
  1649. - It sounds like it will not be exact. And let me say, as we look
  1650. at the distribution of information ...
  1651. - ... that who's going to protect us from Facebook is also a
  1652. question. Thank you. I yield back.
  1653. - Okay, I'm going to — I consider Billy Long a good friend. Let
  1654. me just say that I don't think it was a breach of decorum, and I
  1655. just take issue with his saying that a very modest bill that
  1656. I've introduced is an overreach. That's all.
  1657. LUJAN
  1658. - Well ...
  1659. - If I may, Mr. Zuckerberg, I will recognize that Facebook did
  1660. turn this feature off. My question, and the reason I'm asking
  1661. about 2013 and 2015, is Facebook knew about this in 2013 and
  1662. 2015, but you didn't turn the feature off until Wednesday of
  1663. last week — the same feature that Mr. Kinzinger just talked
  1664. about, where this is essentially a tool for these malicious
  1665. actors to go and steal someone's identity and put the finishing
  1666. touches on it. So, again, you know, one of your mentors, Roger
  1667. McNamee, recently said your business is based on trust, and you
  1668. are losing trust. This is a trust question. Why did it take so
  1669. long, especially when we're talking about some of the other
  1670. pieces that we need to get to the bottom of? Your failure to act
  1671. on this issue has made billions of people potentially vulnerable
  1672. to identity theft and other types of harmful, malicious actors.
  1673. So, on to another subject, Facebook has detailed profiles on
  1674. people who have never signed up for Facebook. Yes or no?
  1675. - So these are called shadow profiles? Is that what they've been
  1676. referred to by some?
  1677. - I'll refer — I'll refer to them as shadow profiles for today's
  1678. hearing. On average, how many data points does Facebook have on
  1679. each Facebook user?
  1680. - So the average for non-Facebook platforms is 1,500. It's been
  1681. reported that Facebook has as many as 29,000 data points for an
  1682. average Facebook user. You know how many points of data that
  1683. Facebook has on the average non-Facebook-user?
  1684. - I appreciate that. It's been admitted by Facebook that you do
  1685. collect data points on non-average users. So my question is, can
  1686. someone who does not have a Facebook account opt out of
  1687. Facebook's involuntary data collection?
  1688. - But — so ...
  1689. - If I may, Mr. Zuckerberg, I'm about out of time. It may
  1690. surprise you that we have not talked about this a lot today. You
  1691. said everyone controls their data, but you're collecting data on
  1692. people that are not even Facebook users, that have never signed
  1693. a consent, a privacy agreement — and you're collecting their
  1694. data. And it may surprise you that, on Facebook's page, when you
  1695. go to “I don't have a Facebook account and would like to request
  1696. all my personal data stored by Facebook,” it takes you to a form
  1697. that says, “Go to your Facebook page, and then, on your account
  1698. settings, you can download your data.” So you're directing
  1699. people who don't have access — don't even have a Facebook page
  1700. to have to sign up for a page to reach their data. We've got to
  1701. fix that. The last question that I have is have you disclosed to
  1702. this committee or to anyone all the information Facebook has
  1703. uncovered about Russian interference on your platform?
  1704. - Thank you Mr. Chair.
  1705. OLSON
  1706. - One last question. I believe I've heard you employ 27,000
  1707. people thereabouts. Is that correct?
  1708. - I've also been told that about 20,000 of those people,
  1709. including contractors, do work on data security. Is that
  1710. correct?
  1711. - Okay, so roughly at least half your employees are dedicated to
  1712. security practices. How can Cambridge Analytica happen with so
  1713. much of your workforce dedicated to these — these causes. How'd
  1714. that happen?
  1715. PALLONE
  1716. - And their justification that those protections were not needed
  1717. because the Federal Trade Commission has everything under
  1718. control — well, this latest disaster shows just how wrong the
  1719. Republicans are. The FTC used every tool Republicans have been
  1720. willing to give it, and those tools weren't enough. And that's
  1721. why Facebook acted like so many other companies, and reacted
  1722. only when it got bad press. We all know this cycle by now. Our
  1723. data is stolen. The company looks the other way. Eventually,
  1724. reporters find out, publish a negative story, and the company
  1725. apologizes. And Congress then holds a hearing, and then nothing
  1726. happens. By not doing its job, this Republican-controlled
  1727. Congress has become complicit in this nonstop cycle of privacy
  1728. by press release. And this cycle must stop, because the current
  1729. system is broken. So I was happy to hear that Mr. Zuckerberg
  1730. conceded that his industry needs to be regulated, and I agree.
  1731. We need comprehensive privacy and data security legislation. We
  1732. need baseline protections that stretch from Internet service
  1733. providers, to data brokers, to app developers and to anyone else
  1734. who makes a living off our data. We need to figure out how to
  1735. make sure these companies act responsibly, even before the press
  1736. finds out. But, while securing our privacy is necessary, it's
  1737. not sufficient. We need to take steps immediately to secure our
  1738. democracy. We can't let what happened in 2016 happen again. And,
  1739. to do that, we need to learn how Facebook was caught so flat-
  1740. footed in 2016. How was it so blind to what the Russians and
  1741. others were doing on its systems? Red flags were everywhere. Why
  1742. didn't anyone see them? Or were they ignored? So today's hearing
  1743. is a good start. But we also need to hold additional hearings
  1744. where we hold accountable executives from other tech companies,
  1745. Internet service providers, data brokers and anyone else that
  1746. collects our information. Now, Congresswoman Schakowsky from
  1747. Illinois and I introduced a bill last year that would require
  1748. companies to implement baseline data security standards. And I
  1749. plan to work with my colleagues to draft additional legislation.
  1750. But I have to, say Mr. Chairman, it's time for this committee
  1751. and this Congress to pass comprehensive legislation to prevent
  1752. incidents like this in the future. My great fear is that we have
  1753. this hearing today, there's a lot of press attention — and, Mr.
  1754. Zuckerberg, you know, appreciate your being here once again —
  1755. but, if all we do is have a hearing and then nothing happens,
  1756. then that's not accomplishing anything. And — and I — you know,
  1757. I know I sound very critical of the Republicans and their
  1758. leadership on this — on these privacy issues. But I've just seen
  1759. it — I've just seen it over and over again — that we have the
  1760. hearings, and nothing happens. So excuse me for being so
  1761. pessimistic, Mr. Chairman, but that's where I am. I yield back.
  1762. - Thank you. I — Mr. Zuckerberg, you talk about how positive and
  1763. optimistic you are, and I'm — I guess I'm sorry, because I'm
  1764. not. I don't have much faith in corporate America, and I
  1765. certainly don't have much faith in their GOP allies here in
  1766. Congress. I really look at everything in — that this committee
  1767. does, or most of what this committee does, in terms of the right
  1768. to know. In other words, they — I always fear that people, you
  1769. know, that go on Facebook — they don't necessarily know what's
  1770. happening or what's going on with their data. And so, to the
  1771. extent that we could pass legislation, which I think we need —
  1772. and you said that we probably should have some legislation — I
  1773. want that legislation to give people the right to know, to
  1774. empower them, to — to, you know, provide more transparency, I
  1775. guess, is the best way to put. So I'm looking at everything
  1776. through that sort of lens. So just let me ask you three quick
  1777. questions. And I'm going to ask you to answer yes or no, because
  1778. of the time. Yes or no: Is Facebook limiting the amount or type
  1779. of data Facebook itself collects or uses?
  1780. - But, see, I — I don't see that in the announcements you've
  1781. made. Like, you've made all these announcements the last few
  1782. days about the changes you're going to make. And I don't really
  1783. see how that — how those announcements or changes limit the
  1784. amount or type of data that Facebook collects or uses in an
  1785. effective way. But let me go to the second one. Again, this is
  1786. my concern — that users currently may not know or take
  1787. affirmative action to protect their own privacy. Yes or no: Is
  1788. Facebook changing any user default settings to be more privacy-
  1789. protective?
  1790. - But see, again, I don't see that in — in the changes you — that
  1791. you propose. I don't really see any way that these user default
  1792. settings — you're changing these user default settings in a way
  1793. that is going to be more privacy protection. But let me —
  1794. protective. But let me go to the third one. Yes or no: Will you
  1795. commit to changing all user default settings to minimize, to the
  1796. greatest extent possible, the collection and user — and use of
  1797. users' data? Can you make that commitment?
  1798. - But I'd like you to answer yes or no, if you could. Will you
  1799. make the commitment to change all the user — to changing all the
  1800. user default settings to minimize, to the greatest extent
  1801. possible, the collection and use of users' data? That's — I
  1802. don't think that's hard for you to say yes to, unless I'm
  1803. missing something.
  1804. - Well, again, that's disappointing to me, because I think you
  1805. should make that commitment. And maybe what we could do is
  1806. follow up with you on this, if possible — if that's okay. We can
  1807. do that follow-up?
  1808. - All right. Now, you said yesterday that each of us owns the
  1809. content that we put on Facebook and that Facebook gives some
  1810. control to consumers over their content. But we know about the
  1811. problems with Cambridge Analytica.
  1812. - I know you changed your rules in 2014 and again this week, but
  1813. you still allow third parties to have access to personal data.
  1814. How can consumers have control over their data when Facebook
  1815. doesn't have control over the data itself? That's my concern.
  1816. Last question.
  1817. - I still don't ...
  1818. - Yeah, I know. I still think that there's not enough — people
  1819. aren't empowered enough to really make those decisions in a
  1820. positive way.
  1821. JOHNSON
  1822. - I got a lot of those folks in my district. You know, you're a —
  1823. you're a real American success story. There's no question that
  1824. you and Facebook have revolutionized the way Americans — in
  1825. fact, the world — communicate and interconnect with one another.
  1826. I think the reason that — one of the reasons that you were able
  1827. to do that is because nowhere other than here in America, where
  1828. a young man in college can pursue his dreams and ambitions on
  1829. his own terms without a big federal government overregulating
  1830. them and telling them what they can and cannot do, could you
  1831. have achieved something like this. But, in the absence of — of
  1832. federal regulations that would reel that in, the only way it
  1833. works for the betterment of society and people is with a high
  1834. degree of responsibility and trust. And you've acknowledged that
  1835. there have been some breakdowns in responsibility. And I think,
  1836. sometimes — and I'm a technology guy. I have two degrees in
  1837. computer science. I'm a software engineer. I'm a patent holder.
  1838. So I know the challenges that you face in terms of managing the
  1839. technology. But, oftentimes, technology folks spend so much time
  1840. thinking about what they can do, and little time thinking about
  1841. what they should do. And so I want to talk about some of those
  1842. “should do” kind of things. You heard earlier about faith-based
  1843. material that had been — that had been taken down, ads that had
  1844. been taken down. You admitted that it was a mistake. That was in
  1845. my district, by the way — Franciscan University, a faith-based
  1846. university, was the one that did that.
  1847. - How is your content filtered and determined to be appropriate,
  1848. or not appropriate, and policy-compliant? Is it an algorithm
  1849. that does it? Or is there a team of a gazillion people that sit
  1850. there and look at each and every ad, that make that
  1851. determination?
  1852. - What do — what you do when you — when you find someone or
  1853. something that's made a mistake? I mean, I've heard you say
  1854. several times today that you know a mistake has been made. What
  1855. — what kind of accountability is there when mistakes are made?
  1856. Because, every time a mistake like that is made, it's a little
  1857. bit of a chip away from the trust and the responsibility
  1858. factors. How do you hold people accountable in Facebook, when
  1859. they make those kind of mistakes of taking stuff down that
  1860. shouldn't be taken down, or leaving stuff up that should not be
  1861. left up?
  1862. - Do you ever fire anybody when they do stuff like that?
  1863. - What happened to the — what happened to the person that took
  1864. down the Franciscan University ad and didn't put it back up
  1865. until the media started getting involved?
  1866. - Could you take that question for me? My time is expired. Can
  1867. you take that question for me and — and get me that answer back,
  1868. please?
  1869. - Okay, thank you very much. I yield back.
  1870. GUTHRIE
  1871. - But — but you're different in that instead of getting just a
  1872. broad — When I'm watching the — the Hilltoppers on basketball,
  1873. the person advertising me doesn't know anything about me. I'm
  1874. just watching the ad, so there's no data, no agreement, or no
  1875. risk, I guess, there. But with you, there — there is consumer-
  1876. driven data. But if we were to greatly reduce or stop — or just
  1877. greatly reduce, through legislation, the use of consumer-driven
  1878. data for targeting ads, what do you think that would do to the
  1879. Internet, just — and when I say Internet, I mean everything, not
  1880. just Facebook.
  1881. - So if you had less revenue, what would that do to ...
  1882. - I have 30 seconds, so I appreciate the answer to that. But if —
  1883. so — so I didn't opt out, and so forth, and all of a sudden, I
  1884. say, “You know, this just doesn't work for me, so I want to
  1885. delete — " You told Congressman Rush that you could delete. What
  1886. happens to the data? I — I've already — it's fair. It's been
  1887. used. It's — Cambridge Analytics may have it. So what happens
  1888. when I say, “Facebook, take my data off your platform”?
  1889. - Thank you. My time's expired. I appreciate it.
  1890. REP. GREGG HARPER (R-MISS.)
  1891. - Thank you, Mr. Chairman. Thank you, Mr. Zuckerberg for being
  1892. here. And we don't lose sight of the fact that you're a great
  1893. American success story. It is a part of everyone's life and
  1894. business — sometimes, maybe too often. But I thank you for
  1895. taking the time to be here. And our concern is to make sure that
  1896. it's — it's fair. We worry because we're — we're looking at
  1897. possible government regulation here. Certainly, this self-
  1898. governing, which has had some issues and how you factor that —
  1899. and — and we — you know, we're trying to keep up with the
  1900. algorithm changes on — on how you determine the prioritization
  1901. of the news feeds. And you look at, well, it's got to be — it
  1902. needs to be trustworthy and reliable and relevant — well, who's
  1903. going to determine that? That also has an impact. And, even
  1904. though you say you don't want the bias, it does — it is
  1905. dependent upon who's setting what those standards are in that.
  1906. And so I want to ask you a couple questions, if I may. And this
  1907. is a quote from Paul Grewal, Facebook's V.P. and general counsel
  1908. — said, “Like all app developers, Mr. Aleksandr Kogan requested
  1909. and gained access to information from people after they chose to
  1910. download his app.” Now, under Facebook policy, in 2013, if
  1911. Cambridge Analytica had developed the This is Your Digital Life
  1912. app, they would have had access to the same data they purchased
  1913. from Mr. Kogan. Would that be correct?
  1914. MCNERNEY
  1915. - Well, my staff just this morning downloaded their information
  1916. and their browsing history is not in there. So are you saying
  1917. that Facebook does not have browsing history?
  1918. - So I'm — I'm — I'm not quite on board with this. Is there any
  1919. other information that Facebook has obtained about me, whether
  1920. Facebook collected it or obtained it from a third party that
  1921. would not be included in the download?
  1922. - Okay, I'm going to follow up with this afterwards. Mr.
  1923. Zuckerberg, you indicated that the European users with have GDPR
  1924. protection on May 25th, and the American users will have those
  1925. similar protections. When will the American users have those
  1926. protections?
  1927. - So it will not be on May 25th?
  1928. - Thank you. Your company and many companies with an online
  1929. presence have a staggering about of personal information. The
  1930. customer is not really in the driver's seat about how their
  1931. information is used or monetized. The data collectors are in the
  1932. driver seat. Today, Facebook is governed by weak federal privacy
  1933. protections. I've introduced legislation that would help address
  1934. this issue. The My Data Act would give the FTC rulemaking
  1935. authority to provide consumers with strong data privacy and
  1936. security protections. Without this kind of legislation, how can
  1937. we be sure that Facebook won't continue to be careless with
  1938. users' information?
  1939. - Correct.
  1940. - Well, I mean I hear — I hear — I hear you saying this, but the
  1941. history isn't there. So I — I think we need to make sure that
  1942. there's regulations in place to give you the proper motivation
  1943. to — to stay in line with data protection. One of the problems
  1944. here in my mind is that Facebook's history, the privacy — user
  1945. privacy and security have not been given as high priority as
  1946. corporate growth. And you've admitted as much. Is Facebook
  1947. considering changing it's management structure to ensure that
  1948. privacy and security have sufficient priority to prevent these
  1949. problems in the future?
  1950. - That's a — that's a little bit off — off track from what I'm
  1951. trying to get at. The privacy protections clearly failed in a
  1952. couple of cases that are high profile right now. And part of the
  1953. blame that — that seems to be out there is that the management
  1954. structure for privacy and security don't have the right level of
  1955. — of profile in — in Facebook to get your attention to make sure
  1956. that they get the proper resources.
  1957. PETERS
  1958. - Thank you, Mr. Chairman. Thank you, Mr. Zuckerberg, for being
  1959. with us today, and I — you know, it's been a long day. I want to
  1960. — I — I think we can all agree that technology has outpaced the
  1961. law, with respect to the protection of private information. I
  1962. wonder if you think it would be reasonable for Congress to
  1963. define the legal duty of privacy that's owed by private
  1964. companies to their customers, with respect to their personal
  1965. information.
  1966. - Right, that's what I mean by it's outpaced, and I — I wonder, I
  1967. want to take — I would also want to take you at your work, I
  1968. believe you're sincere that you personally place a high value on
  1969. consumer privacy and that — that personal commitment is
  1970. significant at Facebook today coming from you, given your
  1971. position, but I also observe, and you'd agree, that the
  1972. performance on privacy has been inconsistent. I wonder, you
  1973. know, myself whether that's because it's not a bottom line
  1974. issue. It — it — it appears that the shareholders are interested
  1975. in — in maximizing profits, privacy neither — certainly doesn't
  1976. drive profits I don't think, but also may interfere with profits
  1977. if you have to sacrifice your ad revenues because of privacy
  1978. concerns. Would it not be appropriate for — for us once we
  1979. define this — this duty to assess financial penalties in a way
  1980. that would sufficiently send a signal to the shareholders and to
  1981. your employees — who you must be frustrated with too — that the
  1982. privacy you're so concerned about is a bottom line issue at
  1983. Facebook?
  1984. - I think part of it is that, but — but part of it also what
  1985. happened with Cambridge Analytica, some of this data got away
  1986. from us, and I'd suggest to you that if — if there were
  1987. financial consequences to that that made a difference to the
  1988. business, not people dropping their Facebook accounts, they
  1989. would get more attention. And it's not so much a — a business
  1990. model choice — I congratulate you on your business model — but
  1991. it's that these issues aren't getting the — the bottom line
  1992. attention that — that I think would have given — made them a
  1993. priority with respect of Facebook. Let me just follow up in my
  1994. final time on a — on an exchange you had with Senator Graham
  1995. yesterday about regulation and — and I — I think Senator said,
  1996. do you as a company welcome regulation, and you said, if it's
  1997. the right regulation, then yes. Question, do you think that the
  1998. Europeans have it right? And you said, I think they get some
  1999. things right. I wanted you to elaborate on what the Europeans
  2000. got right, and what do you think they got wrong?
  2001. - Anything you think they got wrong?
  2002. - Well I would appreciate it if you could respond in writing. I
  2003. really — again, really appreciate you being here. Thank you Mr.
  2004. Chairman.
  2005. REP. BEN RAY LUJÁN (D-N.M.)
  2006. - Thank you, Mr. Chairman, and I want to pick up where Mr.
  2007. Kinzinger dropped off, here. Mr. Zuckerberg, Facebook recently
  2008. announced that — a search feature allowing malicious actors to
  2009. scrape data on virtually all of Facebook's 2 billion users. Yes
  2010. or no: In 2013, Brandon Copley, the CEO of Giftnix, demonstrated
  2011. that this feature could easily be used to gather information at
  2012. scale. Well, the answer to that question is yes. Yes or no: This
  2013. issue of scraping data was again raised in 2015 by a cyber
  2014. security researcher, correct?
  2015. KINZINGER
  2016. - What about, like — what about Russian intel agencies?
  2017. - Do you know — is this data only from accounts located in or
  2018. operated from these individual countries? Or does it include
  2019. Facebook's global data?
  2020. - Yeah. Is the data only from the accounts located in or operated
  2021. from those countries, in terms of Russia or anything? Or does it
  2022. include Facebook's global data?
  2023. - But where is it stored? Where is the data — do they have access
  2024. to data only stored in ...
  2025. - Okay, so it's the global data.
  2026. - So let me just ask — you mentioned a few times that we're in an
  2027. arms race with Russia, but is it one-sided if Facebook, as an
  2028. American-based company, has given the opposition everything it
  2029. needs in terms of, you know, where it's storing its data?
  2030. - So you mentioned a few times that we're in an arms race with
  2031. Russia.
  2032. - If you're giving Russian intelligence service agencies,
  2033. potentially, even on a valid request, access to global data
  2034. that's not in Russia, is that kind of a disadvantage to us and
  2035. an advantage to them?
  2036. - Sure. Yeah, please.
  2037. - That would be great. Now, I've got another unique one I want to
  2038. bring up. So I was just today — and I'm not saying this as a
  2039. “Woe is me,” but I think this happens to a lot of people — there
  2040. have been — my pictures have been stolen and used in fake
  2041. accounts all around, and, in many cases, people have been
  2042. extorted for money. We report it when we can, but we're in a
  2043. tail chase. In fact, today, I just Googled — or I just put on
  2044. your website, “Andrew Kinzinger,” and he looks a lot like me,
  2045. but it says he's from London and lives in L.A. and went to Locke
  2046. High School, which isn't anything like me at all. These accounts
  2047. pop up a lot, and, again, it's using my pictures, but extorting
  2048. people for money. And we hear about it from people that call and
  2049. say, “Hey, I was duped,” or whatever. Can I — I know you can't
  2050. control everything. I mean, it's — you have a huge platform, and
  2051. — but can you talk about, maybe, some movements into the future
  2052. to try to prevent that, in terms of maybe recognizing somebody's
  2053. picture and if it's fake?
  2054. - Okay. Thank you.
  2055. DOYLE
  2056. - ... reported by The Guardian?
  2057. MCKINLEY
  2058. - That's — that's a yes — yes or no. Do you think you should be
  2059. able to do —
  2060. - And — there — there are 35,000 online pharmacies operating, and
  2061. according to the FDA, they think there may be 96 percent of them
  2062. are operating illegally. And on November of last year, CNBC had
  2063. an article say that you were surprised by the breadth of this
  2064. opioids crisis. And as you can see from these photographs,
  2065. opioids are still available on your site, that they're — without
  2066. a prescription on your site. So contradicts just what you just
  2067. said, just a minute ago. And — and when on last week, FDA
  2068. Commissioner Scott Gottlieb has testified before our office,
  2069. said that the Internet firms simply aren't taking practical
  2070. steps to find and remove these illegal opioids listings. And he
  2071. specifically mentioned Facebook. Are you aware of that, his
  2072. quote?
  2073. - Answer yes or no ...
  2074. - If I could — no, we don't — so, in your opening statement — and
  2075. I appreciated your remark — you said, “It's not enough to give
  2076. people a voice. We have to make sure that people aren't using
  2077. it” — Facebook — “to hurt people.” Now, America's in the midst
  2078. of one of the worst epidemics that it's ever experienced, with
  2079. this — with this drug epidemic. And it's all across this
  2080. country; it's not just in West Virginia. But your platform is
  2081. still being used to circumvent the law and allow people to buy
  2082. highly addictive drugs without a prescription. With all due
  2083. respect, Facebook is actually enabling an illegal activity, and
  2084. in so doing, you are hurting people. Would you agree with that
  2085. statement?
  2086. - You can — you can find out, Mr. Zuckerberg. You know which
  2087. pharmacies are operating legally and illegally. But you're still
  2088. continuing to take that — allow that to be posted on — on
  2089. Facebook and allow people to get this — this scourge that's
  2090. ravaging this country — is being enabled because of Facebook. So
  2091. my question to you, as we close, on this — you've said before
  2092. you were going to take down those ads, but you didn't do it.
  2093. We've got statement after statement about things — you're going
  2094. to take those down within days, and they haven't gone down.
  2095. That, what I just put up, was just from yesterday. It's still
  2096. up. So my question to you is, when are you going to stop — take
  2097. down these posts that are done — on — with illegal digital
  2098. pharmacies? When are you going to take them down?
  2099. - Why do they have to — if you got all these 20,000 people — you
  2100. know that they're up there. Where is your require — where is
  2101. your accountability to allow this to be occurring — this —
  2102. ravaging this country?
  2103. - If — you have been — said before you were going to take them
  2104. down, and you haven't. And they're still up.
  2105. BILIRAKIS
  2106. - Now?
  2107. - By the end of the day?
  2108. - Well, you have knowledge now, obviously. You have knowledge —
  2109. you have knowledge of those ads. Will you begin to take them out
  2110. — down today?
  2111. - They clearly do. I — if they're illegal, they clearly violate
  2112. your laws.
  2113. - I agree.
  2114. - Work on those tools as soon as possible, please. Okay. Next
  2115. question. A constituent of mine in District 12 of Florida, the
  2116. Tampa Bay area, came to me recently with what was clear — a
  2117. clear violation of your privacy policy. In this case, a third-
  2118. party organization publicly posted personal information about my
  2119. constituent on his Facebook page. This included his home
  2120. address, voting record, degrading photos and other information.
  2121. In my opinion, this is cyber bullying. For weeks, my constituent
  2122. tried reaching out to Facebook on multiple occasions through its
  2123. report feature, but the offending content remained. It was only
  2124. when my office got involved that the posts were removed almost
  2125. immediately for violating Facebook policy.
  2126. - How does Facebook's self-reporting policy work to prevent
  2127. misuse? And why did it take an act of Congress — a member of
  2128. Congress to get, again, a clear privacy violation removed from
  2129. Facebook? If you can answer that question, I'd appreciate it,
  2130. please.
  2131. - Absolutely.
  2132. - It has to be consistent.
  2133. - Can you give us a timeline as to when will this be done? I
  2134. mean, this is very critical for — I mean, listen, my family uses
  2135. Facebook, my friends, my constituents. We all use Facebook. I
  2136. use Facebook. It's wonderful ...
  2137. - ... for us seniors to connect with our relatives.
  2138. - Yeah, I'm sorry. Can I submit for the record my additional
  2139. questions?
  2140. - Thank you. Thank you so much ...
  2141. CLARKE
  2142. - So, were they — whether they were Russian or not, when you have
  2143. propaganda, how are you addressing that? Because this was
  2144. extremely harmful during the last election cycle and it — and
  2145. can continue to be so in the — in the upcoming elections and
  2146. throughout the year, right? I'm concerned that there are not
  2147. eyes that are culturally competent looking at these things and
  2148. being able to see how this would impact on civil society. If
  2149. everyone within the organization is monolithic, then you can
  2150. miss these things very easily. And we've talked about diversity
  2151. forever, with your organization. What can you say today, when
  2152. you look at how all of this operates, that you can do
  2153. immediately to make sure that we have the types of viewing or
  2154. reviewing that could enable us to catch this in its tracks?
  2155. - Good. We — we'd like you to get back to us with a timeline on
  2156. that. This is ...
  2157. - Okay. Fabulous. When Mr. Kogan sold the Facebook-based data
  2158. that he acquired through the quiz app to Cambridge Analytica,
  2159. did he violate Facebook's policies at the time?
  2160. - When the Obama campaign collected millions of Facebook users'
  2161. data through their own app during the 2012 election, did it
  2162. violate Facebook's policies at the time?
  2163. - I hope you understand that this distinction provides little
  2164. comfort to those of us concerned about our privacy online.
  2165. Regardless of political party, Americans desperately need to be
  2166. protected. Democrats on this committee ...
  2167. - ... have been calling for strong privacy and data security
  2168. legislation for years. We really can't wait. Mr. Chairman, I
  2169. yield back. Thank you, Mr. Zuckerberg.
  2170. MATSUI
  2171. - But, once it gets to the data broker, though — so there are
  2172. certain algorithms and certain assumptions made. What happens
  2173. after that?
  2174. - Well, what I mean is — is that, if you supplement this data —
  2175. you know, you say you're owning it, but you supplement this —
  2176. when other data brokers, you know, use their own algorithms to
  2177. supplement this and make their own assumptions, then what
  2178. happens there? Because that is — to me, somebody else is taking
  2179. that over. How can you say that we own that data?
  2180. - So — but you can't claw it back once it gets out there, right?
  2181. I mean, that's really — we might own our own data, but, once
  2182. it's used in advertising, we lose control over it. Is that not
  2183. right?
  2184. - Yeah. I understand that.
  2185. - But Facebook sells ads based at least on part of data users
  2186. provide to Facebook. That's right. And the more data that
  2187. Facebook collects — allows you to better target ads to users or
  2188. classes of users. So, even if Facebook doesn't earn money from
  2189. selling data, doesn't Facebook earn money from advertising based
  2190. on that data?
  2191. - But we're not controlling that data.
  2192. BUCSHON
  2193. - Okay, because, I mean — like I said, I mean, you've talked to
  2194. people that this has happened to. My son who lives in Chicago
  2195. was — him and his colleagues were talking about a certain type
  2196. of suit, because they're business guys, and, the next day, he
  2197. had a bunch of ads for different suits on — on that, when he
  2198. went onto the Internet. So it's pretty obvious to me that
  2199. someone is — is listening to the audio on — on our phones, and
  2200. that — I see that as a pretty big issue. And the reason is — is
  2201. because — and you may not be, but I see this as a pretty big
  2202. issue for — because, for example, if you're in your doctor's
  2203. office, if you're in your corporate boardroom, your office or
  2204. even personal areas of your home, that's potentially an issue.
  2205. And I'm glad to hear that Facebook isn't listening, but — but
  2206. I'm skeptical that someone isn't. And I — I see this as an
  2207. industry-wide issue that you could potentially help address. And
  2208. the final thing I'll just ask is that, when you have, say, an
  2209. executive session or whatever, your corporate board, and you
  2210. have decisions to be made, do you allow the people in the room
  2211. to have their phones on them?
  2212. - Okay. Because, if — if that's the case, then — I mean, I know,
  2213. for convenience, companies have developed things like Alexa, and
  2214. I don't want to — and other companies are developing things like
  2215. that. But it just seems to me that the whole — part of the whole
  2216. point of those product is not just for your own convenience,
  2217. but, when you're verbally talking about things and then you're
  2218. not on the Internet, they're able to collect information on the
  2219. type of activities that — that you're engaging in. So I'd — I'd
  2220. implore the industry to — to look into that and make sure that,
  2221. in addition to physical — exploring the Internet and collecting
  2222. data, that data being ...
  2223. - ... taken verbally not be allowed. Thank you.
  2224. REP. SUSAN BROOKS (R-IND.)
  2225. - Thank you, Mr. Chairman, and thank you, Mr. Zuckerberg, for
  2226. being here today. It's so critically important that we hear from
  2227. you and your company because we do believe that is critically
  2228. important for you to be a leader in these solutions. One thing
  2229. is that has been talked about just very little, but I think is
  2230. very important and I want to make sure there is appropriate
  2231. attention on how the platform of Facebook but even other
  2232. platforms — and you've mentioned it a little bit — how you help
  2233. us in this country keep our country safe from terrorists. And so
  2234. it's a — I talked with lots of people who actually continue to
  2235. remain very concerned about recruitment of their younger family
  2236. members, and now we're seeing around the globe and enhanced
  2237. recruitment of women as well to join terrorist organizations.
  2238. And so I'm very, very concerned. I'm a former U.S. attorney. And
  2239. so when 9/11 happened, you didn't exist. Facebook did not exist,
  2240. but since the evolution, after 9/11, we know that al-Shabab, al-
  2241. Qaeda, ISIS, has used social media like we could not even
  2242. imagine. So can you please talk about — and then you talked
  2243. about the fact that if there is content that is objectionable or
  2244. is a danger that people report it to you, but what if they
  2245. don't? What if everybody assumes that someone is reporting
  2246. something to you. So I need you to help assure us as well as the
  2247. American people, what is Facebook's role, leadership role, in
  2248. helping us fight terrorism and help us stop the recruitment,
  2249. because it is still a grave danger around the world?
  2250. LOEBSACK
  2251. - ... I think trust that has been the issue today. There's no
  2252. question about it. I think that's what — what I'm hearing from
  2253. my constituents. That's what we're hearing from our colleagues.
  2254. That's really the question: How can we be guaranteed that, for
  2255. example, when you agree to some things today, that you're going
  2256. to follow through, and that we're going to be able to hold you
  2257. accountable. And — and without, perhaps, constructing too many
  2258. rules and regulations — we'd like to keep that to a minimum if
  2259. we possibly can. But I do understand that you have agreed that
  2260. we're going to have to have some rules and regulations so that
  2261. we can protect people's privacy, so that we can protect that use
  2262. of the consumer data. So, going forward from there, I've just
  2263. got a — a few questions I'll probably have an opportunity to get
  2264. to. The first one goes to the business model issue, because
  2265. you're publicly traded. Is that correct?
  2266. - And you're the CEO.
  2267. - Right. And so I've got Lauren from Solon who asks, “Is it
  2268. possible for Facebook to exist without collecting and selling
  2269. our data?” Is it possible to exist?
  2270. - Is it — is it possible for you to be in business without
  2271. sharing the data? Because that's what you have done, whether it
  2272. was selling or not — sharing the data, providing it to Cambridge
  2273. Analytica and other folks along the way. Is it possible for your
  2274. business to exist without doing that?
  2275. - Okay, thank you. I — I appreciate that. And then Brenda from
  2276. Muscatine — she has a question, obviously, related to trust, as
  2277. well, and that is, how will changes promised this time be proven
  2278. to be completed? She'd like to know. How's that going to happen?
  2279. If there are changes — you said there have been some changes —
  2280. how can she and those folks in our districts, and throughout
  2281. America — not just members of Congress, but how can folks in our
  2282. districts hold you accountable? How do they know that those
  2283. changes are, in fact, going to happen? That's what that
  2284. question's about.
  2285. - Thank you. And, finally, Chad from Scott County wants to know,
  2286. “Who has my data, other than Cambridge Analytica?”
  2287. - ... thank you, Mr. Chair.
  2288. CARDENAS
  2289. - Okay. It — just so you know, just brought to my attention — my
  2290. staff texted me a little while ago that the CEO of Cambridge
  2291. Analytica apparently stepped down, some time today. I don't know
  2292. if anybody of your team there whispered that to you, but my
  2293. staff just reported that. That's interesting. The fact that the
  2294. CEO of Cambridge Analytica stepped down — does that in and of
  2295. itself solve the issue and the controversy around what they did?
  2296. - But some of that information did originate with Facebook,
  2297. correct?
  2298. - Something was brought to my attention most recently that
  2299. apparently safe book — Facebook does, in fact, actually buy
  2300. information to add or augment the information that you have on
  2301. some of your users, to build, around them, their profile.
  2302. - But you did do that to build your company, in the past?
  2303. - But you did engage in that, as well — not just everybody else,
  2304. but Facebook yourselves — you did engage in that?
  2305. - Okay. It's my understanding that, when The Guardian decided to
  2306. report on the Cambridge Analytica consumer data issue, Facebook
  2307. threatened to sue them if they want forward with their — their
  2308. story. It appears — did it happen something like that? Facebook
  2309. kind of warned them, like, “Hey, maybe you don't want to do
  2310. that”?
  2311. - So, in other words, you checking The Guardian and saying,
  2312. “You're not going to want to go out with that story because it's
  2313. not 100 percent factual” — that's ...
  2314. - Okay. Now — but, however, they did go through with their story,
  2315. regardless of the warnings or the threats of Facebook saying
  2316. that “You don't — not going to want to do that.” When they did —
  2317. did do that — and only then did Facebook actually apologize for
  2318. that incident, for that 89 million users' information,
  2319. unfortunately, ending up in their hands. Isn't that the case?
  2320. - Okay.
  2321. - Thank you.
  2322. - Okay. But I only have a few more seconds. My — my main point is
  2323. this: I think it's time that you, Facebook — if you want to
  2324. truly be a leader in all the sense of the word and recognize
  2325. that you can, in fact, do right by American users of Facebook
  2326. and when it comes information, unfortunately, getting in the
  2327. wrong hands — you can be a leader. Are you committed to actually
  2328. being a leader in that sense?
  2329. - Can you give a two second answer?
  2330. - Thank you very much. Thank you, Mr. Chairman.
  2331. REP. JOHN SHIMKUS (R-ILL.)
  2332. - Thank you, Mr. Chairman. Thank you for being here, Mr.
  2333. Zuckerberg. Two things: First of all, I want to thank Facebook.
  2334. You streamlined our Congressional Baseball Game last year. We've
  2335. got the managers here, and I was told that, because of that, we
  2336. raised an additional $100,000 for D.C. literacy and feeding kids
  2337. and stuff. So that's a — the other thing is, I — I usually put
  2338. my stuff up on the TV. I don't want to do it very much, because
  2339. my dad — and he'd be mad if he went international, like you are
  2340. — and he's been on Facebook for a long time. He's 88. It's been
  2341. good for connecting with kids and grandkids. I just got my
  2342. mother involved on an iPad and — because she can't handle a
  2343. keyboard. And so — and I did this last week. So the — in this
  2344. world — activity — I still think there is a positive benefit for
  2345. my parents to be engaged on this platform. So — but there's
  2346. issues, as being raised today. And so I'm going to go into a
  2347. couple of those. Facebook made — developed access to user and
  2348. friend data back in — your main update was in 2014. So the
  2349. question is, what triggered that update?
  2350. CASTOR
  2351. - Yes or no?
  2352. - No, you're collecting — you have already acknowledged that you
  2353. are doing that for security purposes, and commercial purposes.
  2354. So you are — you're collecting data outside of Facebook. When
  2355. someone goes to a website, and it has the Facebook like or
  2356. share, that data is being collected by Facebook, correct?
  2357. - Yes or no.
  2358. - Yeah, so for people who don't even have Facebook — I don't
  2359. think that the average American really understands that today,
  2360. something that fundamental, and that you're tracking everyone's
  2361. online activities. Their searches, you can track what people
  2362. buy, correct?
  2363. - You're collecting that data, what people purchase online, yes
  2364. or no?
  2365. - Because it has a share button, so it's — it's — it's gathering.
  2366. Facebook has the application. In fact, you've patented
  2367. applications to do just that, isn't that correct? To collect
  2368. that data?
  2369. - But they — they track you. You want — you're collecting medical
  2370. data, correct, on — on people that — that are on the Internet,
  2371. whether they're Facebook users or not, right?
  2372. - And you're collecting — you watch where we go. Senator Durbin
  2373. had a — had a funny question yesterday about where you're
  2374. staying, and you didn't want to share that, but you — Facebook
  2375. also gathers that data about where we travel, isn't that
  2376. correct?
  2377. - I'm going to get to that, but yes, you are — would you just
  2378. acknowledge if yes, Facebook is — that's the business you're in,
  2379. gathering data and aggregating that data, right?
  2380. - You're not — are you saying you do not gather data on — on
  2381. where people travel, based upon their Internet, and the — the
  2382. ways they sign in, and things like that?
  2383. - Primary, but the — the other way that Facebook gathers data is
  2384. you buy data from data brokers, outside of the platform,
  2385. correct?
  2386. - But I think in the end, I think what — see, it's — it's
  2387. practically impossible these days to remain untracked in
  2388. America. For all the benefits Facebook has brought, and — and
  2389. the Internet, and that's not part of the bargain. And current
  2390. laws have not evolved, and the Congress has not adopted, laws to
  2391. — to address digital surveillance, and Congress should act. And
  2392. I do not believe that the controls, the opaque agreement,
  2393. consent agreements and settings are an adequate substitute for
  2394. fundamental privacy protections for consumers. Now some ...
  2395. - Thank you. I yield back my time.
  2396. - Let that stand. And I'd like to ask unanimous consent that I
  2397. put my constituents' questions in the record.
  2398. - Thank you.
  2399. REP. BILL JOHNSON (R-OHIO)
  2400. - Thank you, Mr. Chairman. Mr. Zuckerberg, thanks for joining us
  2401. today. Let me add my list — my name to the list of folks that
  2402. you're going to get back to on the rural broadband Internet
  2403. access question. Please add my name to that list.
  2404. ESHOO
  2405. - So these are a series of just yes-no questions. Do you think
  2406. you have a moral responsibility to run a platform that protects
  2407. our democracy? Yes or no.
  2408. - Have users of Facebook who are caught up in the Cambridge
  2409. Analytica debacle been notified?
  2410. - Will Facebook offer to all of its users a blanket opt-in to
  2411. share their privacy data with any third-party users?
  2412. - Well, let — let me just add that it is a minefield in order to
  2413. do that. And you have to make it transparent, clear, in
  2414. pedestrian language, just once, “This is what we will do with
  2415. your data. Do you want this to happen, or not?” So I — I think
  2416. that this is being blurred. I — I think you know what I mean by
  2417. it. Are you aware of other third-party information mishandlings
  2418. that have not been disclosed?
  2419. - So you're not sure?
  2420. - What does that mean?
  2421. - So you're not aware.
  2422. - All right. I — I only have four minutes.
  2423. - Was your data included in the data sold to the malicious third
  2424. parties? Your personal data?
  2425. - It was. Are you willing to change your business model in the
  2426. interest of protecting individual privacy?
  2427. - No, are you willing to change your business model in the
  2428. interest of protecting individual privacy?
  2429. - Well, I'll follow up with you on it. When did Facebook learn
  2430. that Cambridge Analytica's research project was actually for
  2431. targeted psychographic political campaign work?
  2432. - Well, no. I — I don't have time for a long answer, though. When
  2433. did Facebook learn that? And, when you learned it, did you
  2434. contact their CEO immediately? And, if not, why not?
  2435. - We know what happened with them. But I'm asking you.
  2436. - Yes. All right.
  2437. - So, in 2015, you learned about it?
  2438. - And you spoke to their CEO immediately?
  2439. - Did you speak to their CEO immediately?
  2440. - Thank you.
  2441. REP. RAUL RUIZ (D-CALIF.)
  2442. - Thank you, Mr. Chairman, and thank you, Mr. Zuckerberg, for
  2443. appearing before the committee today. The fact, is Mr.
  2444. Zuckerberg, Facebook failed its customers. You said as much
  2445. yourself. You've apologized and we appreciate that. We as
  2446. Congress have a responsibility to figure out what went wrong
  2447. here and what could be done differently to better protect
  2448. consumers private digital data in the future. So my first
  2449. question for you, Mr. Zuckerberg, is why did Facebook not notify
  2450. the FTC in 2015 when you first discovered this had happened, and
  2451. was it the legal opinion of your current company that you are
  2452. under no obligation to notify the FTC, even with the 2011
  2453. consent order in place?
  2454. REP. H. MORGAN GRIFFITH (R-VA.)
  2455. - Thank you very much, Mr. Chairman. I appreciate — appreciate
  2456. you being here. Let me state up front that I share the privacy
  2457. concerns that you've heard from a lot of us, and I appreciate
  2458. your statements and willingness to, you know, help us figure out
  2459. a solution that's good for the American people. So I appreciate
  2460. that. Secondly, I have to say that it's my understanding that,
  2461. yesterday, Senator Shelley Moore Capito, my friend in my
  2462. neighboring state of West Virginia, asked you about Facebook's
  2463. plans with rural broadband, and you agreed to share that
  2464. information with her at some point in time, get her up to date
  2465. and up to speed. I was excited to hear that you were excited
  2466. about that and passionate about it. My district is very similar
  2467. to West Virginia, as it borders it and we have a lot of rural
  2468. areas. Can you also agree, yes or no, to update me on that when
  2469. the information is available?
  2470. REP. JOHN SARBANES (D-MD.)
  2471. - Thank you, Mr. Chairman. Good morning, Mr. Zuckerberg. I wanted
  2472. to get something in the record quickly, before I move to some
  2473. questions. You had suggested in your testimony over the last
  2474. couple of days that Facebook notified the Trump and Clinton
  2475. campaigns of Russian attempts to hack in to those campaigns. But
  2476. representatives of both campaigns, in the last 24 hours, have
  2477. said that didn't happen. So we're going to follow up on that and
  2478. find out what the real story is.
  2479. BROOKS
  2480. - Can I ask though — and I appreciate, and I heard you say 99
  2481. percent — and yet I didn't go out and, you know, look for this,
  2482. but yet, as recently as March 29th ISIS content was discovered
  2483. on Facebook, which included an execution video, March 29th. On
  2484. April 9th there were five pages located, on April 9th, of
  2485. Hezbollah content, and so forth. And so, what is the mechanism
  2486. that you're using? Is it artificial intelligence? Is it the
  2487. 20,000 people? What are you using to — because it's not — I
  2488. appreciate that no system is perfect, but yet this is just
  2489. within a week.
  2490. - How large is it?
  2491. - And so you might have those people looking for the content. How
  2492. are they helping block the recruiting?
  2493. - Is it still — your platform as well as Twitter and then
  2494. WhatsApp is how they then begin to communicate which I
  2495. understand you own. Is that correct?
  2496. - So how are we stopping the recruiting and the communications?
  2497. - Thank you. My time is up. I thank you and please continue to
  2498. work with us and all the governments who are trying to fight
  2499. terrorism around the world.
  2500. GRIFFITH
  2501. - I appreciate that. And we've got a lot of drone activity going
  2502. on in our district, whether it's University of Virginia in Wise,
  2503. or Virginia Tech. So we'd be happy to help out there, too. Let
  2504. me — let me switch gears. You talked about trying to ferret out
  2505. misinformation. And the question becomes, who decides what is
  2506. misinformation? So, when the — some of my political opponents
  2507. put on Facebook that, you know, they think Morgan Griffith is a
  2508. bum, I think that's misinformation. What say you? (LAUGHTER)
  2509. - And I appreciate that. My time is running out. I do want to
  2510. point this out, though, as part of that: You know, who is going
  2511. to decide what is misinformation? We've heard about the Catholic
  2512. University and the cross. We've heard about a candidate. We've
  2513. heard about the conservative ladies; a firearms shop, lawful, in
  2514. my district had a similar problem. It has also been corrected.
  2515. And so I wonder if the industry has thought about — not only are
  2516. we looking at it, but has the industry thought about doing
  2517. something like Underwriters Laboratories, which was set up when
  2518. electricity was new to determine whether or not the devices were
  2519. safe? Have you all thought about doing something like that, so
  2520. it's not Facebook alone, but the industry, saying, “Wait a
  2521. minute, this is probably misinformation,” and setting up
  2522. guidelines that everybody can agree are fair?
  2523. - I yield back.
  2524. SARBANES
  2525. - No, I'd like — I'd like to move on. You can provide a response
  2526. to that in writing, if you would. Let me ask you, is it true
  2527. that Facebook offered to provide what I guess you referred to as
  2528. “dedicated campaign embeds” to both of the presidential
  2529. campaigns?
  2530. - Just say yes or no, were there embeds ...
  2531. - ... I need to get to that because I don't have time. Were there
  2532. embeds in the two campaigns, or offers of embeds?
  2533. - Yes or no.
  2534. - Were there embeds offered to the Trump campaign and the Clinton
  2535. campaign?
  2536. - Okay. So sales support — I'm going to refer to that as embeds.
  2537. And I gather that Mr. Trump's campaign ultimately accepted that
  2538. offer. Is that correct? Yes or no.
  2539. - Okay. So they had embeds.
  2540. - I'm going to refer to those as embeds. What I'd like you to do,
  2541. if you could — we're not going to have time for you to do this
  2542. now — but, if you could provide to the committee both the
  2543. initial offer terms, and then any subsequent offer terms that
  2544. were presented to each candidate, in terms of what the embed
  2545. services would be, that would be very helpful. Do you know how
  2546. many ads were approved for display on Facebook for each of the
  2547. presidential candidates — by Facebook?
  2548. - Okay. Let me tell you what they were, because I do. President
  2549. Trump's campaign had an estimated 5.9 million ads approved, and
  2550. Secretary Clinton, 66,000 ads. So that's a delta of about 90
  2551. times as much on the Trump campaign, which raises some questions
  2552. about whether the ad approval processes were maybe not processed
  2553. correctly or inappropriately bypassed in the final months and
  2554. weeks of the election by the Trump campaign. And what I'm
  2555. worried about is that the embeds may have helped to facilitate
  2556. that. Can you say with absolute certainty that Facebook or any
  2557. of the Facebook employees working as campaign embeds did not
  2558. grant any special approval rights to the Trump campaign to allow
  2559. them to upload a very large number of Facebook ads in that final
  2560. stretch?
  2561. - Can you say that there were not special approval rights
  2562. granted? Is that what you're saying — there were not special
  2563. approval rights granted by any of the embeds — or support folks,
  2564. as you call them — in that Trump campaign?
  2565. - Yes or no.
  2566. - Okay. All right. If you're saying yes ...
  2567. - ... if you're saying yes, then I'll take you at your word. The
  2568. reason this is important and the reason we need to get to the
  2569. bottom of it is because it could be a serious problem if these
  2570. kinds of services were provided beyond what is offered in the
  2571. normal course, because that could result in violation of
  2572. campaign finance law, because it would be construed as an in-
  2573. kind contribution — corporate contribution from Facebook, beyond
  2574. what — the sort of ad-buy opportunity you would typically
  2575. provide. The reason I'm asking you these questions is because
  2576. I'm worried that that embed program has the potential to become
  2577. a tool for Facebook to solicit — solicit favor from
  2578. policymakers, and that, then, creates the potential for real
  2579. conflict of interest. And I think a lot of Americans are waking
  2580. up to the fact that Facebook is becoming sort of a self-
  2581. regulated superstructure for political discourse. And the
  2582. question is, are we, the people, going to regulate our political
  2583. dialogue? Or are you, Mark Zuckerberg, going to end up
  2584. regulating the political discourse?
  2585. - So we need to be free of that undue influence. I thank you for
  2586. being here ...
  2587. - ... and I yield back my time.
  2588. SCALISE
  2589. - That's a public service announcement we just made, so
  2590. appreciate you ... (LAUGHTER) ... joining me in that. And Mr.
  2591. Shimkus's question — it was really a follow-up to a question
  2592. yesterday that — that you weren't able to answer, but it was
  2593. dealing with how Facebook tracks users, especially after they
  2594. log off. And you had said, in relation to Congressman Shimkus's
  2595. question, that there is data mining, but it goes on for security
  2596. purposes. So my question would be, is that data that is mined
  2597. for security purposes also used to sell as part of the business
  2598. model?
  2599. - All right. If you could follow up, I would appreciate that.
  2600. Getting into this — this new realm of content review, I know
  2601. some of the people that work for Facebook — Campbell Brown said,
  2602. for example, “This is changing our relationship with publishers
  2603. and emphasizing something that Facebook has never done before:
  2604. It's having a point of view.” And you mentioned the Diamond and
  2605. Silk example, where there — you — you, I think, described it as
  2606. a mistake. Were the people who made that mistake held
  2607. accountable in any way?
  2608. - Okay.
  2609. - I do want to ask you about a study that was done dealing with
  2610. the algorithm that Facebook uses to describe what is fed to
  2611. people through the news feed. And what they found was, after
  2612. this new algorithm was implemented, that there was a tremendous
  2613. bias against conservative news and content, and a favorable bias
  2614. toward liberal content. And, if you can look at that, that shows
  2615. a 16-point disparity, which is concerning. I would imagine
  2616. you're not going to want to share the algorithm itself with us.
  2617. I'd encourage you if you wanted to do that. But who develops the
  2618. algorithm? I wrote algorithms before, and you can determine
  2619. whether or not you want to write an algorithm to sort data, to
  2620. compartmentalize data; but you can also put a bias in, if that's
  2621. the directive. Was there a directive to put a bias in? And,
  2622. first, are you aware of this bias that many people have looked
  2623. at and analyzed and seen?
  2624. - And I know we're — we're almost out of time. So, if you can go
  2625. back and look and determine if there was a bias — whoever
  2626. developed that software — you have 20,000 people that work on
  2627. some of this data analysis — if you can look and see if there is
  2628. a bias and let us know if there is and what you're doing about
  2629. it, because that is disturbing, when you see that kind of
  2630. disparity. Finally, there has been a lot of talk about Cambridge
  2631. and what they've done and the last campaign. In 2008 and 2012,
  2632. there was also a lot of this done. One of the lead digital heads
  2633. of the Obama campaign said recently, “Facebook was surprised we
  2634. were able to suck out the whole social graph, but they didn't
  2635. stop us once they realized that was what we were doing. They
  2636. came to office in the days following the election recruiting and
  2637. were very candid that they allowed us to do things they wouldn't
  2638. have allowed someone else to do, because they were on our side.”
  2639. That's a direct quote from one of the heads of the Obama digital
  2640. team. What — what would she mean by they — Facebook — were on
  2641. our side?
  2642. - So she was making an inaccurate statement, in your point of
  2643. view?
  2644. - ... the comments and look forward to those answers. Yield back
  2645. the balance of my time.
  2646. CARTER
  2647. - Thank you, Mr. Chairman. Thank you, Mr. Zuckerberg for being
  2648. here. You're almost done. When you get to me, that means you're
  2649. getting close to the end. So congratulations. Thank you for
  2650. being here. We do appreciate it. You know, you wouldn't be here
  2651. if it wasn't for the — the privacy — people's information and
  2652. the privacy, and — and the fact that we had — you had this laps.
  2653. You know all about fake news, you know all about foreign
  2654. intervention. I know you're concerned about that. I want to talk
  2655. about just a — a few different subjects, if you will. And I'd
  2656. like to ask you just some yes or no questions, please excuse my
  2657. redundancy. I know that some members have already asked you
  2658. about some of these subjects, but I would like to ask you. Mr.
  2659. Zuckerberg, did you know that 91 people die every day because
  2660. opioid addiction? Yes or no, did you know that? Ninety one
  2661. people every day.
  2662. - Did you know that there's — it's estimated to be between two
  2663. and a half to 11 and a half million people in this country right
  2664. now who are addicted to opioids?
  2665. - Okay, did you know that the average age of Americans has
  2666. decreased for the first time in decades as a result of — what
  2667. people are saying is a result of the opioid epidemic?
  2668. - Absolutely. I ask you this because some of the other members
  2669. have mentioned that — about the ads for fentanyl and other
  2670. illicit drugs that are on the Internet, and the — where you can
  2671. buy them, and about your responsibility to — to monitor that and
  2672. make sure that's not happening. I had the opportunity this past
  2673. week to speak at the Prescription Drug Abuse and Heroin Summit
  2674. in Atlanta that Representative Hal Rogers started some years
  2675. ago. Also we had the FDA Commissioner there, and he mentioned
  2676. the fact that he's going to meeting with CEO's of Internet
  2677. companies to discuss this problem. I hope that you will be
  2678. willing to at least have someone there to meet with him so that
  2679. we can get your help in this, this is extremely important.
  2680. - Okay, let me ask you another question. Mr. Zuckerberg, did you
  2681. know that there are groups of conservations — there are
  2682. conservation groups that have provided evidence to the
  2683. Securities and Exchange Commission that endangered wildlife
  2684. goods, in particular ivory is extensively traded on closed
  2685. groups on Facebook?
  2686. - Okay, let me — all right, well let me ask you, did you know
  2687. that there are some conservation groups that assert that there's
  2688. so much ivory being sold on Facebook that it's literally
  2689. contributing to the extent — to the extinction of the elephant
  2690. species?
  2691. - Okay, and — and did you know that the American — or excuse me,
  2692. the Motion Picture Association of America is having problems
  2693. with piracy of movies and of their products, and that not only
  2694. is this challenging their profits, but their very existence. Did
  2695. you know that that was a problem?
  2696. - It has been. It has been, so you did know that. Well the reason
  2697. I ask you this is that I just want to make sure that I
  2698. understand you have an understanding of a commitment. Look I —
  2699. you said earlier, may have been yesterday that hate speech is
  2700. difficult to discern. And I get that, I understand that and
  2701. you're absolutely right. But these things are not and we need
  2702. your help with this. Now I will tell you there are members of
  2703. this body who would like to see the Internet monitored as a
  2704. utility. I am not one of those, I believe that that would be the
  2705. worst thing we could do. I believe it would stifle innovation, I
  2706. don't think you can legislate morality and I don't want to try
  2707. and do that. But we need a commitment from you that these things
  2708. that can be controlled like this, that you will help us. And
  2709. that you'll work with law enforcement to — to help us with this.
  2710. Look, you love America, I know that, we all know that. We need
  2711. your help here. We don't — I don't want Congress to have to act.
  2712. You — you want to see a mess, you let the federal government get
  2713. into this. You'll see a mess, I assure you.
  2714. - Please, we — we need your help with this. And I just need that
  2715. commitment, can I get that commitment?
  2716. - Thank you very much.
  2717. HARPER
  2718. - Okay. Now according to PolitiFact.com, and this is a quote,
  2719. “The Obama campaign and Cambridge Analytica both gained access
  2720. to huge amounts of information about Facebook users and their
  2721. friends, and in neither case did the friends of app users
  2722. consent,” close quote. This data that Cambridge Analytica
  2723. acquired was used to target voters with political messages, much
  2724. as the same type of data was used by the Obama campaign to
  2725. target voters in 2012. Would that be correct?
  2726. - Sure.
  2727. - And — and, whether in violation of the agreement or not, you —
  2728. you agree that users have an expectation that their information
  2729. would be protected and remained private, and not be sold. And so
  2730. that's something — the — the reason that we're here today. You
  2731. know, and I can certainly understand the general public's
  2732. outrage if they're concerned regarding the way Cambridge
  2733. Analytica required their information. But, if people are
  2734. outraged because they use that for political reasons, would that
  2735. be hypocritical? Shouldn't they be equally outraged that the
  2736. Obama campaign used the — the data of Facebook users without
  2737. their consent in 2012?
  2738. - Thank you.
  2739. - Thank you, Mr. Zuckerberg. My time is expired — yield back.
  2740. REP. JOE BARTON (R-TEX.)
  2741. - Well, thank you. And thank you, Mr. Zuckerberg for being here.
  2742. People need to know that you're here voluntarily. You're not
  2743. here because you've been subpoenaed. So we appreciate that.
  2744. Sitting behind you — have a gentleman who used to be counsel for
  2745. the committee, Mr. Jim Barnett. And, if he's affiliated with
  2746. Facebook, you've got a good one. If he's not, he's just got a
  2747. great seat. I don't know ... (LAUGHTER) ... know what it is. I'm
  2748. going to read you a question that I was asked. I got this
  2749. through Facebook, and I've got dozens like this. So, my first
  2750. question: “Please ask Mr. Zuckerberg, why is Facebook censoring
  2751. conservative bloggers such as Diamond and Silk? Facebook called
  2752. them unsafe to the community. That is ludicrous. They hold
  2753. conservative views. That isn't unsafe.” What's your response to
  2754. ...
  2755. REP. G.K. BUTTERFIELD (D-N.C.)
  2756. - Thank you, Mr. Chairman, and thank you, Mr. Zuckerberg, for
  2757. your testimony here today. Mr. Zuckerberg, you have stated that
  2758. your goal with Facebook is to build strong communities. And,
  2759. certainly, that sounds good. You've stated here today, on the
  2760. record, that you did not live up to the privacy expectations.
  2761. And I appreciate that. But this committee — and you must know
  2762. this — this committee is counting on you to right a wrong. And I
  2763. hope you get it. In my opinion, Facebook is here to stay, and so
  2764. you have an obligation to protect the data that you collect and
  2765. the data that you use. And Congress has the power to regulate
  2766. your industry, and we have the power to penalize misconduct. But
  2767. I want to go in a different direction today, sir. You and your
  2768. team certainly know how I feel about racial diversity in
  2769. corporate America. And Sheryl Sandberg and I talk about that all
  2770. of the time. Let me ask you this — and — and the Congressional
  2771. Black Caucus has been very focused on — on holding your industry
  2772. accountable — not just Facebook, your industry — accountable for
  2773. increasing African American inclusion at all levels of the
  2774. industry. And I know you've — have a number of diversity
  2775. initiatives. In 2017, you've increased you black representation
  2776. from 2 percent to 3 percent. While this is a small increase,
  2777. it's better than none. And this does not nearly meet the
  2778. definition of building a racially diverse community. CEO
  2779. leadership — and I have found this to be absolutely true — CEO
  2780. leadership on issues of diversity is the only way that the
  2781. technology industry will change. So will you commit, sir, to
  2782. convene — personally convene a meeting of CEOs in — in your
  2783. sectors, many of them — them — all of them, perhaps are your
  2784. friends — and to do this very quickly to develop a strategy to
  2785. increase racial diversity in the technology industry?
  2786. MCMORRIS RODGERS
  2787. - Okay. And, even focusing on content for here in America, I'd
  2788. like to shift gears just a little bit and talk about Facebook's
  2789. recent changes to its news feed algorithm. Your head of news
  2790. partnerships recently said that Facebook is, quote, “taking a
  2791. step to define what quality news looks like and give that a
  2792. boost so that, overall, there is a less — there is less
  2793. competition from news.” Can you tell me what she means by “less
  2794. competition from news”? And also, how does Facebook objectively
  2795. determine what is acceptable news and what safeguards exist to
  2796. ensure that, say, religious or conservative content is treated
  2797. fairly?
  2798. - Well, maybe I'll just go on to my other questions, then.
  2799. There's an issue of content discrimination, and it's not a
  2800. problem unique to Facebook. There's a number of high-profile
  2801. examples of edge providers engaging in blocking and censoring
  2802. religious and conservative political content. In November, FCC
  2803. Chairman Pai even said that edge providers routinely block or
  2804. discriminate against content they don't like. This is obviously
  2805. a serious allegation. How would you respond to such an
  2806. allegation? And what is Facebook doing to ensure that its users
  2807. are being treated fairly and objectively by content reviewers?
  2808. - Over Easter, a Catholic university's ad with a picture of a
  2809. historic San Damiano cross was rejected by Facebook. Though
  2810. Facebook addressed the error within days, that it happened at
  2811. all is deeply disturbing. Could you tell me what was so
  2812. shocking, sensational or excessively violent about the ad to
  2813. cause it to be initially censored? Given that your company has
  2814. since said that it did not violate terms of service, how can
  2815. users know that their content is being viewed and judged
  2816. accordingly — to objective standards?
  2817. - Thank you. And I — I just — this — this is — important issue in
  2818. building trust.
  2819. - And that is going to be important as we move forward. Thank
  2820. you, and I yield back.
  2821. REP. ELIOT L. ENGEL (D-N.Y.)
  2822. - Thank you, Mr. Chairman. Mr. Zuckerberg, you have roots in my
  2823. district, the 16th congressional district of New York. I know
  2824. that you attended Ardsley High School and — and grew up in
  2825. Westchester County. As you know, Westchester has a lot to offer,
  2826. and I hope that you might commit to returning to Westchester
  2827. County, perhaps to do a forum on — on this and some other
  2828. things. I hope you would consider that. We'll — we'll be in
  2829. touch — in touch with you. But I know that Ardsley High School's
  2830. very proud of you. You mentioned yesterday that Facebook was
  2831. deceived by Aleksandr Kogan when he sold user information to
  2832. Cambridge Analytica. Does Facebook, therefore, plan to sue
  2833. Aleksandr Kogan, Cambridge University or Cambridge Analytica,
  2834. perhaps, for unauthorized access to computer networks, exceeding
  2835. access to computer networks or breach of contract? And why or
  2836. why — why not?
  2837. BUTTERFIELD
  2838. - Well, we've talked with you over the years about this. And,
  2839. while there has been some marginal improvement, we — we must do
  2840. better than we have done. Recently, you appointed an African-
  2841. American — our friend, Ken Chenault — to your board. And, of
  2842. course, Erskine Bowles is already on your board, who is also a
  2843. friend. But — but we've — we've got to concentrate more on board
  2844. membership for African Americans, and also minorities at the
  2845. entry level in — within your company. I was looking at your
  2846. website a few minutes ago, and it looks like you list five
  2847. individuals as leadership in your company, but none of them is
  2848. African American. I was just looking at it — not only you and
  2849. Sheryl, but David (sic), Mike and Chris — that is your
  2850. leadership team. And this does not reflect America. Can you
  2851. improve the numbers on your leadership team to be more diverse?
  2852. - Not on your website.
  2853. - We can do better than that, Mr. Zuckerberg. We certainly can.
  2854. Do you plan to add an African-American to your leadership team
  2855. in the foreseeable future? And will you commit that you will
  2856. continue to work with us, the Congressional Black Caucus, to
  2857. increase diversity within your company that you're so proud of?
  2858. - We also find that companies' failure to retain black employees
  2859. contributes to their low presence at technology companies. And
  2860. there is little transparency in retention numbers. So will you
  2861. commit to providing numbers on your retention — that's the big
  2862. word — retention of your employees, disaggregated by race, in
  2863. your diversity update, starting this year? Can we get that data?
  2864. That — that's — that's the starting point.
  2865. - I'm out of time, sir. I'll take this up with your team in
  2866. another setting.
  2867. - We'll be out there in a few weeks. Thank you. I yield back.
  2868. GREEN
  2869. - Okay. And you commit today that Facebook will extend the same
  2870. protections to Americans that European users — users will
  2871. receive under the GDPR?
  2872. - There are many requirements in the GDPR, so I'm just going to
  2873. focus on a few of them. The GDPR requires that the company's
  2874. request for user consent — to be requested in a clear and
  2875. concise way, using language that is understandable, and be
  2876. clearly distinguishable from other pieces of information,
  2877. including terms and conditions. How will that requirement be
  2878. implemented in the United States?
  2879. - One of the GDPR's requirements is data portability. Users must
  2880. be able to — permitted to request a full copy of their
  2881. information and be able to share that information with any
  2882. companies that they want to. I know Facebook allows users in the
  2883. U.S. to download their Facebook data. Does Facebook plan to use
  2884. the currently existing ability of users to download their
  2885. Facebook data as the means to comply with the GDPR's data
  2886. portability requirement?
  2887. - Does that download file include all the information Facebook
  2888. has collected about any given individual? In other words, if I
  2889. download my Facebook information, is there other information
  2890. accessible to you within Facebook that I wouldn't see on that
  2891. document, such as browsing history or other inferences that
  2892. Facebook has drawn from users for advertising purposes?
  2893. - GDPR also gives users the right to object to the processing of
  2894. their personal data for marketing purposes, which, according to
  2895. Facebook's website, includes custom micro-target audiences for
  2896. advertising. Will the same right be object — to object be
  2897. available to Facebook users in the United States? And how will
  2898. that be implemented?
  2899. - Okay. Thank you, Mr. Chairman. And again, is the small —
  2900. Facebook conducted, a couple years ago, an effort in our
  2901. district in Houston for our small businesses. And it was one of
  2902. the most successful outreach I've seen. So I appreciate that
  2903. outreach to helping small businesses use Facebook to market
  2904. their products. Thank you, Mr. Chairman.
  2905. BARTON
  2906. - Well, Facebook does tremendous good. When — when I met you in
  2907. my office, eight years ago — you don't remember that. But I've
  2908. got a picture of you when you had curly hair and Facebook had
  2909. 500 million users. Now, it's got over 2 billion. That's a
  2910. success story in — in anybody's book. It's such an integral part
  2911. of, certainly, young Americans' lives that you need to work with
  2912. Congress and the community to ensure that it is a neutral, safe
  2913. and, to the largest extent possible, private platform. Do you
  2914. agree with that?
  2915. - Okay. Let's talk about children. Children can get a Facebook
  2916. account of their own, I believe, starting at age 13. Is that not
  2917. correct?
  2918. - Okay. Is there any reason that we couldn't have just a no-data-
  2919. sharing policy, period, until you're 18? Just — if you're a
  2920. child with your own Facebook account, until you reach the age of
  2921. 18, you know, it's — it's — you know, you can't share anything.
  2922. It's — it's their data, their picture — it doesn't — it doesn't
  2923. go anywhere. Nobody gets to scrape it; nobody gets to access it.
  2924. It's absolutely, totally private. Well, it's — for children.
  2925. What's wrong with that?
  2926. - Will we let them opt in to do that?
  2927. - But don't — you know, unless they specifically allow it, then
  2928. don't allow it. That's my point.
  2929. - I'm — I'm about out of time. I — I actually use Facebook, and,
  2930. you know, I know, if you take the time, you can go to your
  2931. privacy and click on that. You can go to your settings and click
  2932. on that. You can pretty well set up your Facebook account to —
  2933. to be almost totally private. But you have to really work at it.
  2934. And my time's expired. Hopefully we can do some questions in
  2935. writing as a follow-up. Thanks, Mr. Chairman.
  2936. REP. GENE GREEN (D-TEX.)
  2937. - Thank you, Mr. Chairman, and welcome to our committee. I want
  2938. to follow up on what my — my friend from North Texas talked
  2939. about on — on his cartoon. Next month, the General Data
  2940. Protection Regulation — the GDPR — goes into effect in the
  2941. European Union. The GDPR is pretty prescription on —
  2942. prescriptive on how companies treat consumer data. And it makes
  2943. it clear that consumers need to be in control of their own data.
  2944. Mr. Zuckerberg, Facebook has committed to abiding to these
  2945. consumer protections in Europe, and you face large penalties if
  2946. they don't. In recent days, you've said that Facebook intends to
  2947. make the same settings available to users everywhere, not only
  2948. in Europe. Did I understand correctly that Facebook would not
  2949. only make the same settings available, but that it will make the
  2950. same protections available to Americans that they will the
  2951. Europeans?
  2952. REP. BOBBY L. RUSH (D-ILL.)
  2953. - Thank you, Mr. Chairman. Mr. Zuckerberg, welcome. In the 1960s,
  2954. our government, acting through the FBI and local police,
  2955. maliciously tricked individuals and organizations into
  2956. participating in something called COINTELPRO, which was a
  2957. counterintelligence program where they tracked and shared
  2958. information amongst civil rights activists, their political,
  2959. social, city, even religious affiliations. And I personally was
  2960. a victim of COINTELPRO. Your organization, your methodology, in
  2961. my opinion, is similar. You're truncating the basic rights of
  2962. the American promise of life, liberty and the pursuit of
  2963. happiness by the wholesale invasion and manipulation of their
  2964. right to privacy. Mr. Zuckerberg, what is the difference between
  2965. Facebook's methodology and the methodology of the American
  2966. political pariah, J. Edgar Hoover?
  2967. REP. FRANK PALLONE JR. (D-N.J.)
  2968. - Thank you, Mr. Chairman. And I also want to thank you Mr.
  2969. Zuckerberg for being here today. Facebook has become integral to
  2970. our lives. We don't just share pictures of our families, we use
  2971. it to connect for school, to organize events and to watch
  2972. baseball games. Facebook has enabled everyday people to spur
  2973. national political movements. Most of us in Congress use
  2974. Facebook to reach our constituents in ways that were
  2975. unimaginable 10 years ago, and this is certainly a good thing.
  2976. But it also means that many of us can't give it up easily. Many
  2977. businesses have their only web presence on Facebook, and, for
  2978. professions like journalism, people's jobs depend on posting on
  2979. the site. And this ubiquity comes with a price; for all the good
  2980. it brings, Facebook can be a weapon for those, like Russia and
  2981. Cambridge Analytica, that seek to harm us and hack our
  2982. democracy. Facebook made it too easy for a single person — in
  2983. this instance, Aleksandr Kogan — to get extensive personal
  2984. information about 87 million people. He sold this data —
  2985. Cambridge Analytical [sic] — who used it to try to sway the 2016
  2986. presidential election for the Trump campaign. And Facebook made
  2987. itself a powerful tool for things like voter suppression, in
  2988. part by opening its platform to app developers with little or no
  2989. oversight. But it gets worse. The fact is no one knows how many
  2990. people have access to the Cambridge Analytical [sic] data, and
  2991. no one knows how many other Cambridge Analyticas are still out
  2992. there. Shutting down access to data to third parties isn't
  2993. enough, in my opinion. Facebook and many other companies are
  2994. doing the same thing: They're using people's personal
  2995. information to do highly targeted product and political
  2996. advertising. And Facebook is just the latest in a never-ending
  2997. string of companies that vacuum up our data, but fail to keep it
  2998. safe. And this incident demonstrates yet again that our laws are
  2999. not working. Making matters worse, Republicans here in Congress
  3000. continue to block or even repeal the few privacy protections we
  3001. have. In this era of nonstop data breaches, last year,
  3002. Republicans eliminated existing privacy and data security
  3003. protections at the FCC.
  3004. DEGETTE
  3005. - At the end of 2017, Facebook had a total shareholder equity of
  3006. over $74 billion, correct?
  3007. - That's correct. You're the CEO, do you know ...
  3008. - Greater than $74 billion. Last year, Facebook earned a profit
  3009. of $15.9 billion on $40.7 billion in revenue, correct? Yes or
  3010. no.
  3011. - Now, since the revelations surrounding Cambridge Analytica,
  3012. Facebook has not noticed a significant increase in users
  3013. deactivating their accounts. Is that correct?
  3014. - Now, since the revelations surrounding Cambridge Analytica,
  3015. Facebook has also not noticed a decrease in user interaction on
  3016. Facebook. Correct?
  3017. - Okay. Now, I want to take a minute to talk about some of the
  3018. civil and regulatory penalties that we've been seeing. I'm aware
  3019. of two class-action lawsuits that Facebook has settled relating
  3020. to privacy concerns: Lane v. Facebook was settled in 2010. That
  3021. case resulted in no money being awarded to Facebook users. Is
  3022. that correct?
  3023. - Do you — you're — you're the CEO of the company, correct?
  3024. - Now, there — this — this major lawsuit was settled. Do you know
  3025. — do you know about the lawsuit?
  3026. - Do you know about this lawsuit, Lane v. Facebook? Yes or no?
  3027. - Okay. If you can supplement — I'll just tell you, there was
  3028. this lawsuit, and the users got nothing. In another case, Fraley
  3029. v. Facebook, it resulted in a 2013 settlement fund of $20
  3030. million being established, with $15 individual payment — payouts
  3031. to Facebook users, beginning in 2016. Is that correct?
  3032. - You don't know about that one either.
  3033. - Okay. Well, I'll tell you it happened.
  3034. - Okay. Now, as the result of a 2011 FTC investigation into
  3035. Facebook's privacy policy — do you know about that one?
  3036. - Yes.
  3037. - Okay. You entered into a consent decree with the FTC which
  3038. carried no financial penalty for Facebook. Is that correct?
  3039. - You're the CEO of the company, you entered into a consent
  3040. decree, and you don't remember if you had a financial penalty?
  3041. - Yes. I would think a financial penalty would be, too. Okay,
  3042. well, the reason you probably don't remember is because the FTC
  3043. doesn't have the authority to issue financial penalties for
  3044. first-time violations. The reason I'm asking these questions,
  3045. sir, is because we continue to have these abuses and these — and
  3046. these data breaches, but, at the same time, it doesn't seem like
  3047. future activities are prevented. And so I think one of the
  3048. things that we need to look at in the future, as we work with
  3049. you and others in the industry, is putting really robust
  3050. penalties in place in case of — of improper actions. And that's
  3051. why I ask these questions.
  3052. REP. JOSEPH KENNEDY III (D-MASS.)
  3053. - Thank you, Mr. Chairman. Mr. Zuckerberg, thank you for being
  3054. here. Thank you for your patience and — over both days of
  3055. testimony. You spoke about the framing of your testimony about
  3056. privacy, security, and democracy. I want to ask you about
  3057. privacy and democracy, because I think, obviously, those are
  3058. linked. You have said over the course of questioning yesterday
  3059. and today that users own all of their data. So I want to make
  3060. sure that we drill down on that a little bit, but I think our
  3061. colleagues have tried. That includes, I believe, that the
  3062. Facebook — that — the information that Facebook requires users
  3063. to make public — so that would be a profile picture, gender, age
  3064. range — all of which is public-facing information. That's right?
  3065. COSTELLO
  3066. - Thank you, Mr. Chairman. I would echo Congressman Collins
  3067. comments as well. Mr. Zuckerberg, I think that we as Americans
  3068. have a concept of digital privacy rights and privacy that aren't
  3069. necessarily codified. And we're trying to sift through how do we
  3070. actually make privacy rights in a way that are intelligible for
  3071. tech and understandable to the community at large? And so my
  3072. questions are oriented in that fashion. First, if you look at
  3073. GDPR, the E.U. — the law that's about to take effect, what
  3074. pieces of that do you feel would be properly placed in American
  3075. jurisprudence? In other words, right to erasure, right to get
  3076. our data back, right to rectify, could you share with us how you
  3077. see that playing out, not just for you, but for the smaller
  3078. companies. Because I do believe you have a sincere interest in
  3079. seeing small tech companies prosper.
  3080. - Do you feel you should be able to deploy AI for facial
  3081. recognition for a non-FB user?
  3082. - Right.
  3083. - Two — two quick ones. Does — is Facebook, in utilizing that
  3084. platform, ever a publisher in your mind?
  3085. - You would say you're responsible for content, right, you said
  3086. that yesterday. Are you ever a publisher, as the term is legally
  3087. used?
  3088. - Would you ever be legally responsible for the content that is
  3089. put onto your platform?
  3090. - Right.
  3091. - Agreed.
  3092. - Which is what I think Chairman's Walden question was upfront.
  3093. Right.
  3094. - My big concern, I'm going to run out of time, is that a — is
  3095. someone limits their data to not being used for something that
  3096. it might potentially be used for that they have no idea what it
  3097. — how it might actually socially benefit. And I'm out of time,
  3098. but I would like for you to share at later point in time, how
  3099. the data that you get might be limited by user and your
  3100. inability to use that data may actually prevent the kind of
  3101. innovation that would bring about positive social change in this
  3102. country. Because I do believe that was the intention and
  3103. objective to — of your company. And I do believe you perform it
  3104. very, very, very well in a lot of ways. Thank you. I yield back.
  3105. REP. KATHY CASTOR (D-FLA.)
  3106. - Thank you, Mr. Chairman. Welcome, Mr. Zuckerberg. For all of
  3107. the benefits that Facebook has provided in building communities
  3108. and connecting families, I think a devil's bargain has been
  3109. struck. And, in the end, Americans do not like to be
  3110. manipulated. They do not like to be spied on. We don't like it
  3111. when someone is outside of our home, watching. We don't like it
  3112. when someone is following us around the neighborhood or, even
  3113. worse, following our kids or stalking our children. Facebook now
  3114. has evolved to a place where you are tracking everyone. You are
  3115. collecting data on just about everybody. Yes, we understand the
  3116. Facebook users that — that proactively sign in, they're in part
  3117. of the — that platform, but you're following Facebook users even
  3118. after they log off of that platform and application, and you are
  3119. collecting personal information on people who do not even have
  3120. Facebook accounts. Isn't that right?
  3121. SHIMKUS
  3122. - So, if I can interrupt, it's just — you identified that there
  3123. was possibly social scraping going on?
  3124. - Yeah. Let me go to your announcement of audits. Who's going to
  3125. conduct the audit? We're talking about — are there other
  3126. Cambridge Analytics [sic] out there?
  3127. - Yeah, I think we would hope that you would bring in a third
  3128. party to help us ...
  3129. - ... clarify and have more confidence. The last question I have
  3130. is, in yesterday's hearing, you talked a — a little about
  3131. Facebook tracking in different scenarios, including logged-off
  3132. users. Can you please clarify as — how that works? And how does
  3133. tracking work across different devices?
  3134. REP. LEONARD LANCE (R-N.J.)
  3135. - Thank you very much, Mr. Chairman. Mr. Zuckerberg, you are here
  3136. today because you are the face of Facebook, and you have come
  3137. here voluntarily. And our questions are based upon our concern
  3138. about what has occurred and how to move forward. I'm sure you
  3139. have concluded, based upon what we've asked, that we are deeply
  3140. offended by censoring of content inappropriately by Facebook. It
  3141. — examples have been raised: a Roman Catholic university, a
  3142. state senate candidate in Michigan. I would be offended if this
  3143. censoring were occurring on the left, as well as the right, and
  3144. I want you to know that. And do you take from what we have
  3145. indicated so far that, in a bipartisan fashion, Congress is
  3146. offended by inappropriate censoring of content?
  3147. REP. STEVE SCALISE (R-LA.)
  3148. - Thank you, Mr. Chairman. And, Mr. Zuckerberg, I appreciate you
  3149. coming here. I know, as some of my colleagues mentioned, you
  3150. came here voluntarily, and we appreciate the opportunity to have
  3151. this discussion, because, clearly, what your company's been able
  3152. to do has revolutionized the way that people can connect. And
  3153. there's a tremendous benefit to our country. Now it's a
  3154. worldwide platform, and it's — it's helped create a shortage of
  3155. computer programmers. So, as a former computer programmer, I
  3156. think we would both agree that we need to encourage more people
  3157. to go into the computer sciences, because our country is a world
  3158. leader, thanks to your company and so many others. But it
  3159. obviously raises questions about privacy and data and how the
  3160. data is shared and what is a user's expectation of where that
  3161. data goes. So I want to ask a few questions. First, would you
  3162. agree that we need more computer programmers and people to go
  3163. into that field?
  3164. SCHRADER
  3165. - But I'm talking about the direction you've given your forensic
  3166. team. Now, if they find stuff, they are not to delete it at this
  3167. point in time? Or are they going to go ahead and delete it?
  3168. - Right.
  3169. - I'm worried about the — the information being deleted without
  3170. law enforcement having an opportunity to actually review that.
  3171. Will you commit to this committee that neither Facebook nor its
  3172. agents have removed any information or evidence from Cambridge
  3173. Analytica's offices?
  3174. - How about Mr. Kogan's office, if I may ask?
  3175. - Yes, where I'm — with all due respect, what I'm getting at is
  3176. I'd like to have the information available for the U.K. or U.S.
  3177. law enforcement officials, and I did not hear you commit to
  3178. that. Will you commit to the committee that Facebook has not
  3179. destroyed any data or records that may be relevant to any
  3180. federal, state or international law enforcement investigation?
  3181. - You suspended your audit, pending the U.K.'s investigation?
  3182. - So it's my understanding that you and other Facebook executives
  3183. have the ability to rescind or delete messages that are on
  3184. people's websites. To be clear, I just want to make sure that,
  3185. if that is indeed the case — that, after you've deleted that
  3186. information — that, somehow, law enforcement — particularly
  3187. relevant to this case — would still have access to those
  3188. messages.
  3189. - Great. Well, I appreciate that. While you've testified very
  3190. clearly that you do not sell information — it's not Facebook's
  3191. model; you do the advertising and obviously have other means of
  3192. revenue — but it's pretty clear others do sell that information.
  3193. Doesn't that make you somewhat complicit in what they're doing,
  3194. your allowing them to sell the information that they glean from
  3195. your website?
  3196. - How do you — how do you enforce that? That's my concern. How do
  3197. you enforce that? Complaint only is what I've heard so far
  3198. tonight.
  3199. - So last question is it's my understanding, based on the
  3200. testimony here today, that, even after I'm off of Facebook —
  3201. that you guys still have the ability to follow my web
  3202. interactions. Is that correct?
  3203. - I've logged out of Facebook. Do you still have the ability to
  3204. follow my interactions on the web?
  3205. RUIZ
  3206. - Did you think that ...
  3207. - ... the rules were kind of lax, that you were sort of debating
  3208. whether you needed to or something?
  3209. - Okay.
  3210. - Well — well — well, you answered my question. Would you agree
  3211. that for Facebook to continue to be successful, it needs to
  3212. continue to have the trust of its users?
  3213. - Great. So does this not, perhaps, strike you as a weakness with
  3214. the current system; that you are not required to notify the FTC
  3215. of a potential violation of your own consent decree with them,
  3216. and that you did not have clear guidelines for what you as a
  3217. company needed to do in this situation to maintain the public's
  3218. trust, and act in their best interest?
  3219. - I'm just trying to think of the other CEO who might not have
  3220. such a broad view, and might interpret the different legal
  3221. requirements, maybe, differently. So that's why I'm asking these
  3222. questions. I'm — I'm — I'm also taking a broad view as a
  3223. Congressman here, to try to fix this problem. So from what we've
  3224. learned over the past two days of hearings, it just doesn't seem
  3225. like the FTC has the necessary tools to do what needs to be done
  3226. to protect consumer data and consumer privacy, and we can't
  3227. exclusively rely on companies to self-regulate in the best
  3228. interest of consumers. So Mr. Zuckerberg, would — would it be
  3229. helpful if there was an entity clearly tasked with overseeing
  3230. how consumer data is being collected, shared and used, and which
  3231. could offer guidelines, at least guidelines for companies like
  3232. yours to ensure your business practices are not in violation of
  3233. the law, something like a digital consumer protection agency?
  3234. - Well, one of the things that we're realizing is that there's a
  3235. lot of holes in the system; that — that, you know, we don't have
  3236. the toolbox, you don't have the toolbox to monitor 9 million
  3237. apps, and tens of thousands of — of data collectors, and there's
  3238. no specific mechanism for you to collaborate with those that can
  3239. help you prevent these things from happening. And so I think
  3240. that — that perhaps if we — if we started having these
  3241. discussions about what would have been helpful for you to build
  3242. your toolbox, and for us to build our toolbox, so that we can
  3243. prevent things like Cambridge Analytica, things like identity
  3244. thefts, things like what, you know, what we're seeing — what
  3245. we've heard about today. So thank — you know, I just want to
  3246. thank you for your thoughts and testimony. So it's clear to me
  3247. that this is the beginning of many, many conversations on the
  3248. topic, and I look forward to working with you and the committee
  3249. to — to better protect consumer privacy.
  3250. - Thank you.
  3251. DOYLE
  3252. - Thank you. And — and you use these technologies to flag spam,
  3253. identify offensive content and track user activity, right?
  3254. - But not — 2015 when, the Guardian first reported on Cambridge
  3255. Analytica using Facebook user data — was that the first time
  3256. Facebook learned about these allegations?
  3257. - Was that the first time you heard about it, when it was ...
  3258. - When The Guardian made the report, was that the first time you
  3259. heard about it?
  3260. - Thank you. So the — you weren't tuning — learn about these
  3261. violations through the press?
  3262. - Let me ask you this. You have the capability to audit
  3263. developers' use of Facebook user data and — and do more to
  3264. prevent these abuses. But the problem at Facebook not only
  3265. persisted; it proliferated. In fact, relatives (sic) to other
  3266. types of problems you had on your platform, it — it seems as
  3267. though you turned a blind eye to this. Correct?
  3268. - But, Mr. Zuckerberg ...
  3269. - ... it seems to us that — that — it seems like you were more
  3270. concerned with attracting and retaining developers on your
  3271. platform than you were with ensuring the security of Facebook
  3272. user data. Let me switch gears. Your company is subject to a
  3273. 20-year consent decree with the FTC since 2011. Correct?
  3274. - And that decree emerged out of a number of practices that
  3275. Facebook engaged in that the FTC deemed to be unfair and
  3276. deceptive. One such practice was making Facebook users' private
  3277. information public without sufficient notice or consent;
  3278. claiming that Facebook certified the security and integrity of
  3279. certain apps when, in fact, it did not; and enabling developers
  3280. to access excessive information about a user and their friends.
  3281. Is that correct?
  3282. - But these were part of the — the consent decree.
  3283. - So I think — I'm — I'm just concerned that, despite this
  3284. consent decree, Facebook allowed developers access to an unknown
  3285. number of user profiles on Facebook for years — potentially
  3286. hundreds of million, potentially more — and not only allowed,
  3287. but partnered with individuals and app developers such as
  3288. Aleksandr Kogan, who turned around and sold that data on the
  3289. open market and to companies like Cambridge Analytica. Mr.
  3290. Zuckerberg, you've said that you plan to audit tens of thousands
  3291. of developers that may have improperly harvested Facebook user
  3292. data. You also said that you planned to give all Facebook users
  3293. access to some user controls that will be made available in the
  3294. E.U. under the GDPR. But it strikes me that there's a real trust
  3295. gap here. This developer data issue is just one example. But why
  3296. should we trust you to follow through on these promises when you
  3297. have demonstrated repeatedly that you're willing to flout both
  3298. your own internal policies and government oversight when the
  3299. needs suit you?
  3300. - I see my time is almost over.
  3301. - I just want to say, Mr. Chairman ...
  3302. - ... that, to my mind, the only way we're going to close this
  3303. trust gap is through legislation that creates and empowers a
  3304. sufficiently resourced expert oversight agency with rulemaking
  3305. authority to protect the digital privacy and ensure ...
  3306. - ... that companies protect our users' data. With that, I yield
  3307. back.
  3308. REP. ADAM KINZINGER (R-ILL.)
  3309. - Thank you, Chairman. And, Mr. Zuckerberg, thank you for being
  3310. here. Given the global reach of Facebook, I'd like to know about
  3311. the company's policies and practices with respect to information
  3312. sharing with foreign governments, if you don't mind. What
  3313. personal data does Facebook make available from Facebook,
  3314. Instagram, WhatsApp to Russian state agencies, including intel
  3315. and security agencies?
  3316. REP. LARRY BUCSHON (R-IND.)
  3317. - Thank you, Mr. Chairman. Thank you, Mr. Zuckerberg, for being
  3318. here. There are plenty of anecdotal examples, including from
  3319. family members of mine, where people will be verbally discussing
  3320. items, never having actively been on the Internet at the time,
  3321. and then, the next time they get on Facebook or other online
  3322. apps, ads for things that they were verbally discussing with
  3323. each other will show up. And I know you said in the Senate that
  3324. Facebook doesn't listen — specifically listen to what people are
  3325. saying through their — through their phone, whether that's a
  3326. Google phone or whether it's Apple or another one. However, the
  3327. other day, my mother-in-law and I were discussing her brother,
  3328. who had been deceased for about 10 years, and, later on that
  3329. evening, on — on her Facebook site, she had a — she had, set to
  3330. music, kind of a in memoriam picture collage that came up
  3331. Facebook, specifically to her brother. And that happened the
  3332. other night. So, if you don't — you're not listening to us on
  3333. the phone, who is? And do you have specific contracts with —
  3334. with these companies that will provide data that you — is being
  3335. acquired verbally through our — through our phones or, now,
  3336. through things like Alexa or other — other products?
  3337. REP. BILL FLORES (R-TEX.)
  3338. - Thank you Mr. Chairman. Mr. Zuckerberg, thank you for being
  3339. here today. I'm up here, top row. I'm certain there are other
  3340. things you'd rather be doing. The activities of Facebook and
  3341. other technology companies should not surprise us. I mean, we've
  3342. seen it before — and again, don't take this critically. But we
  3343. saw a large oil company become a monopoly back in the late
  3344. 1800s, early 1900s. We saw a large telecommunications company
  3345. become a near-monopoly in the '60s, '70s and '80s. And, just as
  3346. Facebook — and these companies were founded by bright
  3347. entrepreneurs. Their companies grew. And, eventually, they
  3348. sometimes became detached from everyday Americans. And what
  3349. happened is policymakers then had to step in and reestablish the
  3350. balance between those — those folks and everyday Americans. You
  3351. didn't intend for this to happen. It did happen, and I
  3352. appreciate that you've apologized for it. And one of the things
  3353. I appreciate about Facebook — it appears you're proactively
  3354. trying to address the situation. Just as we addressed those
  3355. monopolies in the past, we're faced with that similar — that
  3356. situation today. We need to — and this — this goes beyond
  3357. Facebook. This has to do with the edge providers. It has to do
  3358. with social media organizations and also with ISPs. Back to — to
  3359. Facebook in particular, though, we heard examples yesterday,
  3360. during the Senate hearing, and also today, during this hearing,
  3361. so far, about ideological bias among the users of Facebook. In
  3362. my Texas district, I have a retired schoolteacher whose
  3363. conservative postings were banned or stopped. The good news is I
  3364. was able to work with Facebook's personnel and get her
  3365. reinstated. That said, the Facebook centers still seem to be
  3366. trying to stop her postings. And I — anything you can do in that
  3367. regard to fix that bias will go a long way. I want to move a
  3368. different direction; that's to talk about the future. Congress
  3369. needs to consider policy responses, as I said earlier. And I
  3370. want to call this policy response Privacy 2.0 and Fairness 2.0.
  3371. With respect to fairness, I think the technology companies
  3372. should be ideologically agnostic regarding their users' public-
  3373. facing activities. The only exception would be for potentially
  3374. violent behavior. I'll ask — my — my question is, on this, do
  3375. you agree that Facebook and other technology platforms should be
  3376. ideologically neutral?
  3377. TONKO
  3378. - Well, 3 billion user accounts were breached at Yahoo in 2013,
  3379. 145 million at eBay in 2014, 143 million at Equifax in 2017, 78
  3380. million at Anthem in 2015, 76 million at JPMorgan Chase in 2014
  3381. — the list goes on and on. The security of all that private data
  3382. is gone, likely sold many times over to the highest bidder on
  3383. the dark web. We live in an information age. Data breaches and
  3384. privacy hacks are not a question of if. They are a question of
  3385. when. But the case with Facebook is slightly different. The 87
  3386. million accounts extracted by Cambridge Analytica are just the
  3387. beginning, with, likely, dozens of other third parties that have
  3388. accessed this information. As far as we know, the dam is still
  3389. broken. As you have noted, Mr. Zuckerberg, Facebook's business
  3390. model is based on capitalizing on the private personal
  3391. information of your users. Data security should be a central
  3392. pillar of this model. And, with your latest vast breach of
  3393. privacy and the widespread political manipulation that followed
  3394. it, the question that this committee must ask itself is what
  3395. role the federal government should play in protecting the
  3396. American people and the democratic institutions that your
  3397. platform, and others like it, have put at risk. In this case you
  3398. gave permission to mine the data of some 87 million users, based
  3399. on the deceptive consent — consent of just a fraction of that
  3400. number. When they found out I was going to be speaking with you
  3401. today, my constituents asked me to share some of their concerns
  3402. in person. How can they protect themselves on your platform? Why
  3403. should they trust you again with their likes, their loves, their
  3404. lives? Users trusted Facebook to prioritize user privacy and
  3405. data security, and that trust has been shattered. I'm encouraged
  3406. that Facebook is committed to making changes, but I am indeed
  3407. wary that you are only acting now out of concern for your brand
  3408. and only making changes that should have been made a long time
  3409. ago. We have described this as an arms race, but, every time we
  3410. saw what precautions you have or, in most cases, have not taken,
  3411. your company is caught unprepared and ready to issue another
  3412. apology. I'm left wondering again why Congress should trust you
  3413. again. We'll be watching you closely to ensure that Facebook
  3414. follows through on these commitments. Many of my constituents
  3415. have asked about your business model, where users are the
  3416. product. Mary of Half Moon, in my district, called it
  3417. infuriating. Andy of Schenectady, New York, asked, “Why doesn't
  3418. Facebook pay its users for their incredibly valuable data?”
  3419. Facebook claims that users rightly own and control their data,
  3420. yet their data keeps being exposed on your platform, and these
  3421. breaches cause more and more harm each time. You have said that
  3422. Facebook was built to empower its users. Instead, users are
  3423. having their information abused with absolutely no recourse. In
  3424. light of this harm, what liability should Facebook have? When
  3425. users' data is mishandled, who is responsible and what recourse
  3426. do users have? Do you bear that liability?
  3427. - Do you bear the liability?
  3428. - Mr. Chairman, if I might ask that other questions that my
  3429. constituents have be answered by unanimous consent.
  3430. REP. PAUL TONKO (D-N.Y.)
  3431. - Thank you. Mr. Zuckerberg, I want to follow up on a question
  3432. asked by Mr. McNerney, where he talked about visiting websites
  3433. and the fact that Facebook can track you, and, as you visit
  3434. those websites, you can have that deleted. I'm informed that
  3435. there's not a way to do that. Or are you telling us that you are
  3436. announcing a new policy?
  3437. REP. DORIS MATSUI (D-CALIF.)
  3438. - Thank you, Mr. Chairman, and welcome, Mr. Zuckerberg. Thank you
  3439. very much here. You know, I was just thinking about Facebook and
  3440. how you developed your platform — first, from a social platform
  3441. with — amongst friends and colleagues and joining a community.
  3442. And a lot of that was based upon trust, because you knew your
  3443. friends, right? But that evolved into this business platform,
  3444. and one of the pillars still was trust. And I think you would
  3445. all — I think everybody here would agree that trust is in short
  3446. supply here, and that's why we're here today. Now, you've
  3447. constantly maintained that consumers own the data they provided
  3448. to Facebook and should have control over it. And I appreciate
  3449. that, and I just want to understand more about what that means.
  3450. To me, if you own something, you ought to have to — say about
  3451. how and when it's used. But, to be clear, I don't just mean
  3452. pictures, email addresses, Facebook groups or pages. I
  3453. understand the data and information consumers provided to
  3454. Facebook can be, and perhaps is, used by algorithms to form
  3455. assumptions and inferences about users to better target ads to
  3456. the individuals. Now, do you believe that consumers actually own
  3457. their data, even when that data has been supplemented by a data
  3458. broker — assumptions algorithms have made about that user or
  3459. otherwise? And this is kind of the question that Ms. Blackburn
  3460. has come up with — our own comprehensive profile, which is kind
  3461. of our virtual self.
  3462. ENGEL
  3463. - You mentioned before, in your remarks, hate speech. We've seen
  3464. the scale and reach of extremism balloon in the last decade,
  3465. partially because of the expansion of social platforms. Whether
  3466. it's a white supremacist rally in Charlottesville that turned
  3467. violent, or it's ethnic cleansing in Burma that resulted in the
  3468. second-largest refugee crisis in the world, are you aware of any
  3469. foreign or domestic terrorist organizations, hate groups,
  3470. criminal networks or other extremist networks that have scraped
  3471. Facebook user data? And, if they have, and if they do it in the
  3472. future, how would you go about getting it back or deleting it?
  3473. - So do you adjust your — your algorithms to prevent individuals
  3474. interested in violence or nefarious activities from being
  3475. connected with other like-minded individuals?
  3476. - Do you adjust your algorithms to prevent individuals interested
  3477. in violence or bad activities from being connected with other
  3478. like-minded individuals?
  3479. - Okay. And, finally, let me say this. Many of us are very angry
  3480. about Russian influence in the — in the 2016 presidential
  3481. elections and Russian influence over our presidential elections.
  3482. Does Facebook have the ability to detect when a foreign entity
  3483. is attempting to buy a political ad? And is that process
  3484. automated? Do you have procedures in place to inform key
  3485. government players when a foreign entity is attempting to buy a
  3486. political ad or when it might be taking other steps to interfere
  3487. in an election?
  3488. - Thank you.
  3489. REP. JAN SCHAKOWSKY (D-ILL.)
  3490. - Thank you, Mr. Chairman. You know, you have a long history of
  3491. growth and success, but you also have a long list of apologies.
  3492. In 2003, it started at Harvard. “I apologize for any harm done
  3493. as a result of my neglect.” 2006: “We really messed this one
  3494. up.” 2007: “We simply did a bad job. I apologize for it.” 2010:
  3495. “Sometimes we move too fast.” 2011: “I'm the first to admit that
  3496. we're made — that we've made a bunch of mistakes.” 2017 — this
  3497. is in — in connection with the Russian manipulation of the
  3498. election and the data that was — came from Facebook initially:
  3499. “I am — I ask for forgiveness. I will work to do better.” So it
  3500. seems to me from this history that self-regulation — this is
  3501. proof to me that self-regulation simply does not work. I have a
  3502. bill — the Secure and Protect Americans' Data Act — that I hope
  3503. you will take a look at, very simple bill about setting
  3504. standards for how you have to make sure that the data is
  3505. protected, deadlines on when you have to release that
  3506. information to the public. Certainly, it ought to go to the FTC,
  3507. as well. But, in response to the questions about the apps and
  3508. the investigation that you're going to do, you said you don't
  3509. necessarily know how long. Have you set any deadline for that?
  3510. Because we know, as my colleague said, that there are tens of
  3511. thousands — there's actually been 9 million apps. How long do we
  3512. have to wait for that kind of investigation?
  3513. - Mr. Chairman, since my name was mentioned, can I just respond?
  3514. REP. ANNA G. ESHOO (D-CALIF.)
  3515. - Thank you, Mr. Chairman. Good morning, Mr. Zuckerberg. First, I
  3516. believe that our democratic institutions are undergoing a stress
  3517. test in our country. And I believe that American companies owe
  3518. something to America. I think the damage done to our democracy,
  3519. relative to Facebook and its platform being weaponized, are
  3520. incalculable. Enabling the cynical manipulation of American
  3521. citizens for the purpose of influencing an election is deeply
  3522. offensive, and it's very dangerous. Putting our private
  3523. information on offer without concern for possible misuses, I
  3524. think, is simply irresponsible. I invited my constituents, going
  3525. into the weekend, to participate in this hearing today by
  3526. submitting what they want to ask you. And so my questions are
  3527. theirs. And, Mr. Chairman, I'd like unanimous consent to place
  3528. all of their questions in the record.
  3529. COLLINS
  3530. - Thank you, Mr. Chairman. And I wasn't sure where I would be
  3531. going with this, but when you're number 48 out of 54 members you
  3532. know you can do a lot of listening, and I've tried to do that
  3533. today. And to — to frame where I am now, I think, first of all,
  3534. thank you for coming. And there's a saying, you don't know what
  3535. you know until you know it. And I really think you've done a — a
  3536. great benefit to Facebook and yourself in particular as we now
  3537. have heard, without a doubt, Facebook doesn't sell data. I think
  3538. the narrative would be, of course you sell data. And now we all
  3539. know across America you don't sell data. I think that's very
  3540. good for you, a very good clarification. The other one is that
  3541. the whole situation we're here is because a third-party app
  3542. developer, Aleksandr Kogan, didn't follow through on the rules.
  3543. He was told he can't sell the data. He gathered the data, and
  3544. then he did what he's not supposed to and he sold that data. And
  3545. it's very hard to anticipate a bad actor doing what they're
  3546. doing until after they've done it, and clearly you took actions
  3547. after 2014. So one real quick question is, what did change in,
  3548. you know, 10 or 20 or 30 seconds? What data was being collected
  3549. before you locked down the platform, and how did that change to
  3550. today?
  3551. - And — and I think that's a very good clarification as well
  3552. because people were wondering how does 300,000 become 87
  3553. million. So that — that's also something that's good to know.
  3554. And — and you know, I guess my last minute as I've heard the
  3555. tone here, I've got to give you all the credit in the world. You
  3556. — I could tell from the tone, we would say the other side
  3557. sometimes when we point to our left, but when the representative
  3558. from Illinois to quote her said, “Who is going to protect us
  3559. from Facebook?” I mean that threw me back in my chair. I mean,
  3560. that was certainly an aggressive — we'll — we'll use the polite
  3561. word “aggressive,” but I think out of bounds kind of comment.
  3562. Just my opinion. And I've said I was interviewed by a couple of
  3563. folks in the break and I said, you know, as I'm listening to you
  3564. today I'm quite confident that you truly are doing good. You
  3565. believe in what you're doing. 2.2 billion people are using your
  3566. platform. And I sincerely know in my heart that you do believe
  3567. in — in keeping all ideas equal, and you may vote a certain way
  3568. or not but that doesn't matter. You've got 27,000 employees and
  3569. I think the fact is that you're operating under a Federal Trade
  3570. Commission consent decree from 2011. That's a real thing, and it
  3571. goes for 20 years. So when someone said, do we need more
  3572. regulations, or do we need more legislation? I said no. Right
  3573. now what we have is Facebook with a CEO that — that's mind is in
  3574. the right place doing the best you can with 27,000 people, but
  3575. the consent decree does what it does. I mean, there would be
  3576. significant financial penalties were Facebook to ignore that
  3577. consent decree. So I think as I'm hearing this meeting going
  3578. back and forth, I for one think it was beneficial. It's good. I
  3579. don't think we need more regulations and legislation now, and I
  3580. want to congratulate you, I think, on doing a good job here
  3581. today in presenting your case, and we now know we didn't know
  3582. before hand. So thank you again.
  3583. REP. MARKWAYNE MULLIN (R-OKLA.)
  3584. - Thank you, Mr. Chairman, and sir, thank you for being here. I
  3585. appreciate you using the term “Congressman” and “Congresswoman.”
  3586. My name's Markwayne Mullin, and feel free to use that name. Sir,
  3587. I — I just want to tell you, first of all, I want to commend you
  3588. on your ability to not just invent something, but to see it
  3589. through its — through its growth. We see a lot of venturers have
  3590. the ability to do that, but to manage it, and to see that — see
  3591. it through its tremendous growth period takes a lot of talent,
  3592. and you can show — by your showing here today, you — you handle
  3593. yourself well, so — so thank you on that. And you also do that
  3594. by hiring the right people, so I commend you on doing that,
  3595. also. You hire people, obviously, based on their ability to get
  3596. the job done. Real quick, a couple questions I have, and I'll
  3597. give you time to answer it. Isn't it the consumers'
  3598. responsibility to some degree to control the content to which
  3599. they release?
  3600. REP. MCNERNEY (D-CALIF.)
  3601. - I thank the Chairman. Mr. Zuckerberg, I — I thank you for
  3602. agreeing to testify before the House and Senate committees. I
  3603. know it's a long, grueling process and I appreciate your
  3604. cooperation. I'm a mathematician that spent 20 years in industry
  3605. and government, developing technology including algorithms.
  3606. Moreover, my constituents are impacted by these issues. So I'm
  3607. deeply committed and invested here. I'm going to follow up on an
  3608. earlier question. Is there currently a place that I can download
  3609. all of the Facebook information about me, including the websites
  3610. that I have visited?
  3611. REP. PETE OLSON (R-TEX.)
  3612. - I thank the chair. And, Mr. Zuckerberg, I know we both wish we
  3613. met under a different set of circumstances. When the story
  3614. broke, you were quoted as saying, “I started Facebook. I run it.
  3615. I'm responsible for what happens here,” end quote. You said
  3616. those same words in your opening statement an hour and a half
  3617. ago. I know you believe that in your heart. It's not just some
  3618. talking point, some canned speech, because, my four years — five
  3619. — I'm sorry, nine years in the Navy, I know the best commanding
  3620. officers, the best skippers, the best CEOs have that exact same
  3621. attitude. If Facebook was a Navy ship, your privacy has taken a
  3622. direct hit. Your trust is severely damaged. You're taking on
  3623. water and your future may be a fine with a number, per The
  3624. Washington Post, with four commas in it. Today, over $1 billion
  3625. in fines coming your way. As you know, you have to reinforce
  3626. your words with actions. I have a few questions about some
  3627. anomalies that have happened in the past. First of all, back in
  3628. 2012, apparently, Facebook did an experiment on 689,003 Facebook
  3629. users. You reduced positive posts from users' friends and
  3630. limited so-called “downer” posts from other friends. They see —
  3631. fed positive information to one group, and, another group,
  3632. negative information. The goal was to see how the tone of these
  3633. posts would affect behavior. I look at this Forbes article, the
  3634. L.A. Times, about un-legal — illegal human experimentation
  3635. without permission. I want to talk about that. It seems that
  3636. this is disconnecting people, in stark contrast to your mission
  3637. to connect people. Explain to us how you guys thought this idea
  3638. was a good idea — experimenting with people, giving them more
  3639. negative information, positive information.
  3640. REP. BILLY LONG (R-MO.)
  3641. - Thank you, Mr. Chairman, and thank you, Mr. Zuckerberg, for
  3642. being here today on a voluntary basis. I want to put that out
  3643. here — you were not subpoenaed to be here, as Mr. Barton offered
  3644. up a little bit ago. We've had — you're the only witness at the
  3645. table today. We've had 10 people at that table, to give you an
  3646. idea of what kind of hearings we've had in here. Not too long
  3647. ago, we had 10, and I'd say that, if we invited everyone that
  3648. had read your terms of agreement — terms of service, we could
  3649. probably fit them at that table. I also would say that I had —
  3650. represent 751,000 people, and, out of that 751,000 people, the
  3651. people in my area that are really worked up about this —
  3652. Facebook, and about this hearing today — would also fit with you
  3653. there at the table. So I'm not getting the outcry from my
  3654. constituents about what's going on with Cambridge Analytica and
  3655. — and this user agreement and everything else. But there are
  3656. some things that I think you need to be concerned about. One
  3657. question I'd like to ask before I move into my questioning is
  3658. what was FaceMash, and is it still up and running?
  3659. REP. DIANA DEGETTE (D-COLO.)
  3660. - Thank you very much, Mr. Chairman. Mr. Zuckerberg, we
  3661. appreciate your contrition. And we appreciate your commitment to
  3662. resolving these past problems. From my perspective, though, and
  3663. my colleagues on both sides of the aisle in this committee,
  3664. we're interested in looking forward to preventing this kind of
  3665. activity; not just with Facebook but with others in your
  3666. industry. And as has been noted by many people already, we've
  3667. been relying on self-regulation in your industry for the most
  3668. part. We're trying to explore what we can do to prevent further
  3669. breaches. So I'm going to ask you a whole series of fairly quick
  3670. questions. They should only require yes-or-no answers. Mr.
  3671. Zuckerberg, at the end of 2017, Facebook had a total shareholder
  3672. equity of just over $74 billion. Is that correct?
  3673. KENNEDY
  3674. - Okay. So can advertisers, then — understanding that you,
  3675. Facebook, maintain the data; you're not selling that to anybody
  3676. else — but advertisers clearly end up having access through that
  3677. — through agreements with you about how they, then, target ads
  3678. to me, to you, to any other user. Can advertisers in any way use
  3679. nonpublic data — so data that individuals would not think is
  3680. necessarily public — so that they can target their ads?
  3681. - Understood. They don't — you don't share that, but they get
  3682. access to that information so that — if they know — they want to
  3683. market skis to me, because I like skis. On the realm of data
  3684. that is accessible to them, does that include — does Facebook
  3685. include deleted data?
  3686. - Fair, fair. So can advertisers, either directly or indirectly,
  3687. get access to or use the metadata that Facebook collects in
  3688. order to more specifically target ads? So that would include — I
  3689. know you've talked a lot about how Facebook would use access to
  3690. information for folks that — well, I might be able to opt in or
  3691. out about your ability to track me to other websites. Is that
  3692. used by those advertisers, as well?
  3693. - So does — essentially, does — the advertisers that are using
  3694. your platform — do they get access to information that the user
  3695. doesn't actually think is either, one, being generated, or, two,
  3696. is public? Understanding that, yes, if you dive into the details
  3697. of your — your platform, users might be able to shut that off,
  3698. but I think one of the challenges with trust here is that
  3699. there's an awful lot of information that's generated, that
  3700. people don't think that they're generating, and that advertisers
  3701. are being able to target because Facebook collects it.
  3702. - Right. But, then, I guess, the question back to — and I've only
  3703. got 20 seconds. I think one of the rubs that you're hearing is I
  3704. don't understand how users, then, own that data. I think that's
  3705. part of the rub. Second, you focus a lot of your testimony and
  3706. the questions on the individual privacy aspects of this. But we
  3707. haven't talked about the societal implication of it. And I
  3708. think, while I applaud some of the reforms that you're putting
  3709. forward, the underlying issue here is that your platform has
  3710. become a — a ...
  3711. - ... mix of — two seconds — news, entertainment, social media
  3712. that is up for manipulation. We've seen that with a foreign
  3713. actor. If the changes to individual privacy don't seem to be
  3714. sufficient to address that underlying issue ...
  3715. - ... I'd love your comments on that at the appropriate time.
  3716. Thank you.
  3717. LANCE
  3718. - Fair enough. My point is that we don't favor censoring in any
  3719. way, so long as it doesn't involve hate speech or violence or
  3720. terrorism. And, of course, the examples today indicate quite the
  3721. contrary, number one. Number two, Congresswoman Blackburn has
  3722. mentioned her legislation. I'm a co-sponsor of the BROWSER
  3723. legislation. I commend it to your attention, to the attention of
  3724. your company. It is for the entire ecosystem. It is for ISPs and
  3725. edge providers. It is not just for one or the other. It is an
  3726. opt-in system, similar to the system that exists in your — might
  3727. I respectfully request of you, Mr. Zuckerberg, that you and your
  3728. company review the BROWSER legislation? And I would like your
  3729. support for that legislation after your review of it.
  3730. - Thank you very much. Your COO, Sheryl Sandberg, last week,
  3731. appeared on the Today program. And she admitted the possibility
  3732. that additional breaches in personal information could be
  3733. discovered by the current audits. Quote, “We're doing an
  3734. investigation. We're going to do the audits. And, yes, we think
  3735. it's possible. That's why we're doing the audits.” Then the COO
  3736. went on to say, “Facebook cared about privacy all along, but I
  3737. think we got the balance wrong.” Do you agree with the statement
  3738. of your COO?
  3739. - Thank you. I — I certainly concur with the statement of the
  3740. COO, as affirmed by you today, that you got the balance wrong.
  3741. And then, regarding Cambridge Analytica, the fact that 300,000
  3742. individuals or so gave consent, but that certainly didn't mean
  3743. they gave consent to — to 87 million friends — do you believe
  3744. that that action violated your consent agreement with the
  3745. Federal Trade Commission?
  3746. - Thank you. I think you may have violated the agreement with the
  3747. Federal Trade Commission, and I'm sure that will be determined
  3748. in the future. Thank you, Mr. Chairman.
  3749. MULLIN
  3750. - Right. And — and does the device settings, does it really help
  3751. you protect what information is released? For instance, there's
  3752. been a lot of talk about them searching for something, maybe on
  3753. Google, and then the advertisement pops up on Facebook. Isn't
  3754. there a setting on most devices to where you can close out the
  3755. browser without Facebook interacting with that?
  3756. - See, I — I come from the — from the background of believing
  3757. that everything I do, I assume is open for anybody to take when
  3758. I'm on the Internet. I — I understand that it is — it is privacy
  3759. concerns, but you're still releasing it to something farther
  3760. than a pen and pad. So once I'm — once I'm on the Web, or I'm on
  3761. an app, then that information is subject to — to going, really,
  3762. anyplace. All I can do is protect it the best I can by my
  3763. settings. And so what I'm trying to get to is, as a — as an
  3764. individual, as a user of Facebook, how can someone control
  3765. keeping the content within the realm that they want to keep it,
  3766. without it being collected? You say that, you know, you don't
  3767. sell it. However, you do — you do sell advertisement. As a
  3768. business owner, I have a demographic that I go after, and I
  3769. search advertisers that — that market to that demographic. So
  3770. you collect information for that purpose, right?
  3771. - Sure.
  3772. - Value-based. But if I don't — If I'm a customer or a user of
  3773. Facebook, and I don't want that information to be shared, how do
  3774. I keep that from happening? Is there settings within the app
  3775. that I need to go to to set — to block all that?
  3776. - Would that have kept apps from seeking our information, if
  3777. that's ...
  3778. - Thank you. I appreciate it. Thank you, Chairman.
  3779. BLACKBURN
  3780. - And where does privacy rank as a corporate value for Facebook?
  3781. - Okay.
  3782. - Well ...
  3783. - No, I can't let you filibuster right now. A constituent of mine
  3784. who's a benefits manager brought up a great question in a
  3785. meeting at her company last week. And she said, you know, health
  3786. care, you've got HIPAA, you've got Gramm-Leach-Bliley, you've
  3787. got the Fair Credit Reporting Act. These are all compliance
  3788. documents for privacy for other sectors of the industry. She was
  3789. stunned, stunned, that there are no privacy documents that apply
  3790. to — to you all. And we've heard people say that — you know, and
  3791. you've said you're considering, maybe you need more regulation.
  3792. What we think is, we need for you to look at new legislation.
  3793. And you're hearing there'll be more bills brought out in the
  3794. next few weeks. But we have had a bill. The BROWSER Act, and I'm
  3795. certain that you're familiar with this, is bipartisan. And I
  3796. thank Mr. Lipinski and Mr. Lance and Mr. Flores for their good
  3797. work on this legislation. We've had it for over a year and
  3798. certainly we've been working on this issue for about four years.
  3799. And what this would do is have one regulator, one set of rules
  3800. for the entire ecosystem. And will you commit to working with us
  3801. to pass privacy legislation, to pass the BROWSER Act? Will you
  3802. commit to doing that?
  3803. - Okay, let's get — let's get familiar with the details. As you
  3804. have heard, we need some rules and regulations. This is only 13
  3805. pages. The BROWSER Act is 13 pages, so you can easily become
  3806. familiar with it. And we would appreciate your help. And I've
  3807. got to tell you, as Mr. Green just said, as you look at the E.U.
  3808. privacy policies, you're already doing much of that, if you're
  3809. doing everything you claim. Because you will have to allow
  3810. consumers to control their data, to change, to erase it. You
  3811. have to give consumers opt-in so that mothers know — my
  3812. constituents in Tennessee want to know that they have a right to
  3813. privacy. And we would hope that that's important to you all. I
  3814. want to move on and ask you something else. And please get back
  3815. to me once you've reviewed the BROWSER Act. I would appreciate
  3816. hearing from you. We've done one hearing on the algorithms. I
  3817. chair Communications and Technology Subcommittee here. We're
  3818. getting ready to do a second one on the algorithms. We're going
  3819. to do one next week on prioritization. So I'd like to ask you,
  3820. do you subjectively manipulate your algorithms to prioritize or
  3821. censor speech?
  3822. - Let me tell you something right now: I — Diamond and Silk is
  3823. not terrorism.
  3824. REP. KURT SCHRADER (D-ORE.)
  3825. - Thank you, Mr. Chairman, I appreciate that. Mr. Zuckerberg,
  3826. again, thank you for being here — appreciate your — your good
  3827. offices and voluntarily coming before us. You have testified
  3828. that you voluntarily took Cambridge Analytica's word that they
  3829. had deleted information, found out subsequently that they did
  3830. not delete that information, have sent in your own forensics
  3831. team, which I — I applaud. I just want to make sure — get some
  3832. questions answered here. Can you tell us that they were not told
  3833. — they were told not to destroy any data — misappropriated data
  3834. they may find?
  3835. CRAMER
  3836. - Thank you, and thanks for being here, Mr. Zuckerberg. And you
  3837. know, “Don't eat the fruit of this tree” is the only regulation
  3838. that was ever initiated before people started abusing freedom.
  3839. Since then, millions of regulations, laws and rules have been
  3840. created in response to an abuse of freedom. Oftentimes, that
  3841. response is a — is more extreme than the abuse, and that's what
  3842. I fear could happen, based on some of the things I've heard
  3843. today in response to this. So this national discussion is very
  3844. important. First of all, it's not — not only for these two days,
  3845. but that it continues, lest we over-respond, Okay? Now, that
  3846. said, I think that the consumer and industry, and whatever
  3847. industry it is, your company or others — others like yours,
  3848. share that responsibility. So I appreciate both your patience
  3849. and your preparation coming in today. But in response to the
  3850. questions from a few my colleagues related to the — the illegal
  3851. drug ads, I have to admit that there were times when I was
  3852. thinking, “His answers aren't very reassuring to me.” And I'm
  3853. wondering what your answer would be as to how quickly you could
  3854. take down an illegal drug site, if there was a million-dollar
  3855. per post, per day regulation fine tied to it. In other words,
  3856. give it your best. I mean, don't wait for somebody to flag it.
  3857. Look for it. Make it a priority. It's certainly far more
  3858. dangerous than a couple of conservative Christian women on — on
  3859. TV. So please, be better than this.
  3860. - And I don't expect it to be perfect, but I do expect it to be a
  3861. higher priority than conservative thought. Speaking of that, I
  3862. think in — in some of your responses to Senator Cruz yesterday,
  3863. and some responses today, related to liberal bias, you've —
  3864. you've sort of implied the fact that while you have these 20,000
  3865. enforcement folks, you've implied that the Silicon Valley —
  3866. perhaps this was more yesterday — that Silicon Valley is a very
  3867. liberal place, and so the talent pool perhaps leans left, and
  3868. it's biased. Let me suggest that you look someplace perhaps in
  3869. the middle of the North American continent for some people,
  3870. maybe even your next big investment of — of capital could be in
  3871. — in some place like, say, Bismarck, North Dakota, or Williston,
  3872. where you have visited, where people tend to be pretty common
  3873. sense, and probably, perhaps, even more diverse than Facebook in
  3874. — in some respects. If the talent pool is a problem, then let's
  3875. look for a different talent pool, and maybe we can even have a
  3876. nice, big center someplace. I want to then close with this,
  3877. because you testified yesterday, and the opening statement by
  3878. the ranking member of the committee bothered me, in that
  3879. suddenly there is this great concern that the providers,
  3880. particularly Facebook, other large ads providers, and — and
  3881. content providers should be hyper-regulated, when all along, we
  3882. — we, as Republicans, have been talking about net neutrality. We
  3883. — we talked about earlier this year, when we — or last year,
  3884. when we rolled back the Internet service provider privacy stuff
  3885. that seemed tilted heavily in your favor, and against them.
  3886. Don't you think that ubiquitous platforms like Google, and
  3887. Facebook, and — and many others have — should have the same
  3888. responsibility to privacy as an Internet service provider?
  3889. - It is.
  3890. - I would submit to you that I have fewer choices in — on the
  3891. platform, in — in your type of a platform, than they do Internet
  3892. service providers, even in rural North Dakota. With that, thank
  3893. you, Mr. Chairman.
  3894. - Isn't he funny?
  3895. FLORES
  3896. - Good.
  3897. - I've got to — I've got limited time. With respect to privacy, I
  3898. think that we need to set a baseline. When we talk about a
  3899. virtual person that each technology user establishes online —
  3900. their name, address, their online purchases, geolocation, data,
  3901. websites visited, pictures, et cetera — I think that the
  3902. individual owns the virtual person they set up online. My second
  3903. question is this. You've said earlier that each user owns their
  3904. virtual presence. Do you think that this concept should apply to
  3905. all technology providers, including social media platforms, edge
  3906. providers and ISPs?
  3907. - Thank you. I'm not trying to catch you off. You can provide
  3908. more information supplementally, after, if you don't mind. In
  3909. this regard, I believe that Congress enact — if Congress enacts
  3910. privacy standards for technology providers, just as we have for
  3911. financial institutions, health care, employed benefits, et
  3912. cetera, the policy should state that the data of technology
  3913. users should be held privately unless they specifically consent
  3914. to the use of the data by others. This release should be based
  3915. on the absolute transparency as to what data will be used, how
  3916. it will be processed, where — how — where it will be stored,
  3917. what algorithms will be applied to it, who will have access to
  3918. it, if it will be sold and to whom it might be sold. The
  3919. disclosure of this information and the associated opt-in actions
  3920. should be easy to understand and easier for nontechnical users
  3921. to execute. The days of the long-scrolling fine-print
  3922. disclosures with a single check mark at the bottom should end.
  3923. In this regard, based on my use of ...
  3924. - ... Facebook, I think you've come a long way toward meeting
  3925. that objective. I think we must move further. I'll have two
  3926. questions to submit later. And thank you — if you can expand on
  3927. your responses to my earlier questions later, thank you.
  3928. REP. DAVID LOEBSACK (D-IOWA)
  3929. - Thank you, Mr. Chairman. I want to thank you and the ranking
  3930. member for holding this hearing today, and I want to thank Mr.
  3931. Zuckerberg for being here today, as well. Add my name to the
  3932. rural broadband list, as well. I have one-fourth of Iowa, the
  3933. southeast part of Iowa. We definitely need more help on that
  3934. front. Thank you. You may recall, last year, Mr. Zuckerberg,
  3935. that you set out to visit every state in the country, to meet
  3936. different people, and one of those places you visited was, in
  3937. fact, Iowa — my home state of Iowa. And you did visit the
  3938. district that I probably represent, and you met some of my
  3939. constituents. As you began your tour, you said that you believed
  3940. in connecting the world and giving everyone a voice, and that,
  3941. quote — you wanted, quote, “to personally hear more of those
  3942. voices.” I'm going to do the same thing in just a second that a
  3943. number of my colleagues did, and just ask you some questions
  3944. that were submitted to my Facebook page by some of my
  3945. constituents. I do want to say at the outset, though — and I do
  3946. ask for unanimous consent to enter all those questions on the
  3947. record, Mr. Chair ...
  3948. REP. DAVID B. MCKINLEY (R-W.VA.)
  3949. - Thank you for coming, Mr. Zuckerberg. I've got a yes or no
  3950. question, if you could give that. Should Facebook — should
  3951. Facebook enable illegal online pharmacies to sell drugs such as
  3952. Oxycodone, Percocet, Vicodin without a prescription?
  3953. REP. BRETT GUTHRIE (R-KY.)
  3954. - Thank you, Mr. Chairman. Thanks for being here. When I first
  3955. got into public office, the Internet was really kicking off, and
  3956. I had a lot of people complain about ads, just the inconvenience
  3957. of ads, trying to get the — and the cumbersome of the Internet.
  3958. I remember telling someone one time, being from Kentucky, a
  3959. basketball fan. I said “There's nothing I hate worse than the
  3960. four-minute timeout, the TV timeout. It's flow of the game, and
  3961. everything. But because of the four-minute timeout, I get to
  3962. watch the game for free, so that's something I'm willing to
  3963. accept to move for free. What you're not really willing to
  3964. accept is that your data's just out there, and it — it's being
  3965. used. But it's being used in the — in the right way, and it's —
  3966. it's funny, because I was going to ask this question anyway. My
  3967. — my friend and I was planning a family trip to Florida, and I
  3968. searched a town in Florida, and all of a sudden, I started
  3969. getting ads for a brand of hotel that I typically stay in, and a
  3970. great hotel at the price available to the public, because it was
  3971. on the Internet, that I was willing to pay and stay there. So I
  3972. thought it was actually convenient. Instead of getting just an
  3973. ad to someplace I'll never go, I got an ad specifically to a
  3974. place I was — I was looking to go, so I thought that was
  3975. convenient. And it wasn't Facebook, although my wife used
  3976. Facebook to message my mother-in-law this weekend for where
  3977. we're meeting up, so it's very valuable. We get to do that for
  3978. free, because your business model relies on consumer-driven
  3979. data. This wasn't Facebook. It was a search engine, but they use
  3980. consumer — consumer-driven data to target an ad to me, so you're
  3981. not unique in Silicon Valley, or in this Internet world in doing
  3982. this type of targeted ads, are you?
  3983. REP. TONY CÁRDENAS (D-CALIF.)
  3984. - Thank you very much. Seems like we've been here forever, don't
  3985. you think? Well, thank you, Mr. Chairman, Ranking Member, for
  3986. holding this important hearing. I'm of the opinion that,
  3987. basically, we're hearing from one of the leaders — the CEO of
  3988. one of the biggest corporations in the world — but yet almost
  3989. entirely in an environment that is unregulated, or, for basic
  3990. terms, that — the lanes in which you're supposed to operate in
  3991. are very wide and broad, unlike other industries. Yet, at the
  3992. same time, I have a chart here of the growth of Facebook.
  3993. Congratulations to you and your shareholders. It shows that, in
  3994. 2009, your net value of the company was less than — or revenue
  3995. was less than a billion dollars. And then you look all the way
  3996. over to 2016 — it was in excess of $26 billion. And then, in
  3997. 2017, apparently, you're about close to $40 billion. Are those
  3998. numbers relatively accurate about the growth and the phenomenon
  3999. of Facebook?
  4000. REP. MICHAEL C. BURGESS (R-TEX.)
  4001. - Thank you, Mr. Chairman, and thanks to our witness for — for
  4002. being here today. Mr. Chairman, I have a number of articles that
  4003. I ask unanimous consent to insert into the record. I know I
  4004. won't have time to get to all of my questions.
  4005. WALDEN
  4006. - Before my opening statement, just as a reminder to our
  4007. committee members on both sides, it's another busy day at Energy
  4008. and Commerce. In addition, as you will recall, to this morning's
  4009. Facebook hearing, later today, our Health Subcommittee will hold
  4010. its third in the series of legislative hearings on solutions to
  4011. combat the opioid crisis. And, remember, Oversight and
  4012. Investigations Subcommittee will hold a hearing where we will
  4013. get an update on the restoration of Puerto Rico's electric
  4014. infrastructure following last year's hurricane season. So, just
  4015. a reminder: When this hearing concludes, I think we have votes
  4016. on the House floor. Our intent is to get through every — every
  4017. member before that point, to be able to ask questions. But then,
  4018. after the votes, we will come back into our subcommittees to do
  4019. that work. As Ray Baum used to say, “The fun never stops.” The
  4020. chair now recognizes himself for five minutes for purposes of an
  4021. opening statement. Good morning. Welcome, Mr. Zuckerberg, to the
  4022. Energy and Commerce Committee in the House. We've called you
  4023. here today for two reasons. One is to examine the alarming
  4024. reports regarding breaches of trust between your company, one of
  4025. the biggest and most powerful in the world, and its users. And
  4026. the second reason is to widen our lens to larger questions about
  4027. the fundamental relationship tech companies have with their
  4028. users. The incident involving Cambridge Analytica and the
  4029. compromised personal information of approximately 87 million
  4030. American users — or mostly American users — is deeply disturbing
  4031. to this committee. The American people are concerned about how
  4032. Facebook protects and profits from its users' data. In short,
  4033. does Facebook keep its end of the agreement with its users? How
  4034. should we, as policymakers, evaluate and respond to these
  4035. events? Does Congress need to clarify whether or not consumers
  4036. own or have any real power over their online data? Have edge
  4037. providers grown to the point that they need federal supervision?
  4038. You and your co-founders started a company in your dorm room
  4039. that's grown to one — be one of the biggest and most successful
  4040. businesses in the entire world. Through innovation and
  4041. quintessentially American entrepreneurial spirit, Facebook and
  4042. the tech companies that have flourished in Silicon Valley join
  4043. the legacy of great American companies who built our nation,
  4044. drove our economy forward, and created jobs and opportunity. And
  4045. you did it all without having to ask permission from the federal
  4046. government and with very little regulatory involvement. The
  4047. company you created disrupted entire industries and has become
  4048. an integral part of our daily lives. Your success story is an
  4049. American success story, embodying our shared values of freedom
  4050. of speech, freedom of association and freedom of enterprise.
  4051. Facebook also provides jobs for thousands of Americans,
  4052. including my own congressional district, with data centers in
  4053. Prineville. Many of our constituents feel a genuine sense of
  4054. pride and gratitude for what you've created, and you're rightly
  4055. considered one of the era's greatest entrepreneurs. This
  4056. unparalleled achievement is why we look to you with a special
  4057. sense of obligation and hope for deep introspection. While
  4058. Facebook has certainly grown, I worry it may not have matured. I
  4059. think it's time to ask whether Facebook may have moved too fast
  4060. and broken too many things. There are critical unanswered
  4061. questions surrounding Facebook's business model and the entire
  4062. digital ecosystem regarding online privacy and consumer
  4063. protection. What exactly is Facebook? Social platform? Data
  4064. company? Advertising company? A media company? A common carrier
  4065. in the information age? All of the above? Or something else?
  4066. - Users trust Facebook with a great deal of information; their
  4067. name, home town, email, phone number, photos, private messages,
  4068. and much, much more. But, in many instances, users are not
  4069. purposefully providing Facebook with data. Facebook collects
  4070. this information while users simply browse other websites, shop
  4071. online or use a third-party app. People are willing to share
  4072. quite a bit about their lives online, based on the belief they
  4073. can easily navigate and control privacy settings and trust that
  4074. their personal information is in good hands. If a company fails
  4075. to keep its promises about how personal data are being used,
  4076. that breach of trust must have consequences. Today we hope to
  4077. shed light on Facebook's policies and practices surrounding
  4078. third-party access to and use of user data. We also hope you can
  4079. help clear up the considerable confusion that exists about how
  4080. people's Facebook data are used outside of the platform. We hope
  4081. you can help Congress, but, more importantly, the American
  4082. people better understand how Facebook user information has been
  4083. accessed by third parties, from Cambridge Analytica and Cubeyou,
  4084. to the Obama for America presidential campaign. And we ask that
  4085. you share any suggestions you have for ways policymakers can
  4086. help reassure our constituents that data they believe was only
  4087. shared with friends or certain groups remains private to those
  4088. circles. As policymakers, we want to be sure that consumers are
  4089. adequately informed about how their online activities and
  4090. information are used. These issues apply not just to Facebook,
  4091. but equally to the other internet-based companies that collect
  4092. information about users online. So, Mr. Zuckerberg, your
  4093. expertise in this field is without rival. So thank you for
  4094. joining us today to help us learn more about these vital matters
  4095. and to answer our questions. With that, I yield now to the
  4096. gentleman from New Jersey, the ranking member of the Energy and
  4097. Commerce Committee, my friend, Mr. Pallone, for five minutes for
  4098. purposes of an opening statement.
  4099. - I think I thank the gentleman for his opening comments.
  4100. (LAUGHTER) With that, we now conclude with member opening
  4101. statements. The chair would like to remind members that,
  4102. pursuant to the committee rules, all members' opening statements
  4103. will be made part of the record. Today, we have Mr. Mark
  4104. Zuckerberg, Chairman and CEO of Facebook Incorporated, here to
  4105. testify before the full Energy and Commerce Committee. Mr.
  4106. Zuckerberg will have the opportunity to give a five-minute
  4107. opening statement, followed by a round of questioning from our
  4108. members. So thank you for taking the time to be here, and you
  4109. are now recognized for five minutes.
  4110. - Thank you, Mr. Zuckerberg. I'll start out, and we'll go into
  4111. the questioning phase. We'll go back and forth, as we always do.
  4112. Remember, it's four minutes today, so we can get to everyone.
  4113. Mr. Zuckerberg, you've described Facebook as a company that
  4114. connects people and as a company that's idealistic and
  4115. optimistic. I have a few questions about what other types of
  4116. companies Facebook may be. Facebook has created its own video
  4117. series, starring Tom Brady, that ran for six episodes and has
  4118. over 50 million views. That's twice the number of the viewers
  4119. that watched the Oscars last month. Also, Facebook's obtained
  4120. exclusive broadcasting rights for 25 major league baseball games
  4121. this season. Is Facebook a media company?
  4122. - All right, let me ask the next one. You can send money to
  4123. friends on Facebook Messenger using a debit card or a PayPal
  4124. account to, quote, “split meals, pay rent and more,” close
  4125. quote. People can also send money via Venmo or their bank app.
  4126. Is Facebook a financial institution?
  4127. - So you've mentioned several times that you started Facebook in
  4128. your dorm room in 2004; 15 years, 2 billion users and several —
  4129. unfortunately — breaches of trust later, Facebook's today — is
  4130. Facebook today the same kind of company you started with a
  4131. Harvard.edu email address?
  4132. - And — and you've recently said that you and Facebook have not
  4133. done a good job of explaining what Facebook does. And so, back
  4134. in 2012 and 2013, when a lot of this scraping of user and friend
  4135. data was happening, did it ever cross your mind that you should
  4136. be communicating more clearly with users about how Facebook is
  4137. monetizing their data? I understand that Facebook does not sell
  4138. user data, per se, in the traditional sense, but it's also just
  4139. as true that Facebook's user data is probably the most valuable
  4140. thing about Facebook. In fact, it may be the only truly valuable
  4141. thing about Facebook. Why wasn't explaining what Facebook does
  4142. with users' data a higher priority for you as a co-founder and —
  4143. and now as CEO?
  4144. - Given the situation, are — can you manage the issues that are
  4145. before you? Or does Congress need to intercede? I'm going to
  4146. leave that, because I'm out — I'm over my time — that and I want
  4147. an issue the Vietnam Veterans of America have raised, too. And
  4148. we'll get back with your staff on that about some fake pages
  4149. that are up. But I want to stay on schedule, so, with that, I'll
  4150. yield to Mr. Pallone for four minutes.
  4151. - We're going to have to move on to our next question.
  4152. - The chair now recognizes a former chairman of the committee,
  4153. Mr. Barton of Texas, for four minutes.
  4154. - Absolutely. The chair now recognizes the gentleman from
  4155. Illinois, Mr. Rush, for four minutes for questions.
  4156. - The gentleman's time has expired. We need to go now to the
  4157. gentleman from Michigan, Mr. Upton, for four minutes.
  4158. - Gentleman's time's expired. Chair recognizes the gentlelady
  4159. from California, Ms. Eshoo, for four minutes.
  4160. - Without objection.
  4161. - The gentlelady's time is expired.
  4162. - Chair now recognize gentleman from Illinois, Mr. Shimkus, for
  4163. four minutes.
  4164. - The gentleman's time has expired. We now turn to the gentleman
  4165. from New York, Mr. Engel, for four minutes.
  4166. - Gentleman's time has expired.
  4167. - Chair recognizes the chairman of the Health Subcommittee, Mr. —
  4168. Dr. Burgess of Texas, for four minutes.
  4169. - Without objection. And we put the slide up you requested.
  4170. - Without objection.
  4171. - It's time.
  4172. - Gentleman's time has expired. Chair recognizes the gentleman
  4173. from Texas, Mr. Green, for four minutes.
  4174. - Thank the gentleman. The chair now recognizes the gentlelady
  4175. from Tennessee, Ms. Blackburn, for four minutes.
  4176. - Gentlelady's time's expired. Chair recognizes gentlelady from
  4177. Colorado, Ms. DeGette, for four minutes.
  4178. - The gentlelady's time is expired. Chair recognizes the
  4179. gentleman from Louisiana, the whip of the House, Mr. Scalise,
  4180. for four minutes.
  4181. - Gentleman's time has expired.
  4182. - Chair now recognizes the gentleman from Pennsylvania, Mr.
  4183. Doyle, for four minutes.
  4184. - Gentleman's ...
  4185. - ... Gentleman's time's expired. Chair recognizes the chairman
  4186. of the Subcommittee on Digital Commerce and Consumer Protection,
  4187. Mr. Latta of Ohio, for four minutes.
  4188. - Gentleman yields back. Chair recognizes the gentlelady from
  4189. Illinois, Ms. Schakowsky, for four minutes.
  4190. - The gentlelady's time ...
  4191. - Gentlelady's time's expired. Chair recognizes the gentlelady
  4192. from Washington state, the conference chairman.
  4193. - Gentlelady's ...
  4194. - Gentlelady's time is expired. Chair recognizes the gentleman
  4195. from North Carolina, Mr. Butterfield, for four minutes.
  4196. - The gentleman's time has expired. Chair now recognizes the
  4197. chairman of the Oversight and Investigations Subcommittee,
  4198. gentleman from Mississippi, Mr. Harper, for four minutes.
  4199. - Gentleman yields back the balance of his time. Gentlelady from
  4200. California, Ms. Matsui, is recognized for four minutes.
  4201. - The gentlelady's time is expired. As previously agreed, we will
  4202. now take a five-minute recess, and committee members and — and
  4203. our witness need to plan to be back in about five minutes. We
  4204. stand in recess. (RECESS)
  4205. - We'll call the Energy and Commerce Committee back to order and
  4206. recognize the gentleman from New Jersey, Mr. Lance, for four
  4207. minutes for purposes of questions.
  4208. - Thank the gentleman from New Jersey, recognize the gentlelady
  4209. from Florida, Ms. Castor, for four minutes.
  4210. - The gentle — the gentlelady's time.
  4211. - The gentlelady's time ...
  4212. - Without objection.
  4213. - Chair now recognizes the gentlemen from Kentucky, Mr. Guthrie,
  4214. for (inaudible) minutes.
  4215. - Gentleman's time ...
  4216. - Recognize the gentleman from Maryland, Mr. Sarbanes, for four
  4217. minutes.
  4218. - Gentleman's time ...
  4219. - ... gentleman's time's expired.
  4220. - Chair recognizes the gentleman from Texas, Mr. Olson, for four
  4221. minutes.
  4222. - That's fine.
  4223. - Go ahead.
  4224. - Thank you for that clarification. We'll now go to Mr. Olson
  4225. from Texas for four minutes.
  4226. - Gentleman's time is expired. Chair recognizes the gentleman
  4227. from California, Mr. McNerney for four minutes.
  4228. - Gentleman's time — gentleman's time is expired. Chair
  4229. recognizes the gentleman from West Virginia, Mr. McKinley, for
  4230. four minutes.
  4231. - Gentleman's time has expired. Chair recognizes the gentleman
  4232. from Vermont, Mr. Welch, for four minutes.
  4233. - Gentleman yields back. Chair recognizes the gentleman from
  4234. Illinois, Mr. Kinzinger, for four minutes.
  4235. - The gentleman's time has expired. Chair recognizes the
  4236. gentleman from New Mexico, Mr. Lujan, for four minutes.
  4237. - The gentleman's time is expired.
  4238. - The chair now recognizes the gentleman from Virginia, Mr.
  4239. Griffith, for four minutes.
  4240. - Gentleman's time ...
  4241. - Gentleman's time is expired.
  4242. - Chair now recognizes the gentleman from New York, Mr. Tonko,
  4243. for four minutes.
  4244. - Gentleman's time has expired. Chair recognizes ...
  4245. - Sure. Without objection, of course. That's — that goes for all
  4246. members. Chair recognizes the gentleman from Florida, Mr.
  4247. Bilirakis, for four minutes.
  4248. - Gentleman's time ...
  4249. - ... gentleman's time has expired.
  4250. - Yes, sir.
  4251. - Without objection. The chair recognizes the gentlelady from New
  4252. York, Ms. Clarke, for four minutes.
  4253. - Gentlelady's time ...
  4254. - Gentlelady's time has expired. Chair recognizes the gentleman
  4255. from Ohio, Mr. Johnson, for four minutes.
  4256. - The gentleman's time's expired. The chair recognizes the
  4257. gentleman from Iowa, Mr. Loebsack.
  4258. - Without objection.
  4259. - The gentleman's time is expired.
  4260. - Chair recognizes the gentleman from Missouri, Mr. Long, for
  4261. four minutes.
  4262. - Gentleman's time ...
  4263. - ... gentleman's time's expired.
  4264. - Gentleman's time has expired.
  4265. - Well I — I'd tell you, I'd — if we could move on, just because
  4266. we're going to run out of time for members down dais to be able
  4267. to ask their questions ...
  4268. - I now recognize the gentleman from Oregon, Mr. Schrader, for
  4269. questions for four minutes.
  4270. - Gentleman's time has expired. And, just for our — our members
  4271. who haven't had a chance to ask questions, we will pause at 1:30
  4272. — well, we will have votes at 1:40. We will continue the hearing
  4273. after a — a brief pause, and we'll — we'll coordinate that.
  4274. We'll go now to Dr. Bucshon.
  4275. - The gentleman's time is expired. Chair recognizes the gentleman
  4276. from Massachusetts, Mr. Kennedy, for four minutes.
  4277. - Gentleman's time ...
  4278. - Gentleman's time has expired.
  4279. - Chair recognizes the gentleman from Texas, Mr. Flores, for four
  4280. minutes.
  4281. - Gentleman's ...
  4282. - Gentleman's time has expired. Chair recognizes the gentleman
  4283. from California for four minutes, Mr. Cardenas.
  4284. - Chairman — the gentleman's time.
  4285. - Sure.
  4286. - And, with that, we will recess for about five minutes, 10
  4287. minutes. We'll recess for 10 minutes and then resume the
  4288. hearing. (RECESS)
  4289. - All right, we're going to reconvene the Energy and Commerce
  4290. Committee, and we will go next to the gentlelady from Indiana,
  4291. Ms. Brooks, for four minutes to resume questioning.
  4292. - Sure.
  4293. - I appreciate that. Thank you. We go now to the gentleman from
  4294. California, Mr. Ruiz.
  4295. - Now go to gentleman from Oklahoma, Mr. Mullin, for four
  4296. minutes.
  4297. - The gentleman's time.
  4298. - Recognize now the gentleman from California for four minutes.
  4299. - Thank you. We'll go now to the gentleman from North Carolina,
  4300. Mr. Hudson, for four minutes.
  4301. - Gentleman's time's expired. We now go to the gentleman from New
  4302. York, Mr. Collins for four minutes.
  4303. - Okay. Now I think we go next in order to Mr. Walberg actually,
  4304. who was here when the gavel dropped. So we will go to Mr.
  4305. Walberg for four minutes.
  4306. - Gentleman's time ...
  4307. - Yes. Now recognize the gentlelady from California, Ms. Walters,
  4308. for four minutes.
  4309. - Gentle — gentlelady's time has expired. Recognize the
  4310. gentlelady from Michigan, Ms. Dingell for four minutes.
  4311. - Gentlelady's time has expired. Chair recognizes the gentleman
  4312. from Pennsylvania, Mr. Costello, for four minutes.
  4313. - Gentleman yields back. We go now to the gentleman Georgia, Mr.
  4314. Carter, for four minutes.
  4315. - Gentleman's ...
  4316. - Gentleman's time has expired. Chair recognizes, Mr. Duncan, for
  4317. four minutes.
  4318. - Only by two minutes, did he come in late. (LAUGHTER)
  4319. - And for our final four minutes of questioning comes from Mr.
  4320. Cramer, North Dakota, former head of the Public Utility
  4321. Commission there. We welcome your comments. Go ahead.
  4322. - I suppose you don't want to hang around for another round of
  4323. questions? Just kidding. Mr. Zuckerberg ...
  4324. - Staff, several of them, just passed out behind you. You know,
  4325. on a serious note as we close, I would welcome your suggestions
  4326. of other technology CEOs we might benefit from hearing from in
  4327. the future for a hearing on these issues, as we look at net
  4328. neutrality, as we looked at privacy issues. These are all
  4329. important. They are very controversial. We're fully cognizant of
  4330. that. We want to get it right, and — and so we appreciate your
  4331. comments and — and testimony today. There are no other members
  4332. that haven't asked you questions, and we're not doing a second
  4333. round, so seeing that, I just want to thank you for being here.
  4334. I know we agreed to be respectful of your time. You have been
  4335. respectful of our questions, and we appreciate your answers and
  4336. your candor. As you know, some of our members weren't able to
  4337. ask all the questions they had, so they'll probably submit those
  4338. in — in writing, and we would — we would like to get answers to
  4339. those back in a timely manner. I'd also like to include the
  4340. following documents be submitted for the record by unanimous
  4341. consent: a letter from the American Civil Liberties Union, a
  4342. letter from NetChoice, a letter from the Vietnam Veterans of
  4343. America, which I referenced in my opening remarks. A letter from
  4344. Public Knowledge, a letter and an FTC complaint from Electronic
  4345. Privacy Information Center, a letter from the Motion Picture
  4346. Association of America, a letter from ACT, the App Association,
  4347. a letter from the Committee for Justice, a letter from the
  4348. Transatlantic Consumer Dialogue, and a letter from the Civil
  4349. Society Groups, and a letter from the National Council of Negro
  4350. Women. Pursuant to committee rules, I remind members they have
  4351. 10 business days to submit additional questions for the record,
  4352. and I ask that the witness submit their responses within 10
  4353. business days upon receipt of those questions. Without
  4354. objections, our — our committee is now adjourned.
  4355. REP. FRED UPTON (R-MICH.)
  4356. - Thank you, Mr. Chairman, and welcome to the committee. A number
  4357. of times in the last day or two, you've indicated that, in fact,
  4358. you're now open to some type of regulation. And we know, of
  4359. course, that you're the dominant social media platform without
  4360. any true competitor, in all frankness. And you have hundreds, if
  4361. not thousands, of folks that are — would be required to help
  4362. navigate any type of regulatory environment. Some would argue
  4363. that a more regulatory environment might ultimately stifle new
  4364. platforms and innovators some might describe as desperately
  4365. needed competition; i.e., regulatory complexity helps protect
  4366. those folks like you. It could create a harmful barrier to entry
  4367. for some start-ups, particularly ones that might want to compete
  4368. with you. So should we policymakers up here be more focused on
  4369. the needs of start-ups, over large incumbents? And what kind of
  4370. policy regulation — regulatory environment would you want,
  4371. instead of managing, maybe, a Fortune 500 company, if you were
  4372. launching a start-up to — taking on the big guy?
  4373. REP. MIKE DOYLE (D-PA.)
  4374. - Thank you, Mr. Chairman. Mr. Zuckerberg, welcome. Facebook uses
  4375. some of the most advanced data processing techniques and
  4376. technologies on the planet, correct?
  4377. REP. PETER WELCH (D-VT.)
  4378. - Thank you, Mr. Chairman. Mr. Zuckerberg, you acknowledge
  4379. candidly that Facebook made a mistake. You did an analysis of
  4380. how it happened. You've promised action. We're at the point
  4381. where the action will speak much louder than the words. But, Mr.
  4382. Chairman, this Congress has made a mistake. This event that
  4383. happened, whether it was Facebook or some other platform, was
  4384. foreseeable and inevitable. And we did nothing about it.
  4385. Congresswoman Blackburn and I had a — a group, a privacy working
  4386. group, six meetings with many of the industry players. There was
  4387. an acknowledgment on both sides that privacy was not being
  4388. protected, that there was no reasonable safeguard for Americans'
  4389. privacy. But there was an inability to come to a conclusion. So
  4390. we also have an obligation. And, in an effort to move forward,
  4391. Mr. Zuckerberg, I've framed some questions that hopefully will
  4392. allow a reasonable yes or no answer to see if there's some
  4393. common ground to achieve the goal you assert you have, and we
  4394. certainly have: the obligation to protect the privacy of
  4395. American consumers. First, do you believe that consumers have a
  4396. right to know and control what personal data companies collect
  4397. from them?
  4398. RUSH
  4399. - Mr. Zuckerberg, you should be commended that Facebook has grown
  4400. so big, so fast. It is no longer the company that you started in
  4401. your dorm room. Instead, it's one of — great American success
  4402. stories. That much influence comes with enormous social
  4403. responsibility, on which you have failed to act and to protect
  4404. and to consider. Shouldn't Facebook, by default, protect users'
  4405. information? Why is the onus on the user to opt in to privacy
  4406. and security settings?
  4407. - All right.
  4408. - Mr. Zuckerberg, I only have a few more seconds. In November
  4409. 2017, (inaudible) reported that Facebook was — still allowed
  4410. housing and work advertisements to systematically exclude
  4411. advertisements to specific racial groups, an explicitly
  4412. prohibited practice. This is just one example where Facebook has
  4413. allowed race — so race — race to improperly play a role. What
  4414. has Facebook done, and what are you doing, to ensure that you
  4415. are — that your targeted advertisements and other components of
  4416. your platform are in compliance with federal laws such as the
  4417. Civil Rights Act of 1968?
  4418. - When did you do that?
  4419. DINGELL
  4420. - Thank you, Mr. Chairman. Mr. Zuckerberg, thank you for your
  4421. patience. I am a daily Facebook user. Much to my staff's
  4422. distress, I do it myself. And because we need a little humor,
  4423. I'm even married to a 91-year-old man that's thinking of
  4424. Twitter. But I know Facebook's value. I've used it for a long
  4425. time. But with that value also comes obligation. We've all been
  4426. sitting here for more than four hours. Some things are striking
  4427. during this conversation. As CEO, you didn't know some key
  4428. facts. You didn't know about major court cases regarding your
  4429. privacy policies against your company. You didn't know that the
  4430. FTC doesn't have fining authority and that Facebook could not
  4431. have received fines for the 2011 consent order. You didn't know
  4432. what a shadow profile was. You didn't know how many apps you
  4433. need to audit. You did not know how many other firms have been
  4434. sold data by Dr. Kogan other than Cambridge Analytica and Eunoia
  4435. Technologies, even though you were asked that question
  4436. yesterday. And yes, we were all paying attention yesterday. You
  4437. don't even know all the kinds of information Facebook is
  4438. collecting from its own users. Here's what I do know. You have
  4439. trackers all over the Web.
  4440. - On practically every website you go to, we all see the Facebook
  4441. Like or Facebook Share buttons. And with the Facebook pixel,
  4442. people browsing the Internet may not even see that Facebook
  4443. logo. It doesn't matter whether you have a Facebook account.
  4444. Through those tools, Facebook is able to collect information
  4445. from all of us. So I want to ask you, how many Facebook like
  4446. buttons are there on non-Facebook Web pages?
  4447. - Is the number over hundred million?
  4448. - How many share buttons are there on non-Facebook Web pages?
  4449. - And do we think that's over 100 million likely? How many chunks
  4450. of Facebook pixel code are there on non-Facebook Web page?
  4451. - Can you commit to get the committee, the European Union is
  4452. asking for 72 hours on transparency? Do you think we could get
  4453. that back in committee in 72 hours?
  4454. - I know you're still reviewing, but do you know now whether
  4455. there are other fourth parties that had access to the data from
  4456. someone other than Dr. Kogan? Or is this something we're going
  4457. to find out in a press release down the road? I think what
  4458. worries all of us and you've heard it today is it has taken
  4459. almost three years to hear about that. And I am convinced that
  4460. there are other people out there.
  4461. - And you will make it public quickly? Not three years.
  4462. - So I just — I'm going to conclude because my times almost up
  4463. that I worry that when I hear companies value our privacy, it's
  4464. meant in monetary terms, not the moral obligation to protect it.
  4465. Data protection and privacy are like clean air and clean water,
  4466. there need to be clear rules of the road.
  4467. WALBERG
  4468. - Well, thank you, Mr. Chairman. I appreciate that. And I — Mr.
  4469. Zuckerberg, I appreciate you being here as well. It has been
  4470. interesting to listen to all of the comments from both sides of
  4471. the aisle. To get an idea of the breadth, length, depth, the
  4472. vastness of our World Wide Web, social media and more
  4473. specifically Facebook. I want to ask three starter questions.
  4474. Don't think they'll take a long answer but I'll let you — let
  4475. you answer. Earlier you indicated that there were bad actors,
  4476. and that triggered your platform policy changes in 2014, but you
  4477. didn't identify who those bad actors where. Who were they?
  4478. - Secondly, is there any way, any way, that Facebook can with any
  4479. level of certainty ensure Facebook users that every single app
  4480. on it's platform is not misusing their data?
  4481. - And I think that — I think that's an adequate answer. It's a
  4482. truthful answer. Can you assure me that ads and content are not
  4483. being denied based on particular views?
  4484. - Let me — let me ...
  4485. - And I wanted to bring up a — a screen grab that we had, again
  4486. going back to Representative Upton earlier on was his
  4487. constituent, but was my legislative director for a time. It was
  4488. his campaign ad that he was going to boost his post, and he was
  4489. rejected. It was rejected as being — it said here, ad wasn't
  4490. approved because it doesn't allow — doesn't follow advertising
  4491. policies, we don't allow ads that contain shocking,
  4492. disrespectful or sensational content, including ads that depict
  4493. violence or threats of violence. Now, as I read that, I also
  4494. know that you have since — or Facebook has since declared no,
  4495. that was a mistake; an algorithm problem that went on there. But
  4496. that's our concern that we have, that it wouldn't be because he
  4497. had his picture with a veteran, it wouldn't be because he wanted
  4498. to reduce spending, but pro-life, second amendment, those things
  4499. and conservative, that causes us some concerns. So I guess what
  4500. I'm saying here, I believe that we ought to have a light touch
  4501. in regulation. And when I hear some of my friends on the other
  4502. side of the aisle decry the fact of what's going on now, and
  4503. they were high-fiving what took place in 2012 with President
  4504. Obama and what he was capable of doing in bringing in and
  4505. grabbing, for use in a political way. I would say the best thing
  4506. we can do is have these light-of-day hearings, let you self-
  4507. regulate as much as possible with a light touch coming from us
  4508. but recognizing that, in the end, your Facebooks or subscribers
  4509. are going to tell you what you need to do ...
  4510. - So thank you for your time and thank you for the time you've
  4511. given me.
  4512. DUNCAN
  4513. - Thank you Mr. Chairman. Usually I'm last, but today I think we
  4514. have one behind me that came in late. Mr. Zuckerberg, I want to
  4515. ...
  4516. - I want to thank you for all the work you've done. And I want to
  4517. let you know that I've been on Facebook since 2007. Started as a
  4518. state legislator, used Facebook to communicate with my
  4519. constituents. And it has been an invaluable tool for me in
  4520. communicating. We can actually do in real time multiple issues
  4521. as we deal with them in here in Congress, answer questions. It's
  4522. almost like a town hall in real time. I also want to tell you
  4523. that your staff here at the Governmental Affairs Office, Chris
  4524. Herndon and others do a fabulous job in keeping us informed. So
  4525. I want to thank you for that. Before this hearing when we heard
  4526. about it, we asked our constituents and our friends on Facebook,
  4527. what would they want me to ask you? And the main response was
  4528. addressing the perceived, and in many instances confirmed bias
  4529. and viewpoint discrimination against Christians and
  4530. conservatives on your platform. Today, listening to this, I
  4531. think the two main issues are user privacy and censorship.
  4532. Constitution of the United States and the First Amendment says,
  4533. “Congress shall make no law respecting an establishment of
  4534. religion, nor prohibiting the free exercise thereof. Nor will
  4535. they abridge the freedom of speech of the press, the right of
  4536. people to assemble or address the Congress for address of
  4537. grievances — or petition Congress to address for grievances.”
  4538. I've got a copy of the Constitution I want to give you at the
  4539. end of this hearing. The reason I say all that, this is maybe a
  4540. rhetorical question but why not have a community standard for
  4541. free speech and free exercise of religion that is simply a
  4542. mirror of the First Amendment, with algorithms that are viewed —
  4543. that have a viewpoint that is neutral? Why not do that?
  4544. - And I appreciate — I appreciate that answer. You're right about
  4545. propaganda and other issues there. And I believe the
  4546. Constitution generally applies to government and says that
  4547. Congress shall make no law respecting — talks about religion.
  4548. And then we don't want to bridge the freedom of speech or the
  4549. press. But the standard has been applied to private businesses,
  4550. whether those are newspapers or other media platform. And I
  4551. would argue that social media has now become a media platform to
  4552. be considered in a lot of ways the same as other press media. So
  4553. I think the First Amendment probably does apply and will apply.
  4554. What will you do — and let me ask you this, what will you do to
  4555. restore the First Amendment rights of Facebook users and ensure
  4556. that all users are treated equally, regardless of whether
  4557. they're conservative, moderate, liberal or whatnot?
  4558. - In the essence of time, conservatives are the ones that raise
  4559. the awareness that their content has been pulled. I don't see
  4560. the same awareness being raised by liberal organizations,
  4561. liberal candidates or liberal policy statements. So I think —
  4562. and I think you've been made aware of this over the last two
  4563. days, probably need to go back and make sure that those things
  4564. are treated equal. And I would appreciate if you do that. Again,
  4565. I appreciate the platform, I appreciate the work that you do.
  4566. And we stand willing and able to help you here in Congress,
  4567. because Facebook is an invaluable part of what we do and how we
  4568. communicate, so thanks for being here.
  4569. - I yield back.
  4570. UPTON
  4571. - And, to follow up a question with — that Mr. Barton asked about
  4572. Silk and Diamond — I don't know whether you know about this
  4573. particular case — I have a former state rep who's running for
  4574. state senate. He's the former Michigan Lottery commissioner, so
  4575. he's a guy of — of fairly good political prominence. He is a —
  4576. he announced for state senate just in the last week, and he had
  4577. what I thought was a rather positive announcement. It's — and
  4578. I'll read to you precisely what it was. “I'm proud to announce
  4579. my candidacy for state senate. Lansing needs conservative west
  4580. Michigan values, and, as our next state senator, I will work to
  4581. strengthen our economy, limit government, lower our auto
  4582. insurance rates, balance the budget, stop sanctuary cities, pay
  4583. down government debt, be a pro-life, pro-2nd-Amendment
  4584. lawmaker.” And it was rejected. And the response from you all
  4585. was it wasn't approved because it doesn't follow our advertising
  4586. policies. We don't allow ads that contain shocking,
  4587. disrespectful or sensational content, including ads that depict
  4588. violence or threats of violence. I'm not sure where the threat
  4589. was, based on what he tried to post.
  4590. - Okay.
  4591. - Okay. Thank you.
  4592. REP. YVETTE D. CLARKE (D-N.Y.)
  4593. - I thank you, Mr. Chairman. And thank you for coming before us,
  4594. Mr. Zuckerman (sic). Today, I want to take the opportunity to
  4595. represent the concerns of the newly formed Tech Accountability
  4596. Caucus, in which I serve as a co-chair with my colleagues,
  4597. Representative Robin Kelly, Congressman Emanuel Cleaver and
  4598. Congresswoman Bonnie Watson Coleman, but, most importantly,
  4599. people in our country and around the globe who are in vulnerable
  4600. populations, including those who look just like me. My first
  4601. question to you is, as you may be aware, there have been
  4602. numerous media reports about how more than 3,000 Russian ads
  4603. were bought on Facebook to incite racial and religious division
  4604. and chaos in the U.S. during the 2016 election. Those ads
  4605. specifically characterized and weaponized African American
  4606. groups like Black Lives Matter, in which ads suggested, through
  4607. propaganda — or fake news, as people call it these days — that
  4608. they were a rising threat. Do you think that the lack of
  4609. diversity, culturally competent personnel in your C suite and
  4610. throughout your organization, in which your company did not
  4611. detect or disrupt and investigate these claims, are a problem in
  4612. this regard?
  4613. LATTA
  4614. - None at all?
  4615. - Okay. Let me ask this question. You know, it's a little bit
  4616. that's been going on — when you made your opening statement in
  4617. regards to what you'd like to see done with the — with the
  4618. company and — and steps going — moving forward, there's been a
  4619. couple questions, you know, about that you're going to be
  4620. investigating the apps. How many apps are there out there that
  4621. you'd have to investigate?
  4622. - Just to follow up on that, then, how long would it take to then
  4623. investigate each of those apps, once you're doing that? Because,
  4624. again, when you're talking about tens of thousands and you're
  4625. going through that entire process, then how long will it take to
  4626. go through each one of those apps?
  4627. - Okay.
  4628. - Okay. We were talking about audits, as there have been some
  4629. questions about this. On the audits, in 2011, Facebook signed —
  4630. it did sign that consent order with the Federal Trade Commission
  4631. for the privacy violations. Part of that consent order requires
  4632. Facebook to submit third-party privacy audits to the FTC every
  4633. two years. First, are you aware of the audits? And, second, why
  4634. didn't the audits disclose or find these issues with the
  4635. developer's access to users' data?
  4636. - Let me — I'm about out of time here. Are you aware that
  4637. Facebook did provide the auditors with all the information they
  4638. requested for — when doing the FTC audits?
  4639. - Yeah. Did we — did Facebook provide the auditors with all the
  4640. information it requested when they were preparing the audit for
  4641. the FTC?
  4642. - Okay. So — but all the information is provided. And were you
  4643. ever personally asked to provide information or feedback in
  4644. these audits to the FTC?
  4645. - Okay. Mr. Chair, my time's expired and I yield back.
  4646. REP. ROBERT E. LATTA (R-OHIO)
  4647. - Well thank you, Mr. Chairman. And — and, Mr. Zuckerberg, thanks
  4648. very much for being with us today. First question I have is, can
  4649. you tell the Facebook users that the Russians and the Chinese
  4650. have not used the same methods as other third parties to scrape
  4651. the entire social network for their gain?
  4652. WALTERS
  4653. - Thank you. Thank you, Mr. Chairman. And thank you, Mr.
  4654. Zuckerberg, for being here. One of my biggest concerns is the
  4655. misuse of consumer data and what controls users have over their
  4656. information. You have indicated that Facebook users have
  4657. granular control over their own contact — content and who can
  4658. see it. As you can see on the screen, on the left is a
  4659. screenshot of the on-off choice for apps which must be on for
  4660. users to use apps that require a Facebook login and which allows
  4661. apps to collect your information. On the right is a screenshot
  4662. of what a user sees when they want to change the privacy
  4663. settings on a post, photo or other content. Same account, same
  4664. user. But which control governs? The app platform access or the
  4665. user's decision as to who they want to see a particular post?
  4666. - So, which — which app governs, Okay? Or which control governs?
  4667. The app platform access or the user's decision as to who they
  4668. want to see a particular post? So if you look up there on the
  4669. screen.
  4670. - Okay, do you think that the average Facebook user understands
  4671. that is how it works? And how would they find this out?
  4672. - Okay, so these user control options are in different locations.
  4673. And it seems to me that putting all privacy control options in a
  4674. single location would be more user-friendly. Why aren't they in
  4675. the same location?
  4676. - Okay. California has been heralded by many on this committee
  4677. for its privacy initiatives. Given that you and other major tech
  4678. companies are in California and we are still experiencing
  4679. privacy issues, how do you square the two?
  4680. - So, given that you and other major tech companies are in
  4681. California and we're still experiencing privacy issues, how do
  4682. you square the two?
  4683. - California's been heralded by many in this committee for its
  4684. privacy initiatives.
  4685. REP. GUS BILIRAKIS (R-FLA.)
  4686. - Thank you. Thank you, Mr. Chairman — appreciate it. And thanks
  4687. for your testimony, Mr. Zuckerberg. Well, first of all, I wanted
  4688. to follow up with Mr. — Mr. McKinley's testimony. This is bad
  4689. stuff, Mr. Zuckerberg, with regard to the illegal online
  4690. pharmacies. When are the — those ads — I mean, when are you
  4691. going to take those off? I think we need an answer to that. I
  4692. think they need to get off — we need to get these off as soon as
  4693. possible. Can you give us an answer, a clear answer as to when
  4694. these pharmacies — we have an epidemic here with regard to the
  4695. opioids. I think we're owed a clear answer, a definitive answer
  4696. as to when these ads will be off — offline.
  4697. LONG
  4698. - It's coincidental. The timing was the same, right? Just
  4699. coincidental.
  4700. - You put up pictures of two women, and decide which one was the
  4701. better — more attractive of the two, is that right?
  4702. - Okay. Okay, I just — but, from that beginning — whether it was
  4703. actually the beginning of Facebook or not — you've come a long
  4704. way. Jan Schakowsky — Congresswoman Schakowsky, this morning,
  4705. said self-regulation simply does not work. Mr. Butterfield,
  4706. Representative Butterfield, said that you need more African
  4707. American inclusion on your board of directors. If I was you — a
  4708. little bit of advice — Congress is good at two things: doing
  4709. nothing, and overreacting. So far, we've done nothing on
  4710. Facebook. Since your inception in that Harvard dorm room, many
  4711. years ago, we've done nothing on Facebook. We're getting ready
  4712. to overreact. So take that as just a shot across the bow,
  4713. warning to you. You've got a good outfit there, on your front
  4714. row, behind you, that — they're very bright folks. You're
  4715. Harvard-educated. I have a Yale hat that costs me $160,000 —
  4716. that's as close as I ever got to an Ivy League school. But I'd
  4717. like to show you, right now, a — a little picture here. You
  4718. recognize these folks?
  4719. - Who are they?
  4720. - That is Diamond and Silk, two biological sisters from North
  4721. Carolina. I might point out they're African American. And their
  4722. content was deemed by your folks to be unsafe. So, you know, I
  4723. don't know what type of picture this is — if it was taken in a
  4724. police station, or what, in a lineup — but apparently they've
  4725. been deemed unsafe. Diamond and Silk have a question for you,
  4726. and that question is, what is unsafe about two black women
  4727. supporting President Donald J. Trump?
  4728. - ... you have 20,000 employees, as you said, to check content.
  4729. And I would suggest, as good as you are with analytics, that
  4730. those 20,000 people use some analytical research and see how
  4731. many conservative websites have been pulled down, and how many
  4732. liberal websites. One of our talk show hosts at home — Nick Reed
  4733. — this morning, on the radio, said that, if Diamond and Silk
  4734. were liberal, they'd be on the late-night talk show circuit,
  4735. back and forth. They're humorous, they have their opinion, not
  4736. that you have to agree or that I have to agree — to agree —
  4737. don't agree — with them. But the fact that they're conservative
  4738. — and I would just remember — if you don't remember anything
  4739. else from this hearing here today, remember we do nothing and we
  4740. overreact.
  4741. - And we're getting ready to overreact. So I would suggest you go
  4742. home and review all these other things people have accused you
  4743. of today, get with your good team — they're behind you ...
  4744. - ... you're the guy to fix this. We're not. You need to save
  4745. your ship. Thank you.
  4746. - I didn't say it was an overreach. All I said was that — I was
  4747. just letting — reminding with several ...
  4748. WELCH
  4749. - Do you believe that consumers have a right to control how and
  4750. with whom their personal information is shared with third
  4751. parties?
  4752. - And do you believe that consumers have a right to secure and
  4753. responsible handling of their personal data?
  4754. - And do you believe that consumers should be able to easily
  4755. place limits on the personal data that companies collect and
  4756. retain?
  4757. - Okay. And do you believe that consumers should be able to
  4758. correct or delete inaccurate personal data that companies have
  4759. obtained?
  4760. - Well, then, let's get — you get back to us with specifics on
  4761. that. I think they do have that right. Do you believe that
  4762. consumers should be able to have their data deleted immediately
  4763. from Facebook when they stop using the service?
  4764. - Good. And do you believe that the Federal Trade Commission, or
  4765. another properly resourced governmental agency with rulemaking
  4766. authority, should be able to determine on a regular basis what
  4767. is considered personal information, to provide certainty for
  4768. consumers and companies what information needs to be protected
  4769. most tightly?
  4770. - There's not a big discussion here. Who gets the final say? Is
  4771. it the private market companies, like yours? Or is there a
  4772. governmental function here that defines what privacy is?
  4773. - All right. Let me ask you this. I've appreciated your
  4774. testimony. Will you work with this committee to help put us — to
  4775. help the U.S. put in place our own privacy regulation that
  4776. private — prioritizes consumer's right to privacy, just as the
  4777. E.U. has done?
  4778. - All right. And you have indicated that Facebook has not always
  4779. protected the privacy of their users throughout the company's
  4780. history. And it seems, though, from your answers, that consumers
  4781. — you agree that consumers do have a fundamental right to
  4782. privacy that empowers them to control the collection, the use,
  4783. the sharing of their personal information online. And, Mr.
  4784. Chairman — and thank you. Mr. Chairman, privacy cannot be based
  4785. just on company policies, whether it's Facebook or any other
  4786. company. There has to be a willingness on the part of this
  4787. Congress to step up and provide policy protection to the privacy
  4788. rights of every American consumer. I yield back.
  4789. HUDSON
  4790. - Thank you. Thank you, Mr. Zuckerberg, for being here. This is a
  4791. long day. You're here voluntarily, and we sure appreciate you —
  4792. you being here. I can say from my own experience, I've hosted
  4793. two events with Facebook in my district in North Carolina
  4794. working with small business and finding ways they can increase
  4795. their customer base on Facebook, and it's been very beneficial
  4796. to us, so I thank you for that. I do want to pin this slightly
  4797. and frame the discussion in other light for my question. One of
  4798. the greatest honors I have as I represent the men and women of
  4799. Fort Bragg, epicenter of the universe, home of the airborne
  4800. special operations, you visited last year.
  4801. - Very well received, so you understand that due to the sense of
  4802. nature of some of the operations these soldiers conduct, many
  4803. are discouraged or even prohibited from having a social media
  4804. presence. However, there are others who — who still have
  4805. profiles or some who may have deleted their profiles upon
  4806. entering military service. Many have family members who have
  4807. Facebook profiles. And as we've learned, each one of these
  4808. user's information may have been shared without their consent.
  4809. There's no way that Facebook can guarantee the safety of this
  4810. information on another company's server that they sell this
  4811. information. If private information can be gathered by apps
  4812. without explicit consent of the user, they're almost asking to
  4813. be hacked. Are you aware of the national security concerns that
  4814. would come from allowing those who seek to harm our nation
  4815. access to information such as the geographical location of
  4816. members of our Armed Services? Is this something that you're —
  4817. you're looking at?
  4818. - Great, well I'd love to follow up with you on that. It's been
  4819. said many times here that you refer to Facebook as a platform of
  4820. all ideas — or a platform for all ideas. I know you've heard
  4821. from many yesterday and today about concerns regarding Facebook
  4822. censorship of content, particularly content that may promote
  4823. Christian beliefs of conservative political beliefs. I have to
  4824. bring up Diamond & Silk again because they're actually from my
  4825. district, but — but I think you've addressed these concerns, but
  4826. I think it's also become very apparent, and I hope it's become
  4827. very apparent to you, that this is a very serious concern. I
  4828. actually asked on my Facebook page for my constituents to give
  4829. me ideas of things they'd like for me to ask you today, and the
  4830. most common question was about personal privacy. So this is
  4831. something that I — I think there is an issue, there — there's
  4832. issues that your company, in terms of trust with consumers, that
  4833. I think you need to deal with. I think you recognize that based
  4834. on your testimony today. But my question to you is, what is the
  4835. standard that Facebook uses to determine what is offensive or
  4836. controversial, and how has that standard been applied across
  4837. Facebook's platform?
  4838. - That's probably the most difficult to define, so I guess my
  4839. question is how do you apply — what standards do you apply to
  4840. try to determine what's hate speech versus what's just speech
  4841. you may disagree with?
  4842. - I'm just running out of time here. I hate to cut you off. But
  4843. let me just say that, you know, based on the statistics Mr.
  4844. Scalise shared and the anecdotes we can provide you, it seems
  4845. like there's still a challenge when it comes to conservative
  4846. (inaudible), and I hope you will address that.
  4847. - With that, Mr. Chairman, I'll stop talking.
  4848. BURGESS
  4849. - And so I'm going to be submitting some questions for the record
  4850. that are referencing these articles. One is “Friended: How the
  4851. Obama Campaign Connected with Young Voters,” by Michael Scherer;
  4852. “We Already Know How to Protect Ourselves from Facebook,” and I
  4853. hope I get this name right — Zeynep Tufekci; and “It's Time to
  4854. Break Up Facebook,” by Eric Wilson, who, in the interest of full
  4855. disclosure ...
  4856. - ... was a former staffer. And I will be referencing those
  4857. articles in — in some written questions. I consulted my
  4858. technology guru, Scott Adams, in the form of Dilbert, going back
  4859. 21 years ago. And, when you took the shrink-wrap off of a piece
  4860. of software that you bought, you were automatically agreeing to
  4861. be bound by the terms and conditions. So we've gone a long way
  4862. from taking the shrimp wrap — shrink wrap off of a — off of an
  4863. app. But I don't know that things have changed so much. And, I
  4864. guess, does Facebook have a position — a — a position that you
  4865. recommend for elements of a company's terms and conditions that
  4866. you encourage consumers to look at before they click on the
  4867. acceptance?
  4868. - Let me just ask you, because we're going to run short on time,
  4869. do you have — have you laid out for people what it — would be
  4870. indicative of a good actor, versus a less-than-good actor, in
  4871. someone who's developed a — one of these applications?
  4872. - Is the average consumer able to determine what elements would
  4873. indicate poor or weak consumer protections, just by their
  4874. evaluation of the terms and conditions? Do you think that's
  4875. possible?
  4876. - Well, can you — can someone — can the average person — the
  4877. average layperson look at the terms and conditions and make the
  4878. evaluation, “Is this a strong enough protection for me to enter
  4879. into this arrangement?” Look, I'm as bad as anyone else. I see
  4880. an app, I want it, I download it, I breeze through the stuff.
  4881. Just take me to the — to the good stuff in the app. But, if a
  4882. consumer wanted to know, could they know?
  4883. - Yeah, let me move onto something else. Mr. Pallone brought up
  4884. the issue of — he wanted to see more regulation. We actually
  4885. passed a bill through this committee last Congress dealing with
  4886. data breach notification — not so much for Facebook, but for the
  4887. credit card breaches — a good bill. Many of the friends on the
  4888. other side of the dais voted against it. But it was Ms.
  4889. Blackburn's bill, and I think it's one we should consider again,
  4890. in light of what is going on here. But you also signed a consent
  4891. decree back in 2011. And, you know, when I read through that
  4892. consent decree, it's — it's pretty explicit. And there is a
  4893. significant fine of $40,000 per violation, per day. And, if
  4894. you've got 2 billion users, you can see how those fines would
  4895. mount up pretty quickly. So, in the course of your audit, are
  4896. you — are you extrapolating data for the people at the Federal
  4897. Trade Commission for that — the terms and conditions of the
  4898. consent decree?
  4899. - Well, you're — you've said — you've referenced there are audits
  4900. that are ongoing. Are you making that information from those
  4901. audits available to our friends at the — at the agency, at the
  4902. Federal Trade Commission?
  4903. REP. CATHY MCMORRIS RODGERS (R-WASH.)
  4904. - Yeah, turn on the — thank you. And thank you, Mr. Zuckerberg,
  4905. for joining us. Today is clearly timely. There's a number of
  4906. extremely important questions Americans have about Facebook,
  4907. including questions about safety and security of their data,
  4908. about the process by which their data is made available to third
  4909. parties, about what Facebook is doing to protect consumer
  4910. privacy as we move forward. But one of the issues that is
  4911. concerning me and I'd like to dig a little deeper into is how
  4912. Facebook treats content on its platform. So, Mr. Zuckerberg,
  4913. given the extensive reach of Facebook and its widespread use as
  4914. a tool of public expression, do you think Facebook has a unique
  4915. responsibility to ensure that it has clear standards regarding
  4916. the censorship of content on its platform? And do you think
  4917. Facebook adequately and clearly defines what these standards are
  4918. for its users?
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement