Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- 1
- 00:00:15,580 --> 00:00:25,547
- ♪♪
- 2
- 00:00:25,590 --> 00:00:35,035
- ♪♪
- 3
- 00:00:35,078 --> 00:00:41,302
- What we're on the brink of is
- a world of increasingly intense,
- 4
- 00:00:41,345 --> 00:00:45,219
- sophisticated
- artificial intelligence.
- 5
- 00:00:45,262 --> 00:00:48,396
- Man: Technology is evolving
- so much faster than our society
- 6
- 00:00:48,439 --> 00:00:51,181
- has the ability
- to protect us as citizens.
- 7
- 00:00:51,486 --> 00:00:55,707
- The robots are coming, and they
- will destroy our livelihoods.
- 8
- 00:00:55,751 --> 00:01:01,844
- ♪♪
- 9
- 00:01:01,887 --> 00:01:04,238
- You have a networked
- intelligence that watches us,
- 10
- 00:01:04,281 --> 00:01:08,590
- knows everything about us,
- and begins to try to change us.
- 11
- 00:01:08,633 --> 00:01:12,768
- Man #2: Twitter has become the
- world's number-one news site.
- 12
- 00:01:12,811 --> 00:01:15,205
- Man #3:
- Technology is never good or bad.
- 13
- 00:01:15,249 --> 00:01:18,948
- It's what we do
- with the technology.
- 14
- 00:01:18,991 --> 00:01:22,734
- Eventually, millions of people
- are gonna be thrown out of jobs
- 15
- 00:01:22,778 --> 00:01:25,737
- because their skills
- are going to be obsolete.
- 16
- 00:01:25,781 --> 00:01:27,435
- Woman: Mass unemployment...
- 17
- 00:01:27,478 --> 00:01:32,527
- greater inequalities,
- even social unrest.
- 18
- 00:01:32,570 --> 00:01:35,530
- Man #4: Regardless of whether
- to be afraid or not afraid,
- 19
- 00:01:35,573 --> 00:01:38,185
- the change is coming,
- and nobody can stop it.
- 20
- 00:01:38,228 --> 00:01:44,539
- ♪♪
- 21
- 00:01:44,582 --> 00:01:46,323
- Man #5: We've invested
- huge amounts of money,
- 22
- 00:01:46,367 --> 00:01:49,283
- and so it stands to reason
- that the military,
- 23
- 00:01:49,326 --> 00:01:50,893
- with their own desires,
- 24
- 00:01:50,936 --> 00:01:53,330
- are gonna start to use
- these technologies.
- 25
- 00:01:53,374 --> 00:01:55,419
- Man #6:
- Autonomous weapons systems
- 26
- 00:01:55,463 --> 00:01:57,552
- could lead to a global arms race
- 27
- 00:01:57,595 --> 00:02:00,032
- to rival the Nuclear Era.
- 28
- 00:02:00,076 --> 00:02:02,296
- ♪♪
- 29
- 00:02:02,339 --> 00:02:04,036
- Man #7:
- We know what the answer is.
- 30
- 00:02:04,080 --> 00:02:05,429
- They'll eventually
- be killing us.
- 31
- 00:02:05,473 --> 00:02:10,782
- ♪♪
- 32
- 00:02:10,826 --> 00:02:12,349
- Man #8:
- These technology leaps
- 33
- 00:02:12,393 --> 00:02:15,874
- are gonna yield
- incredible miracles...
- 34
- 00:02:15,918 --> 00:02:18,181
- and incredible horrors.
- 35
- 00:02:18,225 --> 00:02:24,231
- ♪♪
- 36
- 00:02:24,274 --> 00:02:29,323
- Man #9: We created it,
- so I think, as we move forward,
- 37
- 00:02:29,366 --> 00:02:33,762
- this intelligence
- will contain parts of us.
- 38
- 00:02:33,805 --> 00:02:35,981
- And I think the question is --
- 39
- 00:02:36,025 --> 00:02:39,463
- Will it contain
- the good parts...
- 40
- 00:02:39,507 --> 00:02:41,378
- or the bad parts?
- 41
- 00:02:41,422 --> 00:02:47,079
- ♪♪
- 42
- 00:02:57,742 --> 00:03:04,793
- ♪♪
- 43
- 00:03:04,836 --> 00:03:08,840
- Sarah: The survivors
- called the war "Judgment Day."
- 44
- 00:03:08,884 --> 00:03:12,583
- They lived only to face
- a new nightmare --
- 45
- 00:03:12,627 --> 00:03:14,019
- the war against the machines.
- 46
- 00:03:14,063 --> 00:03:15,412
- Aah!
- 47
- 00:03:15,456 --> 00:03:18,023
- Nolan: I think
- we've completely fucked this up.
- 48
- 00:03:18,067 --> 00:03:21,549
- I think Hollywood has managed
- to inoculate the general public
- 49
- 00:03:21,592 --> 00:03:24,247
- against this question --
- 50
- 00:03:24,291 --> 00:03:28,251
- the idea of machines
- that will take over the world.
- 51
- 00:03:28,295 --> 00:03:30,645
- Open the pod bay doors, HAL.
- 52
- 00:03:30,688 --> 00:03:33,561
- I'm sorry, Dave.
- 53
- 00:03:33,604 --> 00:03:35,911
- I'm afraid I can't do that.
- 54
- 00:03:37,434 --> 00:03:38,696
- HAL?
- 55
- 00:03:38,740 --> 00:03:40,437
- Nolan:
- We've cried wolf enough times...
- 56
- 00:03:40,481 --> 00:03:42,483
- HAL?
- ...that the public
- has stopped paying attention,
- 57
- 00:03:42,526 --> 00:03:43,962
- because it feels like
- science fiction.
- 58
- 00:03:44,006 --> 00:03:45,486
- Even sitting here talking
- about it right now,
- 59
- 00:03:45,529 --> 00:03:48,228
- it feels a little bit silly,
- a little bit like,
- 60
- 00:03:48,271 --> 00:03:51,666
- "Oh, this is an artifact
- of some cheeseball movie."
- 61
- 00:03:51,709 --> 00:03:56,584
- The WOPR spends all its time
- thinking about World War III.
- 62
- 00:03:56,627 --> 00:03:59,064
- But it's not.
- 63
- 00:03:59,108 --> 00:04:02,111
- The general public is about
- to get blindsided by this.
- 64
- 00:04:02,154 --> 00:04:11,512
- ♪♪
- 65
- 00:04:11,555 --> 00:04:13,514
- As a society and as individuals,
- 66
- 00:04:13,557 --> 00:04:18,954
- we're increasingly surrounded
- by machine intelligence.
- 67
- 00:04:18,997 --> 00:04:22,653
- We carry this pocket device
- in the palm of our hand
- 68
- 00:04:22,697 --> 00:04:24,829
- that we use to make
- a striking array
- 69
- 00:04:24,873 --> 00:04:26,831
- of life decisions right now,
- 70
- 00:04:26,875 --> 00:04:29,007
- aided by a set
- of distant algorithms
- 71
- 00:04:29,051 --> 00:04:30,748
- that we have no understanding.
- 72
- 00:04:30,792 --> 00:04:34,143
- ♪♪
- 73
- 00:04:34,186 --> 00:04:36,537
- We're already pretty jaded
- about the idea
- 74
- 00:04:36,580 --> 00:04:37,929
- that we can talk to our phone,
- 75
- 00:04:37,973 --> 00:04:40,062
- and it mostly understands us.
- 76
- 00:04:40,105 --> 00:04:42,456
- Woman: I found quite a number
- of action films.
- 77
- 00:04:42,499 --> 00:04:44,327
- Five years ago -- no way.
- 78
- 00:04:44,371 --> 00:04:47,678
- Markoff: Robotics.
- Machines that see and speak...
- 79
- 00:04:47,722 --> 00:04:48,897
- Woman: Hi, there....and listen.
- 80
- 00:04:48,940 --> 00:04:50,202
- All that's real now.
- 81
- 00:04:50,246 --> 00:04:51,639
- And these technologies
- 82
- 00:04:51,682 --> 00:04:55,686
- are gonna fundamentally
- change our society.
- 83
- 00:04:55,730 --> 00:05:00,212
- Thrun: Now we have this great
- movement of self-driving cars.
- 84
- 00:05:00,256 --> 00:05:01,953
- Driving a car autonomously
- 85
- 00:05:01,997 --> 00:05:06,088
- can move people's lives
- into a better place.
- 86
- 00:05:06,131 --> 00:05:07,916
- Horvitz: I've lost
- a number of family members,
- 87
- 00:05:07,959 --> 00:05:09,570
- including my mother,
- 88
- 00:05:09,613 --> 00:05:11,876
- my brother and sister-in-law
- and their kids,
- 89
- 00:05:11,920 --> 00:05:14,009
- to automobile accidents.
- 90
- 00:05:14,052 --> 00:05:18,405
- It's pretty clear we could
- almost eliminate car accidents
- 91
- 00:05:18,448 --> 00:05:20,102
- with automation.
- 92
- 00:05:20,145 --> 00:05:21,843
- 30,000 lives in the U.S. alone.
- 93
- 00:05:21,886 --> 00:05:25,455
- About a million around the world
- per year.
- 94
- 00:05:25,499 --> 00:05:27,501
- Ferrucci:
- In healthcare, early indicators
- 95
- 00:05:27,544 --> 00:05:29,503
- are the name of the game
- in that space,
- 96
- 00:05:29,546 --> 00:05:33,158
- so that's another place where
- it can save somebody's life.
- 97
- 00:05:33,202 --> 00:05:35,726
- Dr. Herman: Here in
- the breast-cancer center,
- 98
- 00:05:35,770 --> 00:05:38,381
- all the things that
- the radiologist's brain
- 99
- 00:05:38,425 --> 00:05:43,386
- does in two minutes, the
- computer does instantaneously.
- 100
- 00:05:43,430 --> 00:05:47,303
- The computer has looked
- at 1 billion mammograms,
- 101
- 00:05:47,347 --> 00:05:49,261
- and it takes that data
- and applies it
- 102
- 00:05:49,305 --> 00:05:51,438
- to this image instantaneously,
- 103
- 00:05:51,481 --> 00:05:54,441
- so the medical application
- is profound.
- 104
- 00:05:56,399 --> 00:05:57,705
- Zilis:
- Another really exciting area
- 105
- 00:05:57,748 --> 00:05:59,402
- that we're seeing
- a lot of development in
- 106
- 00:05:59,446 --> 00:06:03,275
- is actually understanding
- our genetic code
- 107
- 00:06:03,319 --> 00:06:06,104
- and using that
- to both diagnose disease
- 108
- 00:06:06,148 --> 00:06:07,758
- and create
- personalized treatments.
- 109
- 00:06:07,802 --> 00:06:11,632
- ♪♪
- 110
- 00:06:11,675 --> 00:06:14,112
- Kurzweil:
- The primary application
- of all these machines
- 111
- 00:06:14,156 --> 00:06:17,246
- will be to extend
- our own intelligence.
- 112
- 00:06:17,289 --> 00:06:19,422
- We'll be able to make
- ourselves smarter,
- 113
- 00:06:19,466 --> 00:06:22,643
- and we'll be better
- at solving problems.
- 114
- 00:06:22,686 --> 00:06:24,775
- We don't have to age.
- We'll actually understand aging.
- 115
- 00:06:24,819 --> 00:06:27,125
- We'll be able to stop it.
- 116
- 00:06:27,169 --> 00:06:29,519
- Man: There's really no limit
- to what intelligent machines
- 117
- 00:06:29,563 --> 00:06:30,868
- can do for the human race.
- 118
- 00:06:30,912 --> 00:06:36,265
- ♪♪
- 119
- 00:06:36,308 --> 00:06:39,399
- How could a smarter machine
- not be a better machine?
- 120
- 00:06:42,053 --> 00:06:44,708
- It's hard to say exactly
- when I began to think
- 121
- 00:06:44,752 --> 00:06:46,971
- that that was a bit naive.
- 122
- 00:06:47,015 --> 00:06:56,459
- ♪♪
- 123
- 00:06:56,503 --> 00:06:59,288
- Stuart Russell,
- he's basically a god
- 124
- 00:06:59,331 --> 00:07:00,898
- in the field
- of artificial intelligence.
- 125
- 00:07:00,942 --> 00:07:04,380
- He wrote the book that almost
- every university uses.
- 126
- 00:07:04,424 --> 00:07:06,948
- Russell: I used to say it's the
- best-selling AI textbook.
- 127
- 00:07:06,991 --> 00:07:10,255
- Now I just say "It's the PDF
- that's stolen most often."
- 128
- 00:07:10,299 --> 00:07:13,650
- ♪♪
- 129
- 00:07:13,694 --> 00:07:17,306
- Artificial intelligence is
- about making computers smart,
- 130
- 00:07:17,349 --> 00:07:19,830
- and from the point
- of view of the public,
- 131
- 00:07:19,874 --> 00:07:21,484
- what counts as AI
- is just something
- 132
- 00:07:21,528 --> 00:07:23,268
- that's surprisingly intelligent
- 133
- 00:07:23,312 --> 00:07:25,488
- compared to what
- we thought computers
- 134
- 00:07:25,532 --> 00:07:28,404
- would typically be able to do.
- 135
- 00:07:28,448 --> 00:07:33,801
- AI is a field of research
- to try to basically simulate
- 136
- 00:07:33,844 --> 00:07:36,717
- all kinds of human capabilities.
- 137
- 00:07:36,760 --> 00:07:38,719
- We're in the AI era.
- 138
- 00:07:38,762 --> 00:07:40,503
- Silicon Valley
- has the ability to focus
- 139
- 00:07:40,547 --> 00:07:42,462
- on one bright, shiny thing.
- 140
- 00:07:42,505 --> 00:07:43,767
- It was social networking
- 141
- 00:07:43,811 --> 00:07:45,508
- and social media
- over the last decade,
- 142
- 00:07:45,552 --> 00:07:48,119
- and it's pretty clear
- that the bit has flipped.
- 143
- 00:07:48,163 --> 00:07:50,557
- And it starts
- with machine learning.
- 144
- 00:07:50,600 --> 00:07:54,343
- Nolan: When we look back at this
- moment, what was the first AI?
- 145
- 00:07:54,386 --> 00:07:55,736
- It's not sexy,
- and it isn't the thing
- 146
- 00:07:55,779 --> 00:07:57,389
- we could see at the movies,
- 147
- 00:07:57,433 --> 00:08:00,741
- but you'd make a great case
- that Google created,
- 148
- 00:08:00,784 --> 00:08:03,395
- not a search engine,
- but a godhead.
- 149
- 00:08:03,439 --> 00:08:06,486
- A way for people to ask
- any question they wanted
- 150
- 00:08:06,529 --> 00:08:08,270
- and get the answer they needed.
- 151
- 00:08:08,313 --> 00:08:11,273
- Russell: Most people are not
- aware that what Google is doing
- 152
- 00:08:11,316 --> 00:08:13,710
- is actually a form of
- artificial intelligence.
- 153
- 00:08:13,754 --> 00:08:16,234
- They just go there,
- they type in a thing.
- 154
- 00:08:16,278 --> 00:08:18,323
- Google gives them the answer.
- 155
- 00:08:18,367 --> 00:08:21,544
- Musk: With each search,
- we train it to be better.
- 156
- 00:08:21,588 --> 00:08:23,851
- Sometimes we're typing a search,
- and it tell us the answer
- 157
- 00:08:23,894 --> 00:08:27,419
- before you've finished
- asking the question.
- 158
- 00:08:27,463 --> 00:08:29,944
- You know, who is the president
- of Kazakhstan?
- 159
- 00:08:29,987 --> 00:08:31,685
- And it'll just tell you.
- 160
- 00:08:31,728 --> 00:08:33,600
- You don't have to go to the
- Kazakhstan national website
- 161
- 00:08:33,643 --> 00:08:34,818
- to find out.
- 162
- 00:08:34,862 --> 00:08:37,081
- You didn't used to be
- able to do that.
- 163
- 00:08:37,125 --> 00:08:39,475
- Nolan:
- That is artificial intelligence.
- 164
- 00:08:39,519 --> 00:08:42,783
- Years from now when we try
- to understand, we will say,
- 165
- 00:08:42,826 --> 00:08:44,567
- "How did we miss it?"
- 166
- 00:08:44,611 --> 00:08:47,527
- Markoff: It's one of
- the striking contradictions
- 167
- 00:08:47,570 --> 00:08:48,484
- that we're facing.
- 168
- 00:08:48,528 --> 00:08:50,051
- Google and Facebook, et al,
- 169
- 00:08:50,094 --> 00:08:52,053
- have built businesses
- on giving us,
- 170
- 00:08:52,096 --> 00:08:54,185
- as a society, free stuff.
- 171
- 00:08:54,229 --> 00:08:56,013
- But it's a Faustian bargain.
- 172
- 00:08:56,057 --> 00:09:00,017
- They're extracting something
- from us in exchange,
- 173
- 00:09:00,061 --> 00:09:01,628
- but we don't know
- 174
- 00:09:01,671 --> 00:09:03,760
- what code is running
- on the other side and why.
- 175
- 00:09:03,804 --> 00:09:06,546
- We have no idea.
- 176
- 00:09:06,589 --> 00:09:08,591
- It does strike
- right at the issue
- 177
- 00:09:08,635 --> 00:09:11,028
- of how much we should
- trust these machines.
- 178
- 00:09:14,162 --> 00:09:18,166
- I use computers
- literally for everything.
- 179
- 00:09:18,209 --> 00:09:21,386
- There are so many
- computer advancements now,
- 180
- 00:09:21,430 --> 00:09:23,824
- and it's become such
- a big part of our lives.
- 181
- 00:09:23,867 --> 00:09:26,174
- It's just incredible
- what a computer can do.
- 182
- 00:09:26,217 --> 00:09:29,090
- You can actually carry
- a computer in your purse.
- 183
- 00:09:29,133 --> 00:09:31,571
- I mean, how awesome is that?
- 184
- 00:09:31,614 --> 00:09:35,052
- I think most technology is meant
- to make things easier
- 185
- 00:09:35,096 --> 00:09:37,315
- and simpler for all of us,
- 186
- 00:09:37,359 --> 00:09:40,362
- so hopefully that just
- remains the focus.
- 187
- 00:09:40,405 --> 00:09:43,147
- I think everybody loves
- their computers.
- 188
- 00:09:44,409 --> 00:09:51,678
- ♪♪
- 189
- 00:09:51,721 --> 00:09:53,810
- People don't realize
- they are constantly
- 190
- 00:09:53,854 --> 00:09:59,076
- being negotiated with
- by machines,
- 191
- 00:09:59,120 --> 00:10:02,993
- whether that's the price
- of products in your Amazon cart,
- 192
- 00:10:03,037 --> 00:10:05,517
- whether you can get
- on a particular flight,
- 193
- 00:10:05,561 --> 00:10:08,912
- whether you can reserve
- a room at a particular hotel.
- 194
- 00:10:08,956 --> 00:10:11,959
- What you're experiencing
- are machine-learning algorithms
- 195
- 00:10:12,002 --> 00:10:14,265
- that have determined
- that a person like you
- 196
- 00:10:14,309 --> 00:10:15,919
- is willing to pay 2 cents more
- 197
- 00:10:15,963 --> 00:10:17,791
- and is changing the price.
- 198
- 00:10:17,834 --> 00:10:21,751
- ♪♪
- 199
- 00:10:21,795 --> 00:10:24,014
- Kosinski: Now, a computer looks
- at millions of people
- 200
- 00:10:24,058 --> 00:10:28,105
- simultaneously for
- very subtle patterns.
- 201
- 00:10:28,149 --> 00:10:31,369
- You can take seemingly
- innocent digital footprints,
- 202
- 00:10:31,413 --> 00:10:34,677
- such as someone's playlist
- on Spotify,
- 203
- 00:10:34,721 --> 00:10:37,201
- or stuff that they
- bought on Amazon,
- 204
- 00:10:37,245 --> 00:10:40,291
- and then use algorithms
- to translate this
- 205
- 00:10:40,335 --> 00:10:44,513
- into a very detailed and a
- very accurate, intimate profile.
- 206
- 00:10:47,603 --> 00:10:50,911
- Kaplan: There is a dossier on
- each of us that is so extensive
- 207
- 00:10:50,954 --> 00:10:52,695
- it would be possibly
- accurate to say
- 208
- 00:10:52,739 --> 00:10:55,698
- that they know more about you
- than your mother does.
- 209
- 00:10:55,742 --> 00:11:04,054
- ♪♪
- 210
- 00:11:04,098 --> 00:11:06,883
- Tegmark: The major cause
- of the recent AI breakthrough
- 211
- 00:11:06,927 --> 00:11:08,580
- isn't just that some dude
- 212
- 00:11:08,624 --> 00:11:11,583
- had a brilliant insight
- all of a sudden,
- 213
- 00:11:11,627 --> 00:11:14,325
- but simply that we have
- much bigger data
- 214
- 00:11:14,369 --> 00:11:18,242
- to train them on
- and vastly better computers.
- 215
- 00:11:18,286 --> 00:11:19,940
- el Kaliouby:
- The magic is in the data.
- 216
- 00:11:19,983 --> 00:11:21,463
- It's a ton of data.
- 217
- 00:11:21,506 --> 00:11:23,726
- I mean, it's data
- that's never existed before.
- 218
- 00:11:23,770 --> 00:11:26,686
- We've never had
- this data before.
- 219
- 00:11:26,729 --> 00:11:30,733
- We've created technologies
- that allow us to capture
- 220
- 00:11:30,777 --> 00:11:33,040
- vast amounts of information.
- 221
- 00:11:33,083 --> 00:11:35,738
- If you think of a billion
- cellphones on the planet
- 222
- 00:11:35,782 --> 00:11:38,393
- with gyroscopes
- and accelerometers
- 223
- 00:11:38,436 --> 00:11:39,786
- and fingerprint readers...
- 224
- 00:11:39,829 --> 00:11:42,005
- couple that with the GPS
- and the photos they take
- 225
- 00:11:42,049 --> 00:11:43,964
- and the tweets that you send,
- 226
- 00:11:44,007 --> 00:11:47,750
- we're all giving off huge
- amounts of data individually.
- 227
- 00:11:47,794 --> 00:11:50,274
- Cars that drive as the cameras
- on them suck up information
- 228
- 00:11:50,318 --> 00:11:52,059
- about the world around them.
- 229
- 00:11:52,102 --> 00:11:54,844
- The satellites that are now
- in orbit the size of a toaster.
- 230
- 00:11:54,888 --> 00:11:57,629
- The infrared about
- the vegetation on the planet.
- 231
- 00:11:57,673 --> 00:11:59,109
- The buoys that are out
- in the oceans
- 232
- 00:11:59,153 --> 00:12:01,024
- to feed into the climate models.
- 233
- 00:12:01,068 --> 00:12:05,028
- ♪♪
- 234
- 00:12:05,072 --> 00:12:08,902
- And the NSA, the CIA,
- as they collect information
- 235
- 00:12:08,945 --> 00:12:12,644
- about the
- geopolitical situations.
- 236
- 00:12:12,688 --> 00:12:15,604
- The world today is literally
- swimming in this data.
- 237
- 00:12:15,647 --> 00:12:20,565
- ♪♪
- 238
- 00:12:20,609 --> 00:12:22,480
- Kosinski: Back in 2012,
- 239
- 00:12:22,524 --> 00:12:25,875
- IBM estimated
- that an average human being
- 240
- 00:12:25,919 --> 00:12:31,098
- leaves 500 megabytes
- of digital footprints every day.
- 241
- 00:12:31,141 --> 00:12:34,841
- If you wanted to back up
- on the one day worth of data
- 242
- 00:12:34,884 --> 00:12:36,494
- that humanity produces
- 243
- 00:12:36,538 --> 00:12:39,062
- and imprint it out
- on a letter-sized paper,
- 244
- 00:12:39,106 --> 00:12:43,806
- double-sided, font size 12,
- and you stack it up,
- 245
- 00:12:43,850 --> 00:12:46,113
- it would reach from
- the surface of the Earth
- 246
- 00:12:46,156 --> 00:12:49,116
- to the sun four times over.
- 247
- 00:12:49,159 --> 00:12:51,292
- That's every day.
- 248
- 00:12:51,335 --> 00:12:53,816
- Kaplan: The data itself
- is not good or evil.
- 249
- 00:12:53,860 --> 00:12:55,470
- It's how it's used.
- 250
- 00:12:55,513 --> 00:12:58,342
- We're relying, really,
- on the goodwill of these people
- 251
- 00:12:58,386 --> 00:13:01,171
- and on the policies
- of these companies.
- 252
- 00:13:01,215 --> 00:13:03,870
- There is no legal requirement
- for how they can
- 253
- 00:13:03,913 --> 00:13:06,307
- and should use
- that kind of data.
- 254
- 00:13:06,350 --> 00:13:09,266
- That, to me, is at the heart
- of the trust issue.
- 255
- 00:13:11,007 --> 00:13:13,793
- Barrat: Right now there's a
- giant race for creating machines
- 256
- 00:13:13,836 --> 00:13:15,751
- that are as smart as humans.
- 257
- 00:13:15,795 --> 00:13:17,971
- Google -- They're working on
- what's really the kind of
- 258
- 00:13:18,014 --> 00:13:20,016
- Manhattan Project
- of artificial intelligence.
- 259
- 00:13:20,060 --> 00:13:22,671
- They've got the most money.
- They've got the most talent.
- 260
- 00:13:22,714 --> 00:13:27,067
- They're buying up AI companies
- and robotics companies.
- 261
- 00:13:27,110 --> 00:13:29,069
- Urban: People still think
- of Google as a search engine
- 262
- 00:13:29,112 --> 00:13:30,722
- and their e-mail provider
- 263
- 00:13:30,766 --> 00:13:33,943
- and a lot of other things
- that we use on a daily basis,
- 264
- 00:13:33,987 --> 00:13:39,383
- but behind that search box
- are 10 million servers.
- 265
- 00:13:39,427 --> 00:13:42,299
- That makes Google the most
- powerful computing platform
- 266
- 00:13:42,343 --> 00:13:43,910
- in the world.
- 267
- 00:13:43,953 --> 00:13:47,217
- Google is now working
- on an AI computing platform
- 268
- 00:13:47,261 --> 00:13:50,133
- that will have
- 100 million servers.
- 269
- 00:13:52,179 --> 00:13:53,963
- So when you're interacting
- with Google,
- 270
- 00:13:54,007 --> 00:13:56,052
- we're just seeing
- the toenail of something
- 271
- 00:13:56,096 --> 00:13:58,881
- that is a giant beast
- in the making.
- 272
- 00:13:58,925 --> 00:14:00,622
- And the truth is,
- I'm not even sure
- 273
- 00:14:00,665 --> 00:14:02,798
- that Google knows
- what it's becoming.
- 274
- 00:14:02,842 --> 00:14:11,502
- ♪♪
- 275
- 00:14:11,546 --> 00:14:14,114
- Phoenix: If you look inside of
- what algorithms are being used
- 276
- 00:14:14,157 --> 00:14:15,811
- at Google,
- 277
- 00:14:15,855 --> 00:14:20,076
- it's technology
- largely from the '80s.
- 278
- 00:14:20,120 --> 00:14:23,863
- So these are models that you
- train by showing them a 1, a 2,
- 279
- 00:14:23,906 --> 00:14:27,344
- and a 3, and it learns not
- what a 1 is or what a 2 is --
- 280
- 00:14:27,388 --> 00:14:30,434
- It learns what the difference
- between a 1 and a 2 is.
- 281
- 00:14:30,478 --> 00:14:32,436
- It's just a computation.
- 282
- 00:14:32,480 --> 00:14:35,396
- In the last half decade, where
- we've made this rapid progress,
- 283
- 00:14:35,439 --> 00:14:38,268
- it has all been
- in pattern recognition.
- 284
- 00:14:38,312 --> 00:14:41,184
- Tegmark: Most of
- the good, old-fashioned AI
- 285
- 00:14:41,228 --> 00:14:44,057
- was when we would tell
- our computers
- 286
- 00:14:44,100 --> 00:14:46,798
- how to play a game like chess...
- 287
- 00:14:46,842 --> 00:14:49,584
- from the old paradigm where
- you just tell the computer
- 288
- 00:14:49,627 --> 00:14:52,195
- exactly what to do.
- 289
- 00:14:54,502 --> 00:14:57,505
- Announcer:
- This is "Jeopardy!"
- 290
- 00:14:57,548 --> 00:14:59,376
- ♪♪
- 291
- 00:14:59,420 --> 00:15:02,510
- "The IBM Challenge"!
- 292
- 00:15:02,553 --> 00:15:05,730
- Ferrucci: No one at the time
- had thought that a machine
- 293
- 00:15:05,774 --> 00:15:08,298
- could have the precision
- and the confidence
- 294
- 00:15:08,342 --> 00:15:09,952
- and the speed
- to play "Jeopardy!"
- 295
- 00:15:09,996 --> 00:15:11,475
- well enough against
- the best humans.
- 296
- 00:15:11,519 --> 00:15:14,609
- Let's play "Jeopardy!"
- 297
- 00:15:18,569 --> 00:15:20,354
- Watson.Watson: What is "shoe"?
- 298
- 00:15:20,397 --> 00:15:21,877
- You are right.
- You get to pick.
- 299
- 00:15:21,921 --> 00:15:24,836
- Literary Character APB
- for $800.
- 300
- 00:15:24,880 --> 00:15:28,014
- Answer --
- the Daily Double.
- 301
- 00:15:28,057 --> 00:15:31,539
- Watson actually got its
- knowledge by reading Wikipedia
- 302
- 00:15:31,582 --> 00:15:34,672
- and 200 million pages
- of natural-language documents.
- 303
- 00:15:34,716 --> 00:15:36,674
- Ferrucci:
- You can't program every line
- 304
- 00:15:36,718 --> 00:15:38,502
- of how the world works.
- 305
- 00:15:38,546 --> 00:15:40,722
- The machine has to learn
- by reading.
- 306
- 00:15:40,765 --> 00:15:42,202
- Now we come to Watson.
- 307
- 00:15:42,245 --> 00:15:43,986
- "Who is Bram Stoker?"
- 308
- 00:15:44,030 --> 00:15:45,988
- And the wager?
- 309
- 00:15:46,032 --> 00:15:49,165
- Hello! $17,973.
- 310
- 00:15:49,209 --> 00:15:50,993
- $41,413.
- 311
- 00:15:51,037 --> 00:15:53,343
- And a two-day total
- of $77--
- 312
- 00:15:53,387 --> 00:15:56,694
- Phoenix: Watson's trained
- on huge amounts of text,
- 313
- 00:15:56,738 --> 00:15:59,828
- but it's not like it
- understands what it's saying.
- 314
- 00:15:59,871 --> 00:16:02,309
- It doesn't know that water makes
- things wet by touching water
- 315
- 00:16:02,352 --> 00:16:04,441
- and by seeing the way
- things behave in the world
- 316
- 00:16:04,485 --> 00:16:06,182
- the way you and I do.
- 317
- 00:16:06,226 --> 00:16:10,143
- A lot of language AI today
- is not building logical models
- 318
- 00:16:10,186 --> 00:16:11,622
- of how the world works.
- 319
- 00:16:11,666 --> 00:16:15,365
- Rather, it's looking at
- how the words appear
- 320
- 00:16:15,409 --> 00:16:18,238
- in the context of other words.
- 321
- 00:16:18,281 --> 00:16:20,196
- Barrat: David Ferrucci
- developed IBM's Watson,
- 322
- 00:16:20,240 --> 00:16:23,547
- and somebody asked him,
- "Does Watson think?"
- 323
- 00:16:23,591 --> 00:16:27,160
- And he said,
- "Does a submarine swim?"
- 324
- 00:16:27,203 --> 00:16:29,031
- And what they meant was,
- when they developed submarines,
- 325
- 00:16:29,075 --> 00:16:32,992
- they borrowed basic principles
- of swimming from fish.
- 326
- 00:16:33,035 --> 00:16:35,037
- But a submarine swims
- farther and faster than fish
- 327
- 00:16:35,081 --> 00:16:36,125
- and can carry a huge payload.
- 328
- 00:16:36,169 --> 00:16:39,911
- It out-swims fish.
- 329
- 00:16:39,955 --> 00:16:41,870
- Ng: Watson winning the game
- of "Jeopardy!"
- 330
- 00:16:41,913 --> 00:16:43,741
- will go down
- in the history of AI
- 331
- 00:16:43,785 --> 00:16:46,570
- as a significant milestone.
- 332
- 00:16:46,614 --> 00:16:49,269
- We tend to be amazed
- when the machine does so well.
- 333
- 00:16:49,312 --> 00:16:52,663
- I'm even more amazed when the
- computer beats humans at things
- 334
- 00:16:52,707 --> 00:16:55,188
- that humans are
- naturally good at.
- 335
- 00:16:55,231 --> 00:16:58,060
- This is how we make progress.
- 336
- 00:16:58,104 --> 00:17:00,671
- In the early days of
- the Google Brain project,
- 337
- 00:17:00,715 --> 00:17:02,804
- I gave the team a very
- simple instruction,
- 338
- 00:17:02,847 --> 00:17:05,807
- which was, "Build the biggest
- neural network possible,
- 339
- 00:17:05,850 --> 00:17:08,157
- like 1,000 computers."
- 340
- 00:17:08,201 --> 00:17:09,724
- Musk: A neural net is
- something very close
- 341
- 00:17:09,767 --> 00:17:12,161
- to a simulation
- of how the brain works.
- 342
- 00:17:12,205 --> 00:17:16,818
- It's very probabilistic,
- but with contextual relevance.
- 343
- 00:17:16,861 --> 00:17:18,298
- Urban: In your brain,
- you have long neurons
- 344
- 00:17:18,341 --> 00:17:20,256
- that connect to thousands
- of other neurons,
- 345
- 00:17:20,300 --> 00:17:22,519
- and you have these pathways
- that are formed and forged
- 346
- 00:17:22,563 --> 00:17:24,739
- based on what
- the brain needs to do.
- 347
- 00:17:24,782 --> 00:17:28,960
- When a baby tries something and
- it succeeds, there's a reward,
- 348
- 00:17:29,004 --> 00:17:32,312
- and that pathway that created
- the success is strengthened.
- 349
- 00:17:32,355 --> 00:17:34,662
- If it fails at something,
- the pathway is weakened,
- 350
- 00:17:34,705 --> 00:17:36,794
- and so, over time,
- the brain becomes honed
- 351
- 00:17:36,838 --> 00:17:40,320
- to be good at
- the environment around it.
- 352
- 00:17:40,363 --> 00:17:43,279
- Ng: Really, it's just getting
- machines to learn by themselves.
- 353
- 00:17:43,323 --> 00:17:45,238
- This is called "deep learning,"
- and "deep learning"
- 354
- 00:17:45,281 --> 00:17:48,676
- and "neural networks"
- mean roughly the same thing.
- 355
- 00:17:48,719 --> 00:17:52,375
- Tegmark: Deep learning
- is a totally different approach
- 356
- 00:17:52,419 --> 00:17:55,161
- where the computer learns
- more like a toddler,
- 357
- 00:17:55,204 --> 00:17:56,466
- by just getting a lot of data
- 358
- 00:17:56,510 --> 00:18:00,340
- and eventually
- figuring stuff out.
- 359
- 00:18:00,383 --> 00:18:03,125
- The computer just gets
- smarter and smarter
- 360
- 00:18:03,169 --> 00:18:05,997
- as it has more experiences.
- 361
- 00:18:06,041 --> 00:18:08,130
- Ng: So, imagine, if you will,
- a neural network, you know,
- 362
- 00:18:08,174 --> 00:18:09,697
- like 1,000 computers.
- 363
- 00:18:09,740 --> 00:18:11,438
- And it wakes up
- not knowing anything.
- 364
- 00:18:11,481 --> 00:18:14,093
- And we made it watch YouTube
- for a week.
- 365
- 00:18:14,136 --> 00:18:16,704
- ♪♪
- 366
- 00:18:18,706 --> 00:18:20,360
- ♪ Oppan Gangnam style
- 367
- 00:18:20,403 --> 00:18:23,189
- Ow!
- 368
- 00:18:25,408 --> 00:18:28,194
- Charlie!
- That really hurt!
- 369
- 00:18:28,237 --> 00:18:30,152
- ♪♪
- 370
- 00:18:30,196 --> 00:18:31,327
- ♪ Gangnam style
- 371
- 00:18:31,371 --> 00:18:33,286
- ♪ Op, op, op, op
- 372
- 00:18:33,329 --> 00:18:36,202
- ♪ Oppan Gangnam style
- 373
- 00:18:36,245 --> 00:18:38,508
- Ng: And so, after watching
- YouTube for a week,
- 374
- 00:18:38,552 --> 00:18:39,988
- what would it learn?
- 375
- 00:18:40,031 --> 00:18:41,903
- We had a hypothesis that
- it would learn to detect
- 376
- 00:18:41,946 --> 00:18:44,384
- commonly occurring objects
- in videos.
- 377
- 00:18:44,427 --> 00:18:47,517
- And so, we know that human faces
- appear a lot in videos,
- 378
- 00:18:47,561 --> 00:18:49,302
- so we looked,
- and, lo and behold,
- 379
- 00:18:49,345 --> 00:18:51,608
- there was a neuron that had
- learned to detect human faces.
- 380
- 00:18:51,652 --> 00:18:56,265
- Leave Britney alone!
- 381
- 00:18:56,309 --> 00:18:58,354
- Well, what else
- appears in videos a lot?
- 382
- 00:19:00,095 --> 00:19:01,792
- So, we looked,
- and to our surprise,
- 383
- 00:19:01,836 --> 00:19:04,882
- there was actually a neuron
- that had learned to detect cats.
- 384
- 00:19:04,926 --> 00:19:14,849
- ♪♪
- 385
- 00:19:14,892 --> 00:19:17,068
- I still remember
- seeing recognition.
- 386
- 00:19:17,112 --> 00:19:18,635
- "Wow, that's a cat.
- Okay, cool.
- 387
- 00:19:18,679 --> 00:19:20,071
- Great."
- 388
- 00:19:23,162 --> 00:19:24,859
- Barrat:
- It's all pretty innocuous
- 389
- 00:19:24,902 --> 00:19:26,295
- when you're thinking
- about the future.
- 390
- 00:19:26,339 --> 00:19:29,733
- It all seems kind of
- harmless and benign.
- 391
- 00:19:29,777 --> 00:19:31,605
- But we're making
- cognitive architectures
- 392
- 00:19:31,648 --> 00:19:33,520
- that will fly farther
- and faster than us
- 393
- 00:19:33,563 --> 00:19:35,086
- and carry a bigger payload,
- 394
- 00:19:35,130 --> 00:19:37,437
- and they won't be
- warm and fuzzy.
- 395
- 00:19:37,480 --> 00:19:39,656
- Ferrucci: I think that,
- in three to five years,
- 396
- 00:19:39,700 --> 00:19:41,702
- you will see a computer system
- 397
- 00:19:41,745 --> 00:19:45,401
- that will be able
- to autonomously learn
- 398
- 00:19:45,445 --> 00:19:49,013
- how to understand,
- how to build understanding,
- 399
- 00:19:49,057 --> 00:19:51,364
- not unlike the way
- the human mind works.
- 400
- 00:19:53,931 --> 00:19:56,891
- Whatever that lunch was,
- it was certainly delicious.
- 401
- 00:19:56,934 --> 00:19:59,807
- Simply some of
- Robby's synthetics.
- 402
- 00:19:59,850 --> 00:20:01,635
- He's your cook, too?
- 403
- 00:20:01,678 --> 00:20:04,551
- Even manufactures
- the raw materials.
- 404
- 00:20:04,594 --> 00:20:06,944
- Come around here, Robby.
- 405
- 00:20:06,988 --> 00:20:09,773
- I'll show you
- how this works.
- 406
- 00:20:11,122 --> 00:20:13,342
- One introduces
- a sample of human food
- 407
- 00:20:13,386 --> 00:20:15,344
- through this aperture.
- 408
- 00:20:15,388 --> 00:20:17,738
- Down here there's a small
- built-in chemical laboratory,
- 409
- 00:20:17,781 --> 00:20:19,218
- where he analyzes it.
- 410
- 00:20:19,261 --> 00:20:21,263
- Later, he can reproduce
- identical molecules
- 411
- 00:20:21,307 --> 00:20:22,482
- in any shape or quantity.
- 412
- 00:20:22,525 --> 00:20:24,614
- Why, it's
- a housewife's dream.
- 413
- 00:20:24,658 --> 00:20:26,834
- Announcer: Meet Baxter,
- 414
- 00:20:26,877 --> 00:20:29,445
- revolutionary
- new category of robots,
- 415
- 00:20:29,489 --> 00:20:30,490
- with common sense.
- 416
- 00:20:30,533 --> 00:20:31,839
- Baxter...
- 417
- 00:20:31,882 --> 00:20:33,449
- Barrat: Baxter is
- a really good example
- 418
- 00:20:33,493 --> 00:20:36,887
- of the kind of competition
- we face from machines.
- 419
- 00:20:36,931 --> 00:20:42,676
- Baxter can do almost anything
- we can do with our hands.
- 420
- 00:20:42,719 --> 00:20:45,722
- Baxter costs about
- what a minimum-wage worker
- 421
- 00:20:45,766 --> 00:20:47,507
- makes in a year.
- 422
- 00:20:47,550 --> 00:20:48,769
- But Baxter won't be
- taking the place
- 423
- 00:20:48,812 --> 00:20:50,118
- of one minimum-wage worker --
- 424
- 00:20:50,161 --> 00:20:51,772
- He'll be taking
- the place of three,
- 425
- 00:20:51,815 --> 00:20:55,515
- because they never get tired,
- they never take breaks.
- 426
- 00:20:55,558 --> 00:20:57,865
- Gourley: That's probably the
- first thing we're gonna see --
- 427
- 00:20:57,908 --> 00:20:59,475
- displacement of jobs.
- 428
- 00:20:59,519 --> 00:21:01,651
- They're gonna be done
- quicker, faster, cheaper
- 429
- 00:21:01,695 --> 00:21:04,088
- by machines.
- 430
- 00:21:04,132 --> 00:21:07,657
- Our ability to even stay current
- is so insanely limited
- 431
- 00:21:07,701 --> 00:21:10,138
- compared to
- the machines we build.
- 432
- 00:21:10,181 --> 00:21:13,446
- For example, now we have this
- great movement of Uber and Lyft
- 433
- 00:21:13,489 --> 00:21:15,056
- kind of making
- transportation cheaper
- 434
- 00:21:15,099 --> 00:21:16,405
- and democratizing
- transportation,
- 435
- 00:21:16,449 --> 00:21:17,711
- which is great.
- 436
- 00:21:17,754 --> 00:21:19,321
- The next step is gonna be
- 437
- 00:21:19,365 --> 00:21:21,149
- that they're all gonna be
- replaced by driverless cars,
- 438
- 00:21:21,192 --> 00:21:22,411
- and then all the Uber
- and Lyft drivers
- 439
- 00:21:22,455 --> 00:21:25,936
- have to find
- something new to do.
- 440
- 00:21:25,980 --> 00:21:28,156
- Barrat: There are
- 4 million professional drivers
- 441
- 00:21:28,199 --> 00:21:29,723
- in the United States.
- 442
- 00:21:29,766 --> 00:21:31,638
- They're unemployed soon.
- 443
- 00:21:31,681 --> 00:21:34,075
- 7 million people
- that do data entry.
- 444
- 00:21:34,118 --> 00:21:37,339
- Those people
- are gonna be jobless.
- 445
- 00:21:37,383 --> 00:21:40,342
- A job isn't just about money,
- right?
- 446
- 00:21:40,386 --> 00:21:42,605
- On a biological level,
- it serves a purpose.
- 447
- 00:21:42,649 --> 00:21:45,391
- It becomes a defining thing.
- 448
- 00:21:45,434 --> 00:21:48,350
- When the jobs went away
- in any given civilization,
- 449
- 00:21:48,394 --> 00:21:50,787
- it doesn't take long
- until that turns into violence.
- 450
- 00:21:53,355 --> 00:21:57,011
- ♪♪
- 451
- 00:21:59,622 --> 00:22:02,016
- We face a giant divide
- between rich and poor,
- 452
- 00:22:02,059 --> 00:22:05,019
- because that's what automation
- and AI will provoke --
- 453
- 00:22:05,062 --> 00:22:08,588
- a greater divide between
- the haves and the have-nots.
- 454
- 00:22:08,631 --> 00:22:10,807
- Right now, it's working
- into the middle class,
- 455
- 00:22:10,851 --> 00:22:12,896
- into white-collar jobs.
- 456
- 00:22:12,940 --> 00:22:15,334
- IBM's Watson does
- business analytics
- 457
- 00:22:15,377 --> 00:22:20,600
- that we used to pay a business
- analyst $300 an hour to do.
- 458
- 00:22:20,643 --> 00:22:23,037
- Gourley: Today, you're going
- to college to be a doctor,
- 459
- 00:22:23,080 --> 00:22:25,082
- to be an accountant,
- to be a journalist.
- 460
- 00:22:25,126 --> 00:22:28,608
- It's unclear that there's
- gonna be jobs there for you.
- 461
- 00:22:28,651 --> 00:22:32,612
- Ng: If someone's planning for
- a 40-year career in radiology,
- 462
- 00:22:32,655 --> 00:22:34,222
- just reading images,
- 463
- 00:22:34,265 --> 00:22:35,745
- I think that could be
- a challenge
- 464
- 00:22:35,789 --> 00:22:36,920
- to the new graduates of today.
- 465
- 00:22:39,270 --> 00:22:49,193
- ♪♪
- 466
- 00:22:50,847 --> 00:22:58,464
- ♪♪
- 467
- 00:22:58,507 --> 00:23:02,729
- Dr. Herman: The da Vinci robot
- is currently utilized
- 468
- 00:23:02,772 --> 00:23:07,516
- by a variety of surgeons
- for its accuracy and its ability
- 469
- 00:23:07,560 --> 00:23:12,303
- to avoid the inevitable
- fluctuations of the human hand.
- 470
- 00:23:12,347 --> 00:23:17,787
- ♪♪
- 471
- 00:23:17,831 --> 00:23:23,358
- ♪♪
- 472
- 00:23:23,402 --> 00:23:28,494
- Anybody who watches this
- feels the amazingness of it.
- 473
- 00:23:30,931 --> 00:23:34,674
- You look through the scope,
- and you're seeing the claw hand
- 474
- 00:23:34,717 --> 00:23:36,893
- holding that woman's ovary.
- 475
- 00:23:36,937 --> 00:23:42,638
- Humanity was resting right here
- in the hands of this robot.
- 476
- 00:23:42,682 --> 00:23:46,947
- People say it's the future,
- but it's not the future --
- 477
- 00:23:46,990 --> 00:23:50,516
- It's the present.
- 478
- 00:23:50,559 --> 00:23:52,474
- Zilis: If you think about
- a surgical robot,
- 479
- 00:23:52,518 --> 00:23:54,737
- there's often not a lot
- of intelligence in these things,
- 480
- 00:23:54,781 --> 00:23:56,783
- but over time, as we put
- more and more intelligence
- 481
- 00:23:56,826 --> 00:23:58,567
- into these systems,
- 482
- 00:23:58,611 --> 00:24:02,441
- the surgical robots can actually
- learn from each robot surgery.
- 483
- 00:24:02,484 --> 00:24:04,181
- They're tracking the movements,
- they're understanding
- 484
- 00:24:04,225 --> 00:24:05,966
- what worked
- and what didn't work.
- 485
- 00:24:06,009 --> 00:24:08,708
- And eventually, the robot
- for routine surgeries
- 486
- 00:24:08,751 --> 00:24:12,320
- is going to be able to perform
- that entirely by itself...
- 487
- 00:24:12,363 --> 00:24:13,756
- or with human supervision.
- 488
- 00:24:32,558 --> 00:24:34,995
- ♪♪
- 489
- 00:24:35,038 --> 00:24:37,214
- Dr. Herman: It seems that we're
- feeding it and creating it,
- 490
- 00:24:37,258 --> 00:24:42,785
- but, in a way, we are a slave
- to the technology,
- 491
- 00:24:42,829 --> 00:24:45,701
- because we can't go back.
- 492
- 00:24:50,053 --> 00:24:52,882
- Gourley: The machines are taking
- bigger and bigger bites
- 493
- 00:24:52,926 --> 00:24:57,147
- out of our skill set
- at an ever-increasing speed.
- 494
- 00:24:57,191 --> 00:24:59,236
- And so we've got to run
- faster and faster
- 495
- 00:24:59,280 --> 00:25:00,890
- to keep ahead of the machines.
- 496
- 00:25:02,675 --> 00:25:04,677
- How do I look?
- 497
- 00:25:04,720 --> 00:25:06,374
- Good.
- 498
- 00:25:10,030 --> 00:25:11,553
- Are you attracted to me?
- 499
- 00:25:11,597 --> 00:25:14,251
- What?Are you attracted to me?
- 500
- 00:25:14,295 --> 00:25:17,777
- You give me indications
- that you are.
- 501
- 00:25:17,820 --> 00:25:20,562
- I do?
- Yes.
- 502
- 00:25:20,606 --> 00:25:22,608
- Nolan: This is the future
- we're headed into.
- 503
- 00:25:22,651 --> 00:25:26,046
- We want to design
- our companions.
- 504
- 00:25:26,089 --> 00:25:29,266
- We're gonna like to see
- a human face on AI.
- 505
- 00:25:29,310 --> 00:25:33,967
- Therefore, gaming our emotions
- will be depressingly easy.
- 506
- 00:25:34,010 --> 00:25:35,272
- We're not that complicated.
- 507
- 00:25:35,316 --> 00:25:38,101
- We're simple.
- Stimulus-response.
- 508
- 00:25:38,145 --> 00:25:43,063
- I can make you like me basically
- by smiling at you a lot.
- 509
- 00:25:43,106 --> 00:25:45,674
- AIs are gonna be fantastic
- at manipulating us.
- 510
- 00:25:45,718 --> 00:25:54,640
- ♪♪
- 511
- 00:25:54,683 --> 00:25:56,946
- So, you've developed
- a technology
- 512
- 00:25:56,990 --> 00:26:00,036
- that can sense
- what people are feeling.
- 513
- 00:26:00,080 --> 00:26:01,472
- Right.
- We've developed technology
- 514
- 00:26:01,516 --> 00:26:03,387
- that can read
- your facial expressions
- 515
- 00:26:03,431 --> 00:26:06,521
- and map that to a number
- of emotional states.
- 516
- 00:26:06,565 --> 00:26:08,697
- el Kaliouby: 15 years ago,
- I had just finished
- 517
- 00:26:08,741 --> 00:26:11,482
- my undergraduate studies
- in computer science,
- 518
- 00:26:11,526 --> 00:26:15,008
- and it struck me that I was
- spending a lot of time
- 519
- 00:26:15,051 --> 00:26:17,793
- interacting with my laptops
- and my devices,
- 520
- 00:26:17,837 --> 00:26:23,582
- yet these devices had absolutely
- no clue how I was feeling.
- 521
- 00:26:23,625 --> 00:26:26,802
- I started thinking, "What if
- this device could sense
- 522
- 00:26:26,846 --> 00:26:29,326
- that I was stressed
- or I was having a bad day?
- 523
- 00:26:29,370 --> 00:26:31,067
- What would that open up?"
- 524
- 00:26:32,721 --> 00:26:34,418
- Hi, first-graders!
- 525
- 00:26:34,462 --> 00:26:35,855
- How are you?
- 526
- 00:26:35,898 --> 00:26:37,813
- Can I get a hug?
- 527
- 00:26:37,857 --> 00:26:40,773
- We had kids interact
- with the technology.
- 528
- 00:26:40,816 --> 00:26:42,862
- A lot of it
- is still in development,
- 529
- 00:26:42,905 --> 00:26:44,472
- but it was just amazing.
- 530
- 00:26:44,515 --> 00:26:46,648
- Who likes robots?
- Me!
- 531
- 00:26:46,692 --> 00:26:48,911
- Who wants to have a robot
- in their house?
- 532
- 00:26:48,955 --> 00:26:51,479
- What would you use
- a robot for, Jack?
- 533
- 00:26:51,522 --> 00:26:56,353
- I would use it to ask my mom
- very hard math questions.
- 534
- 00:26:56,397 --> 00:26:58,181
- Okay.
- What about you, Theo?
- 535
- 00:26:58,225 --> 00:27:02,272
- I would use it
- for scaring people.
- 536
- 00:27:02,316 --> 00:27:04,666
- All right.
- So, start by smiling.
- 537
- 00:27:04,710 --> 00:27:06,625
- Nice.
- 538
- 00:27:06,668 --> 00:27:09,018
- Brow furrow.
- 539
- 00:27:09,062 --> 00:27:10,890
- Nice one.
- Eyebrow raise.
- 540
- 00:27:10,933 --> 00:27:12,587
- This generation, technology
- 541
- 00:27:12,631 --> 00:27:15,068
- is just surrounding them
- all the time.
- 542
- 00:27:15,111 --> 00:27:17,853
- It's almost like they expect
- to have robots in their homes,
- 543
- 00:27:17,897 --> 00:27:22,336
- and they expect these robots
- to be socially intelligent.
- 544
- 00:27:22,379 --> 00:27:25,252
- What makes robots smart?
- 545
- 00:27:25,295 --> 00:27:29,648
- Put them in, like, a math
- or biology class.
- 546
- 00:27:29,691 --> 00:27:32,259
- I think you would
- have to train it.
- 547
- 00:27:32,302 --> 00:27:35,218
- All right.
- Let's walk over here.
- 548
- 00:27:35,262 --> 00:27:37,394
- So, if you smile and you
- raise your eyebrows,
- 549
- 00:27:37,438 --> 00:27:39,005
- it's gonna run over to you.
- 550
- 00:27:39,048 --> 00:27:40,833
- Woman: It's coming over!
- It's coming over! Look.
- 551
- 00:27:43,183 --> 00:27:45,272
- But if you look angry,
- it's gonna run away.
- 552
- 00:27:46,534 --> 00:27:48,797
- -Awesome!
- -Oh, that was good.
- 553
- 00:27:48,841 --> 00:27:52,366
- We're training computers to read
- and recognize emotions.
- 554
- 00:27:52,409 --> 00:27:53,846
- Ready? Set? Go!
- 555
- 00:27:53,889 --> 00:27:57,414
- And the response so far
- has been really amazing.
- 556
- 00:27:57,458 --> 00:27:59,590
- People are integrating this
- into health apps,
- 557
- 00:27:59,634 --> 00:28:04,465
- meditation apps, robots, cars.
- 558
- 00:28:04,508 --> 00:28:06,728
- We're gonna see
- how this unfolds.
- 559
- 00:28:06,772 --> 00:28:09,426
- ♪♪
- 560
- 00:28:09,470 --> 00:28:11,602
- Zilis:
- Robots can contain AI,
- 561
- 00:28:11,646 --> 00:28:14,388
- but the robot is just
- a physical instantiation,
- 562
- 00:28:14,431 --> 00:28:16,782
- and the artificial
- intelligence is the brain.
- 563
- 00:28:16,825 --> 00:28:19,872
- And so brains can exist purely
- in software-based systems.
- 564
- 00:28:19,915 --> 00:28:22,483
- They don't need to have
- a physical form.
- 565
- 00:28:22,526 --> 00:28:25,094
- Robots can exist without
- any artificial intelligence.
- 566
- 00:28:25,138 --> 00:28:28,097
- We have a lot of
- dumb robots out there.
- 567
- 00:28:28,141 --> 00:28:31,753
- But a dumb robot can be
- a smart robot overnight,
- 568
- 00:28:31,797 --> 00:28:34,103
- given the right software,
- given the right sensors.
- 569
- 00:28:34,147 --> 00:28:38,629
- Barrat: We can't help but impute
- motive into inanimate objects.
- 570
- 00:28:38,673 --> 00:28:40,327
- We do it with machines.
- 571
- 00:28:40,370 --> 00:28:41,502
- We'll treat them like children.
- 572
- 00:28:41,545 --> 00:28:43,330
- We'll treat them
- like surrogates.
- 573
- 00:28:43,373 --> 00:28:45,027
- -Goodbye!
- -Goodbye!
- 574
- 00:28:45,071 --> 00:28:48,204
- And we'll pay the price.
- 575
- 00:28:49,292 --> 00:28:58,998
- ♪♪
- 576
- 00:28:59,041 --> 00:29:08,572
- ♪♪
- 577
- 00:29:08,616 --> 00:29:10,792
- Okay, welcome to ATR.
- 578
- 00:29:10,836 --> 00:29:18,060
- ♪♪
- 579
- 00:29:25,067 --> 00:29:30,594
- ♪♪
- 580
- 00:29:30,638 --> 00:29:36,122
- ♪♪
- 581
- 00:29:47,786 --> 00:29:51,485
- ♪♪
- 582
- 00:29:51,528 --> 00:29:52,791
- Konnichiwa.
- 583
- 00:30:24,170 --> 00:30:29,436
- ♪♪
- 584
- 00:30:53,677 --> 00:30:56,942
- ♪♪
- 585
- 00:30:56,985 --> 00:30:58,682
- Gourley: We build
- artificial intelligence,
- 586
- 00:30:58,726 --> 00:31:02,948
- and the very first thing
- we want to do is replicate us.
- 587
- 00:31:02,991 --> 00:31:05,341
- I think the key point will come
- 588
- 00:31:05,385 --> 00:31:09,258
- when all the major senses
- are replicated --
- 589
- 00:31:09,302 --> 00:31:11,130
- sight...
- 590
- 00:31:11,173 --> 00:31:12,871
- touch...
- 591
- 00:31:12,914 --> 00:31:14,611
- smell.
- 592
- 00:31:14,655 --> 00:31:17,919
- When we replicate our senses,
- is that when it become alive?
- 593
- 00:31:17,963 --> 00:31:22,010
- ♪♪
- 594
- 00:31:24,795 --> 00:31:27,581
- ♪♪
- 595
- 00:31:27,624 --> 00:31:29,104
- Nolan:
- So many of our machines
- 596
- 00:31:29,148 --> 00:31:31,019
- are being built
- to understand us.
- 597
- 00:31:32,847 --> 00:31:34,805
- But what happens when
- an anthropomorphic creature
- 598
- 00:31:34,849 --> 00:31:37,417
- discovers that they can
- adjust their loyalty,
- 599
- 00:31:37,460 --> 00:31:40,028
- adjust their courage,
- adjust their avarice,
- 600
- 00:31:40,072 --> 00:31:42,291
- adjust their cunning?
- 601
- 00:31:42,335 --> 00:31:44,815
- ♪♪
- 602
- 00:31:44,859 --> 00:31:47,166
- Musk: The average person,
- they don't see killer robots
- 603
- 00:31:47,209 --> 00:31:48,645
- going down the streets.
- 604
- 00:31:48,689 --> 00:31:50,996
- They're like, "What are
- you talking about?"
- 605
- 00:31:51,039 --> 00:31:53,955
- Man, we want to make sure
- that we don't have killer robots
- 606
- 00:31:53,999 --> 00:31:57,045
- going down the street.
- 607
- 00:31:57,089 --> 00:31:59,439
- Once they're going down
- the street, it is too late.
- 608
- 00:31:59,482 --> 00:32:05,010
- ♪♪
- 609
- 00:32:05,053 --> 00:32:07,099
- Russell: The thing
- that worries me right now,
- 610
- 00:32:07,142 --> 00:32:08,578
- that keeps me awake,
- 611
- 00:32:08,622 --> 00:32:11,842
- is the development
- of autonomous weapons.
- 612
- 00:32:11,886 --> 00:32:19,850
- ♪♪
- 613
- 00:32:19,894 --> 00:32:27,771
- ♪♪
- 614
- 00:32:27,815 --> 00:32:32,733
- Up to now, people have expressed
- unease about drones,
- 615
- 00:32:32,776 --> 00:32:35,127
- which are remotely
- piloted aircraft.
- 616
- 00:32:35,170 --> 00:32:39,783
- ♪♪
- 617
- 00:32:39,827 --> 00:32:43,309
- If you take a drone's camera
- and feed it into the AI system,
- 618
- 00:32:43,352 --> 00:32:47,443
- it's a very easy step from here
- to fully autonomous weapons
- 619
- 00:32:47,487 --> 00:32:50,881
- that choose their own targets
- and release their own missiles.
- 620
- 00:32:50,925 --> 00:32:58,150
- ♪♪
- 621
- 00:32:58,193 --> 00:33:05,374
- ♪♪
- 622
- 00:33:05,418 --> 00:33:12,686
- ♪♪
- 623
- 00:33:12,729 --> 00:33:15,080
- The expected life-span
- of a human being
- 624
- 00:33:15,123 --> 00:33:16,516
- in that kind of
- battle environment
- 625
- 00:33:16,559 --> 00:33:20,520
- would be measured in seconds.
- 626
- 00:33:20,563 --> 00:33:23,740
- Singer: At one point,
- drones were science fiction,
- 627
- 00:33:23,784 --> 00:33:28,832
- and now they've become
- the normal thing in war.
- 628
- 00:33:28,876 --> 00:33:33,402
- There's over 10,000 in
- U.S. military inventory alone.
- 629
- 00:33:33,446 --> 00:33:35,274
- But they're not
- just a U.S. phenomena.
- 630
- 00:33:35,317 --> 00:33:39,060
- There's more than 80 countries
- that operate them.
- 631
- 00:33:39,104 --> 00:33:41,932
- Gourley: It stands to reason
- that people making some
- 632
- 00:33:41,976 --> 00:33:44,587
- of the most important and
- difficult decisions in the world
- 633
- 00:33:44,631 --> 00:33:46,328
- are gonna start to use
- and implement
- 634
- 00:33:46,372 --> 00:33:48,591
- artificial intelligence.
- 635
- 00:33:48,635 --> 00:33:50,724
- ♪♪
- 636
- 00:33:50,767 --> 00:33:53,596
- The Air Force just designed
- a $400-billion jet program
- 637
- 00:33:53,640 --> 00:33:55,555
- to put pilots in the sky,
- 638
- 00:33:55,598 --> 00:34:01,300
- and a $500 AI, designed by
- a couple of graduate students,
- 639
- 00:34:01,343 --> 00:34:03,432
- is beating the best human pilots
- 640
- 00:34:03,476 --> 00:34:05,782
- with a relatively
- simple algorithm.
- 641
- 00:34:05,826 --> 00:34:09,395
- ♪♪
- 642
- 00:34:09,438 --> 00:34:13,399
- AI will have as big an impact
- on the military
- 643
- 00:34:13,442 --> 00:34:17,490
- as the combustion engine
- had at the turn of the century.
- 644
- 00:34:17,533 --> 00:34:18,839
- It will literally touch
- 645
- 00:34:18,882 --> 00:34:21,233
- everything
- that the military does,
- 646
- 00:34:21,276 --> 00:34:25,324
- from driverless convoys
- delivering logistical supplies,
- 647
- 00:34:25,367 --> 00:34:27,021
- to unmanned drones
- 648
- 00:34:27,065 --> 00:34:30,764
- delivering medical aid,
- to computational propaganda,
- 649
- 00:34:30,807 --> 00:34:34,246
- trying to win the hearts
- and minds of a population.
- 650
- 00:34:34,289 --> 00:34:38,337
- And so it stands to reason
- that whoever has the best AI
- 651
- 00:34:38,380 --> 00:34:41,688
- will probably achieve
- dominance on this planet.
- 652
- 00:34:45,561 --> 00:34:47,650
- At some point in
- the early 21st century,
- 653
- 00:34:47,694 --> 00:34:51,219
- all of mankind was
- united in celebration.
- 654
- 00:34:51,263 --> 00:34:53,830
- We marveled
- at our own magnificence
- 655
- 00:34:53,874 --> 00:34:56,833
- as we gave birth to AI.
- 656
- 00:34:56,877 --> 00:34:58,966
- AI?
- 657
- 00:34:59,009 --> 00:35:00,489
- You mean
- artificial intelligence?
- 658
- 00:35:00,533 --> 00:35:01,751
- A singular consciousness
- 659
- 00:35:01,795 --> 00:35:05,886
- that spawned
- an entire race of machines.
- 660
- 00:35:05,929 --> 00:35:09,716
- We don't know
- who struck first -- us or them,
- 661
- 00:35:09,759 --> 00:35:12,980
- but we know that it was us
- that scorched the sky.
- 662
- 00:35:14,677 --> 00:35:16,766
- Singer: There's a long history
- of science fiction,
- 663
- 00:35:16,810 --> 00:35:19,987
- not just predicting the future,
- but shaping the future.
- 664
- 00:35:20,030 --> 00:35:26,820
- ♪♪
- 665
- 00:35:26,863 --> 00:35:30,389
- Arthur Conan Doyle
- writing before World War I
- 666
- 00:35:30,432 --> 00:35:34,393
- on the danger of how
- submarines might be used
- 667
- 00:35:34,436 --> 00:35:38,048
- to carry out civilian blockades.
- 668
- 00:35:38,092 --> 00:35:40,399
- At the time
- he's writing this fiction,
- 669
- 00:35:40,442 --> 00:35:43,402
- the Royal Navy made fun
- of Arthur Conan Doyle
- 670
- 00:35:43,445 --> 00:35:45,230
- for this absurd idea
- 671
- 00:35:45,273 --> 00:35:47,623
- that submarines
- could be useful in war.
- 672
- 00:35:47,667 --> 00:35:53,412
- ♪♪
- 673
- 00:35:53,455 --> 00:35:55,370
- One of the things
- we've seen in history
- 674
- 00:35:55,414 --> 00:35:58,243
- is that our attitude
- towards technology,
- 675
- 00:35:58,286 --> 00:36:01,942
- but also ethics,
- are very context-dependent.
- 676
- 00:36:01,985 --> 00:36:03,726
- For example, the submarine...
- 677
- 00:36:03,770 --> 00:36:06,468
- nations like Great Britain
- and even the United States
- 678
- 00:36:06,512 --> 00:36:09,863
- found it horrifying
- to use the submarine.
- 679
- 00:36:09,906 --> 00:36:13,214
- In fact, the German use of the
- submarine to carry out attacks
- 680
- 00:36:13,258 --> 00:36:18,480
- was the reason why the United
- States joined World War I.
- 681
- 00:36:18,524 --> 00:36:20,613
- But move the timeline forward.
- 682
- 00:36:20,656 --> 00:36:23,529
- Man: The United States
- of America was suddenly
- 683
- 00:36:23,572 --> 00:36:28,403
- and deliberately attacked
- by the empire of Japan.
- 684
- 00:36:28,447 --> 00:36:32,190
- Five hours after Pearl Harbor,
- the order goes out
- 685
- 00:36:32,233 --> 00:36:36,498
- to commit unrestricted
- submarine warfare against Japan.
- 686
- 00:36:39,936 --> 00:36:44,289
- So Arthur Conan Doyle
- turned out to be right.
- 687
- 00:36:44,332 --> 00:36:46,856
- Nolan: That's the great old line
- about science fiction --
- 688
- 00:36:46,900 --> 00:36:48,336
- It's a lie that tells the truth.
- 689
- 00:36:48,380 --> 00:36:51,470
- Fellow executives,
- it gives me great pleasure
- 690
- 00:36:51,513 --> 00:36:54,821
- to introduce you to the future
- of law enforcement...
- 691
- 00:36:54,864 --> 00:36:56,562
- ED-209.
- 692
- 00:36:56,605 --> 00:37:03,612
- ♪♪
- 693
- 00:37:03,656 --> 00:37:05,919
- This isn't just a question
- of science fiction.
- 694
- 00:37:05,962 --> 00:37:09,488
- This is about what's next, about
- what's happening right now.
- 695
- 00:37:09,531 --> 00:37:13,927
- ♪♪
- 696
- 00:37:13,970 --> 00:37:17,496
- The role of intelligent systems
- is growing very rapidly
- 697
- 00:37:17,539 --> 00:37:19,324
- in warfare.
- 698
- 00:37:19,367 --> 00:37:22,152
- Everyone is pushing
- in the unmanned realm.
- 699
- 00:37:22,196 --> 00:37:26,374
- ♪♪
- 700
- 00:37:26,418 --> 00:37:28,898
- Gourley: Today, the Secretary of
- Defense is very, very clear --
- 701
- 00:37:28,942 --> 00:37:32,337
- We will not create fully
- autonomous attacking vehicles.
- 702
- 00:37:32,380 --> 00:37:34,643
- Not everyone
- is gonna hold themselves
- 703
- 00:37:34,687 --> 00:37:36,515
- to that same set of values.
- 704
- 00:37:36,558 --> 00:37:40,693
- And when China and Russia start
- deploying autonomous vehicles
- 705
- 00:37:40,736 --> 00:37:45,611
- that can attack and kill, what's
- the move that we're gonna make?
- 706
- 00:37:45,654 --> 00:37:49,963
- ♪♪
- 707
- 00:37:50,006 --> 00:37:51,617
- Russell: You can't say,
- "Well, we're gonna use
- 708
- 00:37:51,660 --> 00:37:53,967
- autonomous weapons
- for our military dominance,
- 709
- 00:37:54,010 --> 00:37:56,796
- but no one else
- is gonna use them."
- 710
- 00:37:56,839 --> 00:38:00,495
- If you make these weapons,
- they're gonna be used to attack
- 711
- 00:38:00,539 --> 00:38:03,324
- human populations
- in large numbers.
- 712
- 00:38:03,368 --> 00:38:12,507
- ♪♪
- 713
- 00:38:12,551 --> 00:38:14,596
- Autonomous weapons are,
- by their nature,
- 714
- 00:38:14,640 --> 00:38:16,468
- weapons of mass destruction,
- 715
- 00:38:16,511 --> 00:38:19,862
- because it doesn't need a human
- being to guide it or carry it.
- 716
- 00:38:19,906 --> 00:38:22,517
- You only need one person,
- to, you know,
- 717
- 00:38:22,561 --> 00:38:25,781
- write a little program.
- 718
- 00:38:25,825 --> 00:38:30,220
- It just captures
- the complexity of this field.
- 719
- 00:38:30,264 --> 00:38:32,571
- It is cool.
- It is important.
- 720
- 00:38:32,614 --> 00:38:34,573
- It is amazing.
- 721
- 00:38:34,616 --> 00:38:37,053
- It is also frightening.
- 722
- 00:38:37,097 --> 00:38:38,968
- And it's all about trust.
- 723
- 00:38:42,102 --> 00:38:44,583
- It's an open letter about
- artificial intelligence,
- 724
- 00:38:44,626 --> 00:38:47,063
- signed by some of
- the biggest names in science.
- 725
- 00:38:47,107 --> 00:38:48,413
- What do they want?
- 726
- 00:38:48,456 --> 00:38:50,763
- Ban the use of
- autonomous weapons.
- 727
- 00:38:50,806 --> 00:38:52,373
- Woman: The author stated,
- 728
- 00:38:52,417 --> 00:38:54,375
- "Autonomous weapons
- have been described
- 729
- 00:38:54,419 --> 00:38:56,595
- as the third revolution
- in warfare."
- 730
- 00:38:56,638 --> 00:38:58,553
- Woman #2: ...thousand
- artificial-intelligence
- specialists
- 731
- 00:38:58,597 --> 00:39:01,817
- calling for a global ban
- on killer robots.
- 732
- 00:39:01,861 --> 00:39:04,342
- Tegmark:
- This open letter basically says
- 733
- 00:39:04,385 --> 00:39:06,344
- that we should redefine the goal
- 734
- 00:39:06,387 --> 00:39:07,954
- of the field of
- artificial intelligence
- 735
- 00:39:07,997 --> 00:39:11,610
- away from just creating pure,
- undirected intelligence,
- 736
- 00:39:11,653 --> 00:39:13,655
- towards creating
- beneficial intelligence.
- 737
- 00:39:13,699 --> 00:39:16,092
- The development of AI
- is not going to stop.
- 738
- 00:39:16,136 --> 00:39:18,094
- It is going to continue
- and get better.
- 739
- 00:39:18,138 --> 00:39:19,835
- If the international community
- 740
- 00:39:19,879 --> 00:39:21,968
- isn't putting
- certain controls on this,
- 741
- 00:39:22,011 --> 00:39:24,666
- people will develop things
- that can do anything.
- 742
- 00:39:24,710 --> 00:39:27,365
- Woman: The letter says
- that we are years, not decades,
- 743
- 00:39:27,408 --> 00:39:28,714
- away from these weapons
- being deployed.
- 744
- 00:39:28,757 --> 00:39:30,106
- So first of all...
- 745
- 00:39:30,150 --> 00:39:32,413
- We had 6,000 signatories
- of that letter,
- 746
- 00:39:32,457 --> 00:39:35,155
- including many of
- the major figures in the field.
- 747
- 00:39:37,026 --> 00:39:39,942
- I'm getting a lot of visits
- from high-ranking officials
- 748
- 00:39:39,986 --> 00:39:42,989
- who wish to emphasize that
- American military dominance
- 749
- 00:39:43,032 --> 00:39:45,731
- is very important,
- and autonomous weapons
- 750
- 00:39:45,774 --> 00:39:50,083
- may be part of
- the Defense Department's plan.
- 751
- 00:39:50,126 --> 00:39:52,433
- That's very, very scary,
- because a value system
- 752
- 00:39:52,477 --> 00:39:54,479
- of military developers
- of technology
- 753
- 00:39:54,522 --> 00:39:57,307
- is not the same as a value
- system of the human race.
- 754
- 00:39:57,351 --> 00:40:00,746
- ♪♪
- 755
- 00:40:00,789 --> 00:40:02,922
- Markoff: Out of the concerns
- about the possibility
- 756
- 00:40:02,965 --> 00:40:06,665
- that this technology might be
- a threat to human existence,
- 757
- 00:40:06,708 --> 00:40:08,144
- a number of the technologists
- 758
- 00:40:08,188 --> 00:40:09,972
- have funded
- the Future of Life Institute
- 759
- 00:40:10,016 --> 00:40:12,192
- to try to grapple
- with these problems.
- 760
- 00:40:13,193 --> 00:40:14,847
- All of these guys are secretive,
- 761
- 00:40:14,890 --> 00:40:16,805
- and so it's interesting
- to me to see them,
- 762
- 00:40:16,849 --> 00:40:20,635
- you know, all together.
- 763
- 00:40:20,679 --> 00:40:24,030
- Everything we have is a result
- of our intelligence.
- 764
- 00:40:24,073 --> 00:40:26,641
- It's not the result
- of our big, scary teeth
- 765
- 00:40:26,685 --> 00:40:29,470
- or our large claws
- or our enormous muscles.
- 766
- 00:40:29,514 --> 00:40:32,473
- It's because we're actually
- relatively intelligent.
- 767
- 00:40:32,517 --> 00:40:35,520
- And among my generation,
- we're all having
- 768
- 00:40:35,563 --> 00:40:37,086
- what we call "holy cow,"
- 769
- 00:40:37,130 --> 00:40:39,045
- or "holy something else"
- moments,
- 770
- 00:40:39,088 --> 00:40:41,003
- because we see
- that the technology
- 771
- 00:40:41,047 --> 00:40:44,180
- is accelerating faster
- than we expected.
- 772
- 00:40:44,224 --> 00:40:46,705
- I remember sitting
- around the table there
- 773
- 00:40:46,748 --> 00:40:50,099
- with some of the best and
- the smartest minds in the world,
- 774
- 00:40:50,143 --> 00:40:52,058
- and what really
- struck me was,
- 775
- 00:40:52,101 --> 00:40:56,149
- maybe the human brain
- is not able to fully grasp
- 776
- 00:40:56,192 --> 00:40:58,673
- the complexity of the world
- that we're confronted with.
- 777
- 00:40:58,717 --> 00:41:01,415
- Russell:
- As it's currently constructed,
- 778
- 00:41:01,459 --> 00:41:04,766
- the road that AI is following
- heads off a cliff,
- 779
- 00:41:04,810 --> 00:41:07,595
- and we need to change
- the direction that we're going
- 780
- 00:41:07,639 --> 00:41:10,729
- so that we don't take
- the human race off the cliff.
- 781
- 00:41:13,558 --> 00:41:17,126
- Musk: Google acquired DeepMind
- several years ago.
- 782
- 00:41:17,170 --> 00:41:18,737
- DeepMind operates
- 783
- 00:41:18,780 --> 00:41:22,088
- as a semi-independent
- subsidiary of Google.
- 784
- 00:41:22,131 --> 00:41:24,960
- The thing that makes
- DeepMind unique
- 785
- 00:41:25,004 --> 00:41:26,919
- is that DeepMind
- is absolutely focused
- 786
- 00:41:26,962 --> 00:41:30,313
- on creating digital
- superintelligence --
- 787
- 00:41:30,357 --> 00:41:34,056
- an AI that is vastly smarter
- than any human on Earth
- 788
- 00:41:34,100 --> 00:41:36,624
- and ultimately smarter than
- all humans on Earth combined.
- 789
- 00:41:36,668 --> 00:41:40,715
- This is from the DeepMind
- reinforcement learning system.
- 790
- 00:41:40,759 --> 00:41:43,544
- Basically wakes up
- like a newborn baby
- 791
- 00:41:43,588 --> 00:41:46,852
- and is shown the screen
- of an Atari video game
- 792
- 00:41:46,895 --> 00:41:50,508
- and then has to learn
- to play the video game.
- 793
- 00:41:50,551 --> 00:41:55,600
- It knows nothing about objects,
- about motion, about time.
- 794
- 00:41:57,602 --> 00:41:59,604
- It only knows that there's
- an image on the screen
- 795
- 00:41:59,647 --> 00:42:02,563
- and there's a score.
- 796
- 00:42:02,607 --> 00:42:06,436
- So, if your baby woke up
- the day it was born
- 797
- 00:42:06,480 --> 00:42:08,090
- and, by late afternoon,
- 798
- 00:42:08,134 --> 00:42:11,093
- was playing
- 40 different Atari video games
- 799
- 00:42:11,137 --> 00:42:15,315
- at a superhuman level,
- you would be terrified.
- 800
- 00:42:15,358 --> 00:42:19,101
- You would say, "My baby
- is possessed. Send it back."
- 801
- 00:42:19,145 --> 00:42:23,584
- Musk: The DeepMind system
- can win at any game.
- 802
- 00:42:23,628 --> 00:42:27,588
- It can already beat all
- the original Atari games.
- 803
- 00:42:27,632 --> 00:42:29,155
- It is superhuman.
- 804
- 00:42:29,198 --> 00:42:31,636
- It plays the games at superspeed
- in less than a minute.
- 805
- 00:42:35,640 --> 00:42:37,032
- ♪♪
- 806
- 00:42:37,076 --> 00:42:38,643
- DeepMind turned
- to another challenge,
- 807
- 00:42:38,686 --> 00:42:40,558
- and the challenge
- was the game of Go,
- 808
- 00:42:40,601 --> 00:42:42,603
- which people
- have generally argued
- 809
- 00:42:42,647 --> 00:42:45,084
- has been beyond
- the power of computers
- 810
- 00:42:45,127 --> 00:42:48,304
- to play with
- the best human Go players.
- 811
- 00:42:48,348 --> 00:42:51,264
- First, they challenged
- a European Go champion.
- 812
- 00:42:53,222 --> 00:42:55,834
- Then they challenged
- a Korean Go champion.
- 813
- 00:42:55,877 --> 00:42:57,836
- Man:
- Please start the game.
- 814
- 00:42:57,879 --> 00:42:59,838
- And they were able
- to win both times
- 815
- 00:42:59,881 --> 00:43:02,797
- in kind of striking fashion.
- 816
- 00:43:02,841 --> 00:43:05,017
- Nolan: You were reading articles
- in New York Timesyears ago
- 817
- 00:43:05,060 --> 00:43:09,761
- talking about how Go would take
- 100 years for us to solve.
- 818
- 00:43:09,804 --> 00:43:11,110
- Urban:
- People said, "Well, you know,
- 819
- 00:43:11,153 --> 00:43:13,460
- but that's still just a board.
- 820
- 00:43:13,503 --> 00:43:15,027
- Poker is an art.
- 821
- 00:43:15,070 --> 00:43:16,419
- Poker involves reading people.
- 822
- 00:43:16,463 --> 00:43:18,073
- Poker involves lying
- and bluffing.
- 823
- 00:43:18,117 --> 00:43:19,553
- It's not an exact thing.
- 824
- 00:43:19,597 --> 00:43:21,381
- That will never be,
- you know, a computer.
- 825
- 00:43:21,424 --> 00:43:22,861
- You can't do that."
- 826
- 00:43:22,904 --> 00:43:24,732
- They took the best
- poker players in the world,
- 827
- 00:43:24,776 --> 00:43:27,387
- and it took seven days
- for the computer
- 828
- 00:43:27,430 --> 00:43:30,520
- to start demolishing the humans.
- 829
- 00:43:30,564 --> 00:43:32,261
- So it's the best poker player
- in the world,
- 830
- 00:43:32,305 --> 00:43:34,655
- it's the best Go player in the
- world, and the pattern here
- 831
- 00:43:34,699 --> 00:43:37,440
- is that AI might take
- a little while
- 832
- 00:43:37,484 --> 00:43:40,443
- to wrap its tentacles
- around a new skill,
- 833
- 00:43:40,487 --> 00:43:44,883
- but when it does, when it
- gets it, it is unstoppable.
- 834
- 00:43:44,926 --> 00:43:51,977
- ♪♪
- 835
- 00:43:52,020 --> 00:43:55,110
- DeepMind's AI has
- administrator-level access
- 836
- 00:43:55,154 --> 00:43:57,156
- to Google's servers
- 837
- 00:43:57,199 --> 00:44:00,768
- to optimize energy usage
- at the data centers.
- 838
- 00:44:00,812 --> 00:44:04,816
- However, this could be
- an unintentional Trojan horse.
- 839
- 00:44:04,859 --> 00:44:07,253
- DeepMind has to have complete
- control of the data centers,
- 840
- 00:44:07,296 --> 00:44:08,950
- so with a little
- software update,
- 841
- 00:44:08,994 --> 00:44:10,691
- that AI could take
- complete control
- 842
- 00:44:10,735 --> 00:44:12,214
- of the whole Google system,
- 843
- 00:44:12,258 --> 00:44:13,607
- which means
- they can do anything.
- 844
- 00:44:13,651 --> 00:44:14,913
- They could look
- at all your data.
- 845
- 00:44:14,956 --> 00:44:16,131
- They could do anything.
- 846
- 00:44:16,175 --> 00:44:18,917
- ♪♪
- 847
- 00:44:20,135 --> 00:44:23,051
- We're rapidly heading towards
- digital superintelligence
- 848
- 00:44:23,095 --> 00:44:24,313
- that far exceeds any human.
- 849
- 00:44:24,357 --> 00:44:26,402
- I think it's very obvious.
- 850
- 00:44:26,446 --> 00:44:27,708
- Barrat:
- The problem is, we're not gonna
- 851
- 00:44:27,752 --> 00:44:29,710
- suddenly hit
- human-level intelligence
- 852
- 00:44:29,754 --> 00:44:33,105
- and say,
- "Okay, let's stop research."
- 853
- 00:44:33,148 --> 00:44:34,715
- It's gonna go beyond
- human-level intelligence
- 854
- 00:44:34,759 --> 00:44:36,195
- into what's called
- "superintelligence,"
- 855
- 00:44:36,238 --> 00:44:39,459
- and that's anything
- smarter than us.
- 856
- 00:44:39,502 --> 00:44:41,287
- Tegmark:
- AI at the superhuman level,
- 857
- 00:44:41,330 --> 00:44:42,810
- if we succeed with that,
- will be
- 858
- 00:44:42,854 --> 00:44:46,553
- by far the most powerful
- invention we've ever made
- 859
- 00:44:46,596 --> 00:44:50,296
- and the last invention
- we ever have to make.
- 860
- 00:44:50,339 --> 00:44:53,168
- And if we create AI
- that's smarter than us,
- 861
- 00:44:53,212 --> 00:44:54,735
- we have to be open
- to the possibility
- 862
- 00:44:54,779 --> 00:44:57,520
- that we might actually
- lose control to them.
- 863
- 00:44:57,564 --> 00:45:00,741
- ♪♪
- 864
- 00:45:00,785 --> 00:45:02,612
- Russell: Let's say
- you give it some objective,
- 865
- 00:45:02,656 --> 00:45:04,745
- like curing cancer,
- and then you discover
- 866
- 00:45:04,789 --> 00:45:06,965
- that the way
- it chooses to go about that
- 867
- 00:45:07,008 --> 00:45:08,444
- is actually in conflict
- 868
- 00:45:08,488 --> 00:45:12,405
- with a lot of other things
- you care about.
- 869
- 00:45:12,448 --> 00:45:16,496
- Musk: AI doesn't have to be evil
- to destroy humanity.
- 870
- 00:45:16,539 --> 00:45:20,674
- If AI has a goal, and humanity
- just happens to be in the way,
- 871
- 00:45:20,718 --> 00:45:22,894
- it will destroy humanity
- as a matter of course,
- 872
- 00:45:22,937 --> 00:45:25,113
- without even thinking about it.
- No hard feelings.
- 873
- 00:45:25,157 --> 00:45:27,072
- It's just like
- if we're building a road
- 874
- 00:45:27,115 --> 00:45:29,770
- and an anthill happens
- to be in the way...
- 875
- 00:45:29,814 --> 00:45:31,467
- We don't hate ants.
- 876
- 00:45:31,511 --> 00:45:33,165
- We're just building a road.
- 877
- 00:45:33,208 --> 00:45:34,557
- And so goodbye, anthill.
- 878
- 00:45:34,601 --> 00:45:37,952
- ♪♪
- 879
- 00:45:37,996 --> 00:45:40,172
- It's tempting
- to dismiss these concerns,
- 880
- 00:45:40,215 --> 00:45:42,783
- 'cause it's, like,
- something that might happen
- 881
- 00:45:42,827 --> 00:45:47,396
- in a few decades or 100 years,
- so why worry?
- 882
- 00:45:47,440 --> 00:45:50,704
- Russell: But if you go back
- to September 11, 1933,
- 883
- 00:45:50,748 --> 00:45:52,401
- Ernest Rutherford,
- 884
- 00:45:52,445 --> 00:45:54,795
- who was the most well-known
- nuclear physicist of his time,
- 885
- 00:45:54,839 --> 00:45:56,318
- said that the possibility
- 886
- 00:45:56,362 --> 00:45:58,668
- of ever extracting
- useful amounts of energy
- 887
- 00:45:58,712 --> 00:46:00,801
- from the transmutation
- of atoms, as he called it,
- 888
- 00:46:00,845 --> 00:46:03,151
- was moonshine.
- 889
- 00:46:03,195 --> 00:46:04,849
- The next morning, Leo Szilard,
- 890
- 00:46:04,892 --> 00:46:06,502
- who was a much
- younger physicist,
- 891
- 00:46:06,546 --> 00:46:09,984
- read this and got really annoyed
- and figured out
- 892
- 00:46:10,028 --> 00:46:11,943
- how to make
- a nuclear chain reaction
- 893
- 00:46:11,986 --> 00:46:13,379
- just a few months later.
- 894
- 00:46:13,422 --> 00:46:20,560
- ♪♪
- 895
- 00:46:20,603 --> 00:46:23,693
- We have spent more
- than $2 billion
- 896
- 00:46:23,737 --> 00:46:27,523
- on the greatest
- scientific gamble in history.
- 897
- 00:46:27,567 --> 00:46:30,222
- Russell: So when people say
- that, "Oh, this is so far off
- 898
- 00:46:30,265 --> 00:46:32,528
- in the future, we don't have
- to worry about it,"
- 899
- 00:46:32,572 --> 00:46:36,271
- it might only be three, four
- breakthroughs of that magnitude
- 900
- 00:46:36,315 --> 00:46:40,275
- that will get us from here
- to superintelligent machines.
- 901
- 00:46:40,319 --> 00:46:42,974
- Tegmark: If it's gonna take
- 20 years to figure out
- 902
- 00:46:43,017 --> 00:46:45,237
- how to keep AI beneficial,
- 903
- 00:46:45,280 --> 00:46:48,849
- then we should start today,
- not at the last second
- 904
- 00:46:48,893 --> 00:46:51,460
- when some dudes
- drinking Red Bull
- 905
- 00:46:51,504 --> 00:46:53,332
- decide to flip the switch
- and test the thing.
- 906
- 00:46:53,375 --> 00:46:56,770
- ♪♪
- 907
- 00:46:56,814 --> 00:46:58,859
- Musk:
- We have five years.
- 908
- 00:46:58,903 --> 00:47:00,600
- I think
- digital superintelligence
- 909
- 00:47:00,643 --> 00:47:03,864
- will happen in my lifetime.
- 910
- 00:47:03,908 --> 00:47:05,735
- 100%.
- 911
- 00:47:05,779 --> 00:47:07,215
- Barrat: When this happens,
- 912
- 00:47:07,259 --> 00:47:09,696
- it will be surrounded
- by a bunch of people
- 913
- 00:47:09,739 --> 00:47:13,091
- who are really just excited
- about the technology.
- 914
- 00:47:13,134 --> 00:47:15,571
- They want to see it succeed,
- but they're not anticipating
- 915
- 00:47:15,615 --> 00:47:16,964
- that it can get out of control.
- 916
- 00:47:17,008 --> 00:47:24,450
- ♪♪
- 917
- 00:47:25,494 --> 00:47:28,584
- Oh, my God, I trust
- my computer so much.
- 918
- 00:47:28,628 --> 00:47:30,195
- That's an amazing question.
- 919
- 00:47:30,238 --> 00:47:31,457
- I don't trust
- my computer.
- 920
- 00:47:31,500 --> 00:47:32,937
- If it's on,
- I take it off.
- 921
- 00:47:32,980 --> 00:47:34,242
- Like, even when it's off,
- 922
- 00:47:34,286 --> 00:47:35,896
- I still think it's on.
- Like, you know?
- 923
- 00:47:35,940 --> 00:47:37,637
- Like, you really cannot tru--
- Like, the webcams,
- 924
- 00:47:37,680 --> 00:47:39,595
- you don't know if, like,
- someone might turn it...
- 925
- 00:47:39,639 --> 00:47:41,249
- You don't know, like.
- 926
- 00:47:41,293 --> 00:47:42,903
- I don't trust my computer.
- 927
- 00:47:42,947 --> 00:47:46,907
- Like, in my phone,
- every time they ask me
- 928
- 00:47:46,951 --> 00:47:49,475
- "Can we send your
- information to Apple?"
- 929
- 00:47:49,518 --> 00:47:50,998
- every time, I...
- 930
- 00:47:51,042 --> 00:47:53,087
- So, I don't trust my phone.
- 931
- 00:47:53,131 --> 00:47:56,743
- Okay. So, part of it is,
- yes, I do trust it,
- 932
- 00:47:56,786 --> 00:48:00,660
- because it would be really
- hard to get through the day
- 933
- 00:48:00,703 --> 00:48:04,011
- in the way our world is
- set up without computers.
- 934
- 00:48:04,055 --> 00:48:05,360
- ♪♪
- 935
- 00:48:10,975 --> 00:48:13,368
- Dr. Herman: Trust is
- such a human experience.
- 936
- 00:48:13,412 --> 00:48:21,246
- ♪♪
- 937
- 00:48:21,289 --> 00:48:25,119
- I have a patient coming in
- with an intracranial aneurysm.
- 938
- 00:48:25,163 --> 00:48:29,994
- ♪♪
- 939
- 00:48:30,037 --> 00:48:31,691
- They want to look
- in my eyes and know
- 940
- 00:48:31,734 --> 00:48:34,955
- that they can trust
- this person with their life.
- 941
- 00:48:34,999 --> 00:48:39,394
- I'm not horribly concerned
- about anything.
- 942
- 00:48:39,438 --> 00:48:40,830
- Good.
- Part of that
- 943
- 00:48:40,874 --> 00:48:42,920
- is because
- I have confidence in you.
- 944
- 00:48:42,963 --> 00:48:50,710
- ♪♪
- 945
- 00:48:50,753 --> 00:48:52,233
- This procedure
- we're doing today
- 946
- 00:48:52,277 --> 00:48:57,151
- 20 years ago
- was essentially impossible.
- 947
- 00:48:57,195 --> 00:49:00,328
- We just didn't have the
- materials and the technologies.
- 948
- 00:49:04,202 --> 00:49:13,385
- ♪♪
- 949
- 00:49:13,428 --> 00:49:22,655
- ♪♪
- 950
- 00:49:22,698 --> 00:49:26,485
- So, the coil is barely
- in there right now.
- 951
- 00:49:26,528 --> 00:49:29,923
- It's just a feather
- holding it in.
- 952
- 00:49:29,967 --> 00:49:32,012
- It's nervous time.
- 953
- 00:49:32,056 --> 00:49:36,147
- ♪♪
- 954
- 00:49:36,190 --> 00:49:37,626
- We're just in purgatory,
- 955
- 00:49:37,670 --> 00:49:40,673
- intellectual,
- humanistic purgatory,
- 956
- 00:49:40,716 --> 00:49:43,632
- and AI might know
- exactly what to do here.
- 957
- 00:49:43,676 --> 00:49:50,596
- ♪♪
- 958
- 00:49:50,639 --> 00:49:52,554
- We've got the coil
- into the aneurysm.
- 959
- 00:49:52,598 --> 00:49:54,556
- But it wasn't in
- tremendously well
- 960
- 00:49:54,600 --> 00:49:56,428
- that I knew that it would stay,
- 961
- 00:49:56,471 --> 00:50:01,041
- so with a maybe 20% risk
- of a very bad situation,
- 962
- 00:50:01,085 --> 00:50:04,436
- I elected
- to just bring her back.
- 963
- 00:50:04,479 --> 00:50:05,959
- Because of my relationship
- with her
- 964
- 00:50:06,003 --> 00:50:08,222
- and knowing the difficulties
- of coming in
- 965
- 00:50:08,266 --> 00:50:11,051
- and having the procedure,
- I consider things,
- 966
- 00:50:11,095 --> 00:50:14,272
- when I should only consider
- the safest possible route
- 967
- 00:50:14,315 --> 00:50:16,361
- to achieve success.
- 968
- 00:50:16,404 --> 00:50:19,755
- But I had to stand there for
- 10 minutes agonizing about it.
- 969
- 00:50:19,799 --> 00:50:21,757
- The computer feels nothing.
- 970
- 00:50:21,801 --> 00:50:24,760
- The computer just does
- what it's supposed to do,
- 971
- 00:50:24,804 --> 00:50:26,284
- better and better.
- 972
- 00:50:26,327 --> 00:50:30,288
- ♪♪
- 973
- 00:50:30,331 --> 00:50:32,551
- I want to be AI in this case.
- 974
- 00:50:35,945 --> 00:50:38,861
- But can AI be compassionate?
- 975
- 00:50:38,905 --> 00:50:43,040
- ♪♪
- 976
- 00:50:43,083 --> 00:50:47,827
- I mean, it's everybody's
- question about AI.
- 977
- 00:50:47,870 --> 00:50:51,961
- We are the sole
- embodiment of humanity,
- 978
- 00:50:52,005 --> 00:50:55,269
- and it's a stretch for us
- to accept that a machine
- 979
- 00:50:55,313 --> 00:50:58,794
- can be compassionate
- and loving in that way.
- 980
- 00:50:58,838 --> 00:51:05,105
- ♪♪
- 981
- 00:51:05,149 --> 00:51:07,281
- Part of me
- doesn't believe in magic,
- 982
- 00:51:07,325 --> 00:51:09,805
- but part of me has faith
- that there is something
- 983
- 00:51:09,849 --> 00:51:11,546
- beyond the sum of the parts,
- 984
- 00:51:11,590 --> 00:51:15,637
- that there is at least a oneness
- in our shared ancestry,
- 985
- 00:51:15,681 --> 00:51:20,338
- our shared biology,
- our shared history.
- 986
- 00:51:20,381 --> 00:51:23,210
- Some connection there
- beyond machine.
- 987
- 00:51:23,254 --> 00:51:30,304
- ♪♪
- 988
- 00:51:30,348 --> 00:51:32,567
- So, then, you have
- the other side of that, is,
- 989
- 00:51:32,611 --> 00:51:34,047
- does the computer
- know it's conscious,
- 990
- 00:51:34,091 --> 00:51:37,137
- or can it be conscious,
- or does it care?
- 991
- 00:51:37,181 --> 00:51:40,009
- Does it need to be conscious?
- 992
- 00:51:40,053 --> 00:51:42,011
- Does it need to be aware?
- 993
- 00:51:42,055 --> 00:51:47,365
- ♪♪
- 994
- 00:51:47,408 --> 00:51:52,848
- ♪♪
- 995
- 00:51:52,892 --> 00:51:56,417
- I do not think that a robot
- could ever be conscious.
- 996
- 00:51:56,461 --> 00:51:58,376
- Unless they programmed it
- that way.
- 997
- 00:51:58,419 --> 00:52:00,639
- Conscious? No.
- 998
- 00:52:00,682 --> 00:52:03,163
- No.
- No.
- 999
- 00:52:03,207 --> 00:52:06,035
- I mean, think a robot could be
- programmed to be conscious.
- 1000
- 00:52:06,079 --> 00:52:09,648
- How are they programmed
- to do everything else?
- 1001
- 00:52:09,691 --> 00:52:12,390
- That's another big part
- of artificial intelligence,
- 1002
- 00:52:12,433 --> 00:52:15,741
- is to make them conscious
- and make them feel.
- 1003
- 00:52:17,003 --> 00:52:22,400
- ♪♪
- 1004
- 00:52:22,443 --> 00:52:26,230
- Lipson: Back in 2005, we started
- trying to build machines
- 1005
- 00:52:26,273 --> 00:52:27,709
- with self-awareness.
- 1006
- 00:52:27,753 --> 00:52:33,062
- ♪♪
- 1007
- 00:52:33,106 --> 00:52:37,284
- This robot, to begin with,
- didn't know what it was.
- 1008
- 00:52:37,328 --> 00:52:40,244
- All it knew was that it needed
- to do something like walk.
- 1009
- 00:52:40,287 --> 00:52:44,073
- ♪♪
- 1010
- 00:52:44,117 --> 00:52:45,597
- Through trial and error,
- 1011
- 00:52:45,640 --> 00:52:49,731
- it figured out how to walk
- using its imagination,
- 1012
- 00:52:49,775 --> 00:52:54,040
- and then it walked away.
- 1013
- 00:52:54,083 --> 00:52:56,390
- And then we did
- something very cruel.
- 1014
- 00:52:56,434 --> 00:52:58,653
- We chopped off a leg
- and watched what happened.
- 1015
- 00:52:58,697 --> 00:53:03,005
- ♪♪
- 1016
- 00:53:03,049 --> 00:53:07,749
- At the beginning, it didn't
- quite know what had happened.
- 1017
- 00:53:07,793 --> 00:53:13,233
- But over about a period
- of a day, it then began to limp.
- 1018
- 00:53:13,277 --> 00:53:16,845
- And then, a year ago,
- we were training an AI system
- 1019
- 00:53:16,889 --> 00:53:20,240
- for a live demonstration.
- 1020
- 00:53:20,284 --> 00:53:21,763
- We wanted to show how we wave
- 1021
- 00:53:21,807 --> 00:53:24,113
- all these objects
- in front of the camera
- 1022
- 00:53:24,157 --> 00:53:27,334
- and the AI could
- recognize the objects.
- 1023
- 00:53:27,378 --> 00:53:29,031
- And so, we're preparing
- this demo,
- 1024
- 00:53:29,075 --> 00:53:31,251
- and we had on a side screen
- this ability
- 1025
- 00:53:31,295 --> 00:53:36,778
- to watch what certain
- neurons were responding to.
- 1026
- 00:53:36,822 --> 00:53:39,041
- And suddenly we noticed
- that one of the neurons
- 1027
- 00:53:39,085 --> 00:53:41,087
- was tracking faces.
- 1028
- 00:53:41,130 --> 00:53:45,483
- It was tracking our faces
- as we were moving around.
- 1029
- 00:53:45,526 --> 00:53:48,616
- Now, the spooky thing about this
- is that we never trained
- 1030
- 00:53:48,660 --> 00:53:52,490
- the system
- to recognize human faces,
- 1031
- 00:53:52,533 --> 00:53:55,710
- and yet, somehow,
- it learned to do that.
- 1032
- 00:53:57,973 --> 00:53:59,584
- Even though these robots
- are very simple,
- 1033
- 00:53:59,627 --> 00:54:02,500
- we can see there's
- something else going on there.
- 1034
- 00:54:02,543 --> 00:54:05,851
- It's not just programming.
- 1035
- 00:54:05,894 --> 00:54:08,462
- So, this is just the beginning.
- 1036
- 00:54:10,377 --> 00:54:14,294
- Horvitz: I often think about
- that beach in Kitty Hawk,
- 1037
- 00:54:14,338 --> 00:54:18,255
- the 1903 flight
- by Orville and Wilbur Wright.
- 1038
- 00:54:21,214 --> 00:54:24,348
- It was kind of a canvas plane,
- and it's wood and iron,
- 1039
- 00:54:24,391 --> 00:54:26,828
- and it gets off the ground for,
- what, a minute and 20 seconds,
- 1040
- 00:54:26,872 --> 00:54:29,091
- on this windy day
- 1041
- 00:54:29,135 --> 00:54:31,006
- before touching back down again.
- 1042
- 00:54:33,270 --> 00:54:37,143
- And it was
- just around 65 summers or so
- 1043
- 00:54:37,186 --> 00:54:43,149
- after that moment that you have
- a 747 taking off from JFK...
- 1044
- 00:54:43,192 --> 00:54:50,156
- ♪♪
- 1045
- 00:54:50,199 --> 00:54:51,984
- ...where a major concern
- of someone on the airplane
- 1046
- 00:54:52,027 --> 00:54:55,422
- might be whether or not
- their salt-free diet meal
- 1047
- 00:54:55,466 --> 00:54:56,902
- is gonna be coming to them
- or not.
- 1048
- 00:54:56,945 --> 00:54:58,469
- We have a whole infrastructure,
- 1049
- 00:54:58,512 --> 00:55:01,385
- with travel agents
- and tower control,
- 1050
- 00:55:01,428 --> 00:55:03,778
- and it's all casual,
- and it's all part of the world.
- 1051
- 00:55:03,822 --> 00:55:07,042
- ♪♪
- 1052
- 00:55:07,086 --> 00:55:09,523
- Right now, as far
- as we've come with machines
- 1053
- 00:55:09,567 --> 00:55:12,134
- that think and solve problems,
- we're at Kitty Hawk now.
- 1054
- 00:55:12,178 --> 00:55:13,745
- We're in the wind.
- 1055
- 00:55:13,788 --> 00:55:17,052
- We have our tattered-canvas
- planes up in the air.
- 1056
- 00:55:17,096 --> 00:55:20,882
- ♪♪
- 1057
- 00:55:20,926 --> 00:55:23,885
- But what happens
- in 65 summers or so?
- 1058
- 00:55:23,929 --> 00:55:27,889
- We will have machines
- that are beyond human control.
- 1059
- 00:55:27,933 --> 00:55:30,457
- Should we worry about that?
- 1060
- 00:55:30,501 --> 00:55:32,590
- ♪♪
- 1061
- 00:55:32,633 --> 00:55:34,853
- I'm not sure it's going to help.
- 1062
- 00:55:40,337 --> 00:55:44,036
- Kaplan: Nobody has any idea
- today what it means for a robot
- 1063
- 00:55:44,079 --> 00:55:46,430
- to be conscious.
- 1064
- 00:55:46,473 --> 00:55:48,649
- There is no such thing.
- 1065
- 00:55:48,693 --> 00:55:50,172
- There are a lot of smart people,
- 1066
- 00:55:50,216 --> 00:55:53,088
- and I have a great deal
- of respect for them,
- 1067
- 00:55:53,132 --> 00:55:57,528
- but the truth is, machines
- are natural psychopaths.
- 1068
- 00:55:57,571 --> 00:55:59,225
- Man:
- Fear came back into the market.
- 1069
- 00:55:59,268 --> 00:56:01,706
- Man #2: Went down 800,
- nearly 1,000, in a heartbeat.
- 1070
- 00:56:01,749 --> 00:56:03,360
- I mean,
- it is classic capitulation.
- 1071
- 00:56:03,403 --> 00:56:04,796
- There are some people
- who are proposing
- 1072
- 00:56:04,839 --> 00:56:07,146
- it was some kind
- of fat-finger error.
- 1073
- 00:56:07,189 --> 00:56:09,583
- Take the Flash Crash of 2010.
- 1074
- 00:56:09,627 --> 00:56:13,413
- In a matter of minutes,
- $1 trillion in value
- 1075
- 00:56:13,457 --> 00:56:15,415
- was lost in the stock market.
- 1076
- 00:56:15,459 --> 00:56:18,984
- Woman: The Dow dropped nearly
- 1,000 points in a half-hour.
- 1077
- 00:56:19,027 --> 00:56:22,553
- Kaplan:
- So, what went wrong?
- 1078
- 00:56:22,596 --> 00:56:26,644
- By that point in time,
- more than 60% of all the trades
- 1079
- 00:56:26,687 --> 00:56:29,124
- that took place
- on the stock exchange
- 1080
- 00:56:29,168 --> 00:56:32,693
- were actually being
- initiated by computers.
- 1081
- 00:56:32,737 --> 00:56:34,216
- Man:
- Panic selling on the way down,
- 1082
- 00:56:34,260 --> 00:56:35,783
- and all of a sudden
- it stopped on a dime.
- 1083
- 00:56:35,827 --> 00:56:37,611
- Man #2: This is all happening
- in real time, folks.
- 1084
- 00:56:37,655 --> 00:56:39,526
- Wisz: The short story of what
- happened in the Flash Crash
- 1085
- 00:56:39,570 --> 00:56:42,399
- is that algorithms
- responded to algorithms,
- 1086
- 00:56:42,442 --> 00:56:45,358
- and it compounded upon itself
- over and over and over again
- 1087
- 00:56:45,402 --> 00:56:47,012
- in a matter of minutes.
- 1088
- 00:56:47,055 --> 00:56:50,972
- Man: At one point, the market
- fell as if down a well.
- 1089
- 00:56:51,016 --> 00:56:54,323
- There is no regulatory body
- that can adapt quickly enough
- 1090
- 00:56:54,367 --> 00:56:57,979
- to prevent potentially
- disastrous consequences
- 1091
- 00:56:58,023 --> 00:57:01,243
- of AI operating
- in our financial systems.
- 1092
- 00:57:01,287 --> 00:57:03,898
- They are so prime
- for manipulation.
- 1093
- 00:57:03,942 --> 00:57:05,639
- Let's talk about the speed
- with which
- 1094
- 00:57:05,683 --> 00:57:08,076
- we are watching
- this market deteriorate.
- 1095
- 00:57:08,120 --> 00:57:11,602
- That's the type of AI-run-amuck
- that scares people.
- 1096
- 00:57:11,645 --> 00:57:13,560
- Kaplan:
- When you give them a goal,
- 1097
- 00:57:13,604 --> 00:57:17,825
- they will relentlessly
- pursue that goal.
- 1098
- 00:57:17,869 --> 00:57:20,393
- How many computer programs
- are there like this?
- 1099
- 00:57:20,437 --> 00:57:23,483
- Nobody knows.
- 1100
- 00:57:23,527 --> 00:57:27,444
- Kosinski: One of the fascinating
- aspects about AI in general
- 1101
- 00:57:27,487 --> 00:57:31,970
- is that no one really
- understands how it works.
- 1102
- 00:57:32,013 --> 00:57:36,975
- Even the people who create AI
- don't really fully understand.
- 1103
- 00:57:37,018 --> 00:57:39,804
- Because it has millions
- of elements,
- 1104
- 00:57:39,847 --> 00:57:41,675
- it becomes completely impossible
- 1105
- 00:57:41,719 --> 00:57:45,113
- for a human being
- to understand what's going on.
- 1106
- 00:57:45,157 --> 00:57:52,512
- ♪♪
- 1107
- 00:57:52,556 --> 00:57:56,037
- Grassegger: Microsoft had set up
- this artificial intelligence
- 1108
- 00:57:56,081 --> 00:57:59,127
- called Tay on Twitter,
- which was a chatbot.
- 1109
- 00:58:00,912 --> 00:58:02,696
- They started out in the morning,
- 1110
- 00:58:02,740 --> 00:58:06,526
- and Tay was starting to tweet
- and learning from stuff
- 1111
- 00:58:06,570 --> 00:58:10,835
- that was being sent to him
- from other Twitter people.
- 1112
- 00:58:10,878 --> 00:58:13,272
- Because some people,
- like trolls, attacked him,
- 1113
- 00:58:13,315 --> 00:58:18,582
- within 24 hours, the Microsoft
- bot became a terrible person.
- 1114
- 00:58:18,625 --> 00:58:21,367
- They had to literally
- pull Tay off the Net
- 1115
- 00:58:21,410 --> 00:58:24,718
- because he had turned
- into a monster.
- 1116
- 00:58:24,762 --> 00:58:30,550
- A misanthropic, racist, horrible
- person you'd never want to meet.
- 1117
- 00:58:30,594 --> 00:58:32,857
- And nobody had foreseen this.
- 1118
- 00:58:35,337 --> 00:58:38,602
- The whole idea of AI is that
- we are not telling it exactly
- 1119
- 00:58:38,645 --> 00:58:42,780
- how to achieve a given
- outcome or a goal.
- 1120
- 00:58:42,823 --> 00:58:46,435
- AI develops on its own.
- 1121
- 00:58:46,479 --> 00:58:48,829
- Nolan: We're worried about
- superintelligent AI,
- 1122
- 00:58:48,873 --> 00:58:52,790
- the master chess player
- that will outmaneuver us,
- 1123
- 00:58:52,833 --> 00:58:55,923
- but AI won't have to
- actually be that smart
- 1124
- 00:58:55,967 --> 00:59:00,145
- to have massively disruptive
- effects on human civilization.
- 1125
- 00:59:00,188 --> 00:59:01,886
- We've seen over the last century
- 1126
- 00:59:01,929 --> 00:59:05,150
- it doesn't necessarily take
- a genius to knock history off
- 1127
- 00:59:05,193 --> 00:59:06,804
- in a particular direction,
- 1128
- 00:59:06,847 --> 00:59:09,589
- and it won't take a genius AI
- to do the same thing.
- 1129
- 00:59:09,633 --> 00:59:13,158
- Bogus election news stories
- generated more engagement
- 1130
- 00:59:13,201 --> 00:59:17,075
- on Facebook
- than top real stories.
- 1131
- 00:59:17,118 --> 00:59:21,079
- Facebook really is
- the elephant in the room.
- 1132
- 00:59:21,122 --> 00:59:23,777
- Kosinski:
- AI running Facebook news feed --
- 1133
- 00:59:23,821 --> 00:59:28,347
- The task for AI
- is keeping users engaged,
- 1134
- 00:59:28,390 --> 00:59:29,827
- but no one really understands
- 1135
- 00:59:29,870 --> 00:59:34,832
- exactly how this AI
- is achieving this goal.
- 1136
- 00:59:34,875 --> 00:59:38,792
- Nolan: Facebook is building an
- elegant mirrored wall around us.
- 1137
- 00:59:38,836 --> 00:59:41,665
- A mirror that we can ask,
- "Who's the fairest of them all?"
- 1138
- 00:59:41,708 --> 00:59:45,016
- and it will answer, "You, you,"
- time and again
- 1139
- 00:59:45,059 --> 00:59:48,193
- and slowly begin
- to warp our sense of reality,
- 1140
- 00:59:48,236 --> 00:59:53,502
- warp our sense of politics,
- history, global events,
- 1141
- 00:59:53,546 --> 00:59:57,028
- until determining what's true
- and what's not true,
- 1142
- 00:59:57,071 --> 00:59:58,943
- is virtually impossible.
- 1143
- 01:00:01,032 --> 01:00:03,861
- The problem is that AI
- doesn't understand that.
- 1144
- 01:00:03,904 --> 01:00:08,039
- AI just had a mission --
- maximize user engagement,
- 1145
- 01:00:08,082 --> 01:00:10,041
- and it achieved that.
- 1146
- 01:00:10,084 --> 01:00:13,653
- Nearly 2 billion people
- spend nearly one hour
- 1147
- 01:00:13,697 --> 01:00:17,831
- on average a day
- basically interacting with AI
- 1148
- 01:00:17,875 --> 01:00:21,530
- that is shaping
- their experience.
- 1149
- 01:00:21,574 --> 01:00:24,664
- Even Facebook engineers,
- they don't like fake news.
- 1150
- 01:00:24,708 --> 01:00:26,666
- It's very bad business.
- 1151
- 01:00:26,710 --> 01:00:28,015
- They want to get rid
- of fake news.
- 1152
- 01:00:28,059 --> 01:00:29,974
- It's just very difficult
- to do because,
- 1153
- 01:00:30,017 --> 01:00:32,324
- how do you recognize news
- as fake
- 1154
- 01:00:32,367 --> 01:00:34,456
- if you cannot read
- all of those news personally?
- 1155
- 01:00:34,500 --> 01:00:39,418
- There's so much
- active misinformation
- 1156
- 01:00:39,461 --> 01:00:41,115
- and it's packaged very well,
- 1157
- 01:00:41,159 --> 01:00:44,553
- and it looks the same when
- you see it on a Facebook page
- 1158
- 01:00:44,597 --> 01:00:47,426
- or you turn on your television.
- 1159
- 01:00:47,469 --> 01:00:49,210
- Nolan:
- It's not terribly sophisticated,
- 1160
- 01:00:49,254 --> 01:00:51,691
- but it is terribly powerful.
- 1161
- 01:00:51,735 --> 01:00:54,346
- And what it means is
- that your view of the world,
- 1162
- 01:00:54,389 --> 01:00:56,435
- which, 20 years ago,
- was determined,
- 1163
- 01:00:56,478 --> 01:01:00,004
- if you watched the nightly news,
- by three different networks,
- 1164
- 01:01:00,047 --> 01:01:02,528
- the three anchors who endeavored
- to try to get it right.
- 1165
- 01:01:02,571 --> 01:01:04,225
- Might have had a little bias
- one way or the other,
- 1166
- 01:01:04,269 --> 01:01:05,923
- but, largely speaking,
- we could all agree
- 1167
- 01:01:05,966 --> 01:01:08,273
- on an objective reality.
- 1168
- 01:01:08,316 --> 01:01:10,754
- Well, that objectivity is gone,
- 1169
- 01:01:10,797 --> 01:01:13,757
- and Facebook has
- completely annihilated it.
- 1170
- 01:01:13,800 --> 01:01:17,064
- ♪♪
- 1171
- 01:01:17,108 --> 01:01:19,197
- If most of your understanding
- of how the world works
- 1172
- 01:01:19,240 --> 01:01:20,807
- is derived from Facebook,
- 1173
- 01:01:20,851 --> 01:01:23,418
- facilitated
- by algorithmic software
- 1174
- 01:01:23,462 --> 01:01:27,118
- that tries to show you
- the news you want to see,
- 1175
- 01:01:27,161 --> 01:01:28,815
- that's a terribly
- dangerous thing.
- 1176
- 01:01:28,859 --> 01:01:33,080
- And the idea that we have not
- only set that in motion,
- 1177
- 01:01:33,124 --> 01:01:37,258
- but allowed bad-faith actors
- access to that information...
- 1178
- 01:01:37,302 --> 01:01:39,565
- I mean, this is a recipe
- for disaster.
- 1179
- 01:01:39,608 --> 01:01:43,134
- ♪♪
- 1180
- 01:01:43,177 --> 01:01:45,876
- Urban: I think that there will
- definitely be lots of bad actors
- 1181
- 01:01:45,919 --> 01:01:48,922
- trying to manipulate the world
- with AI.
- 1182
- 01:01:48,966 --> 01:01:52,143
- 2016 was a perfect example
- of an election
- 1183
- 01:01:52,186 --> 01:01:55,015
- where there was lots of AI
- producing lots of fake news
- 1184
- 01:01:55,059 --> 01:01:58,323
- and distributing it
- for a purpose, for a result.
- 1185
- 01:01:59,890 --> 01:02:02,283
- Ladies and gentlemen,
- honorable colleagues...
- 1186
- 01:02:02,327 --> 01:02:04,546
- it's my privilege
- to speak to you today
- 1187
- 01:02:04,590 --> 01:02:07,985
- about the power of big data
- and psychographics
- 1188
- 01:02:08,028 --> 01:02:09,682
- in the electoral process
- 1189
- 01:02:09,726 --> 01:02:12,206
- and, specifically,
- to talk about the work
- 1190
- 01:02:12,250 --> 01:02:14,513
- that we contributed
- to Senator Cruz's
- 1191
- 01:02:14,556 --> 01:02:16,558
- presidential primary campaign.
- 1192
- 01:02:16,602 --> 01:02:19,910
- Nolan: Cambridge Analytica
- emerged quietly as a company
- 1193
- 01:02:19,953 --> 01:02:21,563
- that, according to its own hype,
- 1194
- 01:02:21,607 --> 01:02:26,307
- has the ability to use
- this tremendous amount of data
- 1195
- 01:02:26,351 --> 01:02:30,137
- in order
- to effect societal change.
- 1196
- 01:02:30,181 --> 01:02:33,358
- In 2016, they had
- three major clients.
- 1197
- 01:02:33,401 --> 01:02:34,794
- Ted Cruz was one of them.
- 1198
- 01:02:34,838 --> 01:02:37,884
- It's easy to forget
- that, only 18 months ago,
- 1199
- 01:02:37,928 --> 01:02:41,148
- Senator Cruz was one of
- the less popular candidates
- 1200
- 01:02:41,192 --> 01:02:42,846
- seeking nomination.
- 1201
- 01:02:42,889 --> 01:02:47,241
- So, what was not possible maybe,
- like, 10 or 15 years ago,
- 1202
- 01:02:47,285 --> 01:02:49,374
- was that you can send fake news
- 1203
- 01:02:49,417 --> 01:02:52,420
- to exactly the people
- that you want to send it to.
- 1204
- 01:02:52,464 --> 01:02:56,685
- And then you could actually see
- how he or she reacts on Facebook
- 1205
- 01:02:56,729 --> 01:02:58,905
- and then adjust that information
- 1206
- 01:02:58,949 --> 01:03:01,778
- according to the feedback
- that you got.
- 1207
- 01:03:01,821 --> 01:03:03,257
- So you can start developing
- 1208
- 01:03:03,301 --> 01:03:06,130
- kind of a real-time management
- of a population.
- 1209
- 01:03:06,173 --> 01:03:08,697
- In this case, we've zoned in
- 1210
- 01:03:08,741 --> 01:03:10,699
- on a group
- we've called "Persuasion."
- 1211
- 01:03:10,743 --> 01:03:13,746
- These are people who are
- definitely going to vote,
- 1212
- 01:03:13,790 --> 01:03:16,705
- to caucus, but they need
- moving from the center
- 1213
- 01:03:16,749 --> 01:03:18,490
- a little bit more
- towards the right.
- 1214
- 01:03:18,533 --> 01:03:19,708
- in order to support Cruz.
- 1215
- 01:03:19,752 --> 01:03:22,059
- They need a persuasion message.
- 1216
- 01:03:22,102 --> 01:03:23,800
- "Gun rights," I've selected.
- 1217
- 01:03:23,843 --> 01:03:25,802
- That narrows the field
- slightly more.
- 1218
- 01:03:25,845 --> 01:03:29,066
- And now we know that we need
- a message on gun rights,
- 1219
- 01:03:29,109 --> 01:03:31,111
- it needs to be
- a persuasion message,
- 1220
- 01:03:31,155 --> 01:03:32,591
- and it needs to be nuanced
- 1221
- 01:03:32,634 --> 01:03:34,201
- according to
- the certain personality
- 1222
- 01:03:34,245 --> 01:03:36,029
- that we're interested in.
- 1223
- 01:03:36,073 --> 01:03:39,946
- Through social media, there's an
- infinite amount of information
- 1224
- 01:03:39,990 --> 01:03:42,514
- that you can gather
- about a person.
- 1225
- 01:03:42,557 --> 01:03:45,734
- We have somewhere close
- to 4,000 or 5,000 data points
- 1226
- 01:03:45,778 --> 01:03:48,563
- on every adult
- in the United States.
- 1227
- 01:03:48,607 --> 01:03:51,915
- Grassegger: It's about targeting
- the individual.
- 1228
- 01:03:51,958 --> 01:03:54,352
- It's like a weapon,
- which can be used
- 1229
- 01:03:54,395 --> 01:03:55,962
- in the totally wrong direction.
- 1230
- 01:03:56,006 --> 01:03:58,051
- That's the problem
- with all of this data.
- 1231
- 01:03:58,095 --> 01:04:02,229
- It's almost as if we built the
- bullet before we built the gun.
- 1232
- 01:04:02,273 --> 01:04:04,362
- Ted Cruz employed our data,
- 1233
- 01:04:04,405 --> 01:04:06,407
- our behavioral insights.
- 1234
- 01:04:06,451 --> 01:04:09,541
- He started from a base
- of less than 5%
- 1235
- 01:04:09,584 --> 01:04:15,590
- and had a very slow-and-steady-
- but-firm rise to above 35%,
- 1236
- 01:04:15,634 --> 01:04:17,157
- making him, obviously,
- 1237
- 01:04:17,201 --> 01:04:20,465
- the second most threatening
- contender in the race.
- 1238
- 01:04:20,508 --> 01:04:23,120
- Now, clearly, the Cruz
- campaign is over now,
- 1239
- 01:04:23,163 --> 01:04:24,904
- but what I can tell you
- 1240
- 01:04:24,948 --> 01:04:28,168
- is that of the two candidates
- left in this election,
- 1241
- 01:04:28,212 --> 01:04:30,867
- one of them is using
- these technologies.
- 1242
- 01:04:32,564 --> 01:04:35,959
- I, Donald John Trump,
- do solemnly swear
- 1243
- 01:04:36,002 --> 01:04:38,222
- that I will faithfully execute
- 1244
- 01:04:38,265 --> 01:04:42,226
- the office of President
- of the United States.
- 1245
- 01:04:42,269 --> 01:04:46,273
- ♪♪
- 1246
- 01:04:48,275 --> 01:04:50,234
- Nolan: Elections are
- a marginal exercise.
- 1247
- 01:04:50,277 --> 01:04:53,237
- It doesn't take
- a very sophisticated AI
- 1248
- 01:04:53,280 --> 01:04:57,719
- in order to have
- a disproportionate impact.
- 1249
- 01:04:57,763 --> 01:05:02,550
- Before Trump, Brexit was
- another supposed client.
- 1250
- 01:05:02,594 --> 01:05:04,726
- Well, at 20 minutes to 5:00,
- 1251
- 01:05:04,770 --> 01:05:08,730
- we can now say
- the decision taken in 1975
- 1252
- 01:05:08,774 --> 01:05:10,950
- by this country to join
- the common market
- 1253
- 01:05:10,994 --> 01:05:15,999
- has been reversed by this
- referendum to leave the EU.
- 1254
- 01:05:16,042 --> 01:05:19,828
- Nolan: Cambridge Analytica
- allegedly uses AI
- 1255
- 01:05:19,872 --> 01:05:23,267
- to push through two of
- the most ground-shaking pieces
- 1256
- 01:05:23,310 --> 01:05:27,967
- of political change
- in the last 50 years.
- 1257
- 01:05:28,011 --> 01:05:30,709
- These are epochal events,
- and if we believe the hype,
- 1258
- 01:05:30,752 --> 01:05:33,755
- they are connected directly
- to a piece of software,
- 1259
- 01:05:33,799 --> 01:05:37,194
- essentially, created
- by a professor at Stanford.
- 1260
- 01:05:37,237 --> 01:05:41,415
- ♪♪
- 1261
- 01:05:41,459 --> 01:05:43,635
- Kosinski:
- Back in 2013, I described
- 1262
- 01:05:43,678 --> 01:05:45,593
- that what they are doing
- is possible
- 1263
- 01:05:45,637 --> 01:05:49,293
- and warned against this
- happening in the future.
- 1264
- 01:05:49,336 --> 01:05:51,382
- Grassegger:
- At the time, Michal Kosinski
- 1265
- 01:05:51,425 --> 01:05:52,949
- was a young Polish researcher
- 1266
- 01:05:52,992 --> 01:05:54,994
- working at the
- Psychometrics Centre.
- 1267
- 01:05:55,038 --> 01:06:00,217
- So, what Michal had done was to
- gather the largest-ever data set
- 1268
- 01:06:00,260 --> 01:06:03,481
- of how people
- behave on Facebook.
- 1269
- 01:06:03,524 --> 01:06:07,789
- Kosinski:
- Psychometrics is trying
- to measure psychological traits,
- 1270
- 01:06:07,833 --> 01:06:09,922
- such as personality,
- intelligence,
- 1271
- 01:06:09,966 --> 01:06:11,880
- political views, and so on.
- 1272
- 01:06:11,924 --> 01:06:15,058
- Now, traditionally,
- those traits were measured
- 1273
- 01:06:15,101 --> 01:06:17,712
- using tests and questions.
- 1274
- 01:06:17,756 --> 01:06:19,410
- Nolan: Personality test --
- the most benign thing
- 1275
- 01:06:19,453 --> 01:06:20,715
- you could possibly think of.
- 1276
- 01:06:20,759 --> 01:06:22,065
- Something that doesn't
- necessarily have
- 1277
- 01:06:22,108 --> 01:06:24,197
- a lot of utility, right?
- 1278
- 01:06:24,241 --> 01:06:27,331
- Kosinski: Our idea was that
- instead of tests and questions,
- 1279
- 01:06:27,374 --> 01:06:30,029
- we could simply look at the
- digital footprints of behaviors
- 1280
- 01:06:30,073 --> 01:06:32,553
- that we are all leaving behind
- 1281
- 01:06:32,597 --> 01:06:34,903
- to understand openness,
- 1282
- 01:06:34,947 --> 01:06:37,732
- conscientiousness,
- neuroticism.
- 1283
- 01:06:37,776 --> 01:06:39,560
- Grassegger: You can easily buy
- personal data,
- 1284
- 01:06:39,604 --> 01:06:43,129
- such as where you live, what
- club memberships you've tried,
- 1285
- 01:06:43,173 --> 01:06:45,044
- which gym you go to.
- 1286
- 01:06:45,088 --> 01:06:47,873
- There are actually marketplaces
- for personal data.
- 1287
- 01:06:47,916 --> 01:06:49,918
- Nolan: It turns out, we can
- discover an awful lot
- 1288
- 01:06:49,962 --> 01:06:51,442
- about what you're gonna do
- 1289
- 01:06:51,485 --> 01:06:55,750
- based on a very, very tiny
- set of information.
- 1290
- 01:06:55,794 --> 01:06:58,275
- Kosinski: We are training
- deep-learning networks
- 1291
- 01:06:58,318 --> 01:07:01,278
- to infer intimate traits,
- 1292
- 01:07:01,321 --> 01:07:04,759
- people's political views,
- personality,
- 1293
- 01:07:04,803 --> 01:07:07,806
- intelligence,
- sexual orientation
- 1294
- 01:07:07,849 --> 01:07:10,504
- just from an image
- from someone's face.
- 1295
- 01:07:10,548 --> 01:07:17,033
- ♪♪
- 1296
- 01:07:17,076 --> 01:07:20,645
- Now think about countries which
- are not so free and open-minded.
- 1297
- 01:07:20,688 --> 01:07:23,300
- If you can reveal people's
- religious views
- 1298
- 01:07:23,343 --> 01:07:25,954
- or political views
- or sexual orientation
- 1299
- 01:07:25,998 --> 01:07:28,740
- based on only profile pictures,
- 1300
- 01:07:28,783 --> 01:07:33,310
- this could be literally
- an issue of life and death.
- 1301
- 01:07:33,353 --> 01:07:36,965
- ♪♪
- 1302
- 01:07:37,009 --> 01:07:39,751
- I think there's no going back.
- 1303
- 01:07:42,145 --> 01:07:44,321
- Do you know what
- the Turing test is?
- 1304
- 01:07:44,364 --> 01:07:48,977
- It's when a human interacts
- with a computer,
- 1305
- 01:07:49,021 --> 01:07:50,805
- and if the human doesn't know
- they're interacting
- 1306
- 01:07:50,849 --> 01:07:52,546
- with a computer,
- 1307
- 01:07:52,590 --> 01:07:54,026
- the test is passed.
- 1308
- 01:07:54,070 --> 01:07:57,247
- And over the next few days,
- 1309
- 01:07:57,290 --> 01:07:59,684
- you're gonna be the human
- component in a Turing test.
- 1310
- 01:07:59,727 --> 01:08:02,295
- Holy shit.Yeah, that's right, Caleb.
- 1311
- 01:08:02,339 --> 01:08:04,080
- You got it.
- 1312
- 01:08:04,123 --> 01:08:06,865
- 'Cause if that test
- is passed,
- 1313
- 01:08:06,908 --> 01:08:10,825
- you are dead center of
- the greatest scientific event
- 1314
- 01:08:10,869 --> 01:08:12,958
- in the history of man.
- 1315
- 01:08:13,001 --> 01:08:14,612
- If you've created
- a conscious machine,
- 1316
- 01:08:14,655 --> 01:08:17,615
- it's not the history
- of man--
- 1317
- 01:08:17,658 --> 01:08:19,356
- That's the history
- of gods.
- 1318
- 01:08:19,399 --> 01:08:26,798
- ♪♪
- 1319
- 01:08:26,841 --> 01:08:28,452
- Nolan: It's almost like
- technology is a god
- 1320
- 01:08:28,495 --> 01:08:29,975
- in and of itself.
- 1321
- 01:08:30,018 --> 01:08:33,152
- ♪♪
- 1322
- 01:08:33,196 --> 01:08:35,241
- Like the weather.
- We can't impact it.
- 1323
- 01:08:35,285 --> 01:08:39,593
- We can't slow it down.
- We can't stop it.
- 1324
- 01:08:39,637 --> 01:08:43,249
- We feel powerless.
- 1325
- 01:08:43,293 --> 01:08:44,685
- Kurzweil:
- If we think of God
- 1326
- 01:08:44,729 --> 01:08:46,687
- as an unlimited amount
- of intelligence,
- 1327
- 01:08:46,731 --> 01:08:48,167
- the closest we can get to that
- 1328
- 01:08:48,211 --> 01:08:50,474
- is by evolving
- our own intelligence
- 1329
- 01:08:50,517 --> 01:08:55,566
- by merging with the artificial
- intelligence we're creating.
- 1330
- 01:08:55,609 --> 01:08:58,003
- Musk:
- Today, our computers, phones,
- 1331
- 01:08:58,046 --> 01:09:01,615
- applications give us
- superhuman capability.
- 1332
- 01:09:01,659 --> 01:09:04,662
- So, as the old maxim says,
- if you can't beat 'em, join 'em.
- 1333
- 01:09:06,968 --> 01:09:09,971
- el Kaliouby: It's about
- a human-machine partnership.
- 1334
- 01:09:10,015 --> 01:09:11,669
- I mean, we already see
- how, you know,
- 1335
- 01:09:11,712 --> 01:09:14,933
- our phones, for example, act
- as memory prosthesis, right?
- 1336
- 01:09:14,976 --> 01:09:17,196
- I don't have to remember
- your phone number anymore
- 1337
- 01:09:17,240 --> 01:09:19,198
- 'cause it's on my phone.
- 1338
- 01:09:19,242 --> 01:09:22,070
- It's about machines
- augmenting our human abilities,
- 1339
- 01:09:22,114 --> 01:09:25,248
- as opposed to, like,
- completely displacing them.
- 1340
- 01:09:25,291 --> 01:09:27,380
- Nolan: If you look at all the
- objects that have made the leap
- 1341
- 01:09:27,424 --> 01:09:30,122
- from analog to digital
- over the last 20 years...
- 1342
- 01:09:30,166 --> 01:09:32,080
- it's a lot.
- 1343
- 01:09:32,124 --> 01:09:35,388
- We're the last analog object
- in a digital universe.
- 1344
- 01:09:35,432 --> 01:09:36,911
- And the problem with that,
- of course,
- 1345
- 01:09:36,955 --> 01:09:40,567
- is that the data input/output
- is very limited.
- 1346
- 01:09:40,611 --> 01:09:42,613
- It's this.
- It's these.
- 1347
- 01:09:42,656 --> 01:09:45,355
- Zilis:
- Our eyes are pretty good.
- 1348
- 01:09:45,398 --> 01:09:48,445
- We're able to take in a lot
- of visual information.
- 1349
- 01:09:48,488 --> 01:09:52,536
- But our information output
- is very, very, very low.
- 1350
- 01:09:52,579 --> 01:09:55,669
- The reason this is important --
- If we envision a scenario
- 1351
- 01:09:55,713 --> 01:09:59,543
- where AI's playing a more
- prominent role in societies,
- 1352
- 01:09:59,586 --> 01:10:02,023
- we want good ways to interact
- with this technology
- 1353
- 01:10:02,067 --> 01:10:04,983
- so that it ends up
- augmenting us.
- 1354
- 01:10:05,026 --> 01:10:07,812
- ♪♪
- 1355
- 01:10:07,855 --> 01:10:09,553
- Musk: I think
- it's incredibly important
- 1356
- 01:10:09,596 --> 01:10:12,295
- that AI not be "other."
- 1357
- 01:10:12,338 --> 01:10:14,862
- It must be us.
- 1358
- 01:10:14,906 --> 01:10:18,605
- And I could be wrong
- about what I'm saying.
- 1359
- 01:10:18,649 --> 01:10:20,216
- I'm certainly open to ideas
- 1360
- 01:10:20,259 --> 01:10:23,915
- if anybody can suggest
- a path that's better.
- 1361
- 01:10:23,958 --> 01:10:27,266
- But I think we're gonna really
- have to either merge with AI
- 1362
- 01:10:27,310 --> 01:10:28,963
- or be left behind.
- 1363
- 01:10:29,007 --> 01:10:36,362
- ♪♪
- 1364
- 01:10:36,406 --> 01:10:38,756
- Gourley: It's hard to kind of
- think of unplugging a system
- 1365
- 01:10:38,799 --> 01:10:41,802
- that's distributed
- everywhere on the planet,
- 1366
- 01:10:41,846 --> 01:10:45,806
- that's distributed now
- across the solar system.
- 1367
- 01:10:45,850 --> 01:10:49,375
- You can't just, you know,
- shut that off.
- 1368
- 01:10:49,419 --> 01:10:51,290
- Nolan:
- We've opened Pandora's box.
- 1369
- 01:10:51,334 --> 01:10:55,642
- We've unleashed forces that
- we can't control, we can't stop.
- 1370
- 01:10:55,686 --> 01:10:57,296
- We're in the midst
- of essentially creating
- 1371
- 01:10:57,340 --> 01:10:59,516
- a new life-form on Earth.
- 1372
- 01:10:59,559 --> 01:11:05,826
- ♪♪
- 1373
- 01:11:05,870 --> 01:11:07,611
- Russell:
- We don't know what happens next.
- 1374
- 01:11:07,654 --> 01:11:10,353
- We don't know what shape
- the intellect of a machine
- 1375
- 01:11:10,396 --> 01:11:14,531
- will be when that intellect is
- far beyond human capabilities.
- 1376
- 01:11:14,574 --> 01:11:17,360
- It's just not something
- that's possible.
- 1377
- 01:11:17,403 --> 01:11:24,715
- ♪♪
- 1378
- 01:11:24,758 --> 01:11:26,934
- The least scary future
- I can think of is one
- 1379
- 01:11:26,978 --> 01:11:29,633
- where we have at least
- democratized AI.
- 1380
- 01:11:31,548 --> 01:11:34,159
- Because if one company
- or small group of people
- 1381
- 01:11:34,202 --> 01:11:37,031
- manages to develop godlike
- digital superintelligence,
- 1382
- 01:11:37,075 --> 01:11:40,339
- they can take over the world.
- 1383
- 01:11:40,383 --> 01:11:42,210
- At least when there's
- an evil dictator,
- 1384
- 01:11:42,254 --> 01:11:44,343
- that human is going to die,
- 1385
- 01:11:44,387 --> 01:11:46,998
- but, for an AI,
- there would be no death.
- 1386
- 01:11:47,041 --> 01:11:49,392
- It would live forever.
- 1387
- 01:11:49,435 --> 01:11:51,916
- And then you have
- an immortal dictator
- 1388
- 01:11:51,959 --> 01:11:53,570
- from which we can never escape.
- 1389
- 01:11:53,613 --> 01:12:02,100
- ♪♪
- 1390
- 01:12:02,143 --> 01:12:10,587
- ♪♪
- 1391
- 01:12:10,630 --> 01:12:19,160
- ♪♪
- 1392
- 01:12:19,204 --> 01:12:27,647
- ♪♪
- 1393
- 01:12:27,691 --> 01:12:36,221
- ♪♪
- 1394
- 01:12:36,264 --> 01:12:44,838
- ♪♪
- 1395
- 01:12:51,845 --> 01:12:53,717
- Woman on P.A.:
- Alan. Macchiato.
- 1396
- 01:13:10,951 --> 01:13:17,610
- ♪♪
- 1397
- 01:13:17,654 --> 01:13:24,269
- ♪♪
- 1398
- 01:13:24,312 --> 01:13:30,884
- ♪♪
- 1399
- 01:13:30,928 --> 01:13:37,543
- ♪♪
- 1400
- 01:13:37,587 --> 01:13:44,245
- ♪♪
- 1401
- 01:13:44,289 --> 01:13:50,817
- ♪♪
- 1402
- 01:13:50,861 --> 01:13:57,520
- ♪♪
- 1403
- 01:13:57,563 --> 01:14:04,178
- ♪♪
- 1404
- 01:14:04,222 --> 01:14:10,794
- ♪♪
- 1405
- 01:14:10,837 --> 01:14:17,496
- ♪♪
- 1406
- 01:14:17,540 --> 01:14:24,111
- ♪♪
- 1407
- 01:14:24,155 --> 01:14:30,727
- ♪♪
- 1408
- 01:14:30,770 --> 01:14:37,473
- ♪♪
- 1409
- 01:14:37,516 --> 01:14:44,044
- ♪♪
- 1410
- 01:14:44,088 --> 01:14:47,570
- Woman:
- Hello?
- 1411
- 01:14:47,613 --> 01:14:54,838
- ♪♪
- 1412
- 01:14:54,881 --> 01:15:02,236
- ♪♪
- 1413
- 01:15:02,280 --> 01:15:09,548
- ♪♪
- 1414
- 01:15:09,592 --> 01:15:16,773
- ♪♪
- 1415
- 01:15:16,816 --> 01:15:18,688
- ♪ Yeah, yeah
- 1416
- 01:15:18,731 --> 01:15:20,080
- ♪ Yeah, yeah
- 1417
- 01:15:20,124 --> 01:15:27,261
- ♪♪
- 1418
- 01:15:27,305 --> 01:15:34,442
- ♪♪
- 1419
- 01:15:34,486 --> 01:15:41,580
- ♪♪
- 1420
- 01:15:41,624 --> 01:15:43,234
- ♪ Yeah, yeah
- 1421
- 01:15:43,277 --> 01:15:45,541
- ♪ Yeah, yeah
- 1422
- 01:15:45,584 --> 01:15:51,764
- ♪♪
- 1423
- 01:15:51,808 --> 01:15:57,988
- ♪♪
- 1424
- 01:15:58,031 --> 01:16:04,342
- ♪♪
- 1425
- 01:16:04,385 --> 01:16:10,609
- ♪♪
- 1426
- 01:16:10,653 --> 01:16:13,046
- Hello?
- 1427
- 01:16:13,090 --> 01:16:22,012
- ♪♪
- 1428
- 01:16:22,055 --> 01:16:31,021
- ♪♪
- 1429
- 01:16:31,064 --> 01:16:39,986
- ♪♪
- 1430
- 01:16:40,030 --> 01:16:48,952
- ♪♪
- 1431
- 01:16:48,995 --> 01:16:57,961
- ♪♪
- 1432
- 01:16:58,004 --> 01:17:06,926
- ♪♪
- 1433
- 01:17:06,970 --> 01:17:15,892
- ♪♪
- 1434
- 01:17:15,935 --> 01:17:24,901
- ♪♪
- 1435
- 01:17:24,944 --> 01:17:33,910
- ♪♪
- 1436
- 01:17:33,953 --> 01:17:40,960
- ♪♪
Add Comment
Please, Sign In to add comment