1 00:00:00,000 --> 00:00:10,160 When Yuval Noah Harari published his first book, Sapiens, in 2014 about the history of 2 00:00:10,160 --> 00:00:15,439 the human species, it became a global bestseller, in turn the little-known Israe... 3 00:00:15,439 --> 00:00:20,120 professor into one of the most popular writers and thinkers on the planet. 4 00:00:20,120 --> 00:00:23,920 But when we met with Harari in Tel Aviv this summer, it wasn't our species past that 5 00:00:23,920 --> 00:00:24,920 concerned him. 6 00:00:24,920 --> 00:00:26,560 It was our future. 7 00:00:26,559 --> 00:00:31,019 Harari believes we may be on the brink of creating not just a new, enhanced species 8 00:00:31,019 --> 00:00:37,439 of human, but an entirely new kind of being, one that's far more intelligent than we are. 9 00:00:37,439 --> 00:00:41,920 It sounds like science fiction, but Yuval Noah Harari says it's actually much more 10 00:00:41,920 --> 00:00:43,519 dangerous than that. 11 00:00:50,519 --> 00:00:53,280 You said we are one of the last generations of homo sapiens. 12 00:00:53,280 --> 00:00:57,399 In a century or two, Earth will be dominated by entities that are more different from 13 00:00:57,399 --> 00:01:01,160 us than we are different from chimpanzees. 14 00:01:01,160 --> 00:01:02,840 What the hell does that mean? 15 00:01:02,840 --> 00:01:03,840 That freaked me out. 16 00:01:03,840 --> 00:01:10,480 You know, we'll soon have the power to re-engineer our bodies and brains, whether... 17 00:01:10,480 --> 00:01:18,579 genetic engineering or by directly connecting brains to computers or by creating completely 18 00:01:18,579 --> 00:01:25,019 non-organic entities, artificial intelligence, which is not based at all on... 19 00:01:25,019 --> 00:01:27,539 body and the organic brain. 20 00:01:27,539 --> 00:01:31,039 And these technologies are developing at breakneck speed. 21 00:01:31,039 --> 00:01:35,959 If that is true, then it creates a whole other species. 22 00:01:35,959 --> 00:01:41,259 This is something which is way beyond just another species. 23 00:01:41,259 --> 00:01:46,140 Yuval Noah Harari is talking about the race to develop artificial intelligence, as well 24 00:01:46,140 --> 00:01:51,219 as other technologies like gene editing that could one day enable parents to create... 25 00:01:51,219 --> 00:01:56,700 or more attractive children and brain-computer interfaces that could resul... 26 00:01:56,700 --> 00:01:58,420 hybrids. 27 00:01:58,420 --> 00:02:01,420 What does that do to a society? 28 00:02:01,420 --> 00:02:05,960 It seems like the rich will have access, whereas others wouldn't. 29 00:02:05,960 --> 00:02:13,500 One of the dangers is that we will see in the coming decades a process of greater... 30 00:02:13,860 --> 00:02:19,219 than in any previous time in history, because for the first time it will be real biological 31 00:02:19,219 --> 00:02:20,219 inequality. 32 00:02:20,219 --> 00:02:26,419 If the new technologies are available only to the rich or only to people from a certain 33 00:02:26,419 --> 00:02:34,379 country, then homo sapiens will split into different biological castes because they... 34 00:02:34,379 --> 00:02:38,060 have different bodies and different abilities. 35 00:02:38,620 --> 00:02:43,740 Harari has spent the last few years lecturing and writing about what may lie ahead for 36 00:02:43,740 --> 00:02:44,740 humankind. 37 00:02:44,740 --> 00:02:53,379 In the coming generations, we will learn how to engineer bodies and brains and minds. 38 00:02:53,379 --> 00:02:58,740 He's written two books about the challenges we face in the future, Homo Deus and 21... 39 00:02:58,740 --> 00:03:04,280 for the 21st Century, which along with sapiens have sold more than 35 million cop... 40 00:03:04,280 --> 00:03:07,420 been translated into 65 languages. 41 00:03:07,780 --> 00:03:11,539 His writings have been recommended by President Barack Obama, as well as tech... 42 00:03:11,539 --> 00:03:13,939 Gates and Mark Zuckerberg. 43 00:03:13,939 --> 00:03:16,539 You raise warnings about technology. 44 00:03:16,539 --> 00:03:20,179 You're also embraced by a lot of folks in Silicon Valley. 45 00:03:20,179 --> 00:03:22,299 Isn't that sort of a contradiction? 46 00:03:22,299 --> 00:03:30,199 They are a bit afraid of their own power, that they have realized the immense influence 47 00:03:30,199 --> 00:03:34,060 they have over the world, over the course of evolution, really. 48 00:03:34,060 --> 00:03:39,460 And I think that spooks at least some of them, and that's a good thing. 49 00:03:39,460 --> 00:03:45,759 And this is why they are kind of, to some extent, open to listening. 50 00:03:45,759 --> 00:03:49,340 You started as a history professor. 51 00:03:49,340 --> 00:03:50,340 What do you call yourself now? 52 00:03:50,340 --> 00:03:55,980 I'm still a historian, but I think history is a study of change, not just the study 53 00:03:55,980 --> 00:04:00,319 of the past, but it covers the future as well. 54 00:04:00,319 --> 00:04:05,400 Harari got his Ph.D. in history at Oxford and lives in Israel, where the past is still 55 00:04:05,400 --> 00:04:06,840 very present. 56 00:04:06,840 --> 00:04:09,840 He took us to this archaeological site called Tel Gezir. 57 00:04:09,840 --> 00:04:16,159 Four, five thousand years ago, this was one of the biggest cities in the area. 58 00:04:16,159 --> 00:04:21,839 Harari says cities like this were only possible because about 70,000 years ago, o... 59 00:04:21,839 --> 00:04:27,279 Homo sapiens, experienced a cognitive change that helped us create language, which then 60 00:04:27,279 --> 00:04:32,459 made it possible for us to cooperate in large groups and drive Neanderthals and all other 61 00:04:32,459 --> 00:04:36,479 less cooperative human species into extinction. 62 00:04:36,479 --> 00:04:42,519 Harari fears we are now the ones at risk of being dominated by artificial intelligence. 63 00:04:42,519 --> 00:04:48,719 Maybe the biggest thing that we are facing is really a kind of evolutionary divergence. 64 00:04:48,719 --> 00:04:54,199 For millions of years, intelligence and consciousness went together. 65 00:04:54,199 --> 00:04:59,599 Consciousness is the ability to feel things like pain and pleasure and love and hate. 66 00:04:59,599 --> 00:05:02,920 Intelligence is the ability to solve problems. 67 00:05:02,920 --> 00:05:07,060 But computers or artificial intelligence, they don't have consciousness. 68 00:05:07,060 --> 00:05:08,980 They just have intelligence. 69 00:05:08,980 --> 00:05:12,839 They solve problems in a completely different way than us. 70 00:05:12,839 --> 00:05:17,560 Now in science fiction, it's often assumed that as computers will become more and more 71 00:05:17,560 --> 00:05:22,079 intelligent, they will inevitably also gain consciousness. 72 00:05:22,079 --> 00:05:25,479 But actually it's much more frightening than that in a way. 73 00:05:25,479 --> 00:05:30,919 They will be able to solve more and more problems better than us without having any... 74 00:05:30,919 --> 00:05:32,120 any feelings. 75 00:05:32,120 --> 00:05:34,259 And they will have power over us? 76 00:05:34,259 --> 00:05:36,839 They are already gaining power over us. 77 00:05:36,839 --> 00:05:42,039 Some lenders routinely use complex artificial intelligence algorithms to determine who... 78 00:05:42,039 --> 00:05:43,719 for loans. 79 00:05:43,719 --> 00:05:48,759 Global financial markets are moved by decisions made by machines analyzing huge... 80 00:05:48,759 --> 00:05:53,159 data in ways even their programmers don't always understand. 81 00:05:53,159 --> 00:05:58,159 Harari says the countries and companies that control the most data will in the future 82 00:05:58,159 --> 00:06:00,199 be the ones that control the world. 83 00:06:00,199 --> 00:06:03,680 Today in the world, data is worth much more than money. 84 00:06:03,680 --> 00:06:09,279 Ten years ago, you had these big corporations paying billions and billions for WhatsApp, 85 00:06:09,279 --> 00:06:12,420 for Instagram, and people wondered, are they crazy? 86 00:06:12,420 --> 00:06:18,519 Why do they pay billions to get this application that doesn't produce any money? 87 00:06:18,519 --> 00:06:20,060 And the reason why? 88 00:06:20,060 --> 00:06:21,639 Because it produced data. 89 00:06:21,639 --> 00:06:22,879 And data is the key. 90 00:06:22,879 --> 00:06:31,159 The world is increasingly kind of cut up into spheres of data collection, of data... 91 00:06:31,159 --> 00:06:33,519 In the Cold War, you had the Iron Curtain. 92 00:06:33,519 --> 00:06:37,359 Now you have the Silicon Curtain between the USA and China. 93 00:06:37,359 --> 00:06:39,359 And where does the data go? 94 00:06:39,359 --> 00:06:40,359 California? 95 00:06:40,359 --> 00:06:44,799 Or does it go to Shenzhen and to Shanghai and to Beijing? 96 00:06:44,800 --> 00:06:48,720 Harari is concerned the pandemic has opened the door for more intrusive kinds of data 97 00:06:48,720 --> 00:06:52,000 collection, including biometric data. 98 00:06:52,000 --> 00:06:53,600 What is biometric data? 99 00:06:53,600 --> 00:06:56,520 It's data about what's happening inside my body. 100 00:06:56,520 --> 00:07:01,560 What we have seen so far, it's corporations and governments collecting data about where 101 00:07:01,560 --> 00:07:05,360 we go, who we meet, what movies we watch. 102 00:07:05,360 --> 00:07:10,360 The next phase is the surveillance going under our skin. 103 00:07:10,480 --> 00:07:16,280 I'm wearing a tracker that tracks my heart rate, my sleep. 104 00:07:16,280 --> 00:07:17,759 I don't know where that information is going. 105 00:07:17,759 --> 00:07:20,840 Wear the KGB agent on your wrist willingly. 106 00:07:20,840 --> 00:07:22,840 And I think it's benefiting me. 107 00:07:22,840 --> 00:07:23,840 And it is benefiting. 108 00:07:23,840 --> 00:07:27,100 I mean, the whole thing is that it's not just dystopian. 109 00:07:27,100 --> 00:07:28,360 It's also utopian. 110 00:07:28,360 --> 00:07:36,080 I mean, this kind of data can also enable us to create the best healthcare system in... 111 00:07:36,120 --> 00:07:42,199 The question is, what else is being done with that data and who supervises it? 112 00:07:42,199 --> 00:07:44,199 Who regulates it? 113 00:07:44,199 --> 00:07:48,319 Earlier this year, the Israeli government gave its citizens health data to Pfizer to 114 00:07:48,319 --> 00:07:51,279 get priority access to their vaccine. 115 00:07:51,279 --> 00:07:55,199 The data did not include individual citizens' identities. 116 00:07:55,199 --> 00:07:59,719 So what does Pfizer want the data of all Israelis for? 117 00:07:59,719 --> 00:08:04,879 Because to develop new medicines, new treatments, you need the medical data. 118 00:08:04,879 --> 00:08:09,399 Increasingly, that's the basis for medical research. 119 00:08:09,399 --> 00:08:11,159 And of course, it's not all bad. 120 00:08:11,159 --> 00:08:15,680 Harari's been criticized for pointing out problems without offering solutions. 121 00:08:15,680 --> 00:08:19,959 But he does have some ideas about how to limit the misuse of data. 122 00:08:19,959 --> 00:08:27,879 One key rule is that if you get my data, the data should be used to help me and not to 123 00:08:27,879 --> 00:08:29,399 manipulate me. 124 00:08:29,399 --> 00:08:37,559 Another key rule is that whenever you increase surveillance of individuals, you... 125 00:08:37,559 --> 00:08:43,539 increase surveillance of the corporation and governments and the people at the top. 126 00:08:43,539 --> 00:08:49,299 And the third principle is that never allow all the data to be concentrated in one place. 127 00:08:49,299 --> 00:08:52,079 That's the recipe for a dictatorship. 128 00:08:52,079 --> 00:08:57,480 Netflix tells us what to watch and Amazon tells us what to buy. 129 00:08:57,480 --> 00:09:03,759 Eventually, within 10 or 20 or 30 years, such algorithms could also tell you what 130 00:09:03,759 --> 00:09:11,159 to study at college and where to work and whom to marry and even whom to vote for. 131 00:09:11,159 --> 00:09:16,200 Without greater regulation, Harari believes we're at risk of becoming what he calls... 132 00:09:16,200 --> 00:09:17,200 humans. 133 00:09:17,200 --> 00:09:18,200 What does that mean? 134 00:09:18,200 --> 00:09:23,920 To hack a human being is to get to know that person better than they know themselves. 135 00:09:24,120 --> 00:09:27,639 And based on that, to increasingly manipulate you. 136 00:09:27,639 --> 00:09:32,360 This outside system, it has the potential to remember everything, everything you ever 137 00:09:32,360 --> 00:09:41,120 did and to analyze and find patterns in this data and to get a much better idea of who 138 00:09:41,120 --> 00:09:42,559 you really are. 139 00:09:42,559 --> 00:09:45,199 I came out as gay when I was 21. 140 00:09:45,199 --> 00:09:49,139 It should have been obvious to me when I was 15 that I'm gay. 141 00:09:49,139 --> 00:09:51,719 But something in the mind blocked it. 142 00:09:51,720 --> 00:09:57,160 Now if you think about a teenager today, Facebook can know that they are gay or Amazon 143 00:09:57,160 --> 00:10:04,019 can know that they are gay long before they do, just based on analyzing patterns. 144 00:10:04,019 --> 00:10:08,320 And based on that, you can tell somebody's sexual orientation. 145 00:10:08,320 --> 00:10:09,320 Completely. 146 00:10:09,320 --> 00:10:14,200 And what does it mean if you live in Iran or if you live in Russia or in some other 147 00:10:14,200 --> 00:10:18,899 homophobic country and the police knows that you're gay even before you know it? 148 00:10:18,899 --> 00:10:25,779 When people think about data, they think about companies finding out what their likes 149 00:10:25,779 --> 00:10:27,139 and dislikes are. 150 00:10:27,139 --> 00:10:31,220 But the data that you're talking about, it goes much deeper than that. 151 00:10:31,220 --> 00:10:38,459 Like think in 20 years, when the entire personal history of every journalist, ever... 152 00:10:38,459 --> 00:10:45,139 every politician, every military officer is held by somebody in Beijing or in Washington, 153 00:10:45,139 --> 00:10:51,439 Their ability to manipulate them is like nothing before in history. 154 00:10:51,439 --> 00:10:55,059 Harari lives outside Tel Aviv with his husband, Itzik Yahav. 155 00:10:55,059 --> 00:10:57,939 They've been together for nearly 20 years. 156 00:10:57,939 --> 00:11:02,340 It was Yahav who read Harari's lecture notes for a history course and convinced him to 157 00:11:02,340 --> 00:11:05,299 turn them into his first book, Sapiens. 158 00:11:05,299 --> 00:11:07,899 I read the lessons. 159 00:11:07,899 --> 00:11:09,639 I couldn't stop talking about it. 160 00:11:09,639 --> 00:11:13,919 For me, it was clear that it could be a huge bestseller. 161 00:11:14,000 --> 00:11:15,759 Yahav is now Harari's agent. 162 00:11:15,759 --> 00:11:18,959 Together they started a company called Sapienship. 163 00:11:18,959 --> 00:11:22,779 They're creating an interactive exhibit that'll take visitors through the history ... 164 00:11:22,779 --> 00:11:28,439 evolution and challenge them to think about the future of mankind. 165 00:11:28,439 --> 00:11:33,519 Harari also just published the second installment of a graphic novel based on... 166 00:11:33,519 --> 00:11:39,519 teaching courses at Israel's Hebrew University in ethics and philosophy for... 167 00:11:39,519 --> 00:11:41,360 and bioengineers. 168 00:11:41,360 --> 00:11:48,279 When people write code, they are reshaping politics and economics and ethics and the 169 00:11:48,279 --> 00:11:50,840 structure of human society. 170 00:11:50,840 --> 00:11:55,840 When I think of coders and engineers, I don't think of philosophers and poets. 171 00:11:55,840 --> 00:12:01,120 It's not the case now, but it should be the case because they are increasingly solving 172 00:12:01,120 --> 00:12:03,960 philosophical and poetical riddles. 173 00:12:03,960 --> 00:12:09,440 If you're designing, you know, a self-driving car, so the self-driving car will need to 174 00:12:09,440 --> 00:12:11,880 make ethical decisions. 175 00:12:11,880 --> 00:12:16,720 Like suddenly a kid jumps in front of the car and the only way to prevent running over 176 00:12:16,720 --> 00:12:22,560 the kid is to swerve to the side and be hit by a truck and your own owner, who is asleep 177 00:12:22,560 --> 00:12:25,380 in the back seat, might be killed. 178 00:12:25,380 --> 00:12:29,800 You need to tell the algorithm what to do in this situation. 179 00:12:29,800 --> 00:12:34,680 So you need to actually solve the philosophical question, who to kill. 180 00:12:34,680 --> 00:12:39,280 Last month, the United Nations suggested a moratorium on artificial intelligence systems 181 00:12:39,319 --> 00:12:43,679 that seriously threaten human rights until safeguards are agreed upon. 182 00:12:43,679 --> 00:12:48,039 And advisors to President Biden are proposing what they call a bill of rights to guard 183 00:12:48,039 --> 00:12:50,679 against some of the new technologies. 184 00:12:50,679 --> 00:12:54,959 Harari says just as homo sapiens learned to cooperate with each other many thousands 185 00:12:54,959 --> 00:12:57,959 of years ago, we need to cooperate now. 186 00:12:57,959 --> 00:13:02,519 Certainly, now we are at a point when we need global cooperation. 187 00:13:02,519 --> 00:13:08,879 You cannot regulate the explosive power of artificial intelligence on a national level. 188 00:13:08,879 --> 00:13:11,879 I'm not trying to kind of prophesy what will happen. 189 00:13:11,879 --> 00:13:18,480 I'm trying to warn people about the most dangerous possibilities in the hope that 190 00:13:18,480 --> 00:13:21,480 we will do something in the present to prevent them.