1 00:00:00,000 --> 00:00:02,720 The implications of the new technologies, especially again, 2 00:00:02,720 --> 00:00:05,120 artificial intelligence and bioengineering, 3 00:00:05,120 --> 00:00:10,640 it undermines the very most basic assumptions of the liberal order 4 00:00:10,640 --> 00:00:13,760 about human free will, about individualism, 5 00:00:13,760 --> 00:00:17,280 about these basic slogans that the customer is always right, 6 00:00:17,280 --> 00:00:23,040 the voter knows best, the new technologies really undermine these assumptions. 7 00:00:23,040 --> 00:00:28,960 The crucial point is what happens when outside system and outside algorithm 8 00:00:29,040 --> 00:00:31,440 knows you better than you know yourself. 9 00:00:32,720 --> 00:00:37,760 Know how you feel, can predict your emotions, 10 00:00:37,760 --> 00:00:41,920 can manipulate your emotions, can predict your decisions, your choices, 11 00:00:41,920 --> 00:00:44,640 can make choices on your behalf. 12 00:00:45,440 --> 00:00:50,560 And this is true of the marketplace where a corporation knows your choices 13 00:00:50,560 --> 00:00:55,680 better than you and can predict and also manipulate your choices. 14 00:00:55,680 --> 00:00:59,680 More and more decisions, crucial decisions in people's lives, 15 00:00:59,680 --> 00:01:04,560 what to study, where to work, who to marry, whom to vote for. 16 00:01:04,560 --> 00:01:10,400 There is an algorithm out there that can tell you better than what you can tell yourself. 17 00:01:10,400 --> 00:01:11,920 People think it can never happen. 18 00:01:12,800 --> 00:01:14,640 Humans are too complicated. 19 00:01:14,640 --> 00:01:20,720 We have souls, we have spirits, no algorithm can ever figure out. 20 00:01:20,720 --> 00:01:24,240 These mysterious things like the human soul of free will. 21 00:01:25,120 --> 00:01:28,640 But I think that this is 18th century mythology, 22 00:01:29,120 --> 00:01:34,560 which held on for 200 years because there was no technology to do it. 23 00:01:34,960 --> 00:01:39,120 But now, or very soon, we will have the technology to do it. 24 00:01:39,120 --> 00:01:44,160 And it will force us to rethink the fundamentals of things like the free market 25 00:01:44,160 --> 00:01:45,600 or democratic politics. 26 00:01:45,600 --> 00:01:51,600 How do you get people convinced of your vision and enough people, enough countries, 27 00:01:51,600 --> 00:01:53,040 to actually make it work? 28 00:01:54,000 --> 00:01:58,160 So in terms of the first question of formulating a new vision, 29 00:01:59,680 --> 00:02:01,200 I don't think that's impossible. 30 00:02:01,200 --> 00:02:08,880 The first step is to acknowledge the realities, the biological realities of human beings 31 00:02:08,880 --> 00:02:13,920 and how humans make decisions and where human desires and choices really come from. 32 00:02:14,800 --> 00:02:19,840 And the enormous potential for both good and bad of the new technologies 33 00:02:19,840 --> 00:02:21,600 to really hack human beings. 34 00:02:21,680 --> 00:02:27,600 We are very soon, we will have the technology to have it really total surveillance regime 35 00:02:27,600 --> 00:02:33,680 in which you can survey the entire population down to the level of what's happening to your blood pressure 36 00:02:33,680 --> 00:02:37,040 and to your brain activity every minute of the day. 37 00:02:37,040 --> 00:02:45,760 We might soon reach a point when all the people in power, all the powerful positions 38 00:02:45,760 --> 00:02:49,920 are still occupied by human beings, not by computers. 39 00:02:50,240 --> 00:02:52,720 You still have a prime minister, you still have a CEO. 40 00:02:53,440 --> 00:02:58,320 But the prime minister chooses from a menu written by AI. 41 00:02:58,320 --> 00:03:03,120 And vision is situation in 20 or 30 years when the system is so complicated 42 00:03:03,120 --> 00:03:08,320 and so fast moving that no human being is really able to understand it. 43 00:03:08,320 --> 00:03:14,480 Technology will, on the one hand, make it possible to start enhancing and upgrading humans 44 00:03:14,480 --> 00:03:16,880 and on the other hand, especially the rise of AI, 45 00:03:16,880 --> 00:03:20,880 will make more and more humans economically unnecessary useless 46 00:03:20,880 --> 00:03:22,800 and therefore also politically powerless. 47 00:03:23,600 --> 00:03:28,640 And the world or humanity might have different parts of humanity, 48 00:03:28,640 --> 00:03:33,280 might have different futures and we might see really a process of some kind of speciation.