1 00:00:00,000 --> 00:00:10,560 Let's talk about US-China relations. 2 00:00:10,560 --> 00:00:15,560 Obviously, when you talk about the US and China for businesses, it's going to be a 3 00:00:15,560 --> 00:00:17,200 new paradigm in which everyone operates. 4 00:00:17,200 --> 00:00:23,400 It's going to be a certain degree of instability, lack of clarity sometimes. 5 00:00:23,400 --> 00:00:29,080 But if we set aside the sort of vagaries of their domestic politics, set aside US domestic 6 00:00:29,760 --> 00:00:31,760 the challenges that China may be going through. 7 00:00:31,760 --> 00:00:33,640 You've seen this for many years already. 8 00:00:33,640 --> 00:00:39,200 How do you foresee a long-term sort of stable situation between the US and China that at 9 00:00:39,200 --> 00:00:43,760 least the business sector, governments, even the people can start preparing for? 10 00:00:43,760 --> 00:00:44,760 What's the likelihood? 11 00:00:44,760 --> 00:00:55,760 The two economies are so intertwined, and I think we all have to work to create situations 12 00:00:56,440 --> 00:01:01,000 where the tensions do not span out of control. 13 00:01:01,000 --> 00:01:08,000 But if I come to the fundamental issue, it's not so much the question which country has 14 00:01:08,440 --> 00:01:10,160 a larger economy. 15 00:01:10,160 --> 00:01:17,160 I think the question is who masters the fourth industrial revolution? 16 00:01:17,160 --> 00:01:24,160 Because it is industrial superiority, innovative superiority. 17 00:01:24,560 --> 00:01:30,800 Just think of artificial intelligence, quantum computing, and so on, which actually provides 18 00:01:30,800 --> 00:01:32,640 power to a country. 19 00:01:32,640 --> 00:01:39,640 So what we will see is a race for leadership in the fourth industrial revolution. 20 00:01:40,640 --> 00:01:46,880 You see also other countries forming their own sort of separate alliances, allegiances, 21 00:01:46,880 --> 00:01:48,600 clubs amongst themselves. 22 00:01:48,600 --> 00:01:52,960 Even within, in Geneva, you have plurilateral sort of groupings. 23 00:01:52,960 --> 00:01:56,400 Some sort of forming their own little circles of influence. 24 00:01:56,400 --> 00:02:03,400 Do you see this as something positive, as sort of a development in global geopolitics? 25 00:02:03,400 --> 00:02:10,400 It's really multipolar because you have some middle states like Saudi Arabia, Indonesia, 26 00:02:11,400 --> 00:02:18,400 and of course India exercising great influence on the global situation. 27 00:02:25,720 --> 00:02:31,160 Saudi Arabia thanks to its energy reserves. 28 00:02:31,160 --> 00:02:38,160 India thanks to the fact that it is the fastest growing economy at the moment. 29 00:02:38,160 --> 00:02:43,320 You have, I would say, also the industry, industrial powers. 30 00:02:43,320 --> 00:02:50,320 You could argue today that some of the companies like Google, Microsoft, and so on are truly 31 00:02:50,760 --> 00:02:54,960 multinational power factors. 32 00:02:54,960 --> 00:02:57,560 So it's a very complex world. 33 00:02:57,560 --> 00:03:04,560 And it's not a very stable world because we are now witnessing a kind of dynamic system 34 00:03:05,560 --> 00:03:08,640 which is constantly changing. 35 00:03:08,640 --> 00:03:15,200 And I should also add, by the way, to the power factors, small countries. 36 00:03:15,200 --> 00:03:22,200 I would argue that in the world of tomorrow it's not the big fish eating the small fish, 37 00:03:23,400 --> 00:03:26,600 but the fast fish eating the slow fish. 38 00:03:26,600 --> 00:03:33,600 So you have countries like Singapore, like I would add Switzerland, or Israel, and so 39 00:03:35,120 --> 00:03:39,720 on, playing also an essential role in the global system. 40 00:03:39,720 --> 00:03:46,720 We have to confront a very fragmented and I would say to a certain extent turbulent 41 00:03:48,600 --> 00:03:53,880 system because those countries compete not only in economic terms. 42 00:03:53,880 --> 00:03:59,320 They compete having different values, having different systems. 43 00:03:59,360 --> 00:04:06,360 So the world is relatively full of complexities and certainly also uncertainties. 44 00:04:08,200 --> 00:04:15,200 Now what we have to do in order to stick together is to look at those issues where 45 00:04:17,800 --> 00:04:24,000 we have really global interdependence. 46 00:04:24,000 --> 00:04:27,760 If we do not cooperate we will have a lose-lose situation. 47 00:04:27,960 --> 00:04:34,960 I'm thinking, for example, at the environmental challenge which we have, I think at migration 48 00:04:38,240 --> 00:04:41,600 challenges and so on and so on. 49 00:04:41,600 --> 00:04:47,520 And we should focus our efforts of collaborations on those touch points. 50 00:04:47,520 --> 00:04:52,600 What's your advice to the young Singaporeans, the young Qataris, the young Swiss who see 51 00:04:52,600 --> 00:04:55,240 this fast fish phenomenon? 52 00:04:55,240 --> 00:04:59,640 Is there something of a mindset shift that they need to have to be able to compete, 53 00:04:59,640 --> 00:05:01,240 to be able to eat the slow fish? 54 00:05:01,240 --> 00:05:05,960 Yes, I think to embrace change. 55 00:05:05,960 --> 00:05:12,960 What we see now, and we will come back to it, is this complex world and also now the 56 00:05:14,720 --> 00:05:21,720 new factor of the technological revolution, particularly artificial intelligence. 57 00:05:22,720 --> 00:05:29,720 People have difficulty to understand and what we see is a certain fear of the future. 58 00:05:30,440 --> 00:05:37,440 I think it's the first time in global history that people are so pessimistic about the future. 59 00:05:38,120 --> 00:05:43,000 Now my advice would be embrace change. 60 00:05:43,000 --> 00:05:50,000 Change will be a constant factor of our lives and those people who see change as an opportunity 61 00:05:51,000 --> 00:05:54,000 and not as a threat will succeed. 62 00:05:54,000 --> 00:06:00,000 There are some people who also fear or think of this as a fundamental transition point 63 00:06:00,000 --> 00:06:03,360 in human history, this move into AI. 64 00:06:03,360 --> 00:06:10,360 Do you see any prospect of larger global cooperation, setting guidelines, guardrails? 65 00:06:11,480 --> 00:06:18,480 I feel that artificial intelligence is not only a game changer for the future, but also 66 00:06:20,400 --> 00:06:21,840 for business. 67 00:06:21,840 --> 00:06:25,160 It could be a game changer for societies. 68 00:06:25,160 --> 00:06:32,160 We have to make all the efforts to use the tremendous opportunities offered by artificial 69 00:06:32,600 --> 00:06:39,600 intelligence, but there are social and as I mentioned, existential challenges. 70 00:06:40,240 --> 00:06:47,240 Social challenges are for example in the capability to influence elections in a much stronger 71 00:06:48,240 --> 00:06:55,240 way and public opinion in general compared to what we have seen in the past, which is 72 00:06:55,240 --> 00:06:58,240 a danger for democracy. 73 00:06:58,240 --> 00:07:02,240 We have the impact on the workforce. 74 00:07:02,240 --> 00:07:09,240 A study which the World Economic Forum did shows that about one quarter of jobs will 75 00:07:10,240 --> 00:07:17,240 disappear or be replaced by artificial intelligence and one other quarter will require reskilling 76 00:07:21,000 --> 00:07:22,000 and upskilling. 77 00:07:22,000 --> 00:07:29,000 And suddenly you have the existential question because first you have the shift of power 78 00:07:29,000 --> 00:07:36,000 from governance to business and you have the questions that the whole system may go 79 00:07:37,000 --> 00:07:39,000 out of control. 80 00:07:39,000 --> 00:07:46,000 So what the World Economic Forum has done, we created an artificial intelligence governance 81 00:07:47,000 --> 00:07:48,000 alliance. 82 00:07:48,000 --> 00:07:55,000 We have in the alliance, we have cooperating all the big companies, Google, Microsoft, 83 00:07:56,000 --> 00:08:01,000 Meta, IBM and so on. 84 00:08:01,000 --> 00:08:08,000 And on the other hand, we have integrated into those efforts also the G7 effort under 85 00:08:11,000 --> 00:08:14,000 the leadership of Japan. 86 00:08:14,000 --> 00:08:21,000 We work together with the US, we work together with the European Union and we will have in 87 00:08:21,000 --> 00:08:25,000 November an artificial intelligence. 88 00:08:25,000 --> 00:08:32,000 It will be our second meeting with all the experts on our campus in San Francisco as 89 00:08:36,000 --> 00:08:43,000 a preparation for the next annual meeting in Davos, which we will make a true global 90 00:08:44,000 --> 00:08:51,000 summit to look at all the aspects of artificial intelligence and also to develop based on 91 00:08:54,000 --> 00:09:01,000 proposals the necessary safeguards which we need and which require global collaboration. 92 00:09:01,000 --> 00:09:08,000 We cannot have different safeguards in different, let's say global regions. 93 00:09:08,000 --> 00:09:13,000 Are you confident that the collaboration between businesses and between business and 94 00:09:13,000 --> 00:09:19,000 government is something that can sort of progress as rapidly as needed because the technology 95 00:09:19,000 --> 00:09:21,000 is developing extremely rapidly? 96 00:09:21,000 --> 00:09:25,000 But just initial sense, do you think there's room for optimism? 97 00:09:25,000 --> 00:09:28,000 No, you need to have a global collaboration. 98 00:09:28,000 --> 00:09:30,000 Do you think there's room for optimism? 99 00:09:30,000 --> 00:09:35,000 No, you need a kind of resetting of government business relations because in the traditional 100 00:09:42,000 --> 00:09:49,000 way, governments looked at a new technology and set the necessary rules for the further 101 00:09:50,000 --> 00:09:53,000 development of the technology. 102 00:09:53,000 --> 00:10:00,000 But here, the technology is moving so fast and it's also very difficult to understand 103 00:10:03,000 --> 00:10:05,000 really the technology. 104 00:10:05,000 --> 00:10:12,000 So the danger is governments will be too late in creating the necessary borders and the 105 00:10:19,000 --> 00:10:26,000 borderlines around the technology and business which is in a competitive battle will just 106 00:10:26,000 --> 00:10:31,000 move ahead and that means the ghost will leave the bottle. 107 00:10:31,000 --> 00:10:38,000 So we have here to put much more emphasis on self-regulation and that's what the World 108 00:10:39,000 --> 00:10:46,000 Economic Forum is doing to make sure that the self-regulation is also, maybe sometimes 109 00:10:47,000 --> 00:10:53,000 afterwards, approved and endorsed by governments and civil society. 110 00:10:53,000 --> 00:10:58,000 We are, everyone knows, soon to be a super-aged society. 111 00:10:58,000 --> 00:11:03,000 Many societies in Asia as well, Japan and Korea, are moving in that direction. 112 00:11:03,000 --> 00:11:08,000 What's your sense in how do we change the narrative from talking about longevity and 113 00:11:08,000 --> 00:11:15,000 an age society, from sort of a slightly negative challenge kind of narrative to a more 114 00:11:16,000 --> 00:11:17,000 positive one? 115 00:11:17,000 --> 00:11:22,000 We should not just speak about longevity, but we should speak about healthy old age. 116 00:11:29,000 --> 00:11:34,000 And this starts, in my opinion, at an early age. 117 00:11:36,000 --> 00:11:43,000 We should even introduce in schools the teaching, I would say, of health literature, of the 118 00:11:46,000 --> 00:11:53,000 literacy, because definitely we have today in many parts of the world and in many sectors 119 00:11:53,320 --> 00:12:00,320 of population, we have unhealthy behavior which afterwards have a big impact not only 120 00:12:00,880 --> 00:12:06,520 on health insurances, but also on the quality of life. 121 00:12:06,520 --> 00:12:13,520 Our effort should be not only to prolong life, but to make sure that we have as long as possible 122 00:12:14,520 --> 00:12:16,520 a healthy life. 123 00:12:16,520 --> 00:12:23,520 And this starts with health literacy, to know what are actually the factors which make 124 00:12:24,520 --> 00:12:27,520 your life healthier. 125 00:12:28,520 --> 00:12:33,520 A second element is the question of retirement. 126 00:12:34,520 --> 00:12:41,520 We created this retirement age principle when people had much more money, and they were 127 00:12:43,520 --> 00:12:47,520 much shorter life expectancies. 128 00:12:47,520 --> 00:12:54,520 I think we should be very flexible related to the integration of older people into the 129 00:12:55,520 --> 00:12:57,520 work process. 130 00:12:57,520 --> 00:13:06,520 And finally, I think it's a question also, what is the meaning of life? 131 00:13:07,520 --> 00:13:14,520 What mission do people have when they get older? 132 00:13:14,520 --> 00:13:17,520 Do they feel idle or do they still have a mission? 133 00:13:17,520 --> 00:13:32,520 I would say my own most important resource for being active is this notion of having 134 00:13:32,520 --> 00:13:33,520 a mission in life. 135 00:13:33,520 --> 00:13:35,520 Thank you, Professor Schwab. 136 00:13:35,520 --> 00:13:36,520 Thank you very much. 137 00:13:36,520 --> 00:13:37,520 Thank you. 138 00:13:37,520 --> 00:13:37,520