1 00:00:00,000 --> 00:00:01,440 Welcome back to Pizzball when I say something. 2 00:00:01,440 --> 00:00:04,640 I'm going to say the camera never lies, but is that true? 3 00:00:04,640 --> 00:00:06,240 Take a look at this photograph. 4 00:00:06,240 --> 00:00:09,680 It's the winner of a Sony World Photography Award, 5 00:00:09,680 --> 00:00:12,640 but the artist who created it ended up rejecting his prize, 6 00:00:12,640 --> 00:00:13,680 I've read Mitting. 7 00:00:13,680 --> 00:00:16,880 He made it, you're using artificial intelligence. 8 00:00:16,880 --> 00:00:19,920 Well, the use of AI is seemingly everywhere these days, 9 00:00:19,920 --> 00:00:22,560 but the more it's used, the more debate it creates, 10 00:00:22,560 --> 00:00:27,760 just as afternoon as song that uses AI to clone the voices of Drake and the weekend 11 00:00:27,760 --> 00:00:29,920 to the biggest selling artists in the world, 12 00:00:29,920 --> 00:00:33,200 has been removed from streaming services after singing criticism 13 00:00:33,200 --> 00:00:36,320 from their music company over copyright laws. 14 00:00:36,320 --> 00:00:39,760 First, just have a listen to what they normally sound like. 15 00:00:53,760 --> 00:00:57,680 Well, now here's the AI version, a Drake in the Weekend. 16 00:00:57,920 --> 00:00:58,880 I'm absolutely fascinated. 17 00:00:58,880 --> 00:01:13,760 We're joining me now as next year at a music industry legend, 18 00:01:13,760 --> 00:01:15,760 and now the chief executive of T.C. 19 00:01:15,760 --> 00:01:18,800 or to catch a thief and anti-paracy business. 20 00:01:18,800 --> 00:01:22,320 And from Israel, a story in an is best thing all third sapiens, 21 00:01:22,320 --> 00:01:23,840 you've all know a horror. 22 00:01:23,840 --> 00:01:25,040 Well, welcome to both of you. 23 00:01:25,040 --> 00:01:27,520 And next year, a great piece of the metalligraph today, 24 00:01:27,520 --> 00:01:32,320 which you were quoted about this song and about that photograph, 25 00:01:32,320 --> 00:01:37,680 really, I think, demonstrating what a threat AI is to the creative arts. 26 00:01:37,680 --> 00:01:41,200 My question for you is, how serious do you think that threat is? 27 00:01:41,200 --> 00:01:44,160 And should it really matter, or is this in its own way, 28 00:01:44,160 --> 00:01:46,400 another form of art? 29 00:01:46,400 --> 00:01:47,760 Well, the threat is huge. 30 00:01:47,760 --> 00:01:51,120 We're on the edge of a precipice, and I'm not trying to be too dramatic here. 31 00:01:51,120 --> 00:01:54,000 It's come across so quickly as well. That's the point. 32 00:01:55,040 --> 00:01:58,560 This two tracks that you played, I think people need to understand. 33 00:01:58,560 --> 00:02:02,000 The original, it was done by Drake and the Weekend. 34 00:02:02,000 --> 00:02:07,520 The AI generator listens to all Drake and all Weekend, 35 00:02:07,520 --> 00:02:12,720 and then puts something together that comes out as the second track you played. 36 00:02:12,720 --> 00:02:14,960 So, it does no artist involved. 37 00:02:14,960 --> 00:02:17,040 It is completely generated. 38 00:02:17,040 --> 00:02:18,720 So, here's my question for you. 39 00:02:18,720 --> 00:02:22,880 I saw you tapping your fingers to the AI version. 40 00:02:22,880 --> 00:02:23,840 It's spontaneous, yeah. 41 00:02:23,920 --> 00:02:25,360 It's a good tune. 42 00:02:25,360 --> 00:02:26,400 Yeah. 43 00:02:26,400 --> 00:02:27,680 It's why should we care? 44 00:02:27,680 --> 00:02:28,480 Why does it matter? 45 00:02:28,480 --> 00:02:30,880 Why couldn't Drake and Weekend actually think you know what? 46 00:02:30,880 --> 00:02:32,000 This saves us a lot of work. 47 00:02:32,000 --> 00:02:33,280 Okay. 48 00:02:33,280 --> 00:02:34,000 Two things. 49 00:02:34,000 --> 00:02:38,800 One is, there's a British copyright law, which dates back to 1988. 50 00:02:38,800 --> 00:02:41,920 So, you could say it's a bit out of date, and it's being tinkered with. 51 00:02:41,920 --> 00:02:46,080 The other thing is passing off, which is not actually on the statute book. 52 00:02:46,080 --> 00:02:49,280 Passing off has been handed down through precedence who judges, 53 00:02:49,280 --> 00:02:51,040 and passing off is what this is. 54 00:02:51,040 --> 00:02:52,800 Which is why it's been removed exactly. 55 00:02:53,760 --> 00:02:55,280 What if Drake and the Weekend, like I said, 56 00:02:55,280 --> 00:02:57,920 what if they decided to adopt this technology 57 00:02:57,920 --> 00:03:00,080 to save the money to come up with new songs? 58 00:03:00,080 --> 00:03:00,960 Is that possible? 59 00:03:00,960 --> 00:03:02,960 No, I couldn't be a force for good. 60 00:03:02,960 --> 00:03:03,600 I know. 61 00:03:03,600 --> 00:03:05,440 Artists don't do that. 62 00:03:05,440 --> 00:03:06,560 Even if it was a big hit. 63 00:03:06,560 --> 00:03:07,680 It's a good song. 64 00:03:07,680 --> 00:03:10,400 Well, what if it was a kind of what you're suggesting? 65 00:03:10,400 --> 00:03:11,200 Yeah. 66 00:03:11,200 --> 00:03:13,360 Artists, and I'm sure of Drake was here, he would say, 67 00:03:13,360 --> 00:03:17,920 and he got very cross about another track that had been similarly dealt with. 68 00:03:17,920 --> 00:03:19,680 Artists want to do their own stuff. 69 00:03:19,680 --> 00:03:21,920 That's what they're in the game for. 70 00:03:22,000 --> 00:03:26,080 We've had situations in the past with groups who have been doing 71 00:03:26,080 --> 00:03:28,960 reports and samples and all the rest of it. 72 00:03:28,960 --> 00:03:31,760 Real artists want to do their music. 73 00:03:31,760 --> 00:03:32,320 I want to make it. 74 00:03:32,320 --> 00:03:33,120 I don't want anybody else. 75 00:03:33,120 --> 00:03:35,920 Well, what about the photograph, which was a beautiful photograph? 76 00:03:35,920 --> 00:03:38,800 So good, it won on a war before the guy who did it. 77 00:03:38,800 --> 00:03:41,440 Say, well, actually, A.I. created this. 78 00:03:41,440 --> 00:03:45,120 As a lover of great photography, why should we care if that's been done 79 00:03:45,120 --> 00:03:47,680 by artificial intelligence or by an actual human being? 80 00:03:47,680 --> 00:03:48,960 Because they've infringed. 81 00:03:48,960 --> 00:03:51,200 It was done from the Getty Library, wasn't it? 82 00:03:51,200 --> 00:03:52,800 And what they did, they went into the Getty Library, 83 00:03:52,800 --> 00:03:54,000 and they just picked it again. 84 00:03:54,000 --> 00:03:56,320 They scraped it all out and up it came. 85 00:03:56,320 --> 00:03:57,520 And they scraped it so well. 86 00:03:57,520 --> 00:03:59,360 So it's theft, basically. 87 00:03:59,360 --> 00:04:00,000 It's theft. 88 00:04:00,000 --> 00:04:01,040 Yeah, it's OK. 89 00:04:01,040 --> 00:04:02,240 Let me bring in Huvah. 90 00:04:02,240 --> 00:04:04,960 I want to start if I may, Huvah, by playing. 91 00:04:04,960 --> 00:04:08,080 This is a bit from Elon Musk talking to Tucker Carlson 92 00:04:08,080 --> 00:04:09,280 or Fox News last night. 93 00:04:09,280 --> 00:04:11,440 Very interesting, what he said. 94 00:04:11,440 --> 00:04:16,560 I'm going to start something which I think is called truth GBT or an 95 00:04:16,560 --> 00:04:21,040 maximum truth seeking AI that tries to understand the nature of the universe. 96 00:04:21,040 --> 00:04:27,120 And I think this might be the best path to safety in the sense that an AI that 97 00:04:27,120 --> 00:04:32,000 cares about understanding the universe is unlikely to annihilate humans 98 00:04:32,000 --> 00:04:35,360 because we are an interesting part of the universe. 99 00:04:35,360 --> 00:04:38,320 So you know, this came after he said that actually A.I. 100 00:04:38,320 --> 00:04:42,560 could lead to the end of humanity as we know it if we allowed, 101 00:04:42,560 --> 00:04:46,000 because it would be a superior intelligent force, which if it 102 00:04:46,000 --> 00:04:49,280 managed to break out from human control, 103 00:04:49,280 --> 00:04:50,400 could be the end of it. 104 00:04:50,400 --> 00:04:54,320 And that reminded me of a conversation I have with Professor Stephen Hawking 105 00:04:54,320 --> 00:04:56,880 in what was his last television interview, which he said this. 106 00:04:58,160 --> 00:05:01,600 Is artificial intelligence going to be the end of us? 107 00:05:02,640 --> 00:05:06,960 And if it's not, how do we best work with it? 108 00:05:08,560 --> 00:05:11,920 Ever since the start of the industrial revolution, 109 00:05:11,920 --> 00:05:16,960 there have been fears of mass unemployment as machines replaced humans. 110 00:05:18,320 --> 00:05:24,800 Instead, a demand for goats and services has risen in line with the increased capabilities. 111 00:05:25,840 --> 00:05:30,320 Whether his can continue indefinitely as an old new question, 112 00:05:30,320 --> 00:05:34,400 but there is a greater danger from artificial intelligence 113 00:05:34,400 --> 00:05:37,520 if we allow it to become self-designing. 114 00:05:37,600 --> 00:05:41,920 Further, can improve itself rapidly when we may lose control? 115 00:05:43,280 --> 00:05:48,080 So really interesting that Elon Musk really is now saying what Professor Hawking told me, 116 00:05:48,880 --> 00:05:55,440 you're an expert in this field, how worried should we be about artificial intelligence actually 117 00:05:55,440 --> 00:06:01,520 taking control? We should be very worried because what we need to understand about 118 00:06:01,520 --> 00:06:07,520 AI, artificial intelligence. It is the first tool that can make decisions by itself. 119 00:06:08,160 --> 00:06:15,920 On previous inventions in human history, always empowered us. They always gave us more power, 120 00:06:15,920 --> 00:06:22,240 because the decisions were always made by humans. If you invent a knife, the knife cannot decide 121 00:06:22,240 --> 00:06:28,080 whether to use it to cut salad or to murder somebody or to save their life in surgery. 122 00:06:28,160 --> 00:06:34,880 If you invent an atom bomb, similarly, the atom bomb cannot decide who to attack and when and where. 123 00:06:35,280 --> 00:06:41,680 AI is the first technology that can actually make decisions by itself. It can make decisions 124 00:06:41,680 --> 00:06:48,560 about its own usage and development. Newx cannot make better neux. But AI can make better in 125 00:06:49,200 --> 00:06:57,200 AI and also AI can make and does make decisions about us. Increasingly, when you apply to a bank, 126 00:06:57,200 --> 00:07:04,640 to get a loan, you apply to get a job, it's an AI making crucial decisions about your life. 127 00:07:04,640 --> 00:07:11,440 And we haven't seen anything yet. AI is just making its first tiny baby steps. It's something 128 00:07:11,440 --> 00:07:17,520 like 10 years old. You know, to really think about it, think about it as like, this is the beginning 129 00:07:17,600 --> 00:07:24,480 of organic life. For billion years ago, this is the first amiibo crawling out of the organic 130 00:07:24,480 --> 00:07:31,360 soup. Can you imagine how to run a zaro's Rex would look like or how homo sapiens would look like 131 00:07:31,360 --> 00:07:35,760 and what we'll be able to do. Okay, well, you look you successfully, you've got you successfully. 132 00:07:35,760 --> 00:07:40,720 Yeah, look, you successfully terrified me and probably all my viewers. So how do we save ourselves 133 00:07:40,800 --> 00:07:48,000 from the T-Rex of AI? Well, first of all, it can also of course be used for good and so far, 134 00:07:48,000 --> 00:07:54,240 we are still in control, but we don't know for how many years. And therefore, we need to 135 00:07:54,240 --> 00:08:01,360 first of all understand the capabilities of AI and slow down its deployment to make sure that we 136 00:08:01,360 --> 00:08:09,200 use it wisely and safely. You know, the same way that a drug company cannot just release a new medicine 137 00:08:09,280 --> 00:08:15,840 for the public without going through a very rigorous safety checks. It should be the same with AI. 138 00:08:15,840 --> 00:08:20,080 Right, because the problem is, here's the problem, it seems to me, which is, it's fine to be 139 00:08:20,080 --> 00:08:24,160 well-meaning. I mean Elon Musk, I think a lot of what he does, he's a force for good in many ways, 140 00:08:24,160 --> 00:08:29,440 you know, all the stuff he's been involved with is to try and I think help rather than damage the 141 00:08:29,440 --> 00:08:34,000 planet. But this AI intelligence is going to be getting the hands of some pretty bad people. 142 00:08:34,000 --> 00:08:38,240 It's been pretty good with regimes. And then I'm going to have any qualms about trying to get the edge 143 00:08:38,880 --> 00:08:42,800 over the West or America or wherever it may be. That's where I see the real danger. 144 00:08:42,800 --> 00:08:47,200 Is it, you know, a bit like nuclear weapons getting into rogue states' hands? 145 00:08:47,200 --> 00:08:52,000 Is it once the wrong people have control of this, then all help to break loose? 146 00:08:53,120 --> 00:08:59,120 In the hands of the wrong people, AI could be the end of democracy. AI could also be the basis 147 00:08:59,120 --> 00:09:05,680 for the worst totalitarian regimes in human history, because you know, dictators always dreamt 148 00:09:05,760 --> 00:09:11,920 about following everybody and monitoring everybody all the time. But they could never do it. 149 00:09:11,920 --> 00:09:19,040 Because even the Soviet Union, you have 200 million Soviet citizens, Stalin didn't have 200 150 00:09:19,040 --> 00:09:24,880 million KGB officers to follow everybody around all the time, and then to analyze all the 151 00:09:24,880 --> 00:09:30,160 millions of analysts to analyze all the data you collect. Okay, so now, now it's a AI. 152 00:09:30,160 --> 00:09:35,520 Now it is becoming possible. You don't need human agents to follow our answer round. 153 00:09:36,400 --> 00:09:42,480 Well, that was exciting. Our sound desk, which is two days all today, just crashed. 154 00:09:42,480 --> 00:09:47,040 So that's why we went off there, a little suddenly in the middle of that to debate about AI. 155 00:09:47,040 --> 00:09:53,520 As my mother texted me, oh my god, has AI taken over, which actually was a pretty good weather 156 00:09:53,520 --> 00:09:58,080 yet. So I'm rejoined, I hope for the by our two guests next to it is with me, and by you 157 00:09:58,080 --> 00:10:03,440 while in Israel, you've got my apologies, but there's something that departs from you there. 158 00:10:03,440 --> 00:10:08,000 We're just talking about the number of works when you need it. Yeah, I don't get easier. I mean, 159 00:10:08,000 --> 00:10:12,480 you can have a between sure. I'll continue to moment you've got a link just to round off. 160 00:10:12,480 --> 00:10:15,920 You were telling me a story in the break there about Oasis. Tell me what that was about. 161 00:10:15,920 --> 00:10:19,920 As I'm sure your viewers know Oasis, the Gallagher brothers broke up. 162 00:10:19,920 --> 00:10:24,720 Okay, there have been talk of reconciliation, but at the moment that's not on the table. 163 00:10:24,880 --> 00:10:29,120 If you go on YouTube, you will find something called Oasis, which is AI, SIS, 164 00:10:30,000 --> 00:10:33,680 talking about a lost album, an AI generator has made. 165 00:10:33,680 --> 00:10:39,520 Almost it like. It's uncannily good. Right, so if you were to Oasis fan, why would you not want that? 166 00:10:39,520 --> 00:10:46,720 No, what the answer is. Yes, you would. And so it, but I mean, in terms of bands that have ceased to 167 00:10:46,800 --> 00:10:50,800 work, I mean, with people go down the mood of trying to, I don't know. 168 00:10:50,800 --> 00:10:51,680 Got a bit of a hero thing. 169 00:11:03,360 --> 00:11:07,520 I mean, it's fascinating. Let's go to Uval finally. Uval give us, 170 00:11:07,520 --> 00:11:12,000 and on some hope here for us, all right. I need to feel more hopeful than, 171 00:11:12,000 --> 00:11:15,600 than I do right now, which is I basically think the world's going to end quite soon. 172 00:11:15,680 --> 00:11:21,840 And when these, when terminate, it comes real. Do you believe the, the planet has the right 173 00:11:21,840 --> 00:11:27,520 people in the right place to actually stop that happening? I hope so. What you know about the 174 00:11:27,520 --> 00:11:33,280 technology that, you know, you can use the same technology to build completely different societies. 175 00:11:33,280 --> 00:11:40,720 In the 20th century, some people used trains and radio and electricity to build totalitarian 176 00:11:40,720 --> 00:11:45,440 regimes like the Soviet Union, and other people used exactly the same technology to build 177 00:11:45,680 --> 00:11:51,200 democracies. It's the same with AI and with the technologies of the 21st century, 178 00:11:51,200 --> 00:11:58,400 we still have a choice about how to employ them. I think that AI is nowhere near, it's full potential. 179 00:11:58,400 --> 00:12:04,960 Yeah. But also human beings, our nowhere near are full potential. We don't really understand 180 00:12:04,960 --> 00:12:12,880 the full potential of our brains or our minds. If we invest, if for every dollar and every minute 181 00:12:12,880 --> 00:12:20,480 that we invest in developing AI will invest another dollar and minute in developing our own mind 182 00:12:20,480 --> 00:12:25,440 and our own consciousness, I think will be okay. Fascinating. You've all know, Harari, thank you 183 00:12:25,440 --> 00:12:29,520 so much, honestly. A gripping period. Apologies for the interruption.