1 00:00:01,000 --> 00:00:10,320 This, this moment right now, this feels real, but is it? 2 00:00:20,840 --> 00:00:26,220 They are the fake, fake, disgusting news. 3 00:00:26,220 --> 00:00:32,220 People look like they have a much better life than they really do. 4 00:00:32,220 --> 00:00:37,220 We can't reliably distinguish true memories from false memories. 5 00:00:37,220 --> 00:00:43,220 I'm not sure you've ever experienced anything real your entire life. 6 00:00:43,220 --> 00:00:55,680 There's a famous quote that says a lie gets halfway around the world before the truth 7 00:00:55,680 --> 00:00:58,220 has a chance to get its pants on. 8 00:00:58,220 --> 00:01:03,220 This quote is widely attributed to Winston Churchill, which is a lie, ironically. 9 00:01:03,220 --> 00:01:05,220 There's no evidence he ever said that. 10 00:01:05,220 --> 00:01:11,220 But it seems the quote itself has some truth, as an MIT study found fake news spread six 11 00:01:11,220 --> 00:01:13,220 times faster than true news. 12 00:01:13,220 --> 00:01:18,220 And online in particular, where anyone can post anything with virtually no consequences, 13 00:01:18,220 --> 00:01:23,220 it's no wonder that the media we consume is infested with lies. 14 00:01:23,220 --> 00:01:29,220 However, the actual term fake news is believed to have originated in just 2016 from a small 15 00:01:29,220 --> 00:01:35,220 town in Macedonia, where teenagers were creating websites that claimed to be news sources, but 16 00:01:35,220 --> 00:01:39,220 were actually just filled with fabricated and sensationalized stories. 17 00:01:39,220 --> 00:01:40,220 Why? 18 00:01:40,220 --> 00:01:41,220 Ad revenue. 19 00:01:41,220 --> 00:01:43,220 The more clicks, the more money they made. 20 00:01:43,220 --> 00:01:48,220 And of course, it's a lot easier to get clicks when you can just make up whatever tantalizing 21 00:01:48,220 --> 00:01:49,220 story you want. 22 00:01:49,220 --> 00:01:55,220 The thing is, whilst these were just teenagers looking to make a quick buck, the media conglomerates 23 00:01:55,220 --> 00:01:58,220 were coming to a similar realization. 24 00:01:58,220 --> 00:02:04,220 As printed news died out and they lost their loyal subscribers, they faced a new sea of competition 25 00:02:04,220 --> 00:02:07,220 online, and so the clickbait arms race began. 26 00:02:07,220 --> 00:02:12,220 After all, the business model of the news and the media is selling your attention to advertisers. 27 00:02:12,220 --> 00:02:18,220 You are the product, and they have to keep you watching, reading, clicking, keep you consumed. 28 00:02:18,220 --> 00:02:23,220 Their primary goal is not to deliver the most accurate news, it's to attract more attention 29 00:02:23,220 --> 00:02:24,220 and make more money. 30 00:02:24,220 --> 00:02:29,220 And it's been proven that polarizing stories perform best, so no wonder we are more divided 31 00:02:29,220 --> 00:02:34,220 than ever when it's in the media's best interest to manufacture fear and outrage. 32 00:02:34,220 --> 00:02:37,220 It is better for them that you are scared and angry. 33 00:02:37,220 --> 00:02:40,220 And that's just when the goal is money. 34 00:02:40,220 --> 00:02:44,220 What about when powerful groups plant fake news stories to deliberately pit us against 35 00:02:44,220 --> 00:02:47,220 each other and create a certain narrative? 36 00:02:47,220 --> 00:02:50,220 Or countries who plant fake stories to influence elections? 37 00:02:50,220 --> 00:02:56,220 It's no wonder fake news is so prevalent when there is so much money and power to be gained 38 00:02:56,220 --> 00:02:58,220 from manipulating us like this. 39 00:02:58,220 --> 00:03:04,220 And even when these false stories get debunked, the damage is already done, because enough 40 00:03:04,220 --> 00:03:09,220 people will have already read and believed the lies, and the truth just isn't as exciting, 41 00:03:09,220 --> 00:03:11,220 so it doesn't spread as widely. 42 00:03:11,220 --> 00:03:13,220 Of course, fake news is a whole spectrum. 43 00:03:13,220 --> 00:03:19,220 Yes, there still are stories that are totally made up of just blatant lies and propaganda. 44 00:03:19,220 --> 00:03:24,220 But if we're talking about mainstream news, there's actually something more subtle going 45 00:03:24,220 --> 00:03:33,220 on at the other end of the spectrum that's perhaps even more worrying. 46 00:03:33,220 --> 00:03:38,220 Half-truths are statements that are partially true but not fully true, or not the whole story. 47 00:03:38,220 --> 00:03:42,220 For example, if someone's pulled over for drunk driving, they may tell the officer, 48 00:03:42,220 --> 00:03:48,220 I only had one beer, which is true, but doesn't mention they also drank a bottle of wine. 49 00:03:48,220 --> 00:03:53,220 In the media, half-truths allow you to make something seem believable and avoid libel, 50 00:03:53,220 --> 00:03:59,220 but remove all the context and nuance so you can distort reality to suit the narrative you want. 51 00:03:59,220 --> 00:04:04,220 And so, this is why two news stations can talk about the same issue but have such wildly 52 00:04:04,220 --> 00:04:05,220 different arguments. 53 00:04:05,220 --> 00:04:10,220 You're always just seeing a carefully constructed snapshot of what someone wants you to see, 54 00:04:10,220 --> 00:04:13,220 never the full picture with all the details. 55 00:04:13,220 --> 00:04:18,220 And at least with Fox and CNN, you know they have clear biases towards certain political 56 00:04:18,220 --> 00:04:20,220 parties and ideologies. 57 00:04:20,220 --> 00:04:24,220 The real problem is half-truths are inescapable everywhere. 58 00:04:24,220 --> 00:04:28,220 There is infinite information out there that can fit almost any narrative. 59 00:04:28,220 --> 00:04:33,220 And so when it's a human decision of which stories to report and which stats and details 60 00:04:33,220 --> 00:04:36,220 to include, there will always be bias. 61 00:04:36,220 --> 00:04:42,220 Even pictures and videos can be so easily edited or taken out of context to imply something 62 00:04:42,220 --> 00:04:46,220 totally different, that fits the narrative someone wants. 63 00:04:46,220 --> 00:04:51,220 Like how an obvious joke can be edited out of context to demonise someone. 64 00:04:51,220 --> 00:04:53,220 And we're not just talking about news here. 65 00:04:53,220 --> 00:04:57,220 Six companies own almost all of the world's media. 66 00:04:57,220 --> 00:04:59,220 Disney own all of this. 67 00:04:59,220 --> 00:05:01,220 News Corp on all of this. 68 00:05:01,220 --> 00:05:04,220 And Comcast on all of this. 69 00:05:04,220 --> 00:05:11,220 So you essentially have a very small and powerful group owning and controlling most of the media 70 00:05:11,220 --> 00:05:16,220 and quite literally being able to pull the strings of what we hear and what we don't. 71 00:05:16,220 --> 00:05:20,220 Putting whatever spin on information they want that fits their agenda. 72 00:05:20,220 --> 00:05:25,220 After all, controlling our media is the closest thing there is to controlling how we think, 73 00:05:25,220 --> 00:05:30,220 how we vote, how we act, how we feel. 74 00:05:30,220 --> 00:05:35,220 Just like Inception, they can plant an idea in your mind without you really being aware of it 75 00:05:35,220 --> 00:05:38,220 and it can spread like a virus. 76 00:05:38,220 --> 00:05:42,220 A lie told a thousand times becomes the truth. 77 00:05:42,220 --> 00:05:44,220 But it gets worse. 78 00:05:44,220 --> 00:05:53,220 If we're honest, we all know that there's malicious forces out there planting fake news stories 79 00:05:53,220 --> 00:05:58,220 and we all know that the media only give us half-truths that suit their agenda. 80 00:05:58,220 --> 00:06:02,220 But we still like to pretend that we don't fall for that. 81 00:06:02,220 --> 00:06:05,220 We get our information from unbiased sources. 82 00:06:05,220 --> 00:06:07,220 But we don't. 83 00:06:07,220 --> 00:06:10,220 Because unbiased doesn't exist. 84 00:06:10,220 --> 00:06:15,220 Even if it's the most honest journalists trying to be fair and impartial, 85 00:06:15,220 --> 00:06:20,220 they've been indoctrinated just like the rest of us and have their own subconscious biases. 86 00:06:20,220 --> 00:06:26,220 Even if it's not intentional, their choice of wording and what to include is unavoidably going to have bias. 87 00:06:26,220 --> 00:06:31,220 Their worldview will always influence what they show you and how they say it. 88 00:06:31,220 --> 00:06:35,220 It's the same with sites that claim to deliver impartial balanced news. 89 00:06:35,220 --> 00:06:40,220 It's still down to a small group of people who have their own biases who decide what to show. 90 00:06:40,220 --> 00:06:44,220 And subconsciously, they will want to focus on the issues important to them 91 00:06:44,220 --> 00:06:48,220 and they'll want to discredit the views and ideas they don't agree with 92 00:06:48,220 --> 00:06:51,220 and pick stories that show them in a less favorable light. 93 00:06:51,220 --> 00:06:56,220 But we're not just battling fake news, we're battling human psychology. 94 00:06:56,220 --> 00:06:59,220 It's like a game of Chinese whispers or telephone. 95 00:06:59,220 --> 00:07:05,220 By the time you hear any information, it's been passed through multiple other biased sources 96 00:07:05,220 --> 00:07:08,220 who've distorted it, whether intentionally or not. 97 00:07:08,220 --> 00:07:14,220 And so you start to realize, to a certain extent, all news is fake news 98 00:07:14,220 --> 00:07:18,220 because 100% true and objective news is impossible. 99 00:07:18,220 --> 00:07:23,220 Whether it's fake news, half-truths or simply just a subconscious bias, 100 00:07:23,220 --> 00:07:27,220 in one way or another, you're being lied to. 101 00:07:27,220 --> 00:07:30,220 And when you think about it, this isn't surprising. 102 00:07:30,220 --> 00:07:34,220 What is surprising, is we don't really seem to care. 103 00:07:42,220 --> 00:07:49,220 Confirmation bias is the tendency to process information in a way that supports the ideas and beliefs we already have. 104 00:07:49,220 --> 00:07:52,220 In other words, we want our worldview to make sense 105 00:07:52,220 --> 00:07:58,220 and so we instinctively seek out and believe the information that aligns with our views 106 00:07:58,220 --> 00:08:00,220 and disregard the rest. 107 00:08:00,220 --> 00:08:04,220 And this is just one of many cognitive biases we all have 108 00:08:04,220 --> 00:08:07,220 that makes us care less about the accuracy of information 109 00:08:07,220 --> 00:08:09,220 and care more about us being right. 110 00:08:09,220 --> 00:08:12,220 It's why two people will hear the same information 111 00:08:12,220 --> 00:08:16,220 and both will interpret it to fit their current beliefs. 112 00:08:16,220 --> 00:08:21,220 For example, two people with opposing views can hear about a mass shooting 113 00:08:21,220 --> 00:08:23,220 and the one who believes in tighter gun control 114 00:08:23,220 --> 00:08:26,220 will see it as evidence of why their opinion is right. 115 00:08:26,220 --> 00:08:30,220 And the one who believes civilians need guns to defend against attackers 116 00:08:30,220 --> 00:08:33,220 will see it as evidence of why their opinion is right. 117 00:08:33,220 --> 00:08:38,220 Or another example is how if you hear something bad about someone you don't like 118 00:08:38,220 --> 00:08:40,220 you believe it immediately 119 00:08:40,220 --> 00:08:42,220 whereas if it's something bad about someone you do like 120 00:08:42,220 --> 00:08:44,220 you are much more skeptical of it. 121 00:08:44,220 --> 00:08:50,220 Biases like these are why nobody ever really changes their opinion in a debate or argument 122 00:08:50,220 --> 00:08:54,220 they just become even more entrenched in their original views. 123 00:08:54,220 --> 00:08:58,220 Our minds simplify things to try and make sense of the world 124 00:08:58,220 --> 00:09:02,220 and just like with the media, accuracy is not the priority. 125 00:09:02,220 --> 00:09:07,220 Even our own memories can't be trusted. 126 00:09:07,220 --> 00:09:12,220 Our minds want to fill in the gaps which leads us to inventing or removing details 127 00:09:12,220 --> 00:09:14,220 without being aware of it. 128 00:09:14,220 --> 00:09:18,220 Which is why two people can remember the same story differently. 129 00:09:18,220 --> 00:09:21,220 We think of our brain like a database but it's not. 130 00:09:21,220 --> 00:09:24,220 It looks for patterns and generalizations 131 00:09:24,220 --> 00:09:26,220 and rather than remembering exact details 132 00:09:26,220 --> 00:09:29,220 it remembers fragments of memories. 133 00:09:29,220 --> 00:09:30,220 So when you recall a memory 134 00:09:30,220 --> 00:09:34,220 your brain is just trying to piece together some of those fragments 135 00:09:34,220 --> 00:09:38,220 and can easily be distorted in ways we don't even realize. 136 00:09:38,220 --> 00:09:40,220 Just look at the Mandela effect 137 00:09:40,220 --> 00:09:42,220 where huge groups of people remembered 138 00:09:42,220 --> 00:09:45,220 Nelson Mandela dying in prison in the 80s 139 00:09:45,220 --> 00:09:48,220 even though he didn't die until 2013. 140 00:09:48,220 --> 00:09:52,220 Same way that one of the most iconic lines in film is remembered as 141 00:09:52,220 --> 00:09:56,220 Luke, I am your father, when it was actually 142 00:09:56,220 --> 00:09:58,220 No, I am your father. 143 00:09:58,220 --> 00:10:00,220 Both collectively and individually 144 00:10:00,220 --> 00:10:02,220 our memories fail us constantly. 145 00:10:02,220 --> 00:10:04,220 And as if that wasn't bad enough 146 00:10:04,220 --> 00:10:08,220 then something came along that made the truth even harder to decipher. 147 00:10:08,220 --> 00:10:20,220 Just like the media conglomerates, social media platforms profit from sucking you in. 148 00:10:20,220 --> 00:10:24,220 The more addicted you are the better for them and the more money they make. 149 00:10:24,220 --> 00:10:30,220 And we already know about how they exploit human vulnerabilities as they call it. 150 00:10:30,220 --> 00:10:34,220 Because they've publicly talked about how they created the endless scroll effect, 151 00:10:34,220 --> 00:10:40,220 the dopamine hits and an algorithm designed to continually keep you on the platform. 152 00:10:40,220 --> 00:10:48,220 Just like with the media, these social platforms know that polarizing and controversial content performs best. 153 00:10:48,220 --> 00:10:51,220 The stuff that triggers an emotional reaction in people. 154 00:10:51,220 --> 00:10:55,220 Reliability is not a factor the algorithm cares about. 155 00:10:55,220 --> 00:10:58,220 And this helps fuel the huge divide between us all. 156 00:10:58,220 --> 00:11:03,220 Everyone's feed is different and tailored to them and their beliefs. 157 00:11:03,220 --> 00:11:09,220 You essentially get put in an echo chamber that shows you the articles that align with your views 158 00:11:09,220 --> 00:11:12,220 and articles that dismiss the alternatives. 159 00:11:12,220 --> 00:11:16,220 And because these social platforms are tracking everything you click, read and see, 160 00:11:16,220 --> 00:11:21,220 the algorithm very quickly learns exactly how to keep you there longer 161 00:11:21,220 --> 00:11:25,220 which in turn makes you become more and more entrenched in your beliefs. 162 00:11:25,220 --> 00:11:29,220 Leaving you to wonder, how can the other side be so stupid? 163 00:11:29,220 --> 00:11:32,220 How are they not seeing the information I'm seeing? 164 00:11:32,220 --> 00:11:35,220 But they're often not being shown it. 165 00:11:35,220 --> 00:11:38,220 It's difficult to have empathy when you literally cannot see 166 00:11:38,220 --> 00:11:40,220 where the other side are coming from. 167 00:11:40,220 --> 00:11:44,220 We all think we're smarter than average, so most people are wrong. 168 00:11:44,220 --> 00:11:48,220 The reality is that it's easier to believe that we're on the smart side 169 00:11:48,220 --> 00:11:51,220 and that the others who disagree with us must be idiots 170 00:11:51,220 --> 00:11:53,220 when often we're just consuming different media 171 00:11:53,220 --> 00:11:56,220 and none of us are seeing the full picture. 172 00:11:56,220 --> 00:12:01,220 In fact, some scholars have argued that because there's so many media sources to choose from, 173 00:12:01,220 --> 00:12:08,220 we now have selective exposure and can just follow the sources that align with the views we want to hear. 174 00:12:08,220 --> 00:12:12,220 Which may be why the divide between the right and left grows wider 175 00:12:12,220 --> 00:12:16,220 and why there's a reported increase in extremist views. 176 00:12:16,220 --> 00:12:22,220 This is only compounded by the fact that social media is designed for sound bites, not details. 177 00:12:22,220 --> 00:12:27,220 We are living in an age where information is distilled into 280 characters or less 178 00:12:27,220 --> 00:12:31,220 and where most people just read the exaggerated headlines anyway. 179 00:12:31,220 --> 00:12:38,220 And whilst it's easy to blame the media, blame the platforms, at some point we have to blame ourselves too. 180 00:12:38,220 --> 00:12:45,220 The reason fake news spreads so fast on social media is because we share it without questioning how valid or true it really is. 181 00:12:45,220 --> 00:12:51,220 If it aligns with our views, we rarely question the bias or lack of context it may have. 182 00:12:51,220 --> 00:12:57,220 After all, the world is incredibly complicated and almost every issue is nuanced and intricate. 183 00:12:57,220 --> 00:13:01,220 So, we simplify things to such basic, two-dimensional versions. 184 00:13:01,220 --> 00:13:05,220 We pretend things are more black and white than they ever are. 185 00:13:05,220 --> 00:13:11,220 We look to tear down or ignore people with opposing views, not to try and understand where they're coming from. 186 00:13:11,220 --> 00:13:19,220 And then we contribute to the fakeness further by presenting our own lives to look the way we want them to look rather than reality. 187 00:13:19,220 --> 00:13:28,220 We fill our social media with filtered photos and fake friends until the sites that promise to connect the world have left us more disconnected than ever. 188 00:13:28,220 --> 00:13:33,220 It's hardly surprising certain influencers exploit this concept for profit. 189 00:13:33,220 --> 00:13:39,220 By showing you a glimpse of their seemingly perfect life so that you buy the products they allegedly recommend, 190 00:13:39,220 --> 00:13:43,220 it's only when you start to look deeper does the illusion start to crumble. 191 00:13:43,220 --> 00:13:57,220 In fact, when you start to question what's actually real, everything starts to unravel. 192 00:13:57,220 --> 00:14:04,220 We're entering an era in which our enemies can make it look like anyone is saying anything at any point in time. 193 00:14:04,220 --> 00:14:07,220 As you may know, that was not Obama really saying that. 194 00:14:07,220 --> 00:14:12,220 That was a deep fake video that was artificially created several years ago. 195 00:14:12,220 --> 00:14:22,220 So just imagine how accurate this technology will be in a few more years when video can be created depicting anyone saying or doing anything without their consent. 196 00:14:22,220 --> 00:14:26,220 And it will be entirely indistinguishable from the real thing. 197 00:14:26,220 --> 00:14:31,220 It won't be long until recordings won't be considered evidence in a court of law. 198 00:14:31,220 --> 00:14:35,220 So again, you start to question what can we trust? 199 00:14:35,220 --> 00:14:36,220 How about documentaries? 200 00:14:36,220 --> 00:14:42,220 The very definition of a documentary is to provide a factual report on a particular subject. 201 00:14:42,220 --> 00:14:46,220 But if you look at some of the most popular documentaries in recent time, 202 00:14:46,220 --> 00:14:51,220 you'll see they are filled with the same half-truths as everything else. 203 00:14:51,220 --> 00:14:58,220 They need views to be successful and what better way to get views than making them as sensationalized as possible. 204 00:14:58,220 --> 00:15:04,220 Yes, they may still have a lot of truth in them, but they're still never going to give you a full, balanced perspective. 205 00:15:04,220 --> 00:15:09,220 They're still designed to grab your attention and create an emotional response. 206 00:15:09,220 --> 00:15:12,220 Entertainment is the priority, not accuracy. 207 00:15:12,220 --> 00:15:15,220 Okay, so how about statistics? 208 00:15:15,220 --> 00:15:17,220 We think of them as being very factual. 209 00:15:17,220 --> 00:15:24,220 But of course, they're so often twisted and wildly extrapolated to suit a specific agenda as well. 210 00:15:24,220 --> 00:15:29,220 Like, I can tell you that statistically, the safest place in the world to be born is Antarctica, 211 00:15:29,220 --> 00:15:32,220 because the stats show they have the lowest infant mortality rate. 212 00:15:32,220 --> 00:15:35,220 Zero babies have ever died there. 213 00:15:35,220 --> 00:15:39,220 But I won't mention that only 11 babies have ever been born there. 214 00:15:39,220 --> 00:15:41,220 It's the same with studies. 215 00:15:41,220 --> 00:15:46,220 How often does someone quote the result of the study, but has never looked into what the sample size was, 216 00:15:46,220 --> 00:15:51,220 or who paid for it, and whether they have vested interests in the results? 217 00:15:51,220 --> 00:15:55,220 It's for this reason every other week we hear this thing is bad for you, 218 00:15:55,220 --> 00:15:58,220 and then the next week it's supposedly good for you. 219 00:15:58,220 --> 00:16:03,220 For example, there was a study that found women are more attracted to men who don't smile. 220 00:16:03,220 --> 00:16:05,220 Except, did they find that? 221 00:16:05,220 --> 00:16:10,220 What actually happened is in a one-off experiment that was not repeated, 222 00:16:10,220 --> 00:16:16,220 1,000 people from a specific region rated the attractiveness of certain photographs. 223 00:16:16,220 --> 00:16:19,220 If the experiment was repeated with different people or photos, 224 00:16:19,220 --> 00:16:24,220 there's definitely no guarantee of the same results because there's so many variables involved. 225 00:16:24,220 --> 00:16:31,220 And so, can you conclude from that small study that universally women are more attracted to men who don't smile? 226 00:16:31,220 --> 00:16:33,220 No, not at all. 227 00:16:33,220 --> 00:16:39,220 But that's what was reported very widely by places like Business Insider and Psychology Today, 228 00:16:39,220 --> 00:16:42,220 because that made for a better headline. 229 00:16:42,220 --> 00:16:46,220 And once one source starts sharing it, others share it too, 230 00:16:46,220 --> 00:16:49,220 and then people start citing it in conversation, 231 00:16:49,220 --> 00:16:54,220 and just like with the media, all context and detail is stripped away and forgotten. 232 00:16:54,220 --> 00:16:58,220 Even look at one of the most famous psychology studies of all time, 233 00:16:58,220 --> 00:17:01,220 the Stanford Prison Experiments. 234 00:17:01,220 --> 00:17:06,220 This was where participants were assigned roles as either inmates or guards in a mock prison. 235 00:17:06,220 --> 00:17:09,220 And soon after beginning, the guards started mistreating the prisoners, 236 00:17:09,220 --> 00:17:14,220 with the researchers concluding that throwing innocent people into a situation of power 237 00:17:14,220 --> 00:17:16,220 would lead them to abuse it. 238 00:17:16,220 --> 00:17:21,220 This is taught in schools and universities, documentaries, there's even a movie about it. 239 00:17:21,220 --> 00:17:23,220 But it was all a sham. 240 00:17:23,220 --> 00:17:26,220 There's audio showing that the guards were told to act so rough, 241 00:17:26,220 --> 00:17:32,220 and one of the prisoners has even admitted that he faked a breakdown because he wanted to leave early for an exam, 242 00:17:32,220 --> 00:17:35,220 and thought that's what the researchers wanted to see. 243 00:17:35,220 --> 00:17:41,220 So it's no surprise, this experiment's been repeated several times, but the results never matched. 244 00:17:41,220 --> 00:17:44,220 And yet, it's often quoted as fact. 245 00:17:44,220 --> 00:17:51,220 And so again, we get to the real heart of the problem, which is that just like the media or even our own memories, 246 00:17:51,220 --> 00:17:56,220 there's greater incentive to twist things to fit the narrative you want. 247 00:17:56,220 --> 00:18:01,220 Because if the experiment doesn't show something new and exciting, there's no story, 248 00:18:01,220 --> 00:18:03,220 and it's not going to help your career. 249 00:18:03,220 --> 00:18:09,220 So there is every incentive to make the results more interesting and exciting, not accurate. 250 00:18:09,220 --> 00:18:18,220 But here's the thing, if we're saying we can't trust videos and documentaries, we can't trust stats and studies, 251 00:18:18,220 --> 00:18:23,220 suddenly you really do have to start questioning everything. 252 00:18:23,220 --> 00:18:28,220 Quotes, almost all wrongly attributed to a small group of famous people. 253 00:18:28,220 --> 00:18:34,220 Reviews, it's estimated well over half are totally fake to help sell more products. 254 00:18:34,220 --> 00:18:40,220 YouTube, the highest ranked videos are the ones who are best at search engine optimization, not the ones who are most accurate. 255 00:18:40,220 --> 00:18:43,220 Wikipedia, literally editable by anyone. 256 00:18:43,220 --> 00:18:44,220 Money isn't real. 257 00:18:44,220 --> 00:18:49,220 If we all stop believing that this paper or those digits on a screen meant anything, then they wouldn't. 258 00:18:49,220 --> 00:18:54,220 Our senses, I can tell you this is orange, but we could be seeing two totally different things. 259 00:18:54,220 --> 00:18:59,220 Our choices, based on emotion and cognitive biases, not logic. 260 00:18:59,220 --> 00:19:04,220 Our feelings, manipulated constantly by the media and advertisers, are reality. 261 00:19:06,220 --> 00:19:09,220 This is where things start to get really worrying. 262 00:19:09,220 --> 00:19:23,220 So here we are, divided, confused and unsure who or what to believe. 263 00:19:23,220 --> 00:19:31,220 It's clear that almost everything has some bias, even if not intentional, which leads to beliefs that we never actually stop to question. 264 00:19:31,220 --> 00:19:37,220 For example, there are about 10,000 religions worldwide, and yet almost everyone follows the one they were born into. 265 00:19:37,220 --> 00:19:42,220 If they'd been born somewhere else, to different parents, they'd believe something different. 266 00:19:42,220 --> 00:19:45,220 And this is not to pick on any singular belief. 267 00:19:45,220 --> 00:19:49,220 Almost all beliefs are based on second hand information. 268 00:19:49,220 --> 00:19:56,220 There's a quote that if it weren't for movies, the average person would probably have no idea what an elevator shaft looks like. 269 00:19:56,220 --> 00:20:01,220 We say we know what they look like, but actually we've just seen movies that tell us what they look like. 270 00:20:01,220 --> 00:20:11,220 The reality is that even things that we think of as being incredibly obvious and unquestionable, like terrorists or evil, it's always more nuanced. 271 00:20:11,220 --> 00:20:13,220 To them, we're the evil ones. 272 00:20:13,220 --> 00:20:19,220 It's all just down to the very different experiences and beliefs that have been instilled in us. 273 00:20:19,220 --> 00:20:23,220 We're all just focused on our own half-truths. 274 00:20:23,220 --> 00:20:29,220 Because even with hard science, we're still just trusting that the scientists have got it right. 275 00:20:29,220 --> 00:20:34,220 How many of us have actually read all the papers on an issue and checked for themselves? 276 00:20:34,220 --> 00:20:39,220 And I'm not saying we should doubt science, but we do know that science often proves itself wrong. 277 00:20:39,220 --> 00:20:42,220 And anything can change when new evidence comes to light. 278 00:20:42,220 --> 00:20:46,220 So, if it's been wrong before, how do we know it's not wrong right now? 279 00:20:46,220 --> 00:20:52,220 How do we know that what we've been told about our world won't get disproved in the future? 280 00:20:52,220 --> 00:21:01,220 Let's look at Elon Musk, regarded by many as one of the smartest people alive, creating electric cars, chips for the brain and reusable rockets. 281 00:21:01,220 --> 00:21:06,220 He says there's about a one in a billion chance we're even in base reality. 282 00:21:06,220 --> 00:21:13,220 In other words, he proposes that right now, the chances we are just in some kind of simulation is almost certain. 283 00:21:13,220 --> 00:21:20,220 His argument is that if you think of the progress that has been made in video game technology in just the last 50 years, 284 00:21:20,220 --> 00:21:28,220 even if that improvement rate slows down, it seems inevitable we will get to a point where games are indistinguishable from reality. 285 00:21:28,220 --> 00:21:34,220 We're not far off with virtual reality at the moment, let alone in another 50 years. 286 00:21:34,220 --> 00:21:45,220 So, if we can create a world that's indistinguishable from reality, it seems highly likely we are already in some kind of game or simulation or matrix type world. 287 00:21:45,220 --> 00:21:49,220 I mean, if you magnify anything enough, it does become pixelated. 288 00:21:49,220 --> 00:21:54,220 And yes, it may sound far fetched, but how can you prove otherwise? 289 00:21:54,220 --> 00:22:03,220 And is it really that much harder than believing we're in an infinitely large universe spinning on a giant space rock with around 8.7 million other species, 290 00:22:03,220 --> 00:22:10,220 all created by a big bang or a god or whatever you want to believe in based on the beliefs you were indoctrinated with when you grew up? 291 00:22:10,220 --> 00:22:17,220 And suddenly you start to think about a dream you had that just felt so real, and then you woke up. 292 00:22:17,220 --> 00:22:22,220 And you start to question, could you be in some kind of extended dream right now? 293 00:22:22,220 --> 00:22:26,220 Your senses have deceived you before, why is this time any different? 294 00:22:26,220 --> 00:22:31,220 Solipsism is the belief that there is nothing outside of one's own mind. 295 00:22:31,220 --> 00:22:36,220 If you are a brain and a vat, you have no way of knowing you're just a brain and a vat. 296 00:22:36,220 --> 00:22:44,220 You can never rule out the possibility that the entire world you experience, including all other people, isn't real. 297 00:22:44,220 --> 00:22:48,220 It's a figment of your imagination, all in your mind. 298 00:22:48,220 --> 00:22:54,220 When you really start to question the world around you, from the news you hear to the reality you experience, 299 00:22:54,220 --> 00:22:58,220 it's easy to conclude that everything is a lie. 300 00:22:58,220 --> 00:23:00,220 And I'd like to end this video with solutions. 301 00:23:00,220 --> 00:23:04,220 I'd like to tell you that you can still hold a strong position on something. 302 00:23:04,220 --> 00:23:08,220 You just need to be able to argue the other side of the argument first. 303 00:23:08,220 --> 00:23:15,220 That you should seek out the smartest people on the other side of the debate to get their opinion before jumping to conclusions. 304 00:23:15,220 --> 00:23:18,220 I'd like to tell you that you can still get a good idea of the truth. 305 00:23:18,220 --> 00:23:25,220 You just have to dig a little deeper to get a balanced perspective and not rely on a small pool of media sources. 306 00:23:25,220 --> 00:23:32,220 But I can't tell you that, because of course by now you're starting to see that even this video, 307 00:23:32,220 --> 00:23:39,220 the very video that's been on your side pointing out the lies, is in itself a lie. 308 00:23:39,220 --> 00:23:42,220 It's all based on my biased point of view. 309 00:23:42,220 --> 00:23:48,220 I've used stats that you just took at face value and even if I tell you I got them from a reliable source, 310 00:23:48,220 --> 00:23:52,220 did they, did the person creating the stats have a bias? 311 00:23:52,220 --> 00:23:55,220 Even the title of this video, it's a paradox. 312 00:23:55,220 --> 00:23:59,220 If I'm telling you the truth that everything is a lie, then everything isn't a lie. 313 00:23:59,220 --> 00:24:01,220 Which means I was lying. 314 00:24:01,220 --> 00:24:06,220 The more you think about this whole problem of truth and lies, the worse it gets. 315 00:24:06,220 --> 00:24:10,220 The more confusing and terrifying everything becomes. 316 00:24:10,220 --> 00:24:11,220 Is choice an illusion? 317 00:24:11,220 --> 00:24:19,220 Are we being manipulated by a hidden hand that's choosing our politicians and keeping us divided so we fight each other instead of them? 318 00:24:19,220 --> 00:24:25,220 Are our own minds tricking us into making biased judgements on the entire world around us? 319 00:24:25,220 --> 00:24:31,220 So much so that we can't be sure of anything outside of the existence of our own mind? 320 00:24:31,220 --> 00:24:37,220 Eventually you start to wonder if questioning everything is worth it or whether blissful ignorance is better. 321 00:24:37,220 --> 00:24:42,220 Because if you really are going to question anything that you can't prove for certain, 322 00:24:42,220 --> 00:24:47,220 then you have to accept that you could be in a Truman-style reality show right now, 323 00:24:47,220 --> 00:24:51,220 where the world around you isn't real and everyone else but you is in on it. 324 00:24:51,220 --> 00:24:53,220 Which means even I'm in on it. 325 00:24:53,220 --> 00:24:57,220 And they're probably going to cut this video off soon because I've already said too much and over- 326 00:24:57,220 --> 00:24:59,220 . 327 00:24:59,220 --> 00:25:00,220 . 328 00:25:00,220 --> 00:25:01,220 . 329 00:25:01,220 --> 00:25:02,220 . 330 00:25:02,220 --> 00:25:03,220 . 331 00:25:03,220 --> 00:25:04,220 . 332 00:25:04,220 --> 00:25:06,220 . 333 00:25:06,220 --> 00:25:07,220 . 334 00:25:07,220 --> 00:25:08,220 .