1 00:00:00,000 --> 00:00:05,000 We do not have everyone in my personal account. 2 00:00:05,000 --> 00:00:12,000 We are very briefly and I will introduce our guest this afternoon. 3 00:00:12,000 --> 00:00:17,000 We say, Lager, you must have eaten from Schneider.com. 4 00:00:17,000 --> 00:00:23,000 He's an author. Many works including Liars and Applients, which we will be talking about today. 5 00:00:23,000 --> 00:00:27,000 Ladies and gentlemen, this is Bruce Shenz. 6 00:00:28,000 --> 00:00:30,000 How are you? 7 00:00:32,000 --> 00:00:36,000 I'm going to put that in the back of it's old bad. 8 00:00:36,000 --> 00:00:41,000 Hi, thanks. I'm not going to talk about my book because I figured if you want to hear about it, 9 00:00:41,000 --> 00:00:46,000 you can read it. I'd rather talk about stuff that I'm thinking about since that. 10 00:00:46,000 --> 00:00:50,000 This is very much ideas and progress, which I made. 11 00:00:50,000 --> 00:00:52,000 It makes it good for a talk here. 12 00:00:52,000 --> 00:00:56,000 I was in an instant feedback and comments and they'll be time for that. 13 00:00:56,000 --> 00:01:00,000 What I want to talk about is security and power. 14 00:01:00,000 --> 00:01:05,000 I think that is a lot of what's interesting and right now and going on right now. 15 00:01:05,000 --> 00:01:10,000 So basically, technologies are disruptive. 16 00:01:10,000 --> 00:01:16,000 They disrupt society by disrupting power balances. 17 00:01:17,000 --> 00:01:22,000 You can look at the history of, on the flower, the stir up or gunpowder, 18 00:01:22,000 --> 00:01:28,000 printing press, telegraph, radio, airplane, container shipping, 19 00:01:28,000 --> 00:01:34,000 diseases, drought, wheat, and see how those technologies changed the balance of power. 20 00:01:34,000 --> 00:01:36,000 Then there's a lot written about this. 21 00:01:36,000 --> 00:01:38,000 Written as history. 22 00:01:38,000 --> 00:01:44,000 It harder is doing this in the present, which is really what I'm thinking about on the internet. 23 00:01:45,000 --> 00:01:48,000 The internet is incredibly disruptive. 24 00:01:48,000 --> 00:01:50,000 We've seen entire industries disappear. 25 00:01:50,000 --> 00:01:53,000 We've seen our hydrogens, we've created, we've seen industries upended, 26 00:01:53,000 --> 00:01:55,000 we've seen the computer industry. 27 00:01:55,000 --> 00:01:58,000 It's self-upended several times. 28 00:01:58,000 --> 00:02:02,000 Government has changed a lot. 29 00:02:02,000 --> 00:02:06,000 We see governments losing power as citizens of organized. 30 00:02:06,000 --> 00:02:09,000 We're seeing political luma's become easier. 31 00:02:09,000 --> 00:02:12,000 We're seeing totalitarian states used power. 32 00:02:12,000 --> 00:02:22,000 Maybe the Obama campaign was revolutionary in how they use the internet to organize and engage people. 33 00:02:22,000 --> 00:02:28,000 You could look at how technology has changed the media, ranging from the 24-hour news cycle 34 00:02:28,000 --> 00:02:35,000 to bloggers and citizen journalism in two eight communications and the acute explosion of media sources. 35 00:02:35,000 --> 00:02:37,000 Social power. 36 00:02:37,000 --> 00:02:39,000 There's a lot here. 37 00:02:39,000 --> 00:02:46,000 Personal publishing, the internet, email, criminal power. 38 00:02:46,000 --> 00:02:51,000 It's certain crimes becoming easier, identity theft, which is really impersonation fraud, 39 00:02:51,000 --> 00:02:53,000 done to scale. 40 00:02:53,000 --> 00:02:58,000 How the internet has changed things. 41 00:02:58,000 --> 00:03:02,000 I think about how this affects computer security, 42 00:03:02,000 --> 00:03:06,000 basically what I do, and then how that affects the rest of the world. 43 00:03:06,000 --> 00:03:14,000 Traditionally, computer security has had the model of the user takes care of it. 44 00:03:14,000 --> 00:03:16,000 That's been the traditional model. 45 00:03:16,000 --> 00:03:19,000 It's actually a very strange model. 46 00:03:19,000 --> 00:03:25,000 We are selling products that aren't secure, aren't any good, 47 00:03:25,000 --> 00:03:29,000 and expect the user to make them good. 48 00:03:29,000 --> 00:03:32,000 I think of it as an automobile manufacturer when they sell you a car saying, 49 00:03:32,000 --> 00:03:34,000 that car doesn't come at brakes. 50 00:03:34,000 --> 00:03:37,000 But brakes are really important, and we think you should have them. 51 00:03:37,000 --> 00:03:41,000 There's some good aftermarket dealers, but you should get some brakes installed pretty quickly, 52 00:03:41,000 --> 00:03:42,000 maybe on the drive home. 53 00:03:42,000 --> 00:03:44,000 It's much safer car that way. 54 00:03:44,000 --> 00:03:50,000 In a lot of ways, that's what we would do with anti-virus with firewalls. 55 00:03:50,000 --> 00:03:56,000 We would sell these products and expect the user to do it themselves. 56 00:03:56,000 --> 00:04:01,000 It's a had some level of expertise necessary to secure their environment. 57 00:04:01,000 --> 00:04:04,000 There's a lot of reasons why we did this. 58 00:04:04,000 --> 00:04:06,000 It's a speed of our industry. 59 00:04:06,000 --> 00:04:11,000 It's how the youth of our industry, but it was the norm. 60 00:04:11,000 --> 00:04:14,000 That model is breaking. 61 00:04:14,000 --> 00:04:17,000 That model is less becoming the norm. 62 00:04:17,000 --> 00:04:24,000 It's changing, actually, and it's not because we've realized there are better ways to do security, 63 00:04:24,000 --> 00:04:28,000 but because of how computers and the net are working today. 64 00:04:28,000 --> 00:04:35,000 There are two trends that I think push this, the change this model. 65 00:04:35,000 --> 00:04:37,000 The first is cloud computing. 66 00:04:37,000 --> 00:04:41,000 The one-hand cloud computing isn't anything new. 67 00:04:41,000 --> 00:04:43,000 In the 60s, we call the time sharing. 68 00:04:43,000 --> 00:04:46,000 In the 80s, we call the client server. 69 00:04:46,000 --> 00:04:51,000 In the 90s, I had a company we call the managed security or managed services. 70 00:04:51,000 --> 00:04:57,000 It's fundamentally a balance between the cost of computation and the cost of data transport. 71 00:04:57,000 --> 00:05:03,000 In the 60s, the computation is very expensive, so it makes sense to centralize computers and their own rooms. 72 00:05:03,000 --> 00:05:06,000 They're on air conditioning and give people badges. 73 00:05:06,000 --> 00:05:09,000 In the 80s, what's come to come to expensive is large storage. 74 00:05:09,000 --> 00:05:12,000 So you end up with a client server model. 75 00:05:12,000 --> 00:05:15,000 In the 90s, it's more services. 76 00:05:15,000 --> 00:05:20,000 Right now, the cost of computing is really dropping towards free. 77 00:05:20,000 --> 00:05:23,000 The cost of transport is dropping towards free. 78 00:05:23,000 --> 00:05:35,000 So it makes sense economically, is to put your computers into the places on the planet where they can be run the most cheaply and access them from wherever you are. 79 00:05:35,000 --> 00:05:37,000 That seems to be the end game. 80 00:05:37,000 --> 00:05:41,000 There's nothing cheaper than free. 81 00:05:41,000 --> 00:05:48,000 It's the time where you see computation pushed the edges, our places where you have relatively low bandwidth, 82 00:05:48,000 --> 00:05:52,000 maybe mobile applications, or relatively high need for local computation. 83 00:05:52,000 --> 00:05:55,000 Like gaming. 84 00:05:55,000 --> 00:06:00,000 But even those are becoming more of a cloud model. 85 00:06:00,000 --> 00:06:06,000 Since the first trend, the second trend is locked down end points. 86 00:06:06,000 --> 00:06:11,000 And I think this is more of a trend in businesses than technology. 87 00:06:11,000 --> 00:06:18,000 But the nowadays, the computing platforms we buy, we have much less control over. 88 00:06:19,000 --> 00:06:21,000 Right? So I have an iPhone. 89 00:06:21,000 --> 00:06:24,000 I can't clear my cookies on an iPhone. 90 00:06:24,000 --> 00:06:27,000 I can't get a program that does that. 91 00:06:27,000 --> 00:06:30,000 I can't even get a program that erases files. 92 00:06:30,000 --> 00:06:33,000 Because I don't have direct control over the memory map. 93 00:06:33,000 --> 00:06:36,000 Is weird things going on in your system. 94 00:06:36,000 --> 00:06:38,000 I can't go with it. 95 00:06:38,000 --> 00:06:39,000 Wow. 96 00:06:39,000 --> 00:06:41,000 She's solving on her laptop. 97 00:06:41,000 --> 00:06:41,000 98 00:06:41,000 --> 00:06:42,000 All right. 99 00:06:42,000 --> 00:06:43,000 Go. 100 00:06:44,000 --> 00:06:46,000 You want to borrow it? 101 00:06:46,000 --> 00:06:54,000 So, so these end user devices, whether they are their tablets or phones or kindles, 102 00:06:54,000 --> 00:06:57,000 the user has much less control over. 103 00:06:57,000 --> 00:07:01,000 And a kindle updates are downloaded automatically. 104 00:07:01,000 --> 00:07:03,000 And I can even say yes. 105 00:07:03,000 --> 00:07:06,000 So, at least in the iPhone, I can say yes or no. 106 00:07:06,000 --> 00:07:09,000 But I still don't have any of the control I have in my OS's. 107 00:07:09,000 --> 00:07:11,000 OSes are moving that direction as well. 108 00:07:11,000 --> 00:07:18,000 Right? Both Windows 8 and Mountain Lion are moving to the direction of these mobile platforms 109 00:07:18,000 --> 00:07:20,000 to give the user less control. 110 00:07:20,000 --> 00:07:22,000 And I think this is just purely economics. 111 00:07:22,000 --> 00:07:28,000 The companies have realized that the more they can control the supply chain, the better they'll do. 112 00:07:28,000 --> 00:07:34,000 So, whether it's Apple or Apple store or however the system works, 113 00:07:34,000 --> 00:07:40,000 you just better off if you can control as much of the environment as possible. 114 00:07:41,000 --> 00:07:44,000 So, so this brings us to a new model of security. 115 00:07:44,000 --> 00:07:48,000 And the bottle is someone else takes care of it. 116 00:07:48,000 --> 00:07:53,000 The model is it just happens automatically by magic. 117 00:07:53,000 --> 00:07:58,000 This happens on, on my Gmail account. 118 00:07:58,000 --> 00:08:01,000 I have no control over Gmail security. 119 00:08:01,000 --> 00:08:04,000 I have to simply trust that Google does it. 120 00:08:04,000 --> 00:08:08,000 I have no control over my pictures on Flicker or my Facebook account. 121 00:08:08,000 --> 00:08:11,000 And I have no control over any of that stuff. 122 00:08:11,000 --> 00:08:17,000 And I'm gave less and less control over the devices where I view these things. 123 00:08:17,000 --> 00:08:20,000 So, users have to trust vendors. 124 00:08:20,000 --> 00:08:23,000 And to a degree we haven't before. 125 00:08:23,000 --> 00:08:25,000 And there's a lot of good reasons why we do this. 126 00:08:25,000 --> 00:08:28,000 I mean, all the reasons why these models make sense. 127 00:08:28,000 --> 00:08:33,000 Convenience, redundancy, automation, the ability to share things. 128 00:08:33,000 --> 00:08:38,000 And the trust can be surprisingly complete. 129 00:08:38,000 --> 00:08:44,000 We're living in a world where Facebook mediates all of our friend interactions. 130 00:08:44,000 --> 00:08:50,000 We already Google knows more about my interest than my wife does. 131 00:08:50,000 --> 00:08:53,000 Which is a little bit freaky. 132 00:08:53,000 --> 00:08:56,000 Google knows what kind of porn every American likes. 133 00:08:56,000 --> 00:08:59,000 Which is really freaky. 134 00:08:59,000 --> 00:09:01,000 But it's a trade off. 135 00:09:01,000 --> 00:09:03,000 Trade off we actually do pretty willingly. 136 00:09:03,000 --> 00:09:10,000 We give up some control and exchange for the environment that works well for us. 137 00:09:10,000 --> 00:09:16,000 And we trust that the vendors will treat us well and protect us from harm. 138 00:09:16,000 --> 00:09:20,000 And the other hand is not we're running out of other options. 139 00:09:20,000 --> 00:09:25,000 And for most everybody there aren't any real viable alternatives. 140 00:09:26,000 --> 00:09:31,000 I run new door, but I'm increasingly a freak. 141 00:09:31,000 --> 00:09:36,000 My mother has a way better time on our computer since she got an apple. 142 00:09:36,000 --> 00:09:40,000 It has apple handling every part of our computing. 143 00:09:40,000 --> 00:09:42,000 Environment. 144 00:09:42,000 --> 00:09:46,000 She loses a phone. She gets annoyed. It just works great. 145 00:09:46,000 --> 00:09:49,000 And most of us can't do it ourselves. 146 00:09:49,000 --> 00:09:52,000 This is becoming more and more complex. 147 00:09:52,000 --> 00:09:56,000 I can't offer advice to tell people to run their own mail servers. 148 00:09:56,000 --> 00:09:59,000 And it just didn't make sense 20 years ago. 149 00:09:59,000 --> 00:10:01,000 It really doesn't make sense now. 150 00:10:01,000 --> 00:10:05,000 And you can't run your own Facebook. 151 00:10:05,000 --> 00:10:11,000 So the model I think of when I think of this type of computing environment is feudal security. 152 00:10:11,000 --> 00:10:15,000 And that's feudal with a D and not with a T. 153 00:10:16,000 --> 00:10:27,000 It's that we as users have to pledge our allegiance to some powerful company who in turn promises to protect us. 154 00:10:27,000 --> 00:10:29,000 And I like it as a metaphor. 155 00:10:29,000 --> 00:10:36,000 Both because there's a real rich historical metaphor because everyone's watching game of thrones. 156 00:10:36,000 --> 00:10:40,000 So you can pull from both sources. 157 00:10:40,000 --> 00:10:48,000 And if you go back to classic medieval feudalism, it was a system designed for a dangerous environment. 158 00:10:48,000 --> 00:10:52,000 We needed someone more powerful than you to protect you. 159 00:10:52,000 --> 00:10:54,000 It was a series of hierarchical relationships. 160 00:10:54,000 --> 00:10:56,000 There are obligations in both directions. 161 00:10:56,000 --> 00:11:00,000 It was actually a pretty complex political system. 162 00:11:00,000 --> 00:11:07,000 And I see more of it permeating the environment that we work in today. 163 00:11:07,000 --> 00:11:08,000 Right? 164 00:11:08,000 --> 00:11:11,000 It has as advantages. 165 00:11:11,000 --> 00:11:12,000 Right? 166 00:11:12,000 --> 00:11:18,000 You know, from most people, the cloud providers are better at security than they are. 167 00:11:18,000 --> 00:11:19,000 Right? 168 00:11:19,000 --> 00:11:21,000 Automatic cloud backup is fantastic. 169 00:11:21,000 --> 00:11:23,000 Automatic updates is fantastic. 170 00:11:23,000 --> 00:11:25,000 All these things are good. 171 00:11:25,000 --> 00:11:26,000 Right? 172 00:11:26,000 --> 00:11:31,000 So so feudal security provides this level of security that most everybody is below. 173 00:11:31,000 --> 00:11:36,000 So it'll raise them up to whatever level the providers are providing. 174 00:11:36,000 --> 00:11:38,000 And for those up here, it lowers them down. 175 00:11:38,000 --> 00:11:47,000 And when you see the barriers to people adopting this are things like the banks who naturally have a higher level of security. 176 00:11:47,000 --> 00:11:49,000 And don't want to go down. 177 00:11:49,000 --> 00:11:55,000 I assume that some point we are going to see a business model of a high security cloud vendor. 178 00:11:55,000 --> 00:11:56,000 Right? 179 00:11:56,000 --> 00:12:05,000 Whether it's a drop box or whatever the art email service, just something for some of these more high assurance users. 180 00:12:06,000 --> 00:12:07,000 Right? 181 00:12:07,000 --> 00:12:10,000 There's all, we also have the problem of regulation. 182 00:12:10,000 --> 00:12:14,000 For a lot of companies, they have auditing reporting requirements. 183 00:12:14,000 --> 00:12:23,000 And, you know, if you go to, I don't know, you go to Dropbox and say, you know, we're using you for our company. 184 00:12:23,000 --> 00:12:25,000 We need to audit your system. 185 00:12:25,000 --> 00:12:27,000 They will say, go away. 186 00:12:27,000 --> 00:12:29,000 Or direct space. 187 00:12:30,000 --> 00:12:40,000 I assume we're going to see some water flow auditing model where, I don't know, the, the rack space audit flows down to whatever service works on top of that, 188 00:12:40,000 --> 00:12:43,000 which flows down to whatever company now uses that service. 189 00:12:43,000 --> 00:12:50,000 Because I think we have to solve the regulatory barriers here. 190 00:12:50,000 --> 00:12:50,000 191 00:12:50,000 --> 00:12:52,000 Futile security has risks. 192 00:12:52,000 --> 00:12:55,000 The vendors are going to act in their self interest. 193 00:12:55,000 --> 00:13:02,000 You hope that their self interest dub tells with your self interest, but that's not always the case. 194 00:13:02,000 --> 00:13:06,000 It's much less the case when you're not paying for the service. 195 00:13:06,000 --> 00:13:10,000 When in fact you were a user, not a customer. 196 00:13:10,000 --> 00:13:14,000 As we see, vendors will make side deals with the government. 197 00:13:14,000 --> 00:13:19,000 And the legal regime is different. 198 00:13:19,000 --> 00:13:23,000 If the data is in your premises, then it's in their premises. 199 00:13:23,000 --> 00:13:24,000 These can act arbitrarily. 200 00:13:24,000 --> 00:13:26,000 Vendors can make mistakes. 201 00:13:26,000 --> 00:13:31,000 In vendors have an incentive to keep users tied to themselves. 202 00:13:31,000 --> 00:13:36,000 And you guys are an exception by allowing users to take their data and leave. 203 00:13:36,000 --> 00:13:39,000 And most companies don't do that. 204 00:13:39,000 --> 00:13:49,000 Because tying the data to the company increases lock-in, increases the value of the company. 205 00:13:49,000 --> 00:13:56,000 So this model is inherently based on trust. 206 00:13:56,000 --> 00:14:09,000 It's inherently based on the companies, the feudal lords, convincing the users to trust them with their data, their photos, their friends, with everything. 207 00:14:09,000 --> 00:14:18,000 And unfortunately, the business model for law these companies is basically betraying that trust for profit. 208 00:14:18,000 --> 00:14:26,000 And that is depending on the on which company more or less transparent, more or less salient. 209 00:14:26,000 --> 00:14:34,000 A lot of effort does go into hiding that fact to pretending it's not true. 210 00:14:34,000 --> 00:14:41,000 And as we turned out these companies have a side business betraying the trust to the government too. 211 00:14:42,000 --> 00:14:48,000 So there's a little bit or it's always a lot of deceit that this is all based on. 212 00:14:48,000 --> 00:14:54,000 And I do worry about how long that can sustain. 213 00:14:54,000 --> 00:15:00,000 Some of it seems to be able to be sustained indefinitely for others I'm not so sure. 214 00:15:00,000 --> 00:15:05,000 The feudal model is also inherently based on power. 215 00:15:05,000 --> 00:15:09,000 And that's what I'm thinking is interesting right now. 216 00:15:09,000 --> 00:15:14,000 And it does dovetail very nicely with the current alignment of power on the internet. 217 00:15:14,000 --> 00:15:21,000 The rise of the controlled endpoints and the third party holding your data, 218 00:15:21,000 --> 00:15:25,000 I mean those two different polls. 219 00:15:25,000 --> 00:15:32,000 So I started the talk by mentioning about the internet changing power. 220 00:15:32,000 --> 00:15:38,000 And if you look back at the history of the internet, a lot of us thought that it would flow in a certain direction. 221 00:15:38,000 --> 00:15:48,000 Right, we, if the internet was designed, then it was really designed in the way that made most technical sense. 222 00:15:48,000 --> 00:15:53,000 There wasn't a lot of agenda placed on the net as was the first design. 223 00:15:53,000 --> 00:15:59,000 And if you look back at the literature around that time, you read about the natural laws of the internet. 224 00:15:59,000 --> 00:16:05,000 Either the internet works a certain way because it's like gravity. 225 00:16:05,000 --> 00:16:09,000 It's just the way it has to work. It's the way it makes sense. 226 00:16:09,000 --> 00:16:13,000 And a lot of us thought this was inevitable. This was the way the world had to work. 227 00:16:13,000 --> 00:16:17,000 I have two quotes. One is John Perry Barlow, 1996. 228 00:16:17,000 --> 00:16:20,000 He's addressing the world economic forum. 229 00:16:20,000 --> 00:16:26,000 And he has something called the Declaration of Independence and Substance Cyberspace, which is a great document to read. 230 00:16:26,000 --> 00:16:31,000 And he's telling governments things like, you have no moral right to rule us. 231 00:16:31,000 --> 00:16:36,000 Nor do you possess any methods of enforcement we have reason to fear. 232 00:16:36,000 --> 00:16:45,000 Now, I have, about three years earlier, John Gilmore writes that the internet interprets censorship as damage and routes around it. 233 00:16:45,000 --> 00:16:51,000 These are very utopian quotes, but we all believe them back then. 234 00:16:51,000 --> 00:16:55,000 We believe that is how the internet works. 235 00:16:55,000 --> 00:17:06,000 That the internet takes the masses, makes them powerful, takes the governments and makes them powerless. 236 00:17:06,000 --> 00:17:13,000 It turns out that's just not true. That's not the way it works. 237 00:17:13,000 --> 00:17:19,000 What it does, like many technologies, is magnify power. 238 00:17:19,000 --> 00:17:23,000 It magnifies power in general. 239 00:17:23,000 --> 00:17:31,000 And what happened is when the powerless discovered the internet, suddenly they had power. 240 00:17:31,000 --> 00:17:36,000 Right, the hackers, the dissidents, the criminals, the disenfranchised. 241 00:17:36,000 --> 00:17:41,000 You know, as those marginal groups discovered the net, suddenly they had power they didn't have before. 242 00:17:41,000 --> 00:17:47,000 And the change was fast and it was stark. 243 00:17:47,000 --> 00:17:54,000 But when powerful interest realized the potential of the internet, they had more power to magnify. 244 00:17:54,000 --> 00:18:04,000 They were much slower, but their ability to use the internet to increase their power is greater. 245 00:18:04,000 --> 00:18:12,000 But the unorganized or more nimble and quick, and the institutions were slower and more effective. 246 00:18:12,000 --> 00:18:15,000 And that's where we are today. 247 00:18:15,000 --> 00:18:22,000 So I look around that I see four classes of internet tools of power. 248 00:18:22,000 --> 00:18:30,000 And what's interesting about them is they all, they all are tools by which a total of government can increase their power, 249 00:18:30,000 --> 00:18:36,000 but they all have viable market reasons for existing. 250 00:18:36,000 --> 00:18:42,000 And so censorship is also a content filtering or data loss prevention. 251 00:18:42,000 --> 00:18:45,000 My propaganda is marketing. 252 00:18:45,000 --> 00:18:49,000 Surveillance is surveillance. 253 00:18:49,000 --> 00:18:51,000 I guess personal data collectors. 254 00:18:51,000 --> 00:18:54,000 Surveillance is the business model of the internet. 255 00:18:54,000 --> 00:18:56,000 Right, use control. 256 00:18:56,000 --> 00:19:02,000 In China, programs have to be certified by the government in order to be used on computers there, 257 00:19:02,000 --> 00:19:05,000 which sounds an awful lot like the Apple Store. 258 00:19:06,000 --> 00:19:10,000 Right, I mean, we laugh, but this is important. 259 00:19:10,000 --> 00:19:19,000 Right, we're building tools that have very different sorts of uses depending on who's using them and why. 260 00:19:19,000 --> 00:19:27,000 And in both the government and the corporate sphere, powerful interests are gaining power with these tools. 261 00:19:27,000 --> 00:19:33,000 Right, censorship and surveillance, both on the rise, the internet censorship project, 262 00:19:33,000 --> 00:19:38,000 which tracks censorship around the world, finds more of it every year. 263 00:19:38,000 --> 00:19:46,000 We see more surveillance by governments every year, even before United States out the United States stuff that happened two weeks ago. 264 00:19:46,000 --> 00:19:51,000 Right, more personal data is being collected and correlated. 265 00:19:51,000 --> 00:19:54,000 We're control over hardware and software. 266 00:19:54,000 --> 00:19:59,000 Right, less purchasing, more licensing, which we're Adobe moved to that model. 267 00:19:59,000 --> 00:20:06,000 Right, this is getting hard. I'm trying to find a taskless productivity tool that I can't find a good one. 268 00:20:06,000 --> 00:20:10,000 Doesn't require me to use the cloud. 269 00:20:10,000 --> 00:20:13,000 Right, and we have corporations. 270 00:20:13,000 --> 00:20:20,000 I think Facebook is one of the examples. They're actually changing social norms. 271 00:20:20,000 --> 00:20:28,000 They're affecting what people think is normal is regular. 272 00:20:28,000 --> 00:20:33,000 And in a for profit, for profit motive. 273 00:20:33,000 --> 00:20:39,000 Right, I think propaganda is something we don't talk about a lot, but it's both companies and governments. 274 00:20:39,000 --> 00:20:43,000 I mean, we've had coal at viral marketing, or we have, there's some cute names for it. 275 00:20:43,000 --> 00:20:46,000 Basically, it's propaganda. 276 00:20:46,000 --> 00:20:49,000 And we're seeing more and more of it. 277 00:20:49,000 --> 00:20:55,000 And now we're at the point where power basically controls everyone's data. 278 00:20:56,000 --> 00:21:00,000 Because for in a lot of ways personal data equals power. 279 00:21:00,000 --> 00:21:02,000 Both in the government side and the corporate side. 280 00:21:02,000 --> 00:21:10,000 Even in non-internet businesses, the need to own the relationship to know more about the customer is driving a lot of data collection. 281 00:21:10,000 --> 00:21:14,000 And all that back end correlation. 282 00:21:14,000 --> 00:21:23,000 And I worry a lot about the, the co-mingling of corporate and government interests here. 283 00:21:23,000 --> 00:21:28,000 Right, we live in a world that I don't have to go through the details of ubiquitous surveillance. 284 00:21:28,000 --> 00:21:32,000 And then basically everything is collected. 285 00:21:32,000 --> 00:21:36,000 I mean Charlie Strow is written about this as the end of prehistory. 286 00:21:36,000 --> 00:21:40,000 That sometime in our lifetime we're going to switch from prehistory. 287 00:21:40,000 --> 00:21:45,000 We're only some things were saved, actual history, or everything is saved. 288 00:21:45,000 --> 00:21:51,000 And now we're in a world where most everything is saved. 289 00:21:52,000 --> 00:22:02,000 And what's happening now and I think it's something I'm not happy about, but it's trying to understand is how powerful it is trying to steer this. 290 00:22:02,000 --> 00:22:14,000 But I mentioned Facebook changing social norms, but we're seeing industries lobbying for laws to make their business models more profitable. 291 00:22:15,000 --> 00:22:25,000 So that's laws to prevent digital copying, laws to reduce privacy, laws allowing different businesses to control bandwidth. 292 00:22:25,000 --> 00:22:36,000 And in the government side we're seeing international bodies trying to get rulings to make the internet easier to surveil into censor. 293 00:22:36,000 --> 00:22:41,000 Right, I've heard this called cyber nationalism and the last, I think, November. 294 00:22:41,000 --> 00:22:47,000 In Dubai there was a meeting of the ITU that's the International Telecommunications Union, those are the guys that run the phone system. 295 00:22:47,000 --> 00:22:51,000 They're not really very tech savvy, but they are very international. 296 00:22:51,000 --> 00:22:54,000 They are very non-US centric. 297 00:22:54,000 --> 00:23:01,000 And they want to rest control over the internet from the US. 298 00:23:01,000 --> 00:23:05,000 For a lot of reasons I think this would be a disaster. 299 00:23:05,000 --> 00:23:15,000 But there's this strong push and unfortunately I wrote it to my blog today, I think the, all the Snowden documents makes their case a lot easier. 300 00:23:15,000 --> 00:23:21,000 Because now when they say, well you can trust the Americans, they're going to say, oh yeah, you're right, you can trust the Americans. 301 00:23:21,000 --> 00:23:25,000 So these things are happening now. 302 00:23:25,000 --> 00:23:35,000 We're seeing large rise in the increase of militarization of cyberspace, which will push more than an unag government control. 303 00:23:35,000 --> 00:23:40,000 I mean, I very much believe we are in the middle of a cyber or arms race. 304 00:23:40,000 --> 00:23:44,000 No, and it's heated up a little bit in the past couple of weeks. 305 00:23:44,000 --> 00:23:49,000 Because we've been complaining about China for a while for the past few years. 306 00:23:49,000 --> 00:23:51,000 I've always assumed we've been giving as good as we're getting. 307 00:23:51,000 --> 00:23:57,000 And now we're getting data that we are giving as good as we're getting, which is just going to make things worse. 308 00:23:57,000 --> 00:24:12,000 And I were pretty sure that the cyber attack against the Saudi oil company, I Ram Co was launched by Iran as an inter-attalliation for Stuxnet, which sounds complicated about our no geopolitics. 309 00:24:12,000 --> 00:24:16,000 Maybe that makes sense. 310 00:24:16,000 --> 00:24:22,000 And we're seeing a lot of alignment of corporate and government power. 311 00:24:22,000 --> 00:24:30,000 I'm pretty sure I'm quoted in the New York Times today as calling Facebook the NSA's Wet Dream. 312 00:24:30,000 --> 00:24:37,000 I'm surprised I use those words, so I don't, there was probably a long interview. 313 00:24:37,000 --> 00:24:41,000 Right, so here's a way to think of it. 314 00:24:41,000 --> 00:24:45,000 In our country, we have two different types of law. 315 00:24:45,000 --> 00:24:49,000 There's constitutional law that regulates what governments do. 316 00:24:49,000 --> 00:24:56,000 And there's his regulatory law that constrains what corporations do. 317 00:24:56,000 --> 00:24:58,000 And they're kind of separate. 318 00:24:58,000 --> 00:25:07,000 We're now living in a world where each group has learned to use the others law to get around its own restrictions. 319 00:25:07,000 --> 00:25:14,000 The government said you all have to carry tracking devices 24-7. 320 00:25:14,000 --> 00:25:18,000 That would be unconstrued. They could never get away with it. 321 00:25:18,000 --> 00:25:21,000 Yet we all carry cellphones. 322 00:25:21,000 --> 00:25:28,000 They all said you must register, you must register whenever you meet a new friend. 323 00:25:28,000 --> 00:25:33,000 We never allow it. Yet we all go on Facebook. 324 00:25:33,000 --> 00:25:39,000 I played this earlier. Two years ago, the onion did a video. 325 00:25:39,000 --> 00:25:43,000 Just go to YouTube and type the onion Facebook CIA. 326 00:25:43,000 --> 00:25:50,000 So short news of video about Facebook, the new Facebook being the new CIA program. 327 00:25:50,000 --> 00:25:55,000 It's hysterical. And it's two years old, which makes it kind of sad. 328 00:25:55,000 --> 00:26:02,000 On the other hand, we're seeing corporations use the governments to enforce their business models. 329 00:26:03,000 --> 00:26:11,000 If I don't know, then the movie industry said that we're going to go into people's computers and trash them if we think they're copying files. 330 00:26:11,000 --> 00:26:14,000 That would be wrong. 331 00:26:14,000 --> 00:26:18,000 But they're going to try to get a lot of the same thing. 332 00:26:18,000 --> 00:26:27,000 Copyright. A lot of examples where industries are trying to bypass their own problems by going through government. 333 00:26:27,000 --> 00:26:31,000 And I think this only gets exacerbated as a more technology. 334 00:26:31,000 --> 00:26:34,000 A few of the lords get more powerful. 335 00:26:34,000 --> 00:26:39,000 And some of that is just the natural order of bigness in our side right now. 336 00:26:39,000 --> 00:26:43,000 We know the weight technology is right now at favors the big. 337 00:26:43,000 --> 00:26:45,000 It doesn't favor many small. 338 00:26:45,000 --> 00:26:50,000 If favors two or three on top and nobody else. 339 00:26:50,000 --> 00:26:54,000 And it's true in geopolitics too. 340 00:26:55,000 --> 00:27:02,000 In any climate change negotiation on the planet, who think that's more power, exoner Bolivia? 341 00:27:02,000 --> 00:27:07,000 It's not even close. Who is more power, exoninized states? 342 00:27:07,000 --> 00:27:09,000 That's actually discussion. 343 00:27:09,000 --> 00:27:13,000 This is weird. 344 00:27:13,000 --> 00:27:16,000 Okay, so that's one trajectory. 345 00:27:16,000 --> 00:27:20,000 There's another trajectory. There's a counter balancing one. 346 00:27:20,000 --> 00:27:26,000 Based on different natural laws of technology. 347 00:27:26,000 --> 00:27:29,000 So in the book, I'm not talking about lords, not lires. 348 00:27:29,000 --> 00:27:32,000 I discussed something called a security gap. 349 00:27:32,000 --> 00:27:38,000 And in that book, I'm talking about effectively the arms race between attackers and defenders. 350 00:27:38,000 --> 00:27:43,000 And the technology causes disruptions of that arms race. 351 00:27:43,000 --> 00:27:46,000 And then there's a rebalancing. 352 00:27:46,000 --> 00:27:49,000 So, you know, fire-armors are invented. 353 00:27:49,000 --> 00:27:51,000 Fingerprint technologies are invented. 354 00:27:51,000 --> 00:27:56,000 All those things upset the balance to attackers and defenders. 355 00:27:56,000 --> 00:28:05,000 And one of the things I point out is that as technology advances, attackers have a natural advantage. 356 00:28:05,000 --> 00:28:08,000 Some of it's a basic first mover advantage. 357 00:28:08,000 --> 00:28:15,000 But in general, unorganized attackers can make use of innovations faster. 358 00:28:15,000 --> 00:28:21,000 So, I don't know, imagine someone invents the motor car. 359 00:28:21,000 --> 00:28:24,000 And the police say, well, that's a really interesting thing. 360 00:28:24,000 --> 00:28:25,000 We can use one of those. 361 00:28:25,000 --> 00:28:29,000 So they have a group to study the automobile and they produce an RFP. 362 00:28:29,000 --> 00:28:31,000 And they get bids and they pick a automobile manufacturer. 363 00:28:31,000 --> 00:28:33,000 They get a car, they have a training system. 364 00:28:33,000 --> 00:28:36,000 Meanwhile, the burglar says, oh, look, new getaway vehicle. 365 00:28:36,000 --> 00:28:40,000 And much more quickly use that. 366 00:28:40,000 --> 00:28:42,000 Now, we saw that on the internet. 367 00:28:42,000 --> 00:28:50,000 And if you remember, as Susie and her dad became a commercial entity, we saw a new breed of cyber criminal appear, like organically out of the ground. 368 00:28:50,000 --> 00:28:51,000 Right? 369 00:28:51,000 --> 00:28:54,000 And immediately able to commit crimes and fraud and identity thefts. 370 00:28:54,000 --> 00:28:56,000 And all of these new things to show up. 371 00:28:56,000 --> 00:29:01,000 Meanwhile, the police who were trained on Agatha Christie novels took what? 372 00:29:01,000 --> 00:29:04,000 10 years to figure it out. 373 00:29:04,000 --> 00:29:06,000 And they had figured it out. 374 00:29:06,000 --> 00:29:11,000 But if you around during that time, it was really painful. 375 00:29:11,000 --> 00:29:17,000 As they had no idea what cyber crime was or how it worked. 376 00:29:17,000 --> 00:29:18,000 Right? 377 00:29:18,000 --> 00:29:22,000 So there's this delay when a new technology appears. 378 00:29:22,000 --> 00:29:25,000 And that's what I think of as a security gap. 379 00:29:25,000 --> 00:29:31,000 The delay between when the non-powerful can make use of the new technology, 380 00:29:31,000 --> 00:29:38,000 the fast and nimble, and between when the powerful, the big and ponderous, 381 00:29:38,000 --> 00:29:40,000 can make use of the technology. 382 00:29:40,000 --> 00:29:45,000 And that gap gives attackers a natural advantage. 383 00:29:45,000 --> 00:29:48,000 And I'll spare the details. 384 00:29:48,000 --> 00:29:53,000 But basically, that gap tends to be greater when there's more technology. 385 00:29:53,000 --> 00:29:54,000 Right? 386 00:29:54,000 --> 00:29:55,000 When your curve is greater. 387 00:29:55,000 --> 00:29:58,000 And it's greater in times of rapid technological. 388 00:29:58,000 --> 00:30:03,000 Actually, it's greater in times of rapid social change due to technological change. 389 00:30:03,000 --> 00:30:09,000 And today, we're living in a world with more technology ever for. 390 00:30:09,000 --> 00:30:16,000 And a greater ramp of social change due to technological change than ever before. 391 00:30:16,000 --> 00:30:21,000 So we're seeing an ever increasing security gap. 392 00:30:21,000 --> 00:30:25,000 So this is the big question that I do not have an answer to. 393 00:30:25,000 --> 00:30:27,000 Who wins? 394 00:30:27,000 --> 00:30:28,000 Who wins? 395 00:30:28,000 --> 00:30:30,000 And in what circumstance? 396 00:30:30,000 --> 00:30:31,000 Right? 397 00:30:31,000 --> 00:30:36,000 Is it just a big slow power beats small nimble power? 398 00:30:37,000 --> 00:30:42,000 And there's going to be some David and Goliath metaphor or Robin Hood and Cher. 399 00:30:42,000 --> 00:30:45,000 I mean, I guess I can need a more medieval metaphor. 400 00:30:45,000 --> 00:30:51,000 Right? But that seems like an open question. 401 00:30:51,000 --> 00:30:52,000 That we don't know. 402 00:30:52,000 --> 00:30:59,000 So for example, in Syria recently, we saw the Syrian dissidents use Facebook to organize. 403 00:30:59,000 --> 00:31:04,000 We saw the Syrian government use Facebook to arrest dissidents. 404 00:31:04,000 --> 00:31:07,000 So right now, it's kind of a mess. 405 00:31:07,000 --> 00:31:12,000 As this shakes out, who gets the upper hand? 406 00:31:12,000 --> 00:31:16,000 I mean, right now, it seems like governments do. 407 00:31:16,000 --> 00:31:28,000 It seems like the ability to collect, to analyze, to employ police, beats, dissidents. 408 00:31:28,000 --> 00:31:31,000 It seems like the big corporations win. 409 00:31:31,000 --> 00:31:36,000 That the need to have a credit card or be on Facebook. 410 00:31:36,000 --> 00:31:41,000 To do all these things to live your life are such that you can't shut them off. 411 00:31:41,000 --> 00:31:43,000 And they win. 412 00:31:43,000 --> 00:31:45,000 But that's not clear to me. 413 00:31:45,000 --> 00:31:52,000 I mean, does he clear to me that those that want to get around the systems always will be able to? 414 00:31:52,000 --> 00:31:56,000 But really, I'm now concerned about everyone in the middle. 415 00:31:56,000 --> 00:32:00,000 Right? The nimble are here, the powerful are here. 416 00:32:00,000 --> 00:32:05,000 So here's the rest of us. I guess it's the hapless peasants. 417 00:32:05,000 --> 00:32:14,000 And as the powerful get more control, I think we get largely left out of any negotiations. 418 00:32:14,000 --> 00:32:18,000 And you see this in arbitrary rules, arbitrary terms of service. 419 00:32:18,000 --> 00:32:24,000 You see this in secret NSA spying programs, or secret overrides to rules, 420 00:32:24,000 --> 00:32:28,000 or an power lining with power. 421 00:32:28,000 --> 00:32:32,000 It's not clear to me that these actually do catch terrorists. 422 00:32:32,000 --> 00:32:39,000 And it's pretty clear to me that they don't actually, but they do affect the rest of us. 423 00:32:39,000 --> 00:32:49,000 And I think these power issues are going to affect all of the discussions we have about the future of the internet in the coming decade. 424 00:32:49,000 --> 00:32:54,000 And these are actually complex issues. 425 00:32:54,000 --> 00:33:00,000 We have to decide how we balance a personal privacy against law enforcement, 426 00:33:00,000 --> 00:33:06,000 how we balance them when we want to prevent coped protection, or prevent child pornography. 427 00:33:06,000 --> 00:33:13,000 And when we decide, is it acceptable for us to be judged by computer algorithms? 428 00:33:13,000 --> 00:33:23,000 Is it acceptable to feed us search results to loan us money for a house, to search us by the test, search us at airports, 429 00:33:23,000 --> 00:33:27,000 to convict us from drunk driving? 430 00:33:27,000 --> 00:33:31,000 How these algorithms affect us? 431 00:33:31,000 --> 00:33:36,000 Do we have the right to correct data about ourselves or to delete it? 432 00:33:36,000 --> 00:33:39,000 Do we want computers to forget? 433 00:33:39,000 --> 00:33:47,000 A lot of social lubricant in our society, by the fact that we are forgetting species. 434 00:33:47,000 --> 00:33:55,000 Do you really want, I mean, I don't want Google Glass because I don't want my wife to be able to pull up old arguments. 435 00:33:55,000 --> 00:33:59,000 That seems bad. 436 00:34:00,000 --> 00:34:03,000 It says a lot of power struggles. 437 00:34:03,000 --> 00:34:07,000 And there are bigger ones coming. 438 00:34:07,000 --> 00:34:14,000 I mean, Cory Doctorer writes about the coming battles to having to do with 3D printing. 439 00:34:14,000 --> 00:34:18,000 And very much the same as a copyright battles. 440 00:34:18,000 --> 00:34:24,000 They will be powerful interests that want to stop the execution of certain data files. 441 00:34:24,000 --> 00:34:27,000 It was music and movies. 442 00:34:27,000 --> 00:34:30,000 In the future it will be 3D guide. 443 00:34:30,000 --> 00:34:32,000 It will be working guns. 444 00:34:32,000 --> 00:34:34,000 It will be the Nike swoosh. 445 00:34:34,000 --> 00:34:39,000 His favorite example is anatomically correct interchangeable Barbie torsos. 446 00:34:39,000 --> 00:34:45,000 Which I never thought of, but would freak out and tell. 447 00:34:45,000 --> 00:34:48,000 Probably rightly so. 448 00:34:48,000 --> 00:34:51,000 Or little statues of Mickey Mouse. 449 00:34:51,000 --> 00:34:56,000 Which will freak out a very powerful company. 450 00:34:56,000 --> 00:35:01,000 Who is that? 451 00:35:01,000 --> 00:35:05,000 You know, we see some incredibly distables and telecoms, 452 00:35:05,000 --> 00:35:09,000 but incredibly destabilizing technology is coming. 453 00:35:09,000 --> 00:35:12,000 And this whole debate out of weapons and mass destruction. 454 00:35:12,000 --> 00:35:17,000 Nuclear chemical biological is basically the idea is that 455 00:35:18,000 --> 00:35:22,000 as technology magnifies power, 456 00:35:22,000 --> 00:35:25,000 you can decide to deal with fewer bad events. 457 00:35:25,000 --> 00:35:31,000 So if the average bad guy who is going to make this up can kill 10 people before he is captured 458 00:35:31,000 --> 00:35:36,000 or robbed 10 house or he is captured, we could handle so many robbers. 459 00:35:36,000 --> 00:35:39,000 But if they can now do 100 times as much damage, 460 00:35:39,000 --> 00:35:45,000 we now need only 100 of them to maintain the same security level. 461 00:35:46,000 --> 00:35:51,000 And a lot of our security is based on having some low level of badness. 462 00:35:51,000 --> 00:35:57,000 But as power magnifies the amount of in-badness each individual can do, 463 00:35:57,000 --> 00:36:00,000 you suddenly start nearly eating much more control. 464 00:36:00,000 --> 00:36:03,000 And not even convince of that it will work. 465 00:36:06,000 --> 00:36:08,000 But that's going to be a huge debate. 466 00:36:08,000 --> 00:36:10,000 And that's going to push fear buttons. 467 00:36:10,000 --> 00:36:13,000 Right? 468 00:36:13,000 --> 00:36:19,000 I mean today largely the powerful are winning these debates. 469 00:36:19,000 --> 00:36:23,000 And I worry that these are actually very complicated issues. 470 00:36:23,000 --> 00:36:27,000 They require meaningful debate, international cooperation, 471 00:36:27,000 --> 00:36:36,000 innovative solutions, which doesn't sound like I just described the US government. 472 00:36:36,000 --> 00:36:39,000 But we're going to have to do this. 473 00:36:39,000 --> 00:36:43,000 I mean a lot of ways the internet is a fortuitous accident. 474 00:36:43,000 --> 00:36:48,000 It's a combination of lack of commercial interests, 475 00:36:48,000 --> 00:36:54,000 government benign neglect, some military requirements for survivability and resilience 476 00:36:54,000 --> 00:36:59,000 and computer engineers with they who liveitarian leanings doing what made technical sense. 477 00:36:59,000 --> 00:37:03,000 That was kind of the stew of the internet. 478 00:37:03,000 --> 00:37:06,000 And that stew's gone. 479 00:37:06,000 --> 00:37:11,000 And there are policy battles going on right now over the future of the internet. 480 00:37:11,000 --> 00:37:17,000 In legislatures around the world, international standards bodies, 481 00:37:17,000 --> 00:37:22,000 international organizations. 482 00:37:22,000 --> 00:37:26,000 And I'm not sure how this is all going to play out. 483 00:37:26,000 --> 00:37:32,000 But I have some suggestions for different people. 484 00:37:32,000 --> 00:37:37,000 For researchers, I want to see a lot more research into these technologies 485 00:37:37,000 --> 00:37:42,000 of social control, surveillance, censorship propaganda and use control. 486 00:37:42,000 --> 00:37:49,000 And especially for you guys at Google, you're in a unique position to study propaganda. 487 00:37:49,000 --> 00:37:54,000 There's very little work being done on recognizing propaganda. 488 00:37:54,000 --> 00:37:59,000 What I want is that my internet to come with all propaganda with little yellow box. 489 00:37:59,000 --> 00:38:02,000 Kind of like what you do on your search pages. 490 00:38:02,000 --> 00:38:05,000 My paid commercial is flagged as such. 491 00:38:05,000 --> 00:38:08,000 I would like that to be done automatically. 492 00:38:08,000 --> 00:38:11,000 This seems vaguely impossible. 493 00:38:11,000 --> 00:38:13,000 But I think we need to start thinking about it. 494 00:38:13,000 --> 00:38:15,000 There is some research done around the edges. 495 00:38:15,000 --> 00:38:22,000 This research done in recognizing fake yellow reviews, recognizing fake Amazon reviews. 496 00:38:22,000 --> 00:38:28,000 But we're, there's right now questions whether trending topics on Twitter. 497 00:38:29,000 --> 00:38:32,000 Is being gained. 498 00:38:32,000 --> 00:38:41,000 So when we're losing this transparency, there's a lot of questions about the information we get. 499 00:38:41,000 --> 00:38:46,000 But I think we need research into, because those four things are really going to become very important. 500 00:38:46,000 --> 00:38:49,000 And understanding how they work and how to get around them is really, 501 00:38:49,000 --> 00:38:51,000 how very important. 502 00:38:51,000 --> 00:38:55,000 We need safe places who are not honestly published. 503 00:38:55,000 --> 00:39:01,000 I mean, WikiLeaks was great, but now seems no more. 504 00:39:01,000 --> 00:39:07,000 Right now, the best thing we have is something called Strongbox, the New Yorkers running. 505 00:39:07,000 --> 00:39:11,000 I'm in the process of having to try to review that system right now. 506 00:39:11,000 --> 00:39:15,000 I think we need a lot more of these all around the world. 507 00:39:15,000 --> 00:39:19,000 We do need research into use limitation. 508 00:39:20,000 --> 00:39:28,000 I mean, I believe that, I believe we're going to get legislation on basically copy protection for digital objects, 509 00:39:28,000 --> 00:39:34,000 because the 3D printers, because of bio printers, because of software-defined radio. 510 00:39:34,000 --> 00:39:38,000 And that's going to really hurt our industry. 511 00:39:38,000 --> 00:39:41,000 Because lawmakers are not going to get this right. 512 00:39:41,000 --> 00:39:44,000 They do something's your Coney and it's going to be ugly. 513 00:39:44,000 --> 00:39:49,000 You can solve the actual problems, the less we're like in be handed solutions that won't work, 514 00:39:49,000 --> 00:39:54,000 and we'll hurt everything else. 515 00:39:58,000 --> 00:40:05,000 To vendors, I want people to remember that a lot of technologies we build have dual use. 516 00:40:05,000 --> 00:40:09,000 That business and military uses are basically the same. 517 00:40:09,000 --> 00:40:16,000 So you see blue coat use to sense the Syrian internet, or so-fost use to ease drop on the internet, 518 00:40:16,000 --> 00:40:20,000 or social media enabling surveillance. 519 00:40:20,000 --> 00:40:28,000 On the one hand, the FBI is trying to get laws passed to have us put back doors in our communication systems, 520 00:40:28,000 --> 00:40:31,000 the other hand, we don't want other countries to do the same thing. 521 00:40:31,000 --> 00:40:34,000 Right, this is hard. 522 00:40:39,000 --> 00:40:43,000 The policy prescriptions I think are harder. 523 00:40:43,000 --> 00:40:48,000 I think in the near term, we need to keep circumvention legal and keep net neutrality. 524 00:40:48,000 --> 00:40:57,000 I think those two things give us some backstop towards the powerful becoming even more powerful. 525 00:40:57,000 --> 00:41:02,000 Long-term, fundamentally, we have to recognize we can't have it both ways. 526 00:41:03,000 --> 00:41:06,000 If we want privacy, we have to want it everywhere. 527 00:41:06,000 --> 00:41:08,000 Our country and abroad. 528 00:41:08,000 --> 00:41:13,000 If we think surveillance is good, we have to accept the elsewhere. 529 00:41:13,000 --> 00:41:19,000 Fundamentally, I want to see power level. 530 00:41:19,000 --> 00:41:22,000 Because the relationship is real unbalanced. 531 00:41:22,000 --> 00:41:26,000 If you think about historical feudalism, you read about it. 532 00:41:26,000 --> 00:41:31,000 It eventually evolved into a more balanced government relationship. 533 00:41:31,000 --> 00:41:36,000 So you had feudalism, which came as charted out as this bilateral agreement. 534 00:41:36,000 --> 00:41:40,000 We're in dangerous world, I need to protect us, protect me. 535 00:41:40,000 --> 00:41:44,000 So I will pledge my allegiance to you, turn into something very unbalanced. 536 00:41:44,000 --> 00:41:50,000 I'm powerful, I can do whatever I want, I will ignore my agreements, your powerless, you can't do anything. 537 00:41:50,000 --> 00:41:58,000 And that eventually changed with the rise of the nation state, 538 00:41:59,000 --> 00:42:09,000 with basically rules that gave the feudal lords responsibilities as well as rights. 539 00:42:09,000 --> 00:42:15,000 I culminating in something like the Magna Carta. 540 00:42:15,000 --> 00:42:22,000 And I think we're going to need something like that on the internet, with the current set of powers on the internet, 541 00:42:22,000 --> 00:42:25,000 which will be both government and corporate. 542 00:42:25,000 --> 00:42:34,000 Some basic understanding that there are rights and responsibilities, is some more balanced relationship. 543 00:42:34,000 --> 00:42:40,000 And whether that's limitations on what defenders can do with our data, 544 00:42:40,000 --> 00:42:47,000 or some public scrutiny for the rules by which we are judged by our data. 545 00:42:47,000 --> 00:42:52,000 I expect no time soon, but eventually these will come. 546 00:42:53,000 --> 00:42:57,000 I think this is how we get liberty in the internet world. 547 00:42:57,000 --> 00:43:01,000 And I think this is actually a very long and difficult battle. 548 00:43:01,000 --> 00:43:07,000 I think some of the results will upend your company. 549 00:43:07,000 --> 00:43:12,000 But they might not be coming for a decade or more. 550 00:43:12,000 --> 00:43:14,000 So that's what I have prepared. 551 00:43:14,000 --> 00:43:18,000 I'm happy to take questions. 552 00:43:19,000 --> 00:43:26,000 So there are some rules about a microphone that are confusing to me. 553 00:43:26,000 --> 00:43:33,000 You talked about kind of a game between governments on one hand and corporations on the other, 554 00:43:33,000 --> 00:43:38,000 using each other's power systems, essentially get at everyone in the middle. 555 00:43:38,000 --> 00:43:41,000 How long do you think that that game can play out? 556 00:43:41,000 --> 00:43:47,000 Is it indefinite, can it continue for the able future, or do you see the point in which? 557 00:43:47,000 --> 00:43:51,000 Some scandal, or something so threatened the middle is to galvanize. 558 00:43:51,000 --> 00:43:52,000 I don't know. 559 00:43:52,000 --> 00:43:55,000 And I think we're very much an uncharted territory here. 560 00:43:55,000 --> 00:44:01,000 We're living in a world where it's very hard for the middle to be galvanized. 561 00:44:01,000 --> 00:44:04,000 For lots of different reasons. 562 00:44:04,000 --> 00:44:06,000 I mean, a lot of people have written about this. 563 00:44:06,000 --> 00:44:10,000 I have trouble predicting the future because things are changing so fast. 564 00:44:10,000 --> 00:44:13,000 Right now, it all seems quite dysfunctional. 565 00:44:13,000 --> 00:44:16,000 And there's no mechanism for things to change. 566 00:44:16,000 --> 00:44:18,000 But of course, that's ridiculous. 567 00:44:18,000 --> 00:44:20,000 That things will change. 568 00:44:20,000 --> 00:44:22,000 Exactly how I don't know. 569 00:44:22,000 --> 00:44:25,000 And which way they'll change, I don't know. 570 00:44:25,000 --> 00:44:33,000 If the world is, the terrorists might have nukes and we're all going to die unless we live under totalitarianism, 571 00:44:33,000 --> 00:44:35,000 people are going to accept that. 572 00:44:35,000 --> 00:44:38,000 We could people are scared that's what it looks like. 573 00:44:38,000 --> 00:44:42,000 And technology is the point where that actually might be the world. 574 00:44:42,000 --> 00:44:45,000 But there are a lot of other ways this can play. 575 00:44:45,000 --> 00:44:48,000 I'm not able to say it. 576 00:44:48,000 --> 00:44:52,000 I think this is very, the topic of my next book. 577 00:44:52,000 --> 00:44:56,000 So I hope to explore the different avenues we might move out of this. 578 00:44:56,000 --> 00:45:01,000 I haven't even begun to have an idea of which is likely to be correct. 579 00:45:01,000 --> 00:45:06,000 And I think I wouldn't trust me when I decided. 580 00:45:06,000 --> 00:45:11,000 We're really, you know, just we'd signed search for 20 years ago. 581 00:45:11,000 --> 00:45:16,000 We're really bad at predicting not technical future, but social future. 582 00:45:16,000 --> 00:45:23,000 And everyone can predict, you know, I don't know, the automobile, and I would make people drive faster. 583 00:45:23,000 --> 00:45:24,000 But no one predicts the suburb. 584 00:45:24,000 --> 00:45:26,000 It's always the second order of social life. 585 00:45:26,000 --> 00:45:28,000 And that's what this is all about. 586 00:45:28,000 --> 00:45:31,000 So I just don't know. 587 00:45:31,000 --> 00:45:33,000 Is this on? 588 00:45:33,000 --> 00:45:40,000 You mentioned the convergence of power, the convergence of objectives for the corporations and governments, 589 00:45:40,000 --> 00:45:48,000 and also sort of the convergence of capabilities like the exon mobile comment. 590 00:45:48,000 --> 00:45:53,000 Do you see anything along the lines of the distinction between them vanishing? 591 00:45:53,000 --> 00:45:58,000 I mean, I think they largely are vanishing. 592 00:45:58,000 --> 00:46:03,000 And this is my main complaint with libertarianism as a philosophy that it was really, 593 00:46:03,000 --> 00:46:08,000 in the mid-1700s it was great because it identified the fact that power and balance is bad. 594 00:46:08,000 --> 00:46:13,000 And we utilize power, but it kind of got stuck there and didn't notice the power changed. 595 00:46:13,000 --> 00:46:16,000 And I think there is a lot of blurring. 596 00:46:16,000 --> 00:46:24,000 And some of the fact that money controls government and powerful corporations have the money. 597 00:46:24,000 --> 00:46:28,000 You know, we saw we did have seen blurring at other times in history. 598 00:46:28,000 --> 00:46:29,000 Right? 599 00:46:29,000 --> 00:46:31,000 The Dutch East India Company in Africa. 600 00:46:31,000 --> 00:46:37,000 And you know, the different examples of corporations were de facto governments in areas they were operating. 601 00:46:37,000 --> 00:46:40,000 You know, this is not as stark. 602 00:46:40,000 --> 00:46:42,000 But power is changing. 603 00:46:42,000 --> 00:46:49,000 Power is less hard power, power is more soft power to use nice term. 604 00:46:49,000 --> 00:46:52,000 That the nature of power is changing such. 605 00:46:52,000 --> 00:46:55,000 So yeah, I think I do think there is a blurring. 606 00:46:55,000 --> 00:47:02,000 But it's different than we thought when we worried about this. 607 00:47:03,000 --> 00:47:07,000 I mean, the nature of social control is very, very different now than it was. 608 00:47:07,000 --> 00:47:12,000 Nature's surveillance is very different. 609 00:47:12,000 --> 00:47:13,000 And it's going to change again. 610 00:47:13,000 --> 00:47:15,000 I mean, how what is the half-life these technologies? 611 00:47:15,000 --> 00:47:16,000 10 years? 612 00:47:16,000 --> 00:47:17,000 Five? 613 00:47:17,000 --> 00:47:24,000 Yeah, so so what's going to happen in five ten years that I'll make that will be completely different? 614 00:47:24,000 --> 00:47:27,000 And I don't know. 615 00:47:28,000 --> 00:47:30,000 I really liked your feudalism analogy. 616 00:47:30,000 --> 00:47:33,000 And I see one potential flaw in it. 617 00:47:33,000 --> 00:47:34,000 And I want to do it. 618 00:47:34,000 --> 00:47:35,000 Oh, good. 619 00:47:35,000 --> 00:47:35,000 620 00:47:35,000 --> 00:47:37,000 I thought about it. 621 00:47:37,000 --> 00:47:40,000 As I understand the feudal lords, we're pretty much monopolists. 622 00:47:40,000 --> 00:47:42,000 Like the Russian surfs were bound to land. 623 00:47:42,000 --> 00:47:45,000 So they didn't get a choice of which lord to be with. 624 00:47:45,000 --> 00:47:49,000 Whereas people do, in fact, have a choice when there's two or three big guys. 625 00:47:49,000 --> 00:47:50,000 Right? 626 00:47:50,000 --> 00:47:52,000 They have a choice but isn't really a choice. 627 00:47:52,000 --> 00:47:56,000 I mean, if all three cell phone companies are collecting the same data and giving it to the same government and the same rules. 628 00:47:56,000 --> 00:47:58,000 It's not really a choice. 629 00:47:58,000 --> 00:48:03,000 They don't get a lot more customers by having a very clear privacy policy that seems to be 630 00:48:03,000 --> 00:48:05,000 actually going to be the other guy doesn't. 631 00:48:05,000 --> 00:48:06,000 It seems not to be true. 632 00:48:06,000 --> 00:48:09,000 It seems to be getting more customers by off-socating a privacy policy. 633 00:48:09,000 --> 00:48:16,000 I mean, and the lot of great psych experiments about this, that if you make privacy salient by showing 634 00:48:16,000 --> 00:48:19,000 a privacy policy, you will say, wow, that's bad. 635 00:48:19,000 --> 00:48:21,000 It's a Facebook or a great example. 636 00:48:21,000 --> 00:48:24,000 They make the privacy policy really hard to find. 637 00:48:24,000 --> 00:48:27,000 Because they don't want you to think about it. 638 00:48:27,000 --> 00:48:30,000 Because if you don't think about it, you share. 639 00:48:30,000 --> 00:48:40,000 And so, I mean, this is the problem with many big that the normal market economics, which involves multiple sellers competing on features, 640 00:48:40,000 --> 00:48:44,000 only works if you've got a lot of sellers competing on features. 641 00:48:44,000 --> 00:48:52,000 And if the three companies that do do the same thing, I mean, it was between Apple and Microsoft in operating systems. 642 00:48:52,000 --> 00:48:56,000 And is it really that different to where privacy matters? 643 00:48:56,000 --> 00:49:05,000 I mean, around the edges, unless the companies choose to compete on those features, I mean, I can't fly less secure airlines, 644 00:49:05,000 --> 00:49:07,000 we get you through air security quicker. 645 00:49:07,000 --> 00:49:10,000 There is no competition in that or more secure airlines. 646 00:49:10,000 --> 00:49:11,000 We do a background check in everybody. 647 00:49:11,000 --> 00:49:17,000 It's probably reasonable feature to compete on, but there isn't any competition. 648 00:49:17,000 --> 00:49:26,000 So, especially if some of these deal with government demands, just not going to have the competition. 649 00:49:26,000 --> 00:49:33,000 And there is a lot of reason to make that go away as much as possible. 650 00:49:33,000 --> 00:49:37,000 Because these companies want people to share more. 651 00:49:37,000 --> 00:49:41,000 So, the thing about bounce, bounce the land is interesting. 652 00:49:41,000 --> 00:49:44,000 No, but yes and no. 653 00:49:44,000 --> 00:49:49,000 It's very hard for someone regular to leave Facebook. 654 00:49:49,000 --> 00:49:51,000 That's where your party invites come from. 655 00:49:51,000 --> 00:49:52,000 That's where your friends are. 656 00:49:52,000 --> 00:49:55,000 That's where your social interaction is. 657 00:49:55,000 --> 00:49:57,000 You don't go on Facebook, you don't get invited. 658 00:49:57,000 --> 00:50:01,000 Parties, you never get laid, you have a really sucky college experience. 659 00:50:01,000 --> 00:50:02,000 Right? 660 00:50:02,000 --> 00:50:08,000 So, you're not bound, but there's a lot of social push to stay. 661 00:50:08,000 --> 00:50:13,000 It's very hard to take your data when you leave again, Google is an exception here. 662 00:50:14,000 --> 00:50:18,000 Remember the whole battles about cell phone number portability? 663 00:50:18,000 --> 00:50:21,000 That was all to bind people to the cell phone companies. 664 00:50:21,000 --> 00:50:25,000 To raise the cost of switching. 665 00:50:25,000 --> 00:50:27,000 You raise the cost of switching. 666 00:50:27,000 --> 00:50:33,000 You can do a lot more to your customers or users. 667 00:50:33,000 --> 00:50:34,000 Right? 668 00:50:34,000 --> 00:50:36,000 The customers can't switch. 669 00:50:36,000 --> 00:50:39,000 You can piss them off a whole lot. 670 00:50:40,000 --> 00:50:47,000 The other reason I like the surf model is the notion of people doing stuff online, 671 00:50:47,000 --> 00:50:52,000 which is the raw material companies use to make profits. 672 00:50:52,000 --> 00:50:55,000 It's kind of like you're farming for a year. 673 00:50:55,000 --> 00:50:58,000 It's farm will be perfect for this, right? 674 00:50:58,000 --> 00:51:01,000 Maybe that's too much. 675 00:51:01,000 --> 00:51:05,000 The other way the metaphor works. 676 00:51:05,000 --> 00:51:09,000 The other people have written about this is the feudal metaphor. 677 00:51:09,000 --> 00:51:12,000 Is that a feudal system everything is owned. 678 00:51:12,000 --> 00:51:14,000 There's no comments. 679 00:51:14,000 --> 00:51:17,000 And we're seeing this on the internet. 680 00:51:17,000 --> 00:51:19,000 That no piece of the internet is a comments. 681 00:51:19,000 --> 00:51:20,000 Right? 682 00:51:20,000 --> 00:51:23,000 We have, you know, I'd say to you, it's very particular rules about comments. 683 00:51:23,000 --> 00:51:27,000 Free speech rules, association rules, rules about protesting. 684 00:51:27,000 --> 00:51:28,000 Because you're on a street. 685 00:51:28,000 --> 00:51:30,000 You're on a public street. 686 00:51:30,000 --> 00:51:34,000 And those rules don't apply in, for example, to Cadi Park and New York. 687 00:51:34,000 --> 00:51:37,000 Because that was a privately owned public space. 688 00:51:37,000 --> 00:51:42,000 The internet is entirely privately owned public spaces. 689 00:51:42,000 --> 00:51:43,000 Right? 690 00:51:43,000 --> 00:51:48,000 So Apple is well within its rights to say to an app event, an app creator, 691 00:51:48,000 --> 00:51:52,000 who made an app to show US drone strikes in Pakistan, 692 00:51:52,000 --> 00:51:56,000 you can't have your app on my store. 693 00:51:56,000 --> 00:51:57,000 Right? 694 00:51:57,000 --> 00:52:00,000 Because it is not a free speech, it's not a protest. 695 00:52:00,000 --> 00:52:04,000 It is a private space, Apple gets to decide. 696 00:52:04,000 --> 00:52:10,000 So this lack of a public sphere in the world where we are all associating 697 00:52:10,000 --> 00:52:14,000 is another way the feudal model works. 698 00:52:14,000 --> 00:52:16,000 I don't know how to fit it into what I'm doing. 699 00:52:16,000 --> 00:52:18,000 I'll probably figure it out sooner or later. 700 00:52:18,000 --> 00:52:22,000 The feudal model is really appealing at first blush. 701 00:52:22,000 --> 00:52:26,000 But another problem with it is that we actually do live in a democracy, 702 00:52:26,000 --> 00:52:27,000 at least theoretically. 703 00:52:27,000 --> 00:52:30,000 And we do have the power to vote, at least theoretically. 704 00:52:30,000 --> 00:52:34,000 The problem seems to me not that there are currently all kinds of tricks, 705 00:52:34,000 --> 00:52:37,000 like the people who obfuscate the privacy policies when. 706 00:52:37,000 --> 00:52:41,000 It's more about the last attitude of those who are being governed by the government they set up 707 00:52:41,000 --> 00:52:44,000 or the corporations they choose to do business with. 708 00:52:44,000 --> 00:52:48,000 And so ultimately the problem is we aren't looking after our own interests. 709 00:52:48,000 --> 00:52:50,000 And so that seems what needs to be fixed. 710 00:52:50,000 --> 00:52:53,000 And it's not feudalism because we have the opportunity to escape. 711 00:52:53,000 --> 00:52:54,000 We're just not taking advantage of it. 712 00:52:54,000 --> 00:52:56,000 I do agree with that. 713 00:52:57,000 --> 00:52:59,000 It's just getting harder and harder. 714 00:52:59,000 --> 00:53:05,000 And some of it is the fact that we are just too good at psychological manipulation. 715 00:53:05,000 --> 00:53:09,000 I mean advertising political speech has just gotten too good. 716 00:53:09,000 --> 00:53:13,000 That, you know, I mean I don't know how fair the game is. 717 00:53:13,000 --> 00:53:16,000 I mean yes, you are fundamentally right. 718 00:53:16,000 --> 00:53:19,000 The question is, does that translate to being right in practice? 719 00:53:19,000 --> 00:53:21,000 United States is particularly hard. 720 00:53:21,000 --> 00:53:27,000 Political system is not designed for a huge spectrum of political ideas. 721 00:53:27,000 --> 00:53:33,000 I mean we go to any other country in this, just realize how, how now are our political debaters. 722 00:53:33,000 --> 00:53:36,000 Just because the way our two parties system is set up. 723 00:53:36,000 --> 00:53:41,000 So but again unless the parties choose to compete on these features, 724 00:53:41,000 --> 00:53:44,000 we don't really have a choice. 725 00:53:44,000 --> 00:53:47,000 And some features they do and some they don't. 726 00:53:47,000 --> 00:53:50,000 But yeah, I mean yes, you are inherently right. 727 00:53:50,000 --> 00:53:57,000 And so by the book that's correct, question is how does that translate into to what we can actually do realistically? 728 00:53:57,000 --> 00:54:00,000 So we need to trick Facebook into becoming the EFF. 729 00:54:00,000 --> 00:54:02,000 On game. 730 00:54:02,000 --> 00:54:14,000 Does that mean that governments have an incentive to encourage there to be a few small companies so that then they don't compete on things like privacy? 731 00:54:14,000 --> 00:54:20,000 There's only three, it's much harder for them to compete on something like that. 732 00:54:20,000 --> 00:54:22,000 Yeah, I don't know. 733 00:54:22,000 --> 00:54:24,000 I mean there's a lot of examples we could look at. 734 00:54:24,000 --> 00:54:26,000 I'm going to because so I was poking at some of them. 735 00:54:26,000 --> 00:54:28,000 What do I build manufacturers? 736 00:54:28,000 --> 00:54:32,000 I mean they do have to compete on safety and have for many years. 737 00:54:32,000 --> 00:54:36,000 I mean, so I build an industry on our cars safer than your car. 738 00:54:36,000 --> 00:54:40,000 So you do see security features sometimes. 739 00:54:40,000 --> 00:54:44,000 In a lot of ways the organic food movement is a food safety. 740 00:54:44,000 --> 00:54:50,000 Yeah, but in the 70s they were, I mean that was what they did. 741 00:54:50,000 --> 00:54:55,000 Yeah, but the organic food is believed to be more pure. 742 00:54:55,000 --> 00:55:00,000 It's a food purity sale, which is inherently a health sale in the safety sale. 743 00:55:00,000 --> 00:55:05,000 You can argue whether it's real or fake, but sort of how the companies are competing. 744 00:55:05,000 --> 00:55:06,000 I don't know. 745 00:55:06,000 --> 00:55:12,000 I don't think the government is in sending any particular economic outcome. 746 00:55:12,000 --> 00:55:16,000 I think there's just a right now a very happy conference. 747 00:55:16,000 --> 00:55:19,000 Thank you very much.