1 00:00:00,625 --> 00:00:03,294 (chill electronic music) 2 00:00:10,385 --> 00:00:11,594 - I'm Baratunde Thurston 3 00:00:11,594 --> 00:00:14,431 and you're watching "Lenovo Late Night I.T." 4 00:00:14,431 --> 00:00:17,017 Where we lock industry heavyweights in a garage 5 00:00:17,017 --> 00:00:19,310 and make them tell us everything they know. 6 00:00:19,310 --> 00:00:22,147 I'm here to rouse you from your newsfeed induced coma 7 00:00:22,147 --> 00:00:24,899 with some startling facts about cybersecurity. 8 00:00:24,899 --> 00:00:25,900 First fact. 9 00:00:25,900 --> 00:00:27,152 By the end of this show, 10 00:00:27,152 --> 00:00:30,238 there will have been 40 new cyber-attacks. 11 00:00:30,238 --> 00:00:33,408 Cybercrime is now the most profitable criminal enterprise 12 00:00:33,408 --> 00:00:34,284 in the world. 13 00:00:34,284 --> 00:00:37,245 More than drugs, more than loyalty card fraud, 14 00:00:37,245 --> 00:00:39,456 yeah loyalty card fraud. That's right. 15 00:00:39,456 --> 00:00:42,208 So what conversations do we need to be having right now 16 00:00:42,208 --> 00:00:43,376 about security? 17 00:00:43,376 --> 00:00:46,254 How can businesses and consumers protect themselves? 18 00:00:46,254 --> 00:00:48,715 And what do we do after an attack? 19 00:00:48,715 --> 00:00:50,884 Curl up in a ball and cry probably. 20 00:00:50,884 --> 00:00:52,510 here to weigh in is Tim Brown, 21 00:00:52,510 --> 00:00:54,721 VP of security for SolarWinds, 22 00:00:54,721 --> 00:00:57,348 where he's responsible for internal IT security, 23 00:00:57,348 --> 00:01:00,101 product security, and security strategy. 24 00:01:00,101 --> 00:01:02,312 That's so much security. 25 00:01:02,312 --> 00:01:05,482 A former Dell fellow, CTO, chief product officer, 26 00:01:05,482 --> 00:01:08,651 chief architect, and director of security strategy. 27 00:01:08,651 --> 00:01:10,653 Tim's got more than 20 years of experience 28 00:01:10,653 --> 00:01:12,572 developing and implementing, 29 00:01:12,572 --> 00:01:15,408 you guessed it, security technology. 30 00:01:15,408 --> 00:01:16,826 And here's a fun fact about Tim. 31 00:01:16,826 --> 00:01:19,120 He and his wife live on a 60-acre ranch, 32 00:01:19,120 --> 00:01:22,373 surrounded by horses and miniature donkeys. Oh. 33 00:01:23,625 --> 00:01:24,501 It sounds really secure. 34 00:01:24,501 --> 00:01:25,335 (laughs) 35 00:01:25,335 --> 00:01:28,797 I'll also be talking to CSO hall of Famer, Andy Ellis, 36 00:01:28,797 --> 00:01:31,966 a 20-year veteran of Akamai technologies. 37 00:01:31,966 --> 00:01:34,302 Now, Andy led the company's security program, 38 00:01:34,302 --> 00:01:36,387 growing it from a single individual 39 00:01:36,387 --> 00:01:38,932 to a 90 plus person team. 40 00:01:38,932 --> 00:01:41,976 He's now the advisory CSO at Orca Security 41 00:01:41,976 --> 00:01:43,353 and the founder and CEO 42 00:01:43,353 --> 00:01:45,647 of the leadership development company, Duha. 43 00:01:46,523 --> 00:01:47,357 Did I do that right? 44 00:01:47,357 --> 00:01:48,191 - Yep. 45 00:01:48,191 --> 00:01:49,025 - Yes. 46 00:01:49,025 --> 00:01:50,276 But his greatest accomplishment of all has to be 47 00:01:50,276 --> 00:01:52,529 taking first place at the Sherman Oaks 48 00:01:52,529 --> 00:01:54,364 Galleria Spelling Bee. 49 00:01:54,364 --> 00:01:55,615 That's literally around the corner 50 00:01:55,615 --> 00:01:56,616 from where we are right now. 51 00:01:56,616 --> 00:01:57,450 So Andy, 52 00:01:57,450 --> 00:02:00,161 if you want to go pop over, relive those glory years, 53 00:02:00,161 --> 00:02:00,995 we understand. 54 00:02:00,995 --> 00:02:01,955 - I'm hopping over right after this. 55 00:02:01,955 --> 00:02:04,916 - Thank you so much for joining Andy, Tim, 56 00:02:04,916 --> 00:02:05,917 how you feeling right now? 57 00:02:05,917 --> 00:02:06,751 - Great, 58 00:02:06,751 --> 00:02:07,585 - Great? - Great. 59 00:02:07,585 --> 00:02:08,419 - Fantastic. 60 00:02:08,419 --> 00:02:11,339 - So, a lot of security at the table right now, 61 00:02:11,339 --> 00:02:12,966 I feel safer already. 62 00:02:12,966 --> 00:02:15,093 And I want to know what keeps you up at night, 63 00:02:15,093 --> 00:02:16,261 from a security perspective? 64 00:02:16,261 --> 00:02:18,680 I don't need to know your psychological issues. 65 00:02:18,680 --> 00:02:19,931 - So many things, right? 66 00:02:19,931 --> 00:02:20,765 So, you know, 67 00:02:20,765 --> 00:02:22,100 large breaches, 68 00:02:22,100 --> 00:02:24,561 breaches that are affecting the world. 69 00:02:24,561 --> 00:02:26,437 Really one of the biggest threats that we have 70 00:02:26,437 --> 00:02:28,690 is a true cyber terrorism event, right? 71 00:02:28,690 --> 00:02:31,776 That, that is what is really truly scary. 72 00:02:31,776 --> 00:02:34,279 - I'm duly terrified. Thank you. 73 00:02:34,279 --> 00:02:35,613 What keeps you up at night, Andy? 74 00:02:35,613 --> 00:02:37,157 - I think what worries me the most 75 00:02:37,157 --> 00:02:39,701 is how people don't always understand 76 00:02:39,701 --> 00:02:41,244 that the risks that they take. 77 00:02:41,244 --> 00:02:42,078 And it's important that we take risks. 78 00:02:42,078 --> 00:02:43,329 We're not gonna get rid of risk. 79 00:02:43,329 --> 00:02:45,206 I'm not the person that's gonna sit here and say, 80 00:02:45,206 --> 00:02:46,374 don't take any risks. - Okay. 81 00:02:46,374 --> 00:02:49,002 - Like literally we showed up here, that was a risk. 82 00:02:49,002 --> 00:02:51,713 But the challenge sometimes is the systems we're using 83 00:02:51,713 --> 00:02:53,506 are so complex. 84 00:02:53,506 --> 00:02:56,009 That they're risks that we just don't even understand. 85 00:02:56,009 --> 00:02:57,760 But we think that we're okay. 86 00:02:57,760 --> 00:02:59,095 - I mean, when you talk about 87 00:02:59,095 --> 00:03:02,098 the level of complexity involved that we don't understand, 88 00:03:02,098 --> 00:03:04,184 it reminds me of the banking system 89 00:03:05,185 --> 00:03:07,687 and that we had so much no one understood. 90 00:03:07,687 --> 00:03:08,855 And when it collapsed, 91 00:03:08,855 --> 00:03:10,607 everybody did that. - Right. 92 00:03:10,607 --> 00:03:11,608 - Is that what we're facing 93 00:03:11,608 --> 00:03:13,693 with our dependence on technology? 94 00:03:13,693 --> 00:03:14,527 - Very similar. 95 00:03:14,527 --> 00:03:15,361 Very similar. 96 00:03:15,361 --> 00:03:16,404 Although I might even argue, 97 00:03:16,404 --> 00:03:18,907 the IT industry is even more complex 98 00:03:18,907 --> 00:03:20,366 than the banking industry, 99 00:03:20,366 --> 00:03:21,993 but that's a great model to start from. 100 00:03:21,993 --> 00:03:24,120 If we don't understand the banking industry 101 00:03:24,120 --> 00:03:27,207 and how everything is layered on top of something else, 102 00:03:27,207 --> 00:03:29,709 it could be some company you've never heard of 103 00:03:29,709 --> 00:03:32,420 goes out of business tomorrow, gets compromised, 104 00:03:32,420 --> 00:03:34,255 when they go down it cascades, 105 00:03:34,255 --> 00:03:36,799 somebody else goes down, but that's the price we pay. 106 00:03:36,799 --> 00:03:39,552 If we want to have this rapid advancement, 107 00:03:39,552 --> 00:03:43,181 you get rapid advancement by building on top of complexity. 108 00:03:43,181 --> 00:03:44,015 - [Baratunde] Right. 109 00:03:44,015 --> 00:03:45,058 - And none of the software that we build 110 00:03:45,058 --> 00:03:46,809 is ourselves anymore. 111 00:03:46,809 --> 00:03:47,727 Right? Well, we have- 112 00:03:47,727 --> 00:03:48,770 - What do you mean by that? 113 00:03:48,770 --> 00:03:52,106 - That nobody builds a full suite of software. 114 00:03:52,106 --> 00:03:54,275 We all rely on an operating system, right? 115 00:03:54,275 --> 00:03:56,402 We all rely on sub-components. 116 00:03:56,402 --> 00:03:59,530 We all rely on some open-source components. 117 00:03:59,530 --> 00:04:01,866 We all rely on other things 118 00:04:01,866 --> 00:04:04,369 which are really termed the supply chain now. Right? 119 00:04:04,369 --> 00:04:06,788 So we're all relying on other pieces. 120 00:04:06,788 --> 00:04:09,874 So if some of those critical supply chain components 121 00:04:09,874 --> 00:04:10,959 have an issue, 122 00:04:10,959 --> 00:04:11,918 that's another place 123 00:04:11,918 --> 00:04:14,295 you get that cascading effect of failure. 124 00:04:14,295 --> 00:04:16,130 - So we have like a software supply chain. 125 00:04:16,130 --> 00:04:18,800 A lot of people are familiar with our physical supply chain 126 00:04:18,800 --> 00:04:20,218 and how that can get pretty jacked up. 127 00:04:20,218 --> 00:04:23,721 I'm only imagining the complexity and the confusion 128 00:04:23,721 --> 00:04:26,015 around software supply chains getting messed up. 129 00:04:26,015 --> 00:04:26,849 - Yeah, absolutely. 130 00:04:26,849 --> 00:04:29,811 And no software today is built by, you know, an entity, 131 00:04:29,811 --> 00:04:30,645 a single entity. 132 00:04:30,645 --> 00:04:34,148 Everybody uses something else within their software. 133 00:04:34,148 --> 00:04:36,693 - Okay. So we're getting really, really real with the talk. 134 00:04:36,693 --> 00:04:39,153 And I want to pull it back another layer. 135 00:04:39,153 --> 00:04:42,240 What are the things that CIOs 136 00:04:42,240 --> 00:04:44,867 aren't telling their boards or their CEOs 137 00:04:44,867 --> 00:04:46,536 in terms of the threats and the risks 138 00:04:46,536 --> 00:04:47,412 that they're facing? 139 00:04:47,412 --> 00:04:50,373 - They don't necessarily have full scope 140 00:04:50,373 --> 00:04:53,293 of everything that's involved in an organization. 141 00:04:53,293 --> 00:04:55,545 So just think about everything you need to know 142 00:04:55,545 --> 00:04:59,090 inside of a company to address risk. Right? 143 00:04:59,090 --> 00:05:00,925 A lot, you need to know physical, 144 00:05:00,925 --> 00:05:03,261 you need to know logical. 145 00:05:03,261 --> 00:05:04,095 You need to know everything 146 00:05:04,095 --> 00:05:04,929 that people are building, 147 00:05:04,929 --> 00:05:06,973 you need to know every application that's going on. 148 00:05:06,973 --> 00:05:08,182 - You need to know the little device 149 00:05:08,182 --> 00:05:09,475 that employees are bringing in that's adding to the network 150 00:05:09,475 --> 00:05:10,935 - In the front door, right? 151 00:05:10,935 --> 00:05:13,104 So you need to know so many things 152 00:05:13,104 --> 00:05:17,191 in order to truly get to a risk assessment. 153 00:05:17,191 --> 00:05:21,529 I think they're not telling the board and others, 154 00:05:21,529 --> 00:05:23,072 the unknown unknowns. 155 00:05:23,072 --> 00:05:24,449 We don't know. Right. 156 00:05:24,449 --> 00:05:26,701 Well, you know, what do you mean you don't know? 157 00:05:26,701 --> 00:05:27,618 - The boards don't like to hear that? 158 00:05:27,618 --> 00:05:28,911 - No. 159 00:05:28,911 --> 00:05:29,954 (laughing) 160 00:05:29,954 --> 00:05:32,332 - Boards wanna hear either we're safe 161 00:05:32,332 --> 00:05:34,334 or there's a disaster and I'm on it. 162 00:05:34,334 --> 00:05:36,586 But anything in between those two things 163 00:05:36,586 --> 00:05:38,046 is a nuanced conversation 164 00:05:38,046 --> 00:05:40,256 that a lot of boards don't want to have. 165 00:05:40,256 --> 00:05:41,090 - [Tim] I think they're getting better. 166 00:05:41,090 --> 00:05:42,175 - They're getting better. 167 00:05:42,175 --> 00:05:43,926 - I think they're getting better at asking hard questions. 168 00:05:43,926 --> 00:05:45,428 Just really the unknowns unknowns. 169 00:05:45,428 --> 00:05:47,805 It's like, yes, we have risk in the environment. 170 00:05:47,805 --> 00:05:49,515 - I'd even say the known unknowns 171 00:05:49,515 --> 00:05:50,808 that just get filtered out. 172 00:05:50,808 --> 00:05:53,728 So the board asks the CEO, are we safe? 173 00:05:53,728 --> 00:05:55,938 Are we, let's just take patching, 174 00:05:55,938 --> 00:05:56,939 are we patching all of our systems, 175 00:05:56,939 --> 00:05:58,274 you know, keeping them up to date? 176 00:05:58,274 --> 00:06:01,444 Did you take today's 45-minute OS10 update? 177 00:06:02,695 --> 00:06:05,114 And the challenge is the CIO and CEO 178 00:06:05,114 --> 00:06:07,033 have to give a very short answer. 179 00:06:07,033 --> 00:06:08,659 It's either yes or no. 180 00:06:08,659 --> 00:06:10,286 But when they say yes, they're thinking, 181 00:06:10,286 --> 00:06:12,038 well, yeah, we're safe, but let me go just check 182 00:06:12,038 --> 00:06:13,706 with my director of IT. 183 00:06:13,706 --> 00:06:14,624 (laughs) 184 00:06:14,624 --> 00:06:16,542 And at every level when somebody answers, yes, 185 00:06:16,542 --> 00:06:18,878 they're filtering, they're saying, yes, we're good 186 00:06:18,878 --> 00:06:21,506 in this one environment, which is 90% of our systems. 187 00:06:21,506 --> 00:06:22,840 It's what really matters. 188 00:06:22,840 --> 00:06:24,258 But the person they're drawing from 189 00:06:24,258 --> 00:06:26,386 was really only answering for 70% 190 00:06:26,386 --> 00:06:28,763 (talking over each other) 191 00:06:28,763 --> 00:06:30,056 - Right. 192 00:06:30,056 --> 00:06:30,890 - And so what happens 193 00:06:30,890 --> 00:06:32,308 is you don't have this complete coverage. 194 00:06:32,308 --> 00:06:33,434 So when you get an answer, 195 00:06:33,434 --> 00:06:36,729 you get an answer about the best part of your business. 196 00:06:36,729 --> 00:06:38,231 It would sort of be like saying, 197 00:06:38,231 --> 00:06:40,650 is everybody in America fed tonight. 198 00:06:40,650 --> 00:06:42,151 And if you said well sure, 199 00:06:42,151 --> 00:06:46,239 everybody who lives in the high net worth locations are fed, 200 00:06:46,239 --> 00:06:48,324 you know, the place that we're making sure have food. 201 00:06:48,324 --> 00:06:49,867 Yeah. They're all fed, 202 00:06:49,867 --> 00:06:51,786 but you're not answering the question about the people 203 00:06:51,786 --> 00:06:53,079 who were living out on the streets 204 00:06:53,079 --> 00:06:54,914 or who have food insecurity. 205 00:06:54,914 --> 00:06:57,125 We see that same thing within the CIO frame. 206 00:06:57,125 --> 00:06:59,502 - Well, it also seems like maybe we should stop 207 00:06:59,502 --> 00:07:02,046 asking binary questions. 208 00:07:02,046 --> 00:07:02,880 (laughs) 209 00:07:02,880 --> 00:07:05,383 This is an analog situation, ironically. 210 00:07:05,383 --> 00:07:07,510 - And that's why we stopped talking about security. 211 00:07:07,510 --> 00:07:09,387 And we talk about risk, right? 212 00:07:09,387 --> 00:07:10,763 Security is a terrible word 213 00:07:10,763 --> 00:07:11,806 because people do think it's- 214 00:07:11,806 --> 00:07:12,807 - [Baratunde] Oh, security's a terrible word? 215 00:07:12,807 --> 00:07:13,933 - It is a terrible word. 216 00:07:13,933 --> 00:07:15,852 - It was like five times in your bio. 217 00:07:15,852 --> 00:07:17,937 - I know, but it's still a terrible word. 218 00:07:17,937 --> 00:07:19,814 (laughing) 219 00:07:19,814 --> 00:07:21,107 - Because it means different things to different people. 220 00:07:21,107 --> 00:07:21,899 - Right. 221 00:07:21,899 --> 00:07:23,109 - Give me some positive examples 222 00:07:23,109 --> 00:07:26,320 of effective cybersecurity inside of an organization. 223 00:07:26,320 --> 00:07:28,197 - So when we look at security, you know, 224 00:07:28,197 --> 00:07:30,408 good security means that you're talking about risk, 225 00:07:30,408 --> 00:07:32,076 not security, not in the binary, 226 00:07:32,076 --> 00:07:34,412 not saying we are secure, we're not secure. 227 00:07:34,412 --> 00:07:39,292 It's creating a education for the executive team 228 00:07:39,292 --> 00:07:40,918 to say, here's what risk we face, 229 00:07:40,918 --> 00:07:42,462 here's how we can mitigate risk, 230 00:07:42,462 --> 00:07:46,007 here's how we can appropriately minimize the risks 231 00:07:46,007 --> 00:07:48,551 that we face by doing these things. 232 00:07:48,551 --> 00:07:51,679 So that's when you start seeing a cybersecurity program 233 00:07:51,679 --> 00:07:55,308 that is running well, because everybody faces risk. 234 00:07:55,308 --> 00:07:57,518 Everybody faces some level of risk. 235 00:07:57,518 --> 00:07:59,312 So it's more controlling it, right? 236 00:07:59,312 --> 00:08:00,605 Managing it. 237 00:08:00,605 --> 00:08:02,273 - There's been a breach. 238 00:08:02,273 --> 00:08:05,109 How do you communicate that there's been a breach? 239 00:08:05,109 --> 00:08:07,695 Do you put it on a cake and have it delivered 240 00:08:07,695 --> 00:08:10,198 with dancing and all kinds of sparkles? 241 00:08:10,198 --> 00:08:13,451 Do you make a TikTok dance about it and hope they see it? 242 00:08:13,451 --> 00:08:14,952 And you say, "I tried to tell you!" 243 00:08:14,952 --> 00:08:17,246 What's truly, what's the best way to communicate 244 00:08:17,246 --> 00:08:19,290 when something has not been secured. 245 00:08:20,500 --> 00:08:22,835 - So I think it's really important in that moment 246 00:08:22,835 --> 00:08:25,296 to not try to say, I told you so. 247 00:08:25,296 --> 00:08:26,130 (laughing) 248 00:08:26,130 --> 00:08:27,006 - It's hard though, isn't it? 249 00:08:27,006 --> 00:08:28,216 - It's really hard. 250 00:08:28,216 --> 00:08:30,968 In a great organization you don't have blame. 251 00:08:30,968 --> 00:08:31,844 - What do you have instead? 252 00:08:31,844 --> 00:08:34,055 - What you have instead is this acceptance 253 00:08:34,055 --> 00:08:36,557 that the organization failed. 254 00:08:36,557 --> 00:08:39,227 And you want the person or the people 255 00:08:39,227 --> 00:08:41,270 who are closest to the failure 256 00:08:41,270 --> 00:08:44,815 to be willing to stand up and say, here's what I did. 257 00:08:44,815 --> 00:08:47,109 And they don't know if they're gonna be blamed or not. 258 00:08:47,109 --> 00:08:48,903 And so if they're afraid of being blamed, 259 00:08:48,903 --> 00:08:51,906 they're not going to tell you what really went wrong. 260 00:08:51,906 --> 00:08:54,659 - Shame. It's just like life. 261 00:08:54,659 --> 00:08:55,785 - [Andy] Just like life. 262 00:08:55,785 --> 00:08:56,619 (laughing) 263 00:08:56,619 --> 00:08:58,371 - Right. If you're gonna feel shame for something 264 00:08:58,371 --> 00:09:00,122 your not gonna come forward with it. 265 00:09:00,122 --> 00:09:00,957 Okay. I get it. 266 00:09:00,957 --> 00:09:03,417 - So you want them to know that there is safety. 267 00:09:03,417 --> 00:09:04,919 If they made an error, 268 00:09:04,919 --> 00:09:07,547 the problem was why didn't you have a process 269 00:09:07,547 --> 00:09:11,342 to keep a human error from causing this? 270 00:09:11,342 --> 00:09:12,176 Great. 271 00:09:12,176 --> 00:09:15,054 I want every human to tell me, like they typo-ed something 272 00:09:15,054 --> 00:09:17,056 because the answer is I should never have 273 00:09:17,056 --> 00:09:19,767 a high quality system relying on human input 274 00:09:19,767 --> 00:09:21,394 cause we know humans make typos. 275 00:09:21,394 --> 00:09:23,312 - Okay. Let's talk about trust. 276 00:09:23,312 --> 00:09:25,314 Zero trust. - Yep. 277 00:09:25,314 --> 00:09:26,148 - What is it? 278 00:09:26,148 --> 00:09:28,317 You're explaining it to the CEO. Go. 279 00:09:28,317 --> 00:09:29,151 - Yeah, sure. 280 00:09:29,151 --> 00:09:32,446 Zero trust means that you're moving your, 281 00:09:32,446 --> 00:09:35,491 your authentication and authorization to those edge. 282 00:09:35,491 --> 00:09:36,867 Moving them out to the applications, 283 00:09:36,867 --> 00:09:38,744 making the applications intelligent, 284 00:09:38,744 --> 00:09:40,746 making them make the decision 285 00:09:40,746 --> 00:09:43,249 so that you can segment your market, 286 00:09:43,249 --> 00:09:46,002 your environment into little spots. 287 00:09:46,002 --> 00:09:49,505 The way I like to explain zero trust is a pomegranate. 288 00:09:49,505 --> 00:09:51,799 - Oh. Okay. The annoying fruit that's delicious. 289 00:09:51,799 --> 00:09:53,342 - The annoying fruit. 290 00:09:53,342 --> 00:09:54,176 (laughing) 291 00:09:54,176 --> 00:09:55,094 Think about the pomegranate. 292 00:09:55,094 --> 00:09:56,887 What does it have? It has the seed in the middle. 293 00:09:56,887 --> 00:09:58,389 It has a little gel coating 294 00:09:58,389 --> 00:10:01,017 and then it has sections of gel coatings. 295 00:10:01,017 --> 00:10:02,893 And then it has a hard outer shell. 296 00:10:02,893 --> 00:10:07,231 So your enterprise is actually made up of many seeds. 297 00:10:07,231 --> 00:10:09,317 Your Office 365 is a seed, 298 00:10:09,317 --> 00:10:10,860 your AWS environment is a seed, 299 00:10:10,860 --> 00:10:13,237 your on-premise workstation is a seed, 300 00:10:13,237 --> 00:10:15,656 you're all of those seeds and each one of them 301 00:10:15,656 --> 00:10:18,242 should have a gel coating around the outside, 302 00:10:18,242 --> 00:10:20,536 which is the security associated with it. 303 00:10:20,536 --> 00:10:22,872 And then a common policy around the outside, 304 00:10:22,872 --> 00:10:24,498 which is a hard shell. - Okay. 305 00:10:24,498 --> 00:10:28,169 - But if you think you can have this, 306 00:10:28,169 --> 00:10:29,879 you know what we had before 307 00:10:29,879 --> 00:10:33,132 that one monolithic big avocado. Right? 308 00:10:33,132 --> 00:10:35,092 It's not an avocado. It's a pomegranate. 309 00:10:35,092 --> 00:10:37,553 Avocado has a hard shell, a big seed 310 00:10:37,553 --> 00:10:39,430 and says, oh, I got firewalls around everything. 311 00:10:39,430 --> 00:10:40,723 And that's my environment. 312 00:10:40,723 --> 00:10:43,684 Nobody's environment looks like that anymore. 313 00:10:43,684 --> 00:10:45,728 Everybody's environment is a pomegranate. 314 00:10:45,728 --> 00:10:48,356 You know, often you can't protect everything, right? 315 00:10:48,356 --> 00:10:51,150 You can't protect everything at the same level. 316 00:10:51,150 --> 00:10:52,276 And when- 317 00:10:52,276 --> 00:10:53,778 - Wow, that sounds so honest. 318 00:10:53,778 --> 00:10:55,404 - You have to give up land. 319 00:10:55,404 --> 00:10:58,407 You have to say, I am going to protect this much better 320 00:10:58,407 --> 00:10:59,867 than I'm going to protect this. 321 00:10:59,867 --> 00:11:01,535 - I feel like you're a general in a war 322 00:11:01,535 --> 00:11:02,453 I don't want to be in. 323 00:11:02,453 --> 00:11:03,829 - There you go. 324 00:11:03,829 --> 00:11:04,664 (laughing) 325 00:11:04,664 --> 00:11:06,624 - So what are some misconceptions 326 00:11:06,624 --> 00:11:09,627 that people broadly have about cybersecurity? 327 00:11:09,627 --> 00:11:10,795 Whether it's the nature of the risk 328 00:11:10,795 --> 00:11:12,672 or how it even operates or what it is. 329 00:11:12,672 --> 00:11:13,923 What do you think some of those are? 330 00:11:13,923 --> 00:11:15,841 - So I think the biggest misconception 331 00:11:15,841 --> 00:11:18,511 is that it's a hard field. 332 00:11:18,511 --> 00:11:20,262 It's actually really easy field. 333 00:11:20,262 --> 00:11:22,515 It's really broad. It's really complicated, 334 00:11:22,515 --> 00:11:24,684 but it's like cars a hundred years ago. 335 00:11:25,267 --> 00:11:27,520 A hundred years ago, nobody understood a car. 336 00:11:27,520 --> 00:11:29,313 Like you had to hire car specialists, 337 00:11:29,313 --> 00:11:30,690 you had to do all these things. 338 00:11:30,690 --> 00:11:33,776 And today, like mostly we all know how to drive cars. 339 00:11:33,776 --> 00:11:36,654 And I live in Boston So I know a lot of people who don't. 340 00:11:36,654 --> 00:11:37,488 - And you definitely don't, 341 00:11:37,488 --> 00:11:39,573 but that's why I learned to drive. So same page. 342 00:11:39,573 --> 00:11:41,784 - But, we're in this world 343 00:11:41,784 --> 00:11:44,912 where security is still a maturing field. 344 00:11:44,912 --> 00:11:46,664 We're still trying to hire unicorns. 345 00:11:46,664 --> 00:11:48,374 People who can do everything. 346 00:11:48,374 --> 00:11:50,418 We don't need people who can do everything. 347 00:11:50,418 --> 00:11:52,128 We need people who can understand 348 00:11:52,128 --> 00:11:54,004 how to get part of the job done. 349 00:11:54,004 --> 00:11:56,090 People who've done safety engineering 350 00:11:56,090 --> 00:11:58,634 and water supply systems. 351 00:11:58,634 --> 00:12:00,553 They understand risk trade-offs. 352 00:12:00,553 --> 00:12:02,388 Like you don't get to shut off the water 353 00:12:02,388 --> 00:12:04,390 unless it's really toxic, 354 00:12:04,390 --> 00:12:05,516 but there are things you're gonna do, 355 00:12:05,516 --> 00:12:07,268 you're gonna make risk trade-offs. 356 00:12:07,268 --> 00:12:09,562 We need people who have that kind of expertise 357 00:12:09,562 --> 00:12:11,856 to come into security and make risk decisions. 358 00:12:11,856 --> 00:12:13,607 - Keep going on his a little bit, 359 00:12:13,607 --> 00:12:16,819 the diversity of cybersecurity professionals 360 00:12:16,819 --> 00:12:20,156 sometimes isn't as appreciated as what it could be, right? 361 00:12:20,156 --> 00:12:22,324 Because diversity gives you different thinking 362 00:12:22,324 --> 00:12:24,702 and you absolutely need different, you know, 363 00:12:24,702 --> 00:12:27,371 models of thinking to be able to do things. 364 00:12:27,371 --> 00:12:28,205 So for example, 365 00:12:28,205 --> 00:12:30,124 we want to convince people that security's important 366 00:12:30,124 --> 00:12:31,917 and they should do the right thing. 367 00:12:31,917 --> 00:12:34,420 So should you have a techie engineer do that? 368 00:12:34,420 --> 00:12:36,630 Or should you have a psychologist do that? 369 00:12:36,630 --> 00:12:37,465 Right? 370 00:12:37,465 --> 00:12:39,467 Should you have somebody that can relate to people 371 00:12:39,467 --> 00:12:42,136 to be able to, you know, help modify behavior? 372 00:12:42,136 --> 00:12:43,179 - You should have Beyonce do that. 373 00:12:43,179 --> 00:12:44,054 - There you go. 374 00:12:44,054 --> 00:12:44,889 Absolutely. 375 00:12:44,889 --> 00:12:46,682 - I would hire her for that in a moment. 376 00:12:46,682 --> 00:12:48,893 (talking over each other) 377 00:12:48,893 --> 00:12:51,479 - And they wouldn't complain. No. 378 00:12:51,479 --> 00:12:52,313 (laughing) 379 00:12:52,313 --> 00:12:53,355 So that's one of the misconceptions 380 00:12:53,355 --> 00:12:56,901 that you need techie cybersecurity people to fix everything. 381 00:12:57,359 --> 00:12:59,445 - And you need people to write reports for you, 382 00:12:59,445 --> 00:13:00,905 you should be hiring journalists. 383 00:13:00,905 --> 00:13:02,907 Cause they're really good at consuming data 384 00:13:02,907 --> 00:13:04,492 and writing reports about them 385 00:13:04,492 --> 00:13:06,076 that other people want to read. 386 00:13:06,076 --> 00:13:08,078 I can teach anybody security. 387 00:13:08,078 --> 00:13:10,748 If they've got some skill that's relevant, 388 00:13:10,748 --> 00:13:12,458 I need people who can tell stories. 389 00:13:12,458 --> 00:13:13,584 - We need people to talk 390 00:13:13,584 --> 00:13:15,836 and tell stories and be entertaining. 391 00:13:15,836 --> 00:13:18,422 - Dang. Thank you for the new job. 392 00:13:18,422 --> 00:13:19,632 I appreciate you both. 393 00:13:19,632 --> 00:13:22,468 We're gonna take a break from the interview mode 394 00:13:22,468 --> 00:13:25,012 that we've been in and we're gonna play a fun and weird, 395 00:13:25,012 --> 00:13:26,680 slightly awkward little game. 396 00:13:26,680 --> 00:13:27,515 Are you game? 397 00:13:27,515 --> 00:13:28,516 - I'm game. - Absolutely. 398 00:13:29,391 --> 00:13:32,061 - Now IT experts aren't always the best 399 00:13:32,061 --> 00:13:34,688 at explaining their work in lay person's terms. 400 00:13:34,688 --> 00:13:37,149 So we created a segment that challenges our guests 401 00:13:37,149 --> 00:13:38,943 to describe what they do for a living 402 00:13:38,943 --> 00:13:41,695 in language that anyone can understand. 403 00:13:41,695 --> 00:13:44,949 You are going to explain your jobs to each other 404 00:13:44,949 --> 00:13:46,992 as if you're on a first date 405 00:13:46,992 --> 00:13:48,577 and you'll each have about 20 seconds 406 00:13:48,577 --> 00:13:49,745 to win over your partner. 407 00:13:49,745 --> 00:13:52,873 And with any luck you'll graduate to date number two. 408 00:13:52,873 --> 00:13:55,626 That's right. It's time to play date night IT. 409 00:13:56,710 --> 00:13:59,505 (R&B music) 410 00:13:59,505 --> 00:14:01,298 - So did you ever hear of SolarWinds? 411 00:14:01,298 --> 00:14:02,758 - I have. 412 00:14:02,758 --> 00:14:04,093 - Oh, well I ran security for SolarWinds. 413 00:14:04,093 --> 00:14:05,553 - I'm sorry. 414 00:14:05,553 --> 00:14:06,387 (laughs) 415 00:14:06,387 --> 00:14:07,930 - And you know that breach that occurred, 416 00:14:07,930 --> 00:14:09,265 that thing that affected the world 417 00:14:09,265 --> 00:14:10,516 that was on 60 Minutes? 418 00:14:10,516 --> 00:14:12,601 - [Andy] Even my mother heard of SolarWinds. 419 00:14:12,601 --> 00:14:16,689 - Yeah. So I ran security or I run security for the company. 420 00:14:16,689 --> 00:14:18,691 I manage everything associated with that 421 00:14:19,608 --> 00:14:20,442 and yep. 422 00:14:20,442 --> 00:14:23,195 It's a extremely interesting job. 423 00:14:23,195 --> 00:14:25,614 - That sounds like a really hard job. 424 00:14:25,614 --> 00:14:27,533 - I'm gonna pause this right here. 425 00:14:27,533 --> 00:14:30,077 So Tim, just look, 426 00:14:30,077 --> 00:14:31,537 I haven't been on a first date in a really long time. 427 00:14:31,537 --> 00:14:32,705 - That was a bad first date. 428 00:14:32,705 --> 00:14:35,040 - But I will say starting off 429 00:14:35,040 --> 00:14:38,669 with your most infamous failure per chance and- 430 00:14:38,669 --> 00:14:40,045 - [Tim] Not a good idea. 431 00:14:40,045 --> 00:14:40,880 - I just... 432 00:14:40,880 --> 00:14:41,714 - [Tim] Just didn't feel right. 433 00:14:41,714 --> 00:14:44,508 - Anybody who can get past that, will stick around. 434 00:14:44,508 --> 00:14:46,719 - [Baratunde] Oh, is that how you received it? 435 00:14:46,719 --> 00:14:49,763 - At least, at least he faced up to it, 436 00:14:49,763 --> 00:14:51,849 but I still don't know what he does. 437 00:14:51,849 --> 00:14:52,850 - Okay. So. 438 00:14:53,726 --> 00:14:56,020 (talking over each other) 439 00:14:56,020 --> 00:14:57,062 - Should I run away at this point? 440 00:14:57,062 --> 00:14:58,314 - So why don't you give it a shot 441 00:14:58,314 --> 00:14:59,148 and let's see how- 442 00:14:59,148 --> 00:15:00,649 - [Andy] Am I giving it a shot on his or on mine? 443 00:15:00,649 --> 00:15:02,526 - On yours, it's your turn. - Okay. 444 00:15:02,526 --> 00:15:07,031 So I am like a landscaping architect for computers. 445 00:15:07,031 --> 00:15:09,575 My job is to help tell other people 446 00:15:09,575 --> 00:15:11,493 who are building big computer networks, 447 00:15:11,493 --> 00:15:13,329 what's the right way to do it 448 00:15:13,329 --> 00:15:15,289 so that they can deal with weeds 449 00:15:15,289 --> 00:15:16,999 in a more sustainable fashion. 450 00:15:16,999 --> 00:15:20,044 Weeds being the bad things that would happen to computers. 451 00:15:21,128 --> 00:15:22,504 - Sounds pretty boring. 452 00:15:22,504 --> 00:15:24,173 (laughing) 453 00:15:24,173 --> 00:15:26,091 - Oh, I did not see that coming. 454 00:15:26,091 --> 00:15:28,260 I was like you had me at landscaper. 455 00:15:28,260 --> 00:15:29,511 - Landscaping? 456 00:15:29,511 --> 00:15:31,263 How are you gonna make any money landscaping? 457 00:15:31,263 --> 00:15:34,350 - Harsh. Harsh crowd. Man. 458 00:15:35,225 --> 00:15:37,061 Well, at least you know what he does. 459 00:15:38,437 --> 00:15:40,940 All he knows about you is that you missed. 460 00:15:40,940 --> 00:15:42,608 (laughing) 461 00:15:42,608 --> 00:15:45,569 Second date or no, it's up to you. What do you think? 462 00:15:45,569 --> 00:15:46,487 - Let's do it again. 463 00:15:46,487 --> 00:15:47,988 - Okay. We'll give this a try 464 00:15:47,988 --> 00:15:50,866 cause I don't know if anybody else is gonna take me. So. 465 00:15:50,866 --> 00:15:51,700 - There you go. 466 00:15:51,700 --> 00:15:53,744 - Thank you both for playing our weird, awkward, 467 00:15:53,744 --> 00:15:56,497 and sometimes fun game, date night IT. 468 00:15:57,790 --> 00:16:01,126 Here's an incident that a lot of us are experiencing 469 00:16:01,126 --> 00:16:04,254 directly or seeing in the news or both. 470 00:16:04,254 --> 00:16:05,881 Ransomware. - Yep. 471 00:16:05,881 --> 00:16:10,135 - And I've heard you describe it as a self-inflicted wound. 472 00:16:10,135 --> 00:16:11,595 I'd love for you to expand on 473 00:16:11,595 --> 00:16:12,972 and explain what you mean by that. 474 00:16:12,972 --> 00:16:14,556 - So many of our enterprises 475 00:16:14,556 --> 00:16:17,893 have this sort of flat monolithic administration model. 476 00:16:17,893 --> 00:16:21,605 So you have IT admins who have root access to every machine. 477 00:16:21,605 --> 00:16:23,315 So all it takes is for the adversary 478 00:16:23,315 --> 00:16:26,527 to compromise that root access once. 479 00:16:26,527 --> 00:16:28,737 So that's what happened in Mount Pecha. 480 00:16:28,737 --> 00:16:30,948 A number of places get compromised 481 00:16:30,948 --> 00:16:33,409 because there's this accounting software 482 00:16:33,409 --> 00:16:34,535 that outsourced it to somebody, 483 00:16:34,535 --> 00:16:36,787 downloads an update, it's infected. 484 00:16:36,787 --> 00:16:39,206 An admin happens to be logged into that machine, 485 00:16:39,206 --> 00:16:40,541 doing something, 486 00:16:40,541 --> 00:16:42,376 their credentials are stolen 487 00:16:42,376 --> 00:16:44,503 and your entire enterprise just shut down. 488 00:16:45,462 --> 00:16:48,340 Like that's a failure on our part. 489 00:16:48,340 --> 00:16:50,092 As IT professionals, 490 00:16:50,092 --> 00:16:52,386 we should not be designing systems 491 00:16:52,386 --> 00:16:54,972 that everybody trusts the administrators. 492 00:16:54,972 --> 00:16:57,349 When I talked about users having their laptop 493 00:16:57,349 --> 00:16:58,934 as part of their ecosystem, 494 00:16:58,934 --> 00:17:01,645 I literally mean we shouldn't have control of that 495 00:17:01,645 --> 00:17:03,480 from a central IT shop. 496 00:17:03,480 --> 00:17:05,149 We should have it isolated. 497 00:17:05,149 --> 00:17:07,109 So if IT goes down, 498 00:17:07,109 --> 00:17:08,736 at least our users are still up. 499 00:17:09,903 --> 00:17:12,656 - So when I look at ransomware, 500 00:17:12,656 --> 00:17:15,993 I look it at as a much more efficient business model. 501 00:17:15,993 --> 00:17:17,161 (laughs) 502 00:17:17,161 --> 00:17:19,121 - The shakedown has evolved. 503 00:17:19,121 --> 00:17:21,206 - Think about seriously, right, 504 00:17:21,206 --> 00:17:22,041 When you think about it, 505 00:17:22,041 --> 00:17:24,752 what'd you have to do before ransomware, right? 506 00:17:25,127 --> 00:17:28,630 - You had to get in a car, get a gun, go stick a place up. 507 00:17:28,630 --> 00:17:32,009 - You had to get into systems, steal data, 508 00:17:32,009 --> 00:17:35,054 sell data to somebody that was gonna pay for it. 509 00:17:35,054 --> 00:17:37,347 So you had many chains and- 510 00:17:37,347 --> 00:17:40,017 - You're targeting your customers for different- 511 00:17:40,017 --> 00:17:43,520 - So now, now you just go in you, 512 00:17:43,520 --> 00:17:44,938 you compromise the system, 513 00:17:44,938 --> 00:17:46,356 you encrypt it and you get paid. 514 00:17:46,356 --> 00:17:48,442 - So you steal from a company 515 00:17:48,442 --> 00:17:50,486 and then force them to buy it back, 516 00:17:50,486 --> 00:17:52,821 which is how hip-hop works. 517 00:17:52,821 --> 00:17:55,574 (talking over each other) 518 00:17:55,574 --> 00:17:56,408 (laughing) 519 00:17:56,408 --> 00:17:57,951 Good job music industry. You're ransomware. 520 00:17:57,951 --> 00:18:01,246 - So with that, right now, that model works right? 521 00:18:01,246 --> 00:18:02,998 That model works because of Bitcoin. 522 00:18:02,998 --> 00:18:05,167 That model works because you can essentially 523 00:18:05,167 --> 00:18:07,127 usually get paid, they're doing a little bit better 524 00:18:07,127 --> 00:18:08,420 at getting money back. 525 00:18:08,420 --> 00:18:10,339 - Are happier to pay than deal with- 526 00:18:10,339 --> 00:18:11,173 - Absolutely. 527 00:18:11,173 --> 00:18:12,800 So you get a good payment. 528 00:18:12,800 --> 00:18:15,094 Now, the thing we're seeing though, 529 00:18:15,094 --> 00:18:17,638 is ransomware is getting more sophisticated. 530 00:18:17,638 --> 00:18:21,433 It's getting to the level often of sophisticated attacks. 531 00:18:21,433 --> 00:18:22,935 It's not simple attacks anymore 532 00:18:22,935 --> 00:18:25,395 because I get a $5 million paycheck 533 00:18:25,395 --> 00:18:27,439 I can afford to spend a few hundred thousand 534 00:18:27,439 --> 00:18:29,733 to execute that. 535 00:18:29,733 --> 00:18:34,321 And so that's our worry for the future is that the, 536 00:18:34,321 --> 00:18:37,741 you know, the attacks become more sophisticated, 537 00:18:37,741 --> 00:18:40,577 bigger paydays, and more time spent. 538 00:18:40,577 --> 00:18:42,454 - Because these hackers have a growth mindset. 539 00:18:42,454 --> 00:18:43,288 - Absolutely. 540 00:18:43,288 --> 00:18:45,624 So they're all about business, all about growth, 541 00:18:45,624 --> 00:18:48,627 all about meeting their fiscal plans. 542 00:18:48,627 --> 00:18:50,337 - Now what do their boards have to say? 543 00:18:50,337 --> 00:18:51,922 - Absolutely, what do their boards say? 544 00:18:51,922 --> 00:18:55,092 Their boards say they expect to see 40% growth. 545 00:18:55,092 --> 00:18:57,845 So, and I'm not kidding. 546 00:18:57,845 --> 00:19:00,722 - How does the changing nature of these threats 547 00:19:00,722 --> 00:19:02,933 affect how we keep up, Andy, 548 00:19:02,933 --> 00:19:04,059 and how we shift. 549 00:19:04,059 --> 00:19:06,979 - So I think we keep up by hiring people 550 00:19:06,979 --> 00:19:09,398 who have different experiences, 551 00:19:09,398 --> 00:19:10,732 because they'll understand that model. 552 00:19:10,732 --> 00:19:12,276 Like your reference about hip-hop, 553 00:19:12,276 --> 00:19:13,527 like to you, that was instant. 554 00:19:13,527 --> 00:19:15,696 To me, like I get it once you said it, 555 00:19:15,696 --> 00:19:17,197 but I never would have come up with that and said, 556 00:19:17,197 --> 00:19:19,575 oh, hey, that's a similar business model, 557 00:19:19,575 --> 00:19:22,327 but maybe that's an insight that's helpful in the boardroom. 558 00:19:22,327 --> 00:19:23,412 - Yeah. 559 00:19:23,412 --> 00:19:27,457 One of the things that we also just need to consider, right. 560 00:19:27,457 --> 00:19:32,212 Is our new model is work real hard, real fast, 561 00:19:32,212 --> 00:19:33,589 all the time. 562 00:19:33,589 --> 00:19:35,465 Right? All the time. 563 00:19:35,465 --> 00:19:36,800 Yeah. One story somebody told me was they went, 564 00:19:36,800 --> 00:19:39,344 went into IBM, took over IBM. 565 00:19:39,344 --> 00:19:40,929 And it was a guy that was there 566 00:19:40,929 --> 00:19:43,390 and just had his feet up on his desk. 567 00:19:43,390 --> 00:19:45,726 And the CEO walked through again, 568 00:19:45,726 --> 00:19:47,311 he had his feet up on his desk, 569 00:19:47,311 --> 00:19:50,647 just it's like, what's up with that guy? 570 00:19:50,647 --> 00:19:53,567 Said, well, last time he won the Nobel prize 571 00:19:53,567 --> 00:19:56,236 he had his feet up on the desk for six months. 572 00:19:56,236 --> 00:19:58,530 So he said, don't interrupt him. 573 00:19:58,530 --> 00:20:03,535 But the bottom line is thinking right now is underrated. 574 00:20:03,702 --> 00:20:06,496 The time to think, the time to discover. 575 00:20:06,496 --> 00:20:08,874 You know, John Adams also, you know, if you look at, 576 00:20:08,874 --> 00:20:11,001 he kept a diary of every day of his life 577 00:20:11,001 --> 00:20:14,463 and in probably 40% of the entries, 578 00:20:14,463 --> 00:20:15,589 one word, thinking. 579 00:20:16,715 --> 00:20:19,468 So when we think, 580 00:20:19,468 --> 00:20:21,261 we often come up with new ideas, 581 00:20:21,261 --> 00:20:25,140 we think with groups of people, we talk about things. 582 00:20:25,140 --> 00:20:26,558 And in order to combat, 583 00:20:26,558 --> 00:20:28,727 one of the things people ask me also very often 584 00:20:28,727 --> 00:20:32,105 is so what's, how do you describe the adversary? 585 00:20:32,105 --> 00:20:34,483 Right? How do you describe the adversary? 586 00:20:34,483 --> 00:20:36,526 And the word I come up with is, thoughtful. 587 00:20:36,526 --> 00:20:39,571 Very, very, very, very thoughtful. 588 00:20:39,571 --> 00:20:43,533 Not one thing was done that didn't need to get done. 589 00:20:43,533 --> 00:20:47,079 Not one piece of noise was made that didn't need to be made, 590 00:20:47,079 --> 00:20:51,375 not the code that they dropped waited 14 day before it ran. 591 00:20:51,375 --> 00:20:53,293 Would not run inside of our environment. 592 00:20:53,293 --> 00:20:54,586 Again, thoughtful. 593 00:20:54,586 --> 00:20:58,298 They attacked a virtual machine that goes away. 594 00:20:58,298 --> 00:21:00,425 It's not there all the time. Thoughtful. 595 00:21:00,425 --> 00:21:03,470 They didn't attack the source code control system 596 00:21:03,470 --> 00:21:05,222 because they knew we would see it. 597 00:21:05,222 --> 00:21:07,557 So they thought and thought and thought 598 00:21:07,557 --> 00:21:09,351 and thought and thought, 599 00:21:09,351 --> 00:21:11,019 and we have to out-think them. 600 00:21:11,770 --> 00:21:13,188 - You need time to do that. 601 00:21:13,188 --> 00:21:15,065 - Thinking needs to be part of your program. 602 00:21:15,065 --> 00:21:16,275 It needs to be harder stuff 603 00:21:16,275 --> 00:21:20,153 that you are dedicating time to it. 604 00:21:20,153 --> 00:21:24,741 It's not just about doing it, it's about thinking too. 605 00:21:24,741 --> 00:21:26,368 - Can't think of a better way to end. 606 00:21:26,368 --> 00:21:29,871 Thanks for joining us in our garage for another episode 607 00:21:29,871 --> 00:21:31,915 of Lenovo Late Night I.T, 608 00:21:31,915 --> 00:21:34,293 where you'll always get a fresh unfiltered look 609 00:21:34,293 --> 00:21:36,461 at what's going on in the tech industry. 610 00:21:36,461 --> 00:21:39,381 And thanks to our guests, Andy Ellis and Tim Brown. 611 00:21:39,381 --> 00:21:42,092 I'm Baratunde Thurston. And I'll see you next time. 612 00:21:42,092 --> 00:21:44,970 (upbeat music)