1 00:00:14,960 --> 00:00:17,199 Welcome to Nasdaq trade talks, 2 00:00:17,199 --> 00:00:18,720 where we meet with the top thought leaders and 3 00:00:18,720 --> 00:00:20,519 strategists and emerging technologies, 4 00:00:20,519 --> 00:00:21,879 digital assets and regulatory 5 00:00:21,879 --> 00:00:23,300 landscape and capital markets. 6 00:00:23,300 --> 00:00:24,600 I'm your host, Jill Malandrino, 7 00:00:24,600 --> 00:00:25,640 and joining me on the desk at the 8 00:00:25,640 --> 00:00:27,199 Nasdaq market site to wrap 9 00:00:27,199 --> 00:00:28,399 our in studio coverage for 10 00:00:28,399 --> 00:00:30,120 Cyber Security Awareness Month, 11 00:00:30,120 --> 00:00:31,359 which spotlighted the evolving 12 00:00:31,359 --> 00:00:32,600 landscape of digital threats and 13 00:00:32,600 --> 00:00:35,080 the strategic response to shaping financial services. 14 00:00:35,080 --> 00:00:36,179 We have Eunice Nunez, 15 00:00:36,179 --> 00:00:38,179 global cybersecurity executive, 16 00:00:38,179 --> 00:00:40,620 Colonel Giorgio Xavier Kitara, 17 00:00:40,620 --> 00:00:42,720 CIO and CISO of I Merit, 18 00:00:42,720 --> 00:00:45,560 as well as Doctor Alexander Yampolsky, 19 00:00:45,560 --> 00:00:47,740 co-founder and CEO of Securityscorecard. 20 00:00:47,740 --> 00:00:49,359 We're here to discuss how the role of 21 00:00:49,359 --> 00:00:51,620 the chief information Security officer, 22 00:00:51,620 --> 00:00:53,600 known as CISO, has evolved from 23 00:00:53,600 --> 00:00:56,980 a technical risk manager to a business enabler. 24 00:00:56,980 --> 00:00:58,440 It's great to have all of you with us. 25 00:00:58,440 --> 00:01:01,110 Welcome to Trade talks. Let's kick it off with you. 26 00:01:01,110 --> 00:01:04,209 When we think about the role of the CISO, 27 00:01:04,209 --> 00:01:07,010 it's no longer just to manage technical risk. 28 00:01:07,010 --> 00:01:08,650 It really is to enable 29 00:01:08,650 --> 00:01:11,330 the business to grow and be competitive. 30 00:01:11,330 --> 00:01:12,950 Jill, thanks for having me 31 00:01:12,950 --> 00:01:14,490 and the rest of the team here today. 32 00:01:14,490 --> 00:01:16,169 I would say over the years, 33 00:01:16,169 --> 00:01:18,130 the CISO or security, security, 34 00:01:18,130 --> 00:01:20,709 senior leader organizations, the role has evolved 35 00:01:20,709 --> 00:01:24,110 more about being a security enabler and a collaborator. 36 00:01:24,110 --> 00:01:25,629 The CISO, as it evolves, 37 00:01:25,629 --> 00:01:27,310 has to be a communicator and really 38 00:01:27,310 --> 00:01:30,729 understand how the organizations make their money, 39 00:01:30,729 --> 00:01:32,430 how their culture operates, 40 00:01:32,430 --> 00:01:34,670 and what are the strengths or weaknesses that they have, 41 00:01:34,670 --> 00:01:36,950 and be able to tell that story effectively 42 00:01:36,950 --> 00:01:39,590 so that we can help prioritize the work we need to do. 43 00:01:39,590 --> 00:01:41,029 Oftentimes, we're challenged by 44 00:01:41,029 --> 00:01:42,510 not having enough resources, 45 00:01:42,510 --> 00:01:44,430 by not having the right people in place. 46 00:01:44,430 --> 00:01:45,790 But it's good to understand at 47 00:01:45,790 --> 00:01:48,269 a high level where we're going as an organization, 48 00:01:48,269 --> 00:01:49,490 where the market is going. 49 00:01:49,490 --> 00:01:51,129 And I would say over the last year, 50 00:01:51,129 --> 00:01:54,650 the two most popular letters in the alphabet are AI. 51 00:01:54,650 --> 00:01:57,510 And I think that as a cybersecurity executive, 52 00:01:57,510 --> 00:01:58,980 our role is to make sure that 53 00:01:58,980 --> 00:02:00,739 we're leaning in understanding 54 00:02:00,739 --> 00:02:02,379 our environment and be able to then 55 00:02:02,379 --> 00:02:04,099 move forward with prioritization, 56 00:02:04,099 --> 00:02:05,579 conviction and building 57 00:02:05,579 --> 00:02:08,240 a strong culture of cybersecurity awareness. 58 00:02:08,240 --> 00:02:10,099 And George. A lot of it comes down to is this 59 00:02:10,099 --> 00:02:11,139 a delicate balance between 60 00:02:11,139 --> 00:02:13,180 innovation and trust within the org? 61 00:02:13,180 --> 00:02:16,560 It is. And we have to be able to enable innovation. 62 00:02:16,560 --> 00:02:19,220 So at AI we do what's necessary for the client. 63 00:02:19,220 --> 00:02:20,420 That's the difference between checking 64 00:02:20,420 --> 00:02:21,940 a box and building trust. 65 00:02:21,940 --> 00:02:23,720 So you have to move forward right. 66 00:02:23,720 --> 00:02:24,979 So let the company 67 00:02:24,979 --> 00:02:26,860 innovate same time put in place the controls. 68 00:02:26,860 --> 00:02:27,699 But they got to move forward 69 00:02:27,699 --> 00:02:29,140 with innovation making money. 70 00:02:29,140 --> 00:02:30,379 And that's the challenge for 71 00:02:30,379 --> 00:02:32,680 the CISO today and going forward. 72 00:02:32,680 --> 00:02:34,519 Yeah. And I it 73 00:02:34,519 --> 00:02:36,680 can't be overstated just how important this role is. 74 00:02:36,680 --> 00:02:38,180 And to be able to communicate that because 75 00:02:38,180 --> 00:02:40,320 one outage can cascade 76 00:02:40,320 --> 00:02:41,779 not only through your enterprise but 77 00:02:41,779 --> 00:02:44,680 across clients, partners, vendors. 78 00:02:44,680 --> 00:02:47,420 The role of a CISO became so much more complex. 79 00:02:47,420 --> 00:02:48,979 You have to be a translator between 80 00:02:48,979 --> 00:02:51,779 the technical teams and the board of directors, 81 00:02:51,779 --> 00:02:53,419 but also we're no 82 00:02:53,419 --> 00:02:55,180 longer protecting just our own companies. 83 00:02:55,180 --> 00:02:57,070 We'll interconnect it to each other. 84 00:02:57,070 --> 00:02:58,210 Your data is in a cloud. 85 00:02:58,210 --> 00:03:00,609 65% of data breaches today 86 00:03:00,609 --> 00:03:03,570 happen due to negligence of third parties. 87 00:03:03,570 --> 00:03:05,810 So the CISO needs to enable 88 00:03:05,810 --> 00:03:08,790 innovation with AI while keeping the company secure. 89 00:03:08,790 --> 00:03:10,369 Act as a translator between 90 00:03:10,369 --> 00:03:12,170 the board and the technical team. 91 00:03:12,170 --> 00:03:14,969 Make sure you don't do it on increased budgets, 92 00:03:14,969 --> 00:03:17,250 because everybody's keeping their budgets in check. 93 00:03:17,250 --> 00:03:20,490 And on top of it, worry not just about their own company, 94 00:03:20,490 --> 00:03:22,930 but also about the supply chain ecosystem and 95 00:03:22,930 --> 00:03:23,930 making sure that your 96 00:03:23,930 --> 00:03:25,730 third and fourth parties are secure. 97 00:03:25,730 --> 00:03:27,150 So it's a complicated job. 98 00:03:27,150 --> 00:03:28,370 Yeah, it certainly is. And I would 99 00:03:28,370 --> 00:03:29,650 also argue that it has to be 100 00:03:29,650 --> 00:03:33,249 a cybersecurity culture within the org from the top down, 101 00:03:33,249 --> 00:03:35,570 because, um, security by design, 102 00:03:35,570 --> 00:03:36,409 I think is going to be 103 00:03:36,409 --> 00:03:38,730 a competitive advantage moving forward. 104 00:03:38,730 --> 00:03:40,150 Absolutely. And, you know, 105 00:03:40,150 --> 00:03:41,490 when people ask me for advice, 106 00:03:41,490 --> 00:03:43,809 when boards of directors ask me for advice, 107 00:03:43,809 --> 00:03:45,490 how do you keep a company secure? 108 00:03:45,490 --> 00:03:47,490 The one simple advice I give is make sure 109 00:03:47,490 --> 00:03:48,889 your CISO gets to present at 110 00:03:48,889 --> 00:03:50,290 your audit and risk committee, 111 00:03:50,290 --> 00:03:51,689 because if a CIO sees 112 00:03:51,689 --> 00:03:53,710 that the CSO is at the board meeting, 113 00:03:53,710 --> 00:03:55,789 the CEO is going to provide the budget, 114 00:03:55,789 --> 00:03:57,450 you're going to provide enablement. 115 00:03:57,450 --> 00:03:59,110 It all starts with a culture. 116 00:03:59,110 --> 00:04:02,090 Culture beats strategy for breakfast. 117 00:04:02,090 --> 00:04:03,789 As Peter Drucker used to say, 118 00:04:03,789 --> 00:04:05,610 culture eats strategy for breakfast. 119 00:04:05,610 --> 00:04:06,830 So it starts at the top. 120 00:04:06,830 --> 00:04:10,030 If the board and the CEO are talking about cyber, 121 00:04:10,030 --> 00:04:11,789 and if they're technologically 122 00:04:11,789 --> 00:04:13,670 literate, the change happens. 123 00:04:13,670 --> 00:04:15,850 The board members, you know, 124 00:04:15,850 --> 00:04:17,110 if in the middle of a board meeting, 125 00:04:17,110 --> 00:04:18,910 you raise a hand and say, what's EBITDA? 126 00:04:18,910 --> 00:04:20,530 People are going to give you a strange look. 127 00:04:20,530 --> 00:04:22,270 You're supposed to know what EBITDA is. 128 00:04:22,270 --> 00:04:23,390 But if you're a board member 129 00:04:23,390 --> 00:04:24,550 in the middle of a board meeting, 130 00:04:24,550 --> 00:04:25,910 you raise a hand and say, I'm not sure 131 00:04:25,910 --> 00:04:27,550 what denial of service attack is. 132 00:04:27,550 --> 00:04:29,170 It's still totally normal. 133 00:04:29,170 --> 00:04:31,990 So it's a it's also the onus on the board 134 00:04:31,990 --> 00:04:33,390 and the CEO to make sure that we 135 00:04:33,390 --> 00:04:35,249 don't just speak the language of numbers, 136 00:04:35,249 --> 00:04:36,910 but we become AI and 137 00:04:36,910 --> 00:04:39,130 cybersecurity literate in our conversations. 138 00:04:39,130 --> 00:04:40,250 And I think that's one of 139 00:04:40,250 --> 00:04:41,749 the key things you mentioned, Jill, 140 00:04:41,749 --> 00:04:43,470 earlier today, which is 141 00:04:43,470 --> 00:04:46,490 around as the role continues to evolve. 142 00:04:46,490 --> 00:04:48,770 It's not just about technology anymore. 143 00:04:48,770 --> 00:04:50,309 Cybersecurity cuts across 144 00:04:50,309 --> 00:04:52,430 multiple components of an organization. 145 00:04:52,430 --> 00:04:54,760 It's reputational. it's financial, 146 00:04:54,760 --> 00:04:56,980 it's HR, it's all risk. 147 00:04:56,980 --> 00:04:59,679 And what I'm seeing as being one of 148 00:04:59,679 --> 00:05:01,499 many directors right now that are 149 00:05:01,499 --> 00:05:03,320 on public boards and private boards, 150 00:05:03,320 --> 00:05:04,819 being from a CFO background, 151 00:05:04,819 --> 00:05:07,420 what I'm finding out is that we need to not 152 00:05:07,420 --> 00:05:10,100 only have a seat at the table, 153 00:05:10,100 --> 00:05:12,819 but we need to be part of the planning, 154 00:05:12,819 --> 00:05:14,240 the communication and the 155 00:05:14,240 --> 00:05:15,800 go to market strategies as well. 156 00:05:15,800 --> 00:05:17,140 Yeah. And, you know, 157 00:05:17,140 --> 00:05:18,540 part of that is to George, you have to be able to 158 00:05:18,540 --> 00:05:20,020 collaborate across those functions. 159 00:05:20,020 --> 00:05:22,440 To your point, because I think companies 160 00:05:22,440 --> 00:05:25,080 almost get in their ways because they operate in silos. 161 00:05:25,080 --> 00:05:27,020 That's right. And I don't operate in silos. 162 00:05:27,020 --> 00:05:28,519 So one of the things we do, 163 00:05:28,519 --> 00:05:29,820 and one thing we do 164 00:05:29,820 --> 00:05:31,339 at Merit is actually meet with everybody in 165 00:05:31,339 --> 00:05:32,619 the executive leadership team 166 00:05:32,619 --> 00:05:34,600 and make sure we have good relationships. 167 00:05:34,600 --> 00:05:36,059 When a security incident happens, 168 00:05:36,059 --> 00:05:37,300 it's not going to be just security. 169 00:05:37,300 --> 00:05:39,860 It's going to be it legal, finance, 170 00:05:39,860 --> 00:05:41,220 engineering, all working together to 171 00:05:41,220 --> 00:05:42,720 figure out how to go do containment. 172 00:05:42,720 --> 00:05:44,640 Right. We need to practice what we preach, 173 00:05:44,640 --> 00:05:45,740 not just the executive leadership 174 00:05:45,740 --> 00:05:46,980 level, but all the way down. 175 00:05:46,980 --> 00:05:49,600 So I talked to my talk to my the people on my team, 176 00:05:49,600 --> 00:05:50,699 talk to the folks across you 177 00:05:50,699 --> 00:05:51,815 and make sure they know what's happening. 178 00:05:51,815 --> 00:05:54,050 right? If you see concerns, talk to them about it. 179 00:05:54,050 --> 00:05:55,770 But we have to work together on this. 180 00:05:55,770 --> 00:05:58,430 So it's very, very important we reach out of our silos. 181 00:05:58,430 --> 00:05:59,610 If somebody wants to work on 182 00:05:59,610 --> 00:06:01,990 a new technology or new innovation, don't hold them back. 183 00:06:01,990 --> 00:06:03,350 Figure out how do we protect that? 184 00:06:03,350 --> 00:06:05,350 How do we protect their flank as they move forward? 185 00:06:05,350 --> 00:06:06,449 Yeah. Well, I mean, 186 00:06:06,449 --> 00:06:08,450 it always comes back to where people are, 187 00:06:08,450 --> 00:06:09,730 the front line, and that's where 188 00:06:09,730 --> 00:06:11,130 the vulnerabilities are going to exist, 189 00:06:11,130 --> 00:06:12,329 which was my point before, 190 00:06:12,329 --> 00:06:13,690 where it has to be a cyber first, 191 00:06:13,690 --> 00:06:17,430 privacy first mentality as the culture of the company. 192 00:06:17,430 --> 00:06:19,689 Absolutely. And so it starts 193 00:06:19,689 --> 00:06:21,449 with people and training people 194 00:06:21,449 --> 00:06:23,330 and making sure that people know 195 00:06:23,330 --> 00:06:25,370 what to fall for and what not to fall for. 196 00:06:25,370 --> 00:06:27,210 However, it's becoming more and more 197 00:06:27,210 --> 00:06:28,449 complicated right now with 198 00:06:28,449 --> 00:06:30,310 the proliferation of deepfakes. 199 00:06:30,310 --> 00:06:31,970 Yes, somebody can take a video 200 00:06:31,970 --> 00:06:34,370 of Unesco or myself or yourself, 201 00:06:34,370 --> 00:06:36,070 and it looks just like us. 202 00:06:36,070 --> 00:06:37,370 Sounds just like us. 203 00:06:37,370 --> 00:06:38,990 Places a phone call and says, hey, 204 00:06:38,990 --> 00:06:41,510 can you please initiate the wire transfer? 205 00:06:41,510 --> 00:06:43,130 So we need to evolve our techniques. 206 00:06:43,130 --> 00:06:44,209 We need to train people, 207 00:06:44,209 --> 00:06:46,850 but we also need to train them in different ways, 208 00:06:46,850 --> 00:06:49,250 just like we teach our children. 209 00:06:49,250 --> 00:06:50,644 If you walk in on a dark street, 210 00:06:50,644 --> 00:06:52,399 Maybe look around or don't walk 211 00:06:52,399 --> 00:06:55,540 home from school alone if it's after hours. 212 00:06:55,540 --> 00:06:58,079 We need to start embedding this education 213 00:06:58,079 --> 00:07:00,480 into everyday lives at 214 00:07:00,480 --> 00:07:02,580 school and also in companies as well. 215 00:07:02,580 --> 00:07:03,879 I agree, and I think 216 00:07:03,879 --> 00:07:05,800 that the role of awareness and training, 217 00:07:05,800 --> 00:07:07,759 it's not just per discipline, 218 00:07:07,759 --> 00:07:09,900 it should be holistic across the board. 219 00:07:09,900 --> 00:07:10,839 I think one of the things that I 220 00:07:10,839 --> 00:07:11,959 would like to see more is, 221 00:07:11,959 --> 00:07:13,519 as you do awareness and training and 222 00:07:13,519 --> 00:07:15,720 education is that you're looking at it 223 00:07:15,720 --> 00:07:17,859 across the organisation rather than 224 00:07:17,859 --> 00:07:20,620 a compliance assessment or awareness training. 225 00:07:20,620 --> 00:07:22,480 A cyber awareness training. 226 00:07:22,480 --> 00:07:25,239 And we have a leaning moment today where we're seeing, 227 00:07:25,239 --> 00:07:28,479 as you mentioned, advanced attacks on integrity, 228 00:07:28,479 --> 00:07:29,680 where I'm dubbing it 229 00:07:29,680 --> 00:07:32,720 the Agentic security process enablement, 230 00:07:32,720 --> 00:07:35,180 where we need to lean in to make sure that we 231 00:07:35,180 --> 00:07:38,359 are enabling these processes that we do today, 232 00:07:38,359 --> 00:07:40,220 whether they're manual or human. 233 00:07:40,220 --> 00:07:43,299 We're leveraging AI to help us deal with deepfakes, 234 00:07:43,299 --> 00:07:46,040 help us deal with the advanced attacks that 235 00:07:46,040 --> 00:07:47,480 are coming through our organizations 236 00:07:47,480 --> 00:07:48,820 today. Alex, I agree with you. 237 00:07:48,820 --> 00:07:50,089 Yeah. I mean, you know, George, 238 00:07:50,089 --> 00:07:51,850 you could see, um, 239 00:07:51,850 --> 00:07:53,270 it's not too uncommon. 240 00:07:53,270 --> 00:07:55,449 It's happening these days where somebody that's 241 00:07:55,449 --> 00:07:56,809 a high level position at a company 242 00:07:56,809 --> 00:07:58,450 making a statement. It's a deep fake. 243 00:07:58,450 --> 00:07:59,910 It can move markets, it can create 244 00:07:59,910 --> 00:08:01,750 geopolitical incidents and so forth. 245 00:08:01,750 --> 00:08:03,309 So I think when you, 246 00:08:03,309 --> 00:08:04,709 you know, incidents, response plans, 247 00:08:04,709 --> 00:08:05,910 crisis management, it's not 248 00:08:05,910 --> 00:08:07,810 a set it and forget it exercise. 249 00:08:07,810 --> 00:08:09,589 It is something that continuously needs 250 00:08:09,589 --> 00:08:11,690 to just like you're adapting to AI. 251 00:08:11,690 --> 00:08:15,770 You have to adapt your response and proactive mitigation. 252 00:08:15,770 --> 00:08:17,509 Not just in a. So we 253 00:08:17,509 --> 00:08:19,470 definitely have a security incident response plan. 254 00:08:19,470 --> 00:08:20,750 We definitely practice it. 255 00:08:20,750 --> 00:08:22,829 But we also need to be looking at the frontier of 256 00:08:22,829 --> 00:08:24,970 how our Scholars Program program works. 257 00:08:24,970 --> 00:08:26,050 We look for fraud. 258 00:08:26,050 --> 00:08:27,930 We proactively make sure that on screening, 259 00:08:27,930 --> 00:08:30,290 trying to block out anything from from the nation states, 260 00:08:30,290 --> 00:08:31,809 trying to conduct warfare, 261 00:08:31,809 --> 00:08:33,710 making sure people are who they say they are 262 00:08:33,710 --> 00:08:35,830 coming in to make sure our clients are protected. 263 00:08:35,830 --> 00:08:38,990 So we take security very seriously, but more importantly, 264 00:08:38,990 --> 00:08:40,910 making sure we understand 265 00:08:40,910 --> 00:08:43,950 the changing nature of of what the technology looks like. 266 00:08:43,950 --> 00:08:45,350 So I myself, I'm doing a doctorate in 267 00:08:45,350 --> 00:08:49,880 cyber security data science as well as a genetic AI. 268 00:08:49,880 --> 00:08:51,120 I'm trying to encourage my 269 00:08:51,120 --> 00:08:52,379 own team members to look at the gold, 270 00:08:52,379 --> 00:08:53,440 what I call the Golden Triangle. 271 00:08:53,440 --> 00:08:55,840 Practical experience, certifications and 272 00:08:55,840 --> 00:08:58,700 ongoing education with AI fluency 273 00:08:58,700 --> 00:09:00,200 in the middle of all that, right? 274 00:09:00,200 --> 00:09:01,380 We need to move forward like that. 275 00:09:01,380 --> 00:09:02,720 I set the example they follow. 276 00:09:02,720 --> 00:09:04,059 We talk together and figure out, 277 00:09:04,059 --> 00:09:05,440 hey, what don't we know? 278 00:09:05,440 --> 00:09:07,279 Yeah, well, part of the challenge is to 279 00:09:07,279 --> 00:09:08,820 that AI has erased 280 00:09:08,820 --> 00:09:12,179 this gap between the good actors and the bad actors, 281 00:09:12,179 --> 00:09:15,680 and it really does come down to who is faster, 282 00:09:15,680 --> 00:09:18,260 who's more successful, who's well funded. 283 00:09:18,260 --> 00:09:20,179 And sometimes that can skew to the bad actor side, 284 00:09:20,179 --> 00:09:21,260 because they don't necessarily have 285 00:09:21,260 --> 00:09:24,059 all the layers of compliance and approvals, 286 00:09:24,059 --> 00:09:26,320 just like a nation state wouldn't have. 287 00:09:26,320 --> 00:09:28,399 That's exactly it. For example, 288 00:09:28,399 --> 00:09:31,100 the tools like AI, 289 00:09:31,100 --> 00:09:34,000 LLM for the hackers where you can just type, 290 00:09:34,000 --> 00:09:35,780 can you give me an exploit against 291 00:09:35,780 --> 00:09:37,619 Citrix devices and all of a sudden 292 00:09:37,619 --> 00:09:39,619 you don't need to run sophisticated scripts, 293 00:09:39,619 --> 00:09:41,100 it just gives you an exploit. 294 00:09:41,100 --> 00:09:43,280 So the bar for being able to 295 00:09:43,280 --> 00:09:45,840 be a sophisticated hacker just dropped. 296 00:09:45,840 --> 00:09:48,270 Also, as companies adopt AI, 297 00:09:48,270 --> 00:09:50,230 people are rushing to adopt AI. 298 00:09:50,230 --> 00:09:51,590 It's a boardroom mandate. 299 00:09:51,590 --> 00:09:54,350 But security is an afterthought. 300 00:09:54,350 --> 00:09:56,629 There's a proliferation of a use of shadow AI, 301 00:09:56,629 --> 00:09:58,190 just like we had shadow it 302 00:09:58,190 --> 00:10:00,230 with a proliferation of shadow AI, 303 00:10:00,230 --> 00:10:02,630 where an employee might be sitting in a bank, 304 00:10:02,630 --> 00:10:06,369 upload financial ledger sensitive documents, 305 00:10:06,369 --> 00:10:08,270 just just to get help, 306 00:10:08,270 --> 00:10:09,850 maybe to ChatGPT or worse, 307 00:10:09,850 --> 00:10:11,930 off to deep sea in China. 308 00:10:11,930 --> 00:10:14,990 And all of a sudden your sensitive data is leaked. 309 00:10:14,990 --> 00:10:16,830 That's right. It's not protected. 310 00:10:16,830 --> 00:10:19,049 And I just think that we don't 311 00:10:19,049 --> 00:10:21,590 design AI right now with security in mind. 312 00:10:21,590 --> 00:10:23,409 Just like when internet appeared, 313 00:10:23,409 --> 00:10:25,790 it wasn't designed with cybersecurity in mind. 314 00:10:25,790 --> 00:10:27,550 We just rushed to adopt it. 315 00:10:27,550 --> 00:10:30,570 I agree. Last time I reviewed the numbers on this, 316 00:10:30,570 --> 00:10:31,830 I think it was July. 317 00:10:31,830 --> 00:10:33,769 The big AI players spent something like 318 00:10:33,769 --> 00:10:37,010 $300 billion advancing their AI capabilities. 319 00:10:37,010 --> 00:10:38,770 And my question was automatically, 320 00:10:38,770 --> 00:10:41,090 how much of that was invested in cybersecurity? 321 00:10:41,090 --> 00:10:42,270 The number was minuscule. 322 00:10:42,270 --> 00:10:44,190 Probably about 50 basis points. 323 00:10:44,190 --> 00:10:48,190 So our lean in moment today truly is about making 324 00:10:48,190 --> 00:10:50,250 sure that as we enable 325 00:10:50,250 --> 00:10:52,909 the business with the Gentech, AI, etc., 326 00:10:52,909 --> 00:10:54,730 that we are bringing security in 327 00:10:54,730 --> 00:10:57,110 from the inception at this inflection point, 328 00:10:57,110 --> 00:10:58,590 at this lean in moment, 329 00:10:58,590 --> 00:11:00,809 so that we don't wait until 330 00:11:00,809 --> 00:11:03,630 the train has already left the station, 331 00:11:03,630 --> 00:11:05,869 we need to catch up and make sure we're thinking 332 00:11:05,869 --> 00:11:08,470 securely by design as we do this. 333 00:11:08,470 --> 00:11:09,570 Otherwise we're going to be. 334 00:11:09,570 --> 00:11:11,409 I feel that we are exactly where we were 335 00:11:11,409 --> 00:11:12,990 1520 years ago with 336 00:11:12,990 --> 00:11:15,070 a data loss prevention and email protection. 337 00:11:15,070 --> 00:11:16,630 We're right back to where we started. 338 00:11:16,630 --> 00:11:18,930 Now the avenue is no longer email. 339 00:11:18,930 --> 00:11:21,510 The avenue is not the browser through the avenue. 340 00:11:21,510 --> 00:11:26,050 Now are these AI companies. 341 00:11:26,050 --> 00:11:28,950 But I mean, there there is no choice 342 00:11:28,950 --> 00:11:32,050 but to think cyber first and develop and model that way. 343 00:11:32,050 --> 00:11:33,210 Think about the rush again. 344 00:11:33,210 --> 00:11:34,389 We were talking about this off camera, 345 00:11:34,389 --> 00:11:36,609 but the dividing line November 2022, 346 00:11:36,609 --> 00:11:39,730 when generative AI became part of everyone's vernacular. 347 00:11:39,730 --> 00:11:41,070 Right? And there was such a rush, 348 00:11:41,070 --> 00:11:42,490 whether internal or external, 349 00:11:42,490 --> 00:11:44,240 to get product out there. 350 00:11:44,240 --> 00:11:46,100 Perhaps we didn't think about the risk, 351 00:11:46,100 --> 00:11:47,740 or maybe we didn't even know the unknowns. 352 00:11:47,740 --> 00:11:49,059 Right? And those gaps and 353 00:11:49,059 --> 00:11:50,860 vulnerabilities are starting to be exposed. 354 00:11:50,860 --> 00:11:53,700 And hopefully organizations learn from that. 355 00:11:53,700 --> 00:11:56,760 You have to you have to model out cyber first. 356 00:11:56,760 --> 00:11:58,020 You have to model out cyber first. 357 00:11:58,020 --> 00:11:59,940 But more importantly, you got to figure 358 00:11:59,940 --> 00:12:02,999 out your employees like to your point, 359 00:12:02,999 --> 00:12:04,360 how do we protect the AI? 360 00:12:04,360 --> 00:12:05,600 How do we protect our data? 361 00:12:05,600 --> 00:12:07,320 Right? What data is going to go into AI? 362 00:12:07,320 --> 00:12:08,700 Acceptable use of AI? 363 00:12:08,700 --> 00:12:10,760 Making sure to your point about shadow AI, 364 00:12:10,760 --> 00:12:12,640 making sure our tools to detect that. 365 00:12:12,640 --> 00:12:13,820 So there are tools out 366 00:12:13,820 --> 00:12:15,299 there that attackers are currently using today, 367 00:12:15,299 --> 00:12:17,499 like IP or proxy and such that we can quickly 368 00:12:17,499 --> 00:12:18,739 detect and say that person's not 369 00:12:18,739 --> 00:12:20,140 coming where they are, right? 370 00:12:20,140 --> 00:12:22,320 So you can use tools to detect that at the same time. 371 00:12:22,320 --> 00:12:25,200 Education, education, education doesn't change, right. 372 00:12:25,200 --> 00:12:26,600 So you want to make sure you've got a good 373 00:12:26,600 --> 00:12:28,420 education program for your organizations. 374 00:12:28,420 --> 00:12:29,899 But this technology is going 375 00:12:29,899 --> 00:12:31,560 to evolve and evolve and evolve right. 376 00:12:31,560 --> 00:12:32,660 It's not once and done. 377 00:12:32,660 --> 00:12:33,760 So you do that. 378 00:12:33,760 --> 00:12:35,540 We got to make sure we understand what's happening. 379 00:12:35,540 --> 00:12:36,860 Shadow AI is a real thing. 380 00:12:36,860 --> 00:12:38,319 If you don't partner with 381 00:12:38,319 --> 00:12:40,419 your divisions and with the people in your company, 382 00:12:40,419 --> 00:12:41,740 they're going to go find their own solution, 383 00:12:41,740 --> 00:12:44,770 bring it in and get their own shadow AI up and running. 384 00:12:44,770 --> 00:12:46,390 And now you're suddenly 385 00:12:46,390 --> 00:12:48,829 finding your data's in the cloud. Somebody else's. 386 00:12:48,829 --> 00:12:50,309 But Alex also comes back to 387 00:12:50,309 --> 00:12:51,970 even with all these new tools being introduced, 388 00:12:51,970 --> 00:12:53,050 not every single person within 389 00:12:53,050 --> 00:12:54,850 the organization needs to have these tools. 390 00:12:54,850 --> 00:12:55,970 They haven't used it before or 391 00:12:55,970 --> 00:12:57,170 it doesn't relate to their function. 392 00:12:57,170 --> 00:13:00,169 Is it necessarily prudent to give 393 00:13:00,169 --> 00:13:01,609 them access if it's something 394 00:13:01,609 --> 00:13:03,269 that doesn't even make sense for their role? 395 00:13:03,269 --> 00:13:08,410 Well, you have to. I think it's just like you spoke. 396 00:13:08,410 --> 00:13:11,110 I think the old concepts become new again. 397 00:13:11,110 --> 00:13:12,970 We had the least privilege principle 398 00:13:12,970 --> 00:13:14,849 in cybersecurity for many years, 399 00:13:14,849 --> 00:13:16,990 which basically means you get access to 400 00:13:16,990 --> 00:13:20,390 the technology and the tools only on a per unit basis. 401 00:13:20,390 --> 00:13:22,970 Um, so you. Absolutely right. 402 00:13:22,970 --> 00:13:26,690 However, I do think as I look into the future, 403 00:13:26,690 --> 00:13:28,570 every company is going to be an AI company, 404 00:13:28,570 --> 00:13:30,990 just like we don't go to each other and say, 405 00:13:30,990 --> 00:13:33,090 oh, I use ML, that's my competitive advantage. 406 00:13:33,090 --> 00:13:34,950 Everybody uses ML in the company. 407 00:13:34,950 --> 00:13:36,870 Similarly, I think ten years into the future, 408 00:13:36,870 --> 00:13:38,710 or maybe even less, every company 409 00:13:38,710 --> 00:13:40,250 is going to be an AI company. 410 00:13:40,250 --> 00:13:42,720 So that's why so many people are rushing to adopt it. 411 00:13:42,720 --> 00:13:44,040 And that's why a lot 412 00:13:44,040 --> 00:13:45,799 of guardrails come as an afterthought. 413 00:13:45,799 --> 00:13:47,000 I think that right now, 414 00:13:47,000 --> 00:13:49,299 because we've seen also over the last few months, 415 00:13:49,299 --> 00:13:52,620 a lot of acquisitions in the AI, AI protection space. 416 00:13:52,620 --> 00:13:54,519 Obviously, we don't have all the tools that we 417 00:13:54,519 --> 00:13:56,760 require today based on the capabilities. 418 00:13:56,760 --> 00:13:58,840 So the human becomes the control. 419 00:13:58,840 --> 00:14:00,140 So I have a contrarian, 420 00:14:00,140 --> 00:14:01,640 a bit of a contrarian view here 421 00:14:01,640 --> 00:14:03,080 is that you need to provide 422 00:14:03,080 --> 00:14:04,600 the training for the tools that 423 00:14:04,600 --> 00:14:06,240 you're going to help enable your people. 424 00:14:06,240 --> 00:14:07,840 I don't think you just 425 00:14:07,840 --> 00:14:10,200 allow access without proper training. 426 00:14:10,200 --> 00:14:11,679 So I think you need to do both here, 427 00:14:11,679 --> 00:14:13,640 because we are now leveraging 428 00:14:13,640 --> 00:14:15,799 humans as our number one control 429 00:14:15,799 --> 00:14:18,160 until the technology and the security tools 430 00:14:18,160 --> 00:14:20,600 are in place to help us cover our bases. 431 00:14:20,600 --> 00:14:22,640 They're not there yet today. 432 00:14:22,640 --> 00:14:25,660 But as we do that, I feel that education, training, 433 00:14:25,660 --> 00:14:27,920 access and the limited controls we 434 00:14:27,920 --> 00:14:30,300 have in place today can help us get there faster. Jill. 435 00:14:30,300 --> 00:14:32,800 Yeah, but again, it has to be communicated. 436 00:14:32,800 --> 00:14:34,620 How do you quantify, right, 437 00:14:34,620 --> 00:14:37,580 what your cyber risk is financially? 438 00:14:37,580 --> 00:14:39,214 How do you express this is 439 00:14:39,214 --> 00:14:41,530 the value that we are bringing to the org, 440 00:14:41,530 --> 00:14:43,070 whether it's very abstract to 441 00:14:43,070 --> 00:14:45,070 say if we spend this amount of money, 442 00:14:45,070 --> 00:14:46,190 we won't lose XYZ. 443 00:14:46,190 --> 00:14:47,590 Should we have a breach? As it relates to 444 00:14:47,590 --> 00:14:48,990 reputation, it's very intangible. 445 00:14:48,990 --> 00:14:50,509 So how do you put up in boards 446 00:14:50,509 --> 00:14:52,490 in the C-suite wants to see this? 447 00:14:52,490 --> 00:14:53,689 What are those metrics? How do you 448 00:14:53,689 --> 00:14:55,810 quantify success as a CISO? 449 00:14:55,810 --> 00:14:58,569 Dollar cost, cyber quantification of the risk, 450 00:14:58,569 --> 00:15:00,950 assigning a dollar amount if this system goes down, 451 00:15:00,950 --> 00:15:03,130 if we lose this data, what is the litigation cost? 452 00:15:03,130 --> 00:15:04,530 What is the downtime? 453 00:15:04,530 --> 00:15:07,350 You mentioned AWS here earlier, but my point is, 454 00:15:07,350 --> 00:15:09,389 is that you have to talk to the board 455 00:15:09,389 --> 00:15:11,550 in terms of numbers of this system is not available. 456 00:15:11,550 --> 00:15:12,989 What does it cost look like. And you have to be able 457 00:15:12,989 --> 00:15:14,530 to balance our budget with that. 458 00:15:14,530 --> 00:15:15,629 Right? At the same time, 459 00:15:15,629 --> 00:15:17,830 you brought up a good point about human human feedback 460 00:15:17,830 --> 00:15:20,090 and reinforced learning impaired 461 00:15:20,090 --> 00:15:22,169 does a lot of work to make sure we train the AI 462 00:15:22,169 --> 00:15:24,590 with reinforcement learning from human feedback. 463 00:15:24,590 --> 00:15:27,149 What that means is that we have people looking at the AI, 464 00:15:27,149 --> 00:15:28,550 looking at the decisions that AI makes 465 00:15:28,550 --> 00:15:30,130 to make sure it's correct, right. 466 00:15:30,130 --> 00:15:31,549 Validating that AI model, 467 00:15:31,549 --> 00:15:32,929 validating the data that's coming through, 468 00:15:32,929 --> 00:15:34,250 validating the outputs of it. 469 00:15:34,250 --> 00:15:35,470 It's one of the things we excel at. 470 00:15:35,470 --> 00:15:36,510 Our scholars program does 471 00:15:36,510 --> 00:15:38,179 quite a bit of work around that, 472 00:15:38,179 --> 00:15:40,500 and you've got to be able to show that. 473 00:15:40,500 --> 00:15:41,620 Otherwise what's going to happen is, 474 00:15:41,620 --> 00:15:42,880 is you're going to put this model in place, 475 00:15:42,880 --> 00:15:43,580 put data into it. 476 00:15:43,580 --> 00:15:45,120 Next thing you know, you have a disaster. 477 00:15:45,120 --> 00:15:47,620 You've got to make sure all those controls are in place. 478 00:15:47,620 --> 00:15:49,259 There's actually a guardrail to make sure 479 00:15:49,259 --> 00:15:51,280 that the AI and the model is working as it should be. 480 00:15:51,280 --> 00:15:53,300 But to the board, you've got to be able to 481 00:15:53,300 --> 00:15:55,380 show what the dollar cost 482 00:15:55,380 --> 00:15:57,600 quantification is for if 483 00:15:57,600 --> 00:15:58,820 an incident happens or if 484 00:15:58,820 --> 00:16:00,200 data gets breached, what that looks like. 485 00:16:00,200 --> 00:16:02,600 Right. So you got to be able to figure out data 486 00:16:02,600 --> 00:16:05,560 versus I mean you've got a breach versus controls, right? 487 00:16:05,560 --> 00:16:07,720 So it's a balancing act. 488 00:16:07,720 --> 00:16:10,440 Yeah. Alex, same question with you as well. 489 00:16:10,440 --> 00:16:12,780 And you know, I've discussed this in prior conversations. 490 00:16:12,780 --> 00:16:14,500 But being able to quantify 491 00:16:14,500 --> 00:16:16,499 that technical performance and what that 492 00:16:16,499 --> 00:16:18,659 equates to for the value of the business, 493 00:16:18,659 --> 00:16:20,440 how do you benchmark that okay. 494 00:16:20,440 --> 00:16:22,339 So we spent x, y, z in cybersecurity 495 00:16:22,339 --> 00:16:24,220 this year while we didn't have a breach. 496 00:16:24,220 --> 00:16:25,399 And we know we you know we've 497 00:16:25,399 --> 00:16:27,600 saved that arbitrary number. 498 00:16:27,600 --> 00:16:28,899 How do you justify 499 00:16:28,899 --> 00:16:31,200 those costs going forward if you didn't have a breach. 500 00:16:31,200 --> 00:16:33,100 How do you make those business cases? 501 00:16:33,100 --> 00:16:35,560 Yeah. So well, that's actually our business. 502 00:16:35,560 --> 00:16:37,770 We're in the business of providing trusted, 503 00:16:37,770 --> 00:16:40,929 objective KPIs for companies to measure and quantify 504 00:16:40,929 --> 00:16:44,430 risk and were used by 70% of the fortune 100. 505 00:16:44,430 --> 00:16:46,770 You have to have data based on evidence. 506 00:16:46,770 --> 00:16:48,790 You have to publish your algorithm. 507 00:16:48,790 --> 00:16:50,770 The algorithm needs to be transparent. 508 00:16:50,770 --> 00:16:53,210 You need to expose accuracy rates. 509 00:16:53,210 --> 00:16:55,030 What are the corner cases? 510 00:16:55,030 --> 00:16:57,890 So like I'm a strong believer that you need KPIs. 511 00:16:57,890 --> 00:16:59,649 Just like if you drive a car you need to have 512 00:16:59,649 --> 00:17:01,690 a speedometer knowing how fast you drive. 513 00:17:01,690 --> 00:17:03,969 But whenever you start having one KPI, 514 00:17:03,969 --> 00:17:05,490 you can get blindsided. 515 00:17:05,490 --> 00:17:07,850 You can start gaming the KPI. 516 00:17:07,850 --> 00:17:09,330 So KPI is a data point, 517 00:17:09,330 --> 00:17:10,609 and I think it's imperative 518 00:17:10,609 --> 00:17:13,150 for a board member to report objective, 519 00:17:13,150 --> 00:17:14,809 trusted KPIs of how you're doing and 520 00:17:14,809 --> 00:17:16,569 how your peers are doing and showing, 521 00:17:16,569 --> 00:17:17,750 just like George spoke. 522 00:17:17,750 --> 00:17:19,529 The downside if an event happens, 523 00:17:19,529 --> 00:17:21,230 here's how much money you can lose. 524 00:17:21,230 --> 00:17:23,589 But having said that, 525 00:17:23,589 --> 00:17:26,030 you also need to articulate a strategy. 526 00:17:26,030 --> 00:17:27,210 Here's my strategy. 527 00:17:27,210 --> 00:17:29,370 Here's where we're going. Here's what I know. 528 00:17:29,370 --> 00:17:32,870 Here's what I don't know a single KPI very valuable, 529 00:17:32,870 --> 00:17:35,910 but it's not going to always show the full picture. 530 00:17:35,910 --> 00:17:37,230 It's all about the context. 531 00:17:37,230 --> 00:17:38,630 And we talk about having 532 00:17:38,630 --> 00:17:40,289 the appropriate skill sets for 533 00:17:40,289 --> 00:17:42,050 AI and upskilling and reskilling and so forth. 534 00:17:42,050 --> 00:17:43,550 What about for the cyber function? 535 00:17:43,550 --> 00:17:44,769 Is the talent available to 536 00:17:44,769 --> 00:17:46,090 execute on this I believe, what is there? 537 00:17:46,090 --> 00:17:48,990 Half a million jobs need to be filled in cyber roles. 538 00:17:48,990 --> 00:17:51,029 So so I think that as I mentioned earlier, 539 00:17:51,029 --> 00:17:52,490 I'm going to keep talking about this. 540 00:17:52,490 --> 00:17:54,470 It is our moment to lean in. 541 00:17:54,470 --> 00:17:57,210 I know that a lot of cyber professionals, 542 00:17:57,210 --> 00:17:59,310 they become like fanatics. 543 00:17:59,310 --> 00:18:03,370 I love the Mets, so they become fans of us. 544 00:18:03,370 --> 00:18:06,290 They become fans of tools, of solutions. 545 00:18:06,290 --> 00:18:08,150 I think we have to fall in love with the problem 546 00:18:08,150 --> 00:18:10,610 again and then leverage the latest techniques, 547 00:18:10,610 --> 00:18:12,550 the latest tools that we have in place. 548 00:18:12,550 --> 00:18:14,889 And today the Lean In is truly with 549 00:18:14,889 --> 00:18:17,550 AI enable Agentic security. 550 00:18:17,550 --> 00:18:20,129 Process enablement is what we need to lean into, 551 00:18:20,129 --> 00:18:22,010 because it's not going to be enough to 552 00:18:22,010 --> 00:18:24,549 leverage our same techniques and tools that we 553 00:18:24,549 --> 00:18:26,170 had in the past with 554 00:18:26,170 --> 00:18:29,190 the attackers and what they're using today. 555 00:18:29,190 --> 00:18:30,710 We have to lean in. So that's 556 00:18:30,710 --> 00:18:32,590 my thesis from now moving forward. 557 00:18:32,590 --> 00:18:33,739 I would also say that it doesn't 558 00:18:33,739 --> 00:18:34,920 necessarily have to be like 559 00:18:34,920 --> 00:18:38,839 a hardware software role that you go into cyber, 560 00:18:38,839 --> 00:18:41,380 you know, whether it's legal, whether it's policy, 561 00:18:41,380 --> 00:18:43,900 um, risk management, anything. 562 00:18:43,900 --> 00:18:45,500 It could be a reporter on cybersecurity 563 00:18:45,500 --> 00:18:46,740 and raising awareness. 564 00:18:46,740 --> 00:18:48,180 So it doesn't necessarily have to be like 565 00:18:48,180 --> 00:18:50,099 a plug and play role either, 566 00:18:50,099 --> 00:18:51,820 which I think opens up more opportunities. 567 00:18:51,820 --> 00:18:53,760 It's a multi-dimensional role. 568 00:18:53,760 --> 00:18:56,260 Cyber risk today and cybersecurity is 569 00:18:56,260 --> 00:18:59,120 what operational risk management was 15 years ago. 570 00:18:59,120 --> 00:19:01,980 It cuts across everything in the organization. 571 00:19:01,980 --> 00:19:04,540 So the capabilities, I would say, 572 00:19:04,540 --> 00:19:06,980 for managing a cyber career 573 00:19:06,980 --> 00:19:09,420 and a function that's effective within organizations, 574 00:19:09,420 --> 00:19:11,339 cuts across all components 575 00:19:11,339 --> 00:19:13,920 of a modern organization. Yeah. 576 00:19:13,920 --> 00:19:15,260 I was going to add to that. 577 00:19:15,260 --> 00:19:17,500 So to me, this is not just my passion. 578 00:19:17,500 --> 00:19:18,740 It's also what I enjoy doing, 579 00:19:18,740 --> 00:19:20,639 but also it's my profession, 580 00:19:20,639 --> 00:19:22,000 not just to me. 581 00:19:22,000 --> 00:19:23,640 It's like being a doctor or a lawyer. 582 00:19:23,640 --> 00:19:26,180 I think we need at least five years 583 00:19:26,180 --> 00:19:29,540 of experience working in it. 584 00:19:29,540 --> 00:19:31,874 The OSI model infrastructure, 585 00:19:31,874 --> 00:19:34,290 Understanding networks and protocols. 586 00:19:34,290 --> 00:19:35,690 How data flows across your network 587 00:19:35,690 --> 00:19:36,970 in and out, North, south, east, 588 00:19:36,970 --> 00:19:38,890 west and then understanding that part 589 00:19:38,890 --> 00:19:40,270 because that's what we're really protecting, right. 590 00:19:40,270 --> 00:19:41,010 That data. 591 00:19:41,010 --> 00:19:43,369 And then we need at 592 00:19:43,369 --> 00:19:45,710 the same time to start learning about cybersecurity. 593 00:19:45,710 --> 00:19:46,769 I think a lot of people want to rush 594 00:19:46,769 --> 00:19:47,809 into cybersecurity saying, hey, 595 00:19:47,809 --> 00:19:48,930 I want to learn cybersecurity in 596 00:19:48,930 --> 00:19:51,070 a couple a couple of weeks or a couple of months. 597 00:19:51,070 --> 00:19:53,290 It's I love the ambition, 598 00:19:53,290 --> 00:19:54,450 but I think they need to get with 599 00:19:54,450 --> 00:19:56,130 the fundamentals down at the same time, 600 00:19:56,130 --> 00:19:58,629 as a company, when you get the fundamentals down, 601 00:19:58,629 --> 00:19:59,790 how are we going to protect that data? 602 00:19:59,790 --> 00:20:02,510 Right. Patching vulnerability management. 603 00:20:02,510 --> 00:20:04,730 Who's touching your data access controls. 604 00:20:04,730 --> 00:20:06,010 Principle of least privilege. 605 00:20:06,010 --> 00:20:07,610 These things need to be practiced 606 00:20:07,610 --> 00:20:09,530 and preached to entire organizations. 607 00:20:09,530 --> 00:20:11,849 They know why we're doing it, how we're doing it, 608 00:20:11,849 --> 00:20:14,670 and making sure that we we can lead by example. 609 00:20:14,670 --> 00:20:16,370 All right. Appreciate everyone's insight. 610 00:20:16,370 --> 00:20:17,610 Thanks for joining us on trade talks. 611 00:20:17,610 --> 00:20:18,850 And thanks for joining me for market site. 612 00:20:18,850 --> 00:20:22,450 I'm Joe Malandrino, global markets reporter at Nasdaq.