1 00:00:07,200 --> 00:00:09,439 Welcome to Nasdaq trade talks, 2 00:00:09,439 --> 00:00:12,519 where we meet with the top thought leaders and strategists in emerging technologies, 3 00:00:12,519 --> 00:00:15,340 digital assets and regulatory landscape, and capital markets. 4 00:00:15,340 --> 00:00:16,600 I'm your host, Jill Malandrino, 5 00:00:16,600 --> 00:00:18,739 and joining us this afternoon, we have Ranjan, 6 00:00:18,739 --> 00:00:20,380 CEO of sardine AI, 7 00:00:20,380 --> 00:00:23,100 as well as Joe Robinson, CEO of hummingbird. 8 00:00:23,100 --> 00:00:26,800 We're here to discuss the automated future of compliance. 9 00:00:26,800 --> 00:00:28,360 It is great to have the both of you with us. 10 00:00:28,360 --> 00:00:29,660 Welcome to Trade Talks. 11 00:00:29,660 --> 00:00:31,300 So let's kick it off with you. 12 00:00:31,300 --> 00:00:36,780 Let's discuss why we're seeing such a rise in AI fraud incidences. 13 00:00:36,780 --> 00:00:42,780 Yeah. No. Absolutely. So one of the fundamental reasons is that when internet was built, 14 00:00:42,780 --> 00:00:45,760 no one actually built sender verification. 15 00:00:45,760 --> 00:00:47,220 So you could originate a phone call or 16 00:00:47,220 --> 00:00:50,840 a text message and pretend to be a bank by spoofing the phone number. 17 00:00:50,840 --> 00:00:52,640 And no one is actually going to test if 18 00:00:52,640 --> 00:00:54,700 that phone number actually truly belongs to the bank. 19 00:00:54,700 --> 00:00:58,820 That's the fundamental reason the entire financial system is broken. 20 00:00:58,820 --> 00:01:03,130 Uh, is built on a broken fundamental Into the architecture of the internet. 21 00:01:03,130 --> 00:01:06,150 However, that doesn't mean that we can't 22 00:01:06,150 --> 00:01:09,650 actually stop these deep fakes or these scam attacks. 23 00:01:09,650 --> 00:01:12,850 That is where sardine plays front and center. 24 00:01:12,850 --> 00:01:15,129 We are constantly stopping, you know, 25 00:01:15,129 --> 00:01:20,950 AI based deepfake attacks during ID verification or whenever somebody is 26 00:01:20,950 --> 00:01:27,110 making a transaction at a bank under the guise of a social engineering scam artist. 27 00:01:27,110 --> 00:01:33,170 Then we are detecting if the transaction is being made by a bad actor who's, you know, 28 00:01:33,170 --> 00:01:34,670 controlling the victim's screen, 29 00:01:34,670 --> 00:01:36,989 or is the bad actor making instructions on 30 00:01:36,989 --> 00:01:39,710 a phone call to the victim and they're making a transaction? 31 00:01:39,710 --> 00:01:44,070 So yeah, we can still stop these AI scams and deepfakes. 32 00:01:44,070 --> 00:01:46,189 So I think part of the challenge is as well, 33 00:01:46,189 --> 00:01:51,670 is the fragmentation sometimes that's associated with compliance as it relates to silos. 34 00:01:51,670 --> 00:01:54,310 And I think that's a really big challenge that needs to be addressed 35 00:01:54,310 --> 00:01:58,890 here. Oh, you're on mute. 36 00:01:58,890 --> 00:02:02,804 Thank you. Um, you think about the perspective of 37 00:02:02,804 --> 00:02:06,420 the folks trying to fight the crimes inside the financial institutions, 38 00:02:06,420 --> 00:02:08,819 the banks, credit unions, payment companies, 39 00:02:08,819 --> 00:02:13,100 they can generally see the activity that's occurred within their own institution. 40 00:02:13,100 --> 00:02:14,800 But criminals are smart, right? 41 00:02:14,800 --> 00:02:19,619 They're going to blend activities and illegal behavior across institutions, 42 00:02:19,619 --> 00:02:22,440 across different payments, across different channels. 43 00:02:22,440 --> 00:02:26,880 And so that results in the information silo that you're referring to. 44 00:02:26,880 --> 00:02:30,059 It's one of the key challenges in financial crime enforcement 45 00:02:30,059 --> 00:02:34,500 is our heroes of the financial industry are always kind of operating with, 46 00:02:34,500 --> 00:02:38,520 um, you know, one side of the story or sometimes even less than that. 47 00:02:38,520 --> 00:02:40,340 Yeah. And part of the challenge, too, is, 48 00:02:40,340 --> 00:02:42,979 is that you have the good actors and the bad actors employing some of 49 00:02:42,979 --> 00:02:46,260 the same types of technologies in real time. 50 00:02:46,260 --> 00:02:52,099 Yeah. I know the the same technology that makes our lives easier and 51 00:02:52,099 --> 00:02:53,680 gives customers or consumers 52 00:02:53,680 --> 00:02:57,520 a good user experience can be exploited by bad actors, right? 53 00:02:57,520 --> 00:03:02,099 So, for example, for online identity verification using, uh, 54 00:03:02,099 --> 00:03:04,689 video selfie technology, you could actually 55 00:03:04,689 --> 00:03:07,450 now open a bank account online without showing up to a bank. 56 00:03:07,450 --> 00:03:10,610 But that also means that bad actors have at their disposal 57 00:03:10,610 --> 00:03:14,770 the same technology and they can pass like a using face swapping. 58 00:03:14,770 --> 00:03:17,130 They can pass themselves off as as me or Joe 59 00:03:17,130 --> 00:03:19,710 and try to open a bank account pretending to be us. 60 00:03:19,710 --> 00:03:23,249 However, other technologies like sardine, 61 00:03:23,249 --> 00:03:25,129 we're always staying one step ahead of the bad actors, 62 00:03:25,129 --> 00:03:28,289 and we can still detect these deepfakes on the basis 63 00:03:28,289 --> 00:03:31,589 of the fact that when you're injecting a deepfake video, 64 00:03:31,589 --> 00:03:33,649 you're taking control of the webcam, 65 00:03:33,649 --> 00:03:36,610 or you're doing things like you're doing a video selfie, 66 00:03:36,610 --> 00:03:38,090 but the phone is face down. 67 00:03:38,090 --> 00:03:44,190 So we go into these deep behavior insights to tell you what is real versus fake. 68 00:03:44,190 --> 00:03:47,029 We actually find that criminals are 69 00:03:47,029 --> 00:03:50,130 often some of the most creative technologists in the world. 70 00:03:50,130 --> 00:03:51,470 They'll try anything. 71 00:03:51,470 --> 00:03:52,830 It's worth it to them, right? 72 00:03:52,830 --> 00:03:54,729 If they can get, uh, 73 00:03:54,729 --> 00:03:58,809 particular fraud scheme to work or a money laundering channel or something like that, 74 00:03:58,809 --> 00:04:00,989 the hours and hours are, you know, 75 00:04:00,989 --> 00:04:04,130 the thousands of people that might have working on these crimes. 76 00:04:04,130 --> 00:04:07,319 It pays off for them to be very early adopters of 77 00:04:07,319 --> 00:04:11,820 new technology before controls and protections actually pick up. So yeah. 78 00:04:11,820 --> 00:04:13,800 That's a good point that you you bring up Joe, 79 00:04:13,800 --> 00:04:17,740 because a number of these bad actors are very well funded as well, 80 00:04:17,740 --> 00:04:19,840 especially when you're thinking about nation states and so forth. 81 00:04:19,840 --> 00:04:25,380 So what would be an example of compliance AI in action? 82 00:04:25,380 --> 00:04:29,760 Absolutely. There's different ways to apply AI that are very fruitful here. 83 00:04:29,760 --> 00:04:33,100 So you know, there are things on customer diligence using AI 84 00:04:33,100 --> 00:04:36,720 to better understand the counterparties of transactions, 85 00:04:36,720 --> 00:04:38,660 the customer that owns an account, 86 00:04:38,660 --> 00:04:41,720 things like that, their superior detection mechanisms, 87 00:04:41,720 --> 00:04:45,520 which suits touch on a little bit about how do you detect deepfakes, 88 00:04:45,520 --> 00:04:49,320 how do you detect selfies and bad behavior and things like that. 89 00:04:49,320 --> 00:04:53,239 Um, and then in the operation procedures of investigations, 90 00:04:53,239 --> 00:04:57,619 there's lots of ways to use AI to speed up information gathering, 91 00:04:57,619 --> 00:05:00,240 to speed up collaboration with law enforcement so they 92 00:05:00,240 --> 00:05:02,920 can actually go after and prosecute the crimes. 93 00:05:02,920 --> 00:05:05,774 Um, and just generally to ensure that 94 00:05:05,774 --> 00:05:09,670 the investigations are leveraging the best data that's out there in the market. 95 00:05:09,670 --> 00:05:13,790 So and leveraging a technology like AI is 96 00:05:13,790 --> 00:05:18,250 what allows continuous compliance to to work basically. 97 00:05:18,250 --> 00:05:19,690 Yeah, 100%. 98 00:05:19,690 --> 00:05:24,390 So we've built technologies to automate onboarding reviews or sanctions reviews, 99 00:05:24,390 --> 00:05:28,370 negative news reviews or customer due diligence, etc.. 100 00:05:28,370 --> 00:05:30,330 If you think about it, many of the banks, 101 00:05:30,330 --> 00:05:36,250 they at least spend 30% of their budget on compliance operations and using AI. 102 00:05:36,250 --> 00:05:39,729 We are seeing that 95% of onboarding can be automated, 103 00:05:39,729 --> 00:05:43,570 55 to 75% of sanctions reviews can be automated. 104 00:05:43,570 --> 00:05:45,989 So these are pieces of technology which can help banks 105 00:05:45,989 --> 00:05:49,190 become more efficient in their operational. 106 00:05:49,190 --> 00:05:53,709 Are you seeing silos being broken down because of 107 00:05:53,709 --> 00:05:57,910 this type of technology where you're able to have different business units? 108 00:05:57,910 --> 00:06:00,409 As an example, um, you know, 109 00:06:00,409 --> 00:06:03,409 your crime team or your anti-money laundering team, 110 00:06:03,409 --> 00:06:06,805 your risk and financial operations and so forth. 111 00:06:06,805 --> 00:06:11,560 are these types of technologies allowing a bird's eye view across the enterprise? 112 00:06:11,560 --> 00:06:14,359 Yeah, actually highlighting a really important point, 113 00:06:14,359 --> 00:06:17,220 Joe, which is within a particular institution, 114 00:06:17,220 --> 00:06:19,899 there might be multiple teams in different practice areas 115 00:06:19,899 --> 00:06:22,779 that are doing the same investigative work, 116 00:06:22,779 --> 00:06:24,280 the same diligence work. 117 00:06:24,280 --> 00:06:25,939 And so, yeah, um, 118 00:06:25,939 --> 00:06:28,660 one of our focus areas is essentially to break down 119 00:06:28,660 --> 00:06:31,660 those barriers and make sure that every team has 120 00:06:31,660 --> 00:06:34,680 the full extent of prior investigation work they might have done 121 00:06:34,680 --> 00:06:37,880 on a particular typology on a subject. 122 00:06:37,880 --> 00:06:41,140 Um. And increasingly we're seeing cross institution, 123 00:06:41,140 --> 00:06:42,420 uh, sharing as well. 124 00:06:42,420 --> 00:06:45,419 So banks working with the fintech partners that they 125 00:06:45,419 --> 00:06:48,859 bring on banks working with each other through, uh, 126 00:06:48,859 --> 00:06:52,899 sections of regulation that allow them to collaborate on investigations, uh, 127 00:06:52,899 --> 00:06:55,459 and banks and public and private sector working 128 00:06:55,459 --> 00:06:58,159 together on some of the investigation work through these, 129 00:06:58,159 --> 00:07:00,320 these data and investigative procedures. 130 00:07:00,320 --> 00:07:04,220 Yeah, I think it's a key point that you brought up and not just in finance in particular, 131 00:07:04,220 --> 00:07:05,940 but I think public private partnerships, 132 00:07:05,940 --> 00:07:08,260 you could even include academia in there as well. 133 00:07:08,260 --> 00:07:11,339 These technologies are going to be influential when it comes to 134 00:07:11,339 --> 00:07:14,360 thinking about reregulation and policy making, 135 00:07:14,360 --> 00:07:16,300 as you look forward to the future. 136 00:07:16,300 --> 00:07:20,219 Yeah, 100%. So if you really think about it, 137 00:07:20,219 --> 00:07:23,740 you know, the BSA, AML regulations, right? 138 00:07:23,740 --> 00:07:26,400 They were written like several decades ago and they 139 00:07:26,400 --> 00:07:29,520 need to probably evolve with the changing times, right? 140 00:07:29,520 --> 00:07:31,519 Um, and for instance, 141 00:07:31,519 --> 00:07:34,499 when we are filing SARS suspicious activity reports, 142 00:07:34,499 --> 00:07:36,959 it goes into the ether and no information comes 143 00:07:36,959 --> 00:07:39,540 back to the filing entity as to what was actually done with it. 144 00:07:39,540 --> 00:07:42,019 Right. So what, you know, uh, 145 00:07:42,019 --> 00:07:45,440 the these new pieces of technology are just enabling us 146 00:07:45,440 --> 00:07:49,160 to get more and more efficient when it comes to even transaction monitoring alerts. 147 00:07:49,160 --> 00:07:50,800 You know, when you're generating 148 00:07:50,800 --> 00:07:53,840 all these transaction monitoring alerts or compliance alerts, 149 00:07:53,840 --> 00:07:58,360 you don't have to hire like as high as huge a team to review them, 150 00:07:58,360 --> 00:08:00,259 because I can help you, 151 00:08:00,259 --> 00:08:02,080 you know, reduce the false positives. 152 00:08:02,080 --> 00:08:05,519 And then the next iteration would be let's work with the regulators 153 00:08:05,519 --> 00:08:07,439 into having some sort of a back and forth with 154 00:08:07,439 --> 00:08:09,490 them about what happened with the SARS? Right. 155 00:08:09,490 --> 00:08:12,630 Right. So I think that's an excellent point that you bring up, 156 00:08:12,630 --> 00:08:17,210 because sometimes the function with compliance can feel as if you're checking the box. 157 00:08:17,210 --> 00:08:19,730 Right. But then to your point, what happens with it? 158 00:08:19,730 --> 00:08:23,550 What have we learned? How do we mitigate that risk going forward? How do we detect it? 159 00:08:23,550 --> 00:08:27,169 How do we employ agentic AI so that we can, you know, 160 00:08:27,169 --> 00:08:30,329 predict or recognize behavioral patterns, 161 00:08:30,329 --> 00:08:32,149 whether for good or bad actors, 162 00:08:32,149 --> 00:08:33,930 to mitigate this risk? 163 00:08:33,930 --> 00:08:36,230 I feel, you know, just checking the boxes over the past number 164 00:08:36,230 --> 00:08:39,290 of decades has hasn't been very useful. 165 00:08:39,290 --> 00:08:42,190 Yeah, that's that's that's an accurate statement. 166 00:08:42,190 --> 00:08:44,850 And I think the outside of AI, 167 00:08:44,850 --> 00:08:48,230 there's other new forms of technology that have come out right, 168 00:08:48,230 --> 00:08:50,070 like zero knowledge proofs, etc., 169 00:08:50,070 --> 00:08:54,329 which can be used in cases like this without for, 170 00:08:54,329 --> 00:08:56,030 you know, for the, 171 00:08:56,030 --> 00:09:00,090 the financial regulators to share information back with the, with the banks, 172 00:09:00,090 --> 00:09:02,269 even though they don't need to necessarily say that, 173 00:09:02,269 --> 00:09:04,190 hey, I made an arrest or what have you. 174 00:09:04,190 --> 00:09:05,530 And but you can at least, 175 00:09:05,530 --> 00:09:08,230 you know, if a bank could query, uh, 176 00:09:08,230 --> 00:09:10,280 FinCEN and have some idea that, 177 00:09:10,280 --> 00:09:12,860 hey, what happened with the SA that I filed? 178 00:09:12,860 --> 00:09:14,700 That would be a step in the right direction? 179 00:09:14,700 --> 00:09:15,900 Yeah, it certainly would be. 180 00:09:15,900 --> 00:09:18,159 And Joe, when you think about, um, 181 00:09:18,159 --> 00:09:22,440 early adopters and employing or building models and just, you know, 182 00:09:22,440 --> 00:09:26,840 a compliance, um, cyber first mindset across the enterprise. 183 00:09:26,840 --> 00:09:29,199 I think from a competitive perspective, 184 00:09:29,199 --> 00:09:31,700 that certainly is going to be an advantage as it relates 185 00:09:31,700 --> 00:09:34,520 to scalability and not falling behind. 186 00:09:34,520 --> 00:09:37,300 I think at the end of the day, it'll be more cost effective. 187 00:09:37,300 --> 00:09:38,919 And certainly, you know, 188 00:09:38,919 --> 00:09:41,680 cement your place in in the competitive landscape. 189 00:09:41,680 --> 00:09:43,060 What are your thoughts around that? 190 00:09:43,060 --> 00:09:45,980 There there are direct advantages to keeping your costs of 191 00:09:45,980 --> 00:09:48,980 compliance down through efficiency and operations. 192 00:09:48,980 --> 00:09:51,299 And then there are indirect advantages as well, 193 00:09:51,299 --> 00:09:56,220 where regulatory issues can be a huge issue that can sink the ship later on. 194 00:09:56,220 --> 00:09:59,619 So, um, keeping both of those in mind as you're investing in 195 00:09:59,619 --> 00:10:03,940 better compliance technology and better anti-financial crime technology generally, 196 00:10:03,940 --> 00:10:07,620 um, both of those factors can affect the bottom line. 197 00:10:07,620 --> 00:10:11,610 How are these technologies Technology is deployable at this point. 198 00:10:11,610 --> 00:10:14,730 Are you starting to see more adoption come online, Joe? 199 00:10:14,730 --> 00:10:18,450 Absolutely. We've actually seen a real surge in use of 200 00:10:18,450 --> 00:10:22,750 AI for investigative work over the first two quarters of this year. 201 00:10:22,750 --> 00:10:25,770 There's a lot of chatter about that since 2023, 202 00:10:25,770 --> 00:10:28,770 but financial institutions tend to be more conservative 203 00:10:28,770 --> 00:10:32,170 about adopting new technology for for good reasons. 204 00:10:32,170 --> 00:10:33,769 But over the last two quarters, 205 00:10:33,769 --> 00:10:38,350 we've seen a lot of exciting development with our own products and with in the market 206 00:10:38,350 --> 00:10:43,210 on how people are applying AI to increase their investigation throughput. 207 00:10:43,210 --> 00:10:45,970 So the communication of information to law enforcement, 208 00:10:45,970 --> 00:10:49,770 as well as the quality and depth of the work that they're able to do. 209 00:10:49,770 --> 00:10:52,250 Yeah, I would imagine you're seeing the same thing as well. 210 00:10:52,250 --> 00:10:54,789 And also when you think about your outlook, you know, 211 00:10:54,789 --> 00:10:57,250 to Joe's point, 2023, 212 00:10:57,250 --> 00:10:58,690 certainly there was a ramp up there. 213 00:10:58,690 --> 00:11:00,769 And of course, you know, generative AI became part of 214 00:11:00,769 --> 00:11:03,150 the consumer vernacular in November 2022, right. 215 00:11:03,150 --> 00:11:07,330 With ChatGPT and all of these new products were pushed out, 216 00:11:07,330 --> 00:11:09,270 whether it's internal or external, 217 00:11:09,270 --> 00:11:12,040 to keep up with the competitive landscape. Escape? 218 00:11:12,040 --> 00:11:15,899 Yeah. No. Absolutely. So what we are finding is that, you know, um, 219 00:11:15,899 --> 00:11:19,460 our agentic AI technologies are now, uh, 220 00:11:19,460 --> 00:11:22,120 live with close to 20 of our customers, 221 00:11:22,120 --> 00:11:24,179 which includes fintechs, banks, etc. 222 00:11:24,179 --> 00:11:28,559 and they're using it day in and day out to do to use the agent A.I. 223 00:11:28,559 --> 00:11:31,540 as a copilot to do sanctions reviews or even in 224 00:11:31,540 --> 00:11:35,679 fully autonomous mode to do an onboarding review or, you know, 225 00:11:35,679 --> 00:11:39,619 doing a transaction flow to screen the counterparties or to do 226 00:11:39,619 --> 00:11:43,720 what we call an Osint or open source intelligence search, where, you know, 227 00:11:43,720 --> 00:11:45,259 you look up the counterparty, 228 00:11:45,259 --> 00:11:47,000 for example, I'm a restaurant, 229 00:11:47,000 --> 00:11:48,100 but I am, you know, 230 00:11:48,100 --> 00:11:50,759 uh, making a huge payment to, 231 00:11:50,759 --> 00:11:52,760 uh, to an auto parts supplier. 232 00:11:52,760 --> 00:11:54,180 So that doesn't make sense, right? 233 00:11:54,180 --> 00:12:00,560 So those types of patterns can be explained and inferred much better by using Agentic AI. 234 00:12:00,560 --> 00:12:02,640 All right. I appreciate both of your insights. 235 00:12:02,640 --> 00:12:04,040 Thanks for joining us on trade talks. 236 00:12:04,040 --> 00:12:07,100 I'm Jill Malandrino, global markets reporter at Nasdaq. 237 00:12:07,100 --> 00:12:08,460 Thanks for having us Jill. 238 00:12:08,460 --> 00:12:10,040 Thank you.