WEBVTT 1 00:00:00.120 --> 00:00:02.700 Anna Delaney: Hello, thanks for joining us for Poof of Concept. 2 00:00:02.730 --> 00:00:05.910 This is the ISMG talk show where we analyze today's and 3 00:00:05.910 --> 00:00:09.150 tomorrow's cybersecurity challenges with industry experts 4 00:00:09.300 --> 00:00:12.570 and discuss how we can potentially solve them. I'm Anna 5 00:00:12.570 --> 00:00:14.880 Delaney, director of productions at ISMG. 6 00:00:15.600 --> 00:00:17.370 Tom Field: I'm Tom Field. I'm senior vice president of 7 00:00:17.370 --> 00:00:20.400 Editorial with Information Security Media Group. Anna, it's 8 00:00:20.400 --> 00:00:22.650 the first time we've had a chance to speak since the regime 9 00:00:22.650 --> 00:00:23.310 change in the U.K. 10 00:00:24.360 --> 00:00:27.240 Anna Delaney: Indeed, yes. Big news here. 11 00:00:29.580 --> 00:00:31.530 Tom Field: A new monarch and a new prime minister all in the 12 00:00:31.530 --> 00:00:32.010 same week. 13 00:00:33.390 --> 00:00:37.170 Anna Delaney: So I'm glad you've been following the news. And we 14 00:00:37.170 --> 00:00:39.060 get an extra bank holiday, Tom. 15 00:00:39.510 --> 00:00:43.080 Tom Field: Yes. That was noted. I was actually surprised we got 16 00:00:43.080 --> 00:00:43.950 away with only one. 17 00:00:45.000 --> 00:00:48.210 Anna Delaney: Yeah, well, next year: coronation. But what else 18 00:00:48.210 --> 00:00:49.110 is on your mind, Tom? 19 00:00:49.450 --> 00:00:53.260 Tom Field: Oh, my! Well, let me go from the serious to the 20 00:00:53.260 --> 00:00:57.220 serious. I'm thinking an awful lot about - and here we are 21 00:00:57.220 --> 00:01:02.170 months on end - what's happening in Ukraine still and Russia's 22 00:01:02.170 --> 00:01:05.980 activity. Now in the news, we're watching Ukraine having some 23 00:01:05.980 --> 00:01:11.200 success and repelling some of the kinetic attacks. But still, 24 00:01:11.230 --> 00:01:15.100 we're seeing and learning more about the repercussions of the 25 00:01:15.100 --> 00:01:18.490 cyber attacks. Let me just stop for a second. What are your 26 00:01:18.490 --> 00:01:21.040 takeaways from what we've seen over the past six months or so? 27 00:01:22.320 --> 00:01:24.900 Anna Delaney: Well, my worrying takeaway is this is only the 28 00:01:24.900 --> 00:01:28.620 beginning. We're seeing the impact, as you say it, in the 29 00:01:28.620 --> 00:01:32.430 cybersecurity world, but also on the global supply chain and cost 30 00:01:32.430 --> 00:01:36.780 increases and product and food shortages. But then what about 31 00:01:36.930 --> 00:01:39.870 the future? What's going to happen to future cooperation on 32 00:01:39.870 --> 00:01:44.130 wider, more important issues like arms control, 33 00:01:44.310 --> 00:01:47.760 cybersecurity, also climate change and nuclear issues, 34 00:01:47.760 --> 00:01:52.890 energy security, the whole political solutions elsewhere in 35 00:01:52.890 --> 00:01:56.670 the world? I think the whole diplomatic makeup of the world 36 00:01:56.670 --> 00:02:01.230 has changed. And then, well, and then there's China. So obviously 37 00:02:01.230 --> 00:02:05.940 Russia is forming ties with China now and in isolation, what 38 00:02:05.940 --> 00:02:08.220 will the impact be on cybersecurity there? 39 00:02:08.870 --> 00:02:13.040 Tom Field: You know, it used to be that wiper malware and 40 00:02:13.040 --> 00:02:18.290 disinformation attacks and supply chain disruption were the 41 00:02:18.290 --> 00:02:22.670 exception. Now they're becoming the rule in this world. And I 42 00:02:22.670 --> 00:02:26.300 think what we're getting as a result of what Russia is doing 43 00:02:26.300 --> 00:02:32.390 in Ukraine is now these elements of attack are becoming as 44 00:02:32.390 --> 00:02:36.980 normalized as the flinging of Molotov cocktails used to be, as 45 00:02:37.190 --> 00:02:41.450 localized bombings used to be, the difference being, you can 46 00:02:42.380 --> 00:02:48.290 issue wiper malware, do a disinformation campaign, or 47 00:02:48.320 --> 00:02:53.300 disrupt a supply chain without any risk of personal harm, maybe 48 00:02:53.300 --> 00:02:57.200 even without risk of personal accountability. And so I think 49 00:02:57.770 --> 00:03:01.880 what sort of snuck up on us is the exception has now become the 50 00:03:01.880 --> 00:03:04.370 rule. And I think this is something we're going to be 51 00:03:04.370 --> 00:03:06.110 learning to deal with for quite some time. 52 00:03:06.750 --> 00:03:08.190 Anna Delaney: Yeah, that's a really good way of putting it. 53 00:03:08.190 --> 00:03:13.830 It seems chaotic. And I wonder if we got the U.S. midterms 54 00:03:13.830 --> 00:03:17.100 coming up, if they'll also be interfered and meddled with as a 55 00:03:17.100 --> 00:03:18.840 result of these increasing tensions. 56 00:03:18.990 --> 00:03:21.930 Tom Field: We have to expect it. I think that's just become as 57 00:03:21.930 --> 00:03:27.450 much a part of the political process as putting paper signs 58 00:03:27.450 --> 00:03:32.310 out on the vacant lot. That yes, I think we have to expect that 59 00:03:32.310 --> 00:03:36.210 someone is going to try to interfere with someone else's 60 00:03:36.210 --> 00:03:39.690 thought process and disrupt the elections one way or another. 61 00:03:40.020 --> 00:03:41.550 That's just the landscape now. 62 00:03:42.150 --> 00:03:44.310 Anna Delaney: Well, perhaps our guests today will share some 63 00:03:44.310 --> 00:03:46.500 thoughts on what they think will happen there. 64 00:03:46.540 --> 00:03:49.960 Tom Field: They'll solve everything. No high expectation, 65 00:03:49.960 --> 00:03:50.890 we just solve everything. 66 00:03:50.990 --> 00:03:53.030 Anna Delaney: Yeah. Well, this is Proof of Concept. So yeah, 67 00:03:53.120 --> 00:03:56.000 absolutely. I'm going to introduce our first guest, Tom, 68 00:03:56.330 --> 00:04:01.160 Ari Redbord. Please come to the stage. Welcoming, Ari Redbord, 69 00:04:01.160 --> 00:04:04.370 head of legal and government affairs at TRM Labs. Always a 70 00:04:04.370 --> 00:04:05.030 pleasure, Ari. 71 00:04:05.270 --> 00:04:08.420 Ari Redbord: It's really nice to be here. And I love these 72 00:04:08.420 --> 00:04:09.680 opportunities to chat. 73 00:04:10.440 --> 00:04:13.440 Anna Delaney: So let's talk about NFTs. Cybercriminals have 74 00:04:13.440 --> 00:04:18.120 stolen over $100 million worth of NFTs over the past year. You 75 00:04:18.120 --> 00:04:21.540 conduct many interviews on this topic. So what do you feel is 76 00:04:21.540 --> 00:04:24.720 not being addressed? What's not being discussed at the moment 77 00:04:24.990 --> 00:04:26.490 that should have more airtime? 78 00:04:26.900 --> 00:04:28.910 Ari Redbord: Yeah, no, it's interesting. Look, I mean, NFTs 79 00:04:28.910 --> 00:04:32.360 are a great example of sort of emerging technology that has 80 00:04:32.360 --> 00:04:36.050 tremendous promise and opportunity, but also tremendous 81 00:04:36.050 --> 00:04:40.760 risks, like anything else. And regulators are really just 82 00:04:40.760 --> 00:04:45.410 starting to try to understand, try to craft, I think, 83 00:04:45.410 --> 00:04:49.520 thoughtful regulation for the NFT space. I think, to date, 84 00:04:49.580 --> 00:04:52.370 really the use case has been sort of, you know, art and 85 00:04:52.370 --> 00:04:57.020 collectibles. And, but really, the promise of this technology 86 00:04:57.140 --> 00:05:01.820 is this ability to hash anything to an immutable ledger, right? 87 00:05:01.820 --> 00:05:04.850 Which means that it's unchangeable, which means it's 88 00:05:04.850 --> 00:05:08.600 authentic. And I think that's extraordinary opportunity to 89 00:05:08.600 --> 00:05:13.430 hash your health records, right? And your title to your car or 90 00:05:13.430 --> 00:05:17.120 land, document review. So there's so much promise, but at 91 00:05:17.120 --> 00:05:20.300 the same time, anything that you can move at the speed of the 92 00:05:20.300 --> 00:05:25.220 Internet, cross border, value transfer, you know, come with 93 00:05:25.220 --> 00:05:27.380 risks. And I think that's what we're sort of dealing with right 94 00:05:27.380 --> 00:05:35.060 now. And at TRM, we trace and track the flow of funds in NFTs 95 00:05:35.060 --> 00:05:38.120 or we trace and track NFTs just like sort of cryptocurrency 96 00:05:38.120 --> 00:05:41.750 because they also live and move on blockchains. And what we've 97 00:05:41.750 --> 00:05:44.660 seen is sort of a number of different typologies develop 98 00:05:44.660 --> 00:05:48.470 around that. We've seen these sort of NFT rug polls, which is 99 00:05:48.470 --> 00:05:52.160 this sense where you're creating this excitement or FOMO, around 100 00:05:52.160 --> 00:05:56.300 an NFT drop. And then, you know, users put money in, and 101 00:05:56.300 --> 00:06:00.860 ultimately, the rug is pulled and the funds are essentially 102 00:06:00.860 --> 00:06:05.000 stolen. We've seen traditional money laundering, like wash 103 00:06:05.000 --> 00:06:10.850 trading in the NFT space. So you know that just a week or so ago, 104 00:06:10.850 --> 00:06:14.450 TRM did an investigation on the use of an NFT by an ISIS 105 00:06:14.450 --> 00:06:17.420 supporter - to be clear, not ISIS itself, but a supporter - 106 00:06:17.510 --> 00:06:21.470 who was creating an NFT for propaganda purposes. So we see 107 00:06:21.560 --> 00:06:24.710 sort of these emerging risks, as well as these great 108 00:06:24.710 --> 00:06:27.290 opportunities. So sort of just real quickly where regulators 109 00:06:27.290 --> 00:06:31.100 are, is really trying to figure this out, there's not a lot in 110 00:06:31.100 --> 00:06:33.710 sort of the regular NFT regulatory space the way there 111 00:06:33.710 --> 00:06:37.940 is in crypto. You can analogize and I think it's really helpful. 112 00:06:38.090 --> 00:06:40.610 One really interesting place to look is the U.S. Treasury 113 00:06:40.610 --> 00:06:43.940 Department. About a year, maybe a little less than a year ago, 114 00:06:44.120 --> 00:06:49.820 wrote a white paper on the risks of money laundering in the 115 00:06:49.820 --> 00:06:53.210 traditional art world, the high-value art world. And what 116 00:06:53.210 --> 00:06:57.290 they ultimately said was that, you know, it is low risk, 117 00:06:57.290 --> 00:06:59.870 because of just the friction associated with, you know, 118 00:06:59.900 --> 00:07:04.160 moving a Van Gogh, you know, to be a little bit glib, it's very 119 00:07:04.160 --> 00:07:08.150 hard to move, you know, actual physical art. But interestingly, 120 00:07:08.150 --> 00:07:10.760 they spent about, I want to say three or four pages, it's hard 121 00:07:10.760 --> 00:07:15.200 to tell online these days, three or four pages on sort of the 122 00:07:15.200 --> 00:07:19.640 risks potentially in the NFT space, because of that ability 123 00:07:19.700 --> 00:07:23.180 to transfer value at the speed of the Internet. And what that 124 00:07:23.180 --> 00:07:27.110 says to me having spent time at Treasury, specifically FinCEN, 125 00:07:27.440 --> 00:07:32.030 which I know is taking a hard look at this, is nothing is in a 126 00:07:32.030 --> 00:07:35.540 vacuum. So this sort of very first look in sort of a white 127 00:07:35.540 --> 00:07:38.810 paper context really says to me that sort of the major 128 00:07:38.810 --> 00:07:42.020 regulator, at least in the U.S. in this space is taking a look 129 00:07:42.020 --> 00:07:45.830 now at NFTs, and we're likely to see sort of more guidance, more 130 00:07:45.830 --> 00:07:49.070 regulation. But when you look globally, regulators so far, to 131 00:07:49.070 --> 00:07:53.270 date, have mostly took a pass on NFTs even MiCA, the markets and 132 00:07:53.270 --> 00:07:56.990 crypto assets regulation, which is this comprehensive framework 133 00:07:57.140 --> 00:08:00.980 out of the EU, where we saw an agreement this summer, still 134 00:08:00.980 --> 00:08:05.450 sort of punted on NFTs and we'll see how it all plays out. 135 00:08:06.440 --> 00:08:08.540 Anna Delaney: Any thoughts of your own as to how they might 136 00:08:08.690 --> 00:08:10.580 pan out? How this regulation might...? 137 00:08:11.230 --> 00:08:13.720 Ari Redbord: Yeah, I think what sort of - to get in the 138 00:08:13.750 --> 00:08:16.870 anti-money-laundering space, just specifically, there's all 139 00:08:16.870 --> 00:08:20.140 kinds of other interesting, I think, conversations around NFTs 140 00:08:20.140 --> 00:08:21.850 but kind of in the anti-money-laundering space. 141 00:08:22.750 --> 00:08:25.420 Oftentimes, you know, FATF (Financial Action Task Force), 142 00:08:25.420 --> 00:08:28.090 which is sort of the global standard setting body, provides 143 00:08:28.090 --> 00:08:31.060 a little bit of a preview of where regulators are going with 144 00:08:31.060 --> 00:08:34.450 this. And what FATF has said is, "Look, if you are an NFT 145 00:08:34.450 --> 00:08:38.020 collectible, then you are likely not a virtual asset," which 146 00:08:38.020 --> 00:08:40.750 means you don't have to have sort of compliance controls in 147 00:08:40.750 --> 00:08:46.540 place. But if you are used as an investment mechanism or payment, 148 00:08:46.690 --> 00:08:50.590 or as I say, you know, transfer value, then you very much could 149 00:08:50.590 --> 00:08:55.810 be a regulated asset, or, you know, if you're in a marketplace 150 00:08:55.840 --> 00:08:58.900 or an issuer, then you could be a regulated entity. And it's 151 00:08:58.900 --> 00:09:01.450 very possible that that is sort of a framework. I think the 152 00:09:01.450 --> 00:09:04.330 problem with that is, and I actually sat down with the 153 00:09:04.360 --> 00:09:07.780 chairs of the virtual asset contact group from FATF, and 154 00:09:07.780 --> 00:09:10.270 asked them just this and I said, "You know, but the problem is 155 00:09:10.270 --> 00:09:13.870 the use case today is very much as a collectible that you can 156 00:09:13.870 --> 00:09:17.500 transfer value or use as an investment, think Bored Apes or 157 00:09:18.100 --> 00:09:21.250 crypto punks. So I think that it's really going to have to try 158 00:09:21.250 --> 00:09:24.190 to figure out sort of like, what is the primary use case? And how 159 00:09:24.190 --> 00:09:27.610 do we regulate? But I think for whatever it's worth, sort of 160 00:09:27.610 --> 00:09:30.250 from my crystal ball, we're going to see regulation in this 161 00:09:30.250 --> 00:09:34.330 space. I really hope that there's a focus on sort of 162 00:09:34.330 --> 00:09:37.330 regulation that doesn't stifle innovation, because there's so 163 00:09:37.330 --> 00:09:41.140 much potential here. And the technology is so cool that we 164 00:09:41.140 --> 00:09:43.570 have to ensure that we're sort of still fostering that 165 00:09:43.570 --> 00:09:44.590 continued innovation. 166 00:09:45.050 --> 00:09:47.960 Anna Delaney: Yeah, for sure. I want to revisit a topic now that 167 00:09:47.960 --> 00:09:51.410 we've discussed at length, recently. The Tornado Cash 168 00:09:51.440 --> 00:09:55.520 sanctions: Now, OFAC this week has issued guidance on how U.S. 169 00:09:55.520 --> 00:09:59.180 persons can withdraw their funds from Tornado Cash. Can you just 170 00:09:59.180 --> 00:10:02.330 summarize the process because I believe applying for a license 171 00:10:02.570 --> 00:10:03.320 is involved. 172 00:10:03.750 --> 00:10:07.830 Ari Redbord: Sure, absolutely. First, what is Tornado Cash? No, 173 00:10:07.890 --> 00:10:13.710 I'm just kidding. All right. So really interesting and the 174 00:10:13.710 --> 00:10:16.710 cryptocurrency community for really the last month since 175 00:10:16.710 --> 00:10:19.650 August 8 when the sanctions came out, have been asking for 176 00:10:19.650 --> 00:10:25.170 guidance from OFAC. And then this is the response. And I can 177 00:10:25.170 --> 00:10:27.270 tell you, having worked really closely with really 178 00:10:27.270 --> 00:10:32.820 extraordinary civil servants at OFAC over the years, I imagine 179 00:10:32.820 --> 00:10:36.120 they were heads down really for the last three weeks ensuring 180 00:10:36.120 --> 00:10:40.350 that they were able to put something like this out. So I 181 00:10:40.350 --> 00:10:42.330 think that it's a good first step, I think there's probably 182 00:10:42.330 --> 00:10:45.810 more guidance needed. But just really quickly.. look, this is 183 00:10:45.810 --> 00:10:49.200 not different than any other types of sanctions, you know, in 184 00:10:49.200 --> 00:10:52.050 the traditional world as well. If you want to engage with a 185 00:10:52.050 --> 00:10:55.200 sanctioned entity - and it's hard to call Tornado Cash an 186 00:10:55.200 --> 00:10:57.510 entity, you know, under these circumstances - but if you want 187 00:10:57.510 --> 00:11:00.210 to engage with say someone on the SDN list, the sanctions 188 00:11:00.210 --> 00:11:04.140 list, you need a license, you always have, and what that means 189 00:11:04.140 --> 00:11:06.780 is look, you know, if I need to get my funds out of an Iranian 190 00:11:06.780 --> 00:11:11.850 bank or a Russian bank, I need a license from OFAC to do that. 191 00:11:11.910 --> 00:11:15.420 And what OFAC is saying here is if you sent funds into Tornado 192 00:11:15.420 --> 00:11:20.400 Cash prior to August 8, but you didn't pull them out, you need a 193 00:11:20.400 --> 00:11:24.840 license from OFAC to do that. And OFAC provided some really, 194 00:11:24.840 --> 00:11:28.050 you know, some contact information to send an email to 195 00:11:28.050 --> 00:11:31.740 call a hotline to try to talk to them about getting a license to 196 00:11:31.740 --> 00:11:37.680 get your funds out. You know, one really important FAQ here is 197 00:11:37.710 --> 00:11:41.160 on this idea of dusting. You know, after the Tornado Cash 198 00:11:41.160 --> 00:11:45.420 sanctions, we saw a number of individuals who inadvertently 199 00:11:45.420 --> 00:11:51.090 received funds from Tornado Cash, you know, in very small 200 00:11:51.090 --> 00:11:53.850 amounts, and we call those dusting attacks, basically 201 00:11:53.850 --> 00:11:58.830 people sending tainted funds to taint a wallet address. And a 202 00:11:58.830 --> 00:12:00.780 lot of the question in the crypto community has been 203 00:12:00.780 --> 00:12:04.440 around, well, how can these people possibly, you know, be 204 00:12:06.990 --> 00:12:11.520 open to sanctions or our enforcement action, really? And 205 00:12:11.880 --> 00:12:15.660 the answer, I think, from OFAC is, "Look, any amount 206 00:12:15.690 --> 00:12:25.770 unsolicited or not is a potential violation, right? But 207 00:12:25.860 --> 00:12:28.410 really, most importantly, because the sanctions regime has 208 00:12:28.410 --> 00:12:32.730 strict liability." But most importantly, OFAC is not going 209 00:12:32.730 --> 00:12:35.220 to prioritize these cases. And what they're really saying when 210 00:12:35.220 --> 00:12:38.250 you hear something like that from a regulator is a little bit 211 00:12:38.250 --> 00:12:41.280 of a wink wink nod nod like, "Hey, look, we understand that 212 00:12:41.280 --> 00:12:43.860 this is not what we're intending to do here, right? Go after 213 00:12:43.890 --> 00:12:47.640 regular users or people who have no sort of, you know, actual 214 00:12:47.640 --> 00:12:50.040 malicious intent." So what they're really saying here is, 215 00:12:50.040 --> 00:12:51.930 "Look, this is not something that we're focused on, this is 216 00:12:51.930 --> 00:12:54.690 not something that we're interested in. We don't think 217 00:12:54.690 --> 00:12:57.060 this is good that this is happening. It's a potential 218 00:12:57.060 --> 00:12:59.490 violation if it's not reported, but this is not going to be a 219 00:12:59.490 --> 00:13:03.450 priority." And then just really quickly, there's a fourth FAQ, 220 00:13:03.450 --> 00:13:05.130 which is particularly interesting too, because there's 221 00:13:05.160 --> 00:13:09.270 been a lot of criticism around sort of "engaging sometimes with 222 00:13:09.270 --> 00:13:14.970 software could be considered speech." And what OFAC is saying 223 00:13:14.970 --> 00:13:17.220 here is, "Look, we're not going after speech, we're going after 224 00:13:17.220 --> 00:13:21.030 conduct." And they basically lay out ... it's really actually 225 00:13:21.030 --> 00:13:23.280 very different than anything I've ever seen from OFAC, they 226 00:13:23.280 --> 00:13:25.800 kind of lay out, "Look, you know, these are the ways you can 227 00:13:25.800 --> 00:13:31.440 still engage with Tornado Cash." And they specifically said, "You 228 00:13:31.500 --> 00:13:35.130 can view the software contract, you can discuss it, you can 229 00:13:35.130 --> 00:13:39.420 teach about it. You can include open-source code in written 230 00:13:39.420 --> 00:13:42.780 publications, such as textbooks." So what's really 231 00:13:42.780 --> 00:13:45.720 saying is, "We understand that this is code, what we're doing 232 00:13:45.720 --> 00:13:49.680 is we're sanctioning the conduct that is happening through this 233 00:13:49.680 --> 00:13:53.700 software protocol. We are not sanctioning the code itself." 234 00:13:54.480 --> 00:13:58.830 It's a complex concept. It's going to be challenged in court. 235 00:13:59.430 --> 00:14:02.760 Currently, there's a lawsuit against Treasury by some 236 00:14:02.760 --> 00:14:08.130 Coinbase employees and on this exact issue, we're going to be 237 00:14:08.130 --> 00:14:11.760 talking about this again, Anna, over the next few months, for 238 00:14:11.760 --> 00:14:15.540 sure, but I leave you with that. Look, OFAC has provided some 239 00:14:15.540 --> 00:14:18.330 guidance. It's helpful. I think we're likely to see more. 240 00:14:19.680 --> 00:14:21.720 Anna Delaney: Complex but a clear explanation. I really 241 00:14:21.720 --> 00:14:23.610 appreciate that, Ari. Always brilliant talking with you. 242 00:14:23.610 --> 00:14:24.180 Thank you. 243 00:14:24.390 --> 00:14:25.980 Ari Redbord: Great talking to you too. We'll talk soon. 244 00:14:27.060 --> 00:14:28.620 Anna Delaney: So Tom, it's over to you. 245 00:14:29.200 --> 00:14:32.320 Tom Field: Well, I always learn a lot from Dr. Crypto or the 246 00:14:32.380 --> 00:14:35.410 Blue Devil, whatever you want to call him today. So I appreciate 247 00:14:35.410 --> 00:14:38.860 you bringing Ari on to our stage here. I want to bring back 248 00:14:38.950 --> 00:14:42.970 another long-time guest. He's been a figurehead on our stages 249 00:14:42.970 --> 00:14:45.460 and on our sites for years now. He's David Pollino. He's the 250 00:14:45.460 --> 00:14:49.450 former CISO at PNC Bank. David, always a pleasure to see you. 251 00:14:49.480 --> 00:14:51.100 David Pollino: Good morning. It's great to be here. 252 00:14:51.480 --> 00:14:55.380 Tom Field: David, we have the privilege and the responsibility 253 00:14:55.380 --> 00:14:58.950 of living in, as they say, interesting times, and 254 00:14:59.040 --> 00:15:01.860 particularly interesting for cybersecurity leadership. On one 255 00:15:01.860 --> 00:15:04.920 hand, we've got the former Twitter head of security appear 256 00:15:04.920 --> 00:15:07.920 before Congress this week, talking about his explosive 257 00:15:08.100 --> 00:15:11.220 whistleblower disclosures which alleged that Twitter has got 258 00:15:11.220 --> 00:15:14.850 serious security and privacy vulnerabilities. Same time, we 259 00:15:14.850 --> 00:15:18.840 see an upcoming landmark trial where Uber's previous security 260 00:15:18.840 --> 00:15:21.870 officer is facing criminal charges in regards to a breach 261 00:15:21.870 --> 00:15:25.950 at that organization. As you know, CISOs are watching these 262 00:15:25.950 --> 00:15:31.680 two cases closely. As the CISO at heart and in DNA, what 263 00:15:31.680 --> 00:15:33.450 concerns do you have about these cases? 264 00:15:34.560 --> 00:15:37.590 David Pollino: Well, these cases are fascinating. You know, just 265 00:15:37.620 --> 00:15:40.440 to state something up-front, what we know about the 266 00:15:40.440 --> 00:15:43.740 whistleblower case is largely one-sided from, you know, the 267 00:15:43.740 --> 00:15:47.280 whistleblower from Mudge. And what we know about the Uber 268 00:15:47.280 --> 00:15:50.430 cases was largely from the Department of Justice. So we're 269 00:15:50.430 --> 00:15:55.050 kind of seeing one side of each story. But it is fascinating to 270 00:15:55.050 --> 00:15:58.500 be able to look into these highly respected companies with 271 00:15:58.500 --> 00:16:03.000 these highly respected CISOs and see what's actually going on by 272 00:16:03.000 --> 00:16:07.080 reading the information that's been publicly provided. These 273 00:16:07.080 --> 00:16:12.210 cases have a lot in common. But they also show that even these 274 00:16:12.270 --> 00:16:16.290 highly funded, highly respectable companies have 275 00:16:16.290 --> 00:16:20.550 challenges with cybersecurity. And so, what we can do as CISOs 276 00:16:20.550 --> 00:16:23.610 and security professionals is to read through them and say, "How 277 00:16:23.610 --> 00:16:28.230 can we learn and apply or potentially, in some cases, 278 00:16:28.410 --> 00:16:33.840 avoid making the same mistakes as these companies had," because 279 00:16:33.930 --> 00:16:38.100 what you had was, you had very public security issues that were 280 00:16:38.130 --> 00:16:43.170 highlighted, and then regulatory action, you had several years to 281 00:16:43.170 --> 00:16:47.370 work on those issues. And then you had basically not the 282 00:16:47.400 --> 00:16:50.880 anticipated amount of progress, and incidents continue to 283 00:16:50.910 --> 00:16:54.660 happen. So how do you highlight those things? How do you use the 284 00:16:54.660 --> 00:16:58.590 lessons learned there, and there are a couple of things in there 285 00:16:58.650 --> 00:17:01.680 that I think that kind of highlight to the top. And for 286 00:17:01.680 --> 00:17:06.660 both of them, I think security culture is one of those key 287 00:17:06.690 --> 00:17:12.330 aspects. A CISO, by himself, is not going to be able to keep a 288 00:17:12.330 --> 00:17:17.400 company secure. Absolutely not. You need the full support of the 289 00:17:17.400 --> 00:17:21.330 board, you need the executive committee to be on with you, and 290 00:17:21.330 --> 00:17:24.480 you need to make sure that incentives and behavior are 291 00:17:24.480 --> 00:17:29.310 aligned to promote good security practices. We see in the 292 00:17:29.310 --> 00:17:32.730 whistleblower report that some of the measurements that were 293 00:17:32.730 --> 00:17:37.950 used, that were kind of proxies for how secure or how many bots 294 00:17:37.980 --> 00:17:41.880 were on the network, that those were tilted in such a way to 295 00:17:41.880 --> 00:17:46.470 promote revenue and earnings. And, you know, maybe not 296 00:17:46.470 --> 00:17:53.370 necessarily made to find all the bots on the network. And because 297 00:17:53.400 --> 00:17:57.810 of the culture of the company that was allowed to be the 298 00:17:57.810 --> 00:18:01.230 prevalent attitude, and, as a result, a lot of the initiatives 299 00:18:01.230 --> 00:18:05.070 that the CISO wrote about in the whistleblower report, we're not 300 00:18:05.070 --> 00:18:09.600 really receiving the traction. So learning from these reports 301 00:18:09.630 --> 00:18:15.270 is very important. And, you know, it's up to us as security 302 00:18:15.270 --> 00:18:17.730 professionals to take advantage of this public information. 303 00:18:18.420 --> 00:18:20.370 Tom Field: David, you had the privilege this past week to 304 00:18:20.370 --> 00:18:23.640 moderate a CyberEdBoard Community Roundtable. Topic was 305 00:18:23.640 --> 00:18:26.730 'The Future of Cybersecurity Leadership, Board Regulation 306 00:18:26.760 --> 00:18:29.730 Ethics'. Now, I realize this was under Chatham House rules, I'm 307 00:18:29.730 --> 00:18:32.700 not going to ask you to disclose anything and you shouldn't. But 308 00:18:32.700 --> 00:18:36.300 talk about the mood in the room. Where do you see your peers 309 00:18:36.300 --> 00:18:40.110 standing regarding the ethical implications for CISOs and 310 00:18:40.110 --> 00:18:41.310 cybersecurity leaders? 311 00:18:41.960 --> 00:18:44.690 David Pollino: Yeah, that's an excellent question there. I 312 00:18:44.690 --> 00:18:49.640 think what you're seeing here is the stakes are higher than ever. 313 00:18:49.820 --> 00:18:53.900 You know, one case you have being called before Congress. 314 00:18:55.070 --> 00:18:59.000 And like it or not, Mudge is one of the most respected security 315 00:18:59.000 --> 00:19:01.790 people, I've known him for several years, used to work with 316 00:19:01.790 --> 00:19:04.700 him. He's probably forgotten more about security than most 317 00:19:04.700 --> 00:19:08.720 security people will ever know. Taking this step in his career 318 00:19:08.720 --> 00:19:12.770 is really life changing for him, you know, so he must have 319 00:19:12.770 --> 00:19:17.240 thought through it and really understood what he was getting 320 00:19:17.240 --> 00:19:21.410 into. And on the flip side, having two felony cases or 321 00:19:21.410 --> 00:19:23.750 accounts brought against you by the Department of Justice, 322 00:19:23.960 --> 00:19:27.560 that's life changing as well for another well-respected chief 323 00:19:27.560 --> 00:19:30.770 information security officer. So what we tried to talk about a 324 00:19:30.770 --> 00:19:35.570 little bit is, as you know, this is one of the things that I'm 325 00:19:35.570 --> 00:19:40.940 pondering as well is, as you're thinking about where the CISO 326 00:19:40.970 --> 00:19:46.610 operates and what level of authority that the CISO has and 327 00:19:46.610 --> 00:19:50.660 how the CISO is viewed by the board and even how board members 328 00:19:50.660 --> 00:19:54.200 are part of, or not part of, that interview process and, you 329 00:19:54.200 --> 00:19:57.710 know, what insurance might be appropriate for a CISO, you're 330 00:19:57.710 --> 00:20:02.360 getting what in many cases, it used to be that the CISO was not 331 00:20:02.360 --> 00:20:07.430 executive committee member. And maybe much of the messages from 332 00:20:07.430 --> 00:20:11.600 the CISO were filtered through an executive committee, a CIO or 333 00:20:11.600 --> 00:20:14.960 some other executive committee member, now wanting to have 334 00:20:14.960 --> 00:20:19.070 direct engagement with the board, wanting to make sure that 335 00:20:19.100 --> 00:20:22.460 the CISO is put in the position and if there are real security 336 00:20:22.460 --> 00:20:27.170 issues that are found, that they're able to make progress on 337 00:20:27.170 --> 00:20:30.980 it, highlight it, and also report on it in a way that makes 338 00:20:31.010 --> 00:20:35.420 the CISO feel comfortable. One of the quotes in the 339 00:20:35.420 --> 00:20:38.840 whistleblower report is that Mudge said that they had 10 340 00:20:38.870 --> 00:20:42.860 years of unpaid security bills. And, you know, if you talk to 341 00:20:42.920 --> 00:20:47.210 anyone in that room, everybody has some level of unpaid 342 00:20:47.210 --> 00:20:50.780 security or tech debt, you have some companies that are really 343 00:20:50.780 --> 00:20:54.050 good at encryption, but terrible at patching. Some are good at 344 00:20:54.110 --> 00:20:56.210 least privilege and administrative access, and 345 00:20:56.210 --> 00:21:00.470 others are bad at, you know, education and awareness. So 346 00:21:00.530 --> 00:21:03.560 there's always going to be some areas where you're higher or 347 00:21:03.560 --> 00:21:07.760 lower than your peers. But you know, I think the key thing 348 00:21:07.760 --> 00:21:11.960 that's coming out of this is the "So what can you do about it 349 00:21:11.960 --> 00:21:16.400 now? And where does that take you?" If you've been reporting 350 00:21:16.400 --> 00:21:18.830 that we're doing well, from a security perspective, and all of 351 00:21:18.830 --> 00:21:22.520 a sudden, you're not, you know, what's wrong? What's the 352 00:21:22.880 --> 00:21:28.070 disconnect? So I think when it comes to having an accurate 353 00:21:28.130 --> 00:21:32.090 measurement of where you are, and where you need to go, and 354 00:21:32.090 --> 00:21:35.870 having the authority to be able to push in the right direction 355 00:21:36.170 --> 00:21:39.890 is very important. One of the surprising things also that fell 356 00:21:39.890 --> 00:21:44.720 out of the whistleblower report had to do with the focus on 357 00:21:44.720 --> 00:21:48.260 production and that sometimes, the focus on production 358 00:21:48.290 --> 00:21:52.880 outweighed the need for security. So many of the tests 359 00:21:53.120 --> 00:21:57.050 were not in a full test environment, they were tested in 360 00:21:57.050 --> 00:22:03.980 production, according to that report, and when the September 361 00:22:03.980 --> 00:22:10.070 or the January 6th events were happening, and the CISOs, we 362 00:22:10.070 --> 00:22:12.920 locked down our production environment. The answer was we 363 00:22:12.920 --> 00:22:16.700 can't, we don't know how to do it. So you know, I think there's 364 00:22:16.790 --> 00:22:20.780 lots to be thought through for not only what can go wrong, but 365 00:22:20.780 --> 00:22:24.770 also what could go right when faced with foreseeable 366 00:22:24.800 --> 00:22:29.210 situations. And so there was lots of conversation in the 367 00:22:29.210 --> 00:22:32.660 room, we had to cut it short at the hour. The questions were 368 00:22:32.660 --> 00:22:36.620 flying, the opinions were on display, but it was a great 369 00:22:36.620 --> 00:22:40.340 conversation. And if you haven't dove into both the whistleblower 370 00:22:40.340 --> 00:22:46.130 report, watch the C-span coverage of Pietrzak, or read 371 00:22:46.130 --> 00:22:49.220 the Department of Justice Report. It's some pretty good 372 00:22:49.220 --> 00:22:51.170 reading for your next airplane fly. 373 00:22:51.560 --> 00:22:53.420 Tom Field: They will change our conversation. And it makes a 374 00:22:53.420 --> 00:22:57.050 strong point that if the CISO is in a position to be called 375 00:22:57.050 --> 00:23:00.440 before Congress and report to Congress, perhaps the CISO is in 376 00:23:00.440 --> 00:23:03.140 position to be reporting to senior business management, the 377 00:23:03.140 --> 00:23:06.140 board as well. David, as always, appreciate your time, your 378 00:23:06.140 --> 00:23:06.560 insight. 379 00:23:07.070 --> 00:23:08.030 David Pollino: Absolutely. Thank you. 380 00:23:08.360 --> 00:23:11.480 Tom Field: Anna. So we bring Ari back into the room and wrap up 381 00:23:11.480 --> 00:23:12.530 our conversation today. 382 00:23:12.890 --> 00:23:16.820 Anna Delaney: Let's do that. Well, as we mentioned earlier, 383 00:23:16.850 --> 00:23:20.000 the November US midterm elections are just two months 384 00:23:20.000 --> 00:23:23.690 away. What are your concerns of election interference as we 385 00:23:23.690 --> 00:23:24.920 approach the vote? 386 00:23:25.340 --> 00:23:26.900 Ari Redbord: Is that for me? 387 00:23:28.700 --> 00:23:29.810 Anna Delaney: We'll give David a break. 388 00:23:30.660 --> 00:23:33.809 Ari Redbord: Absolutely, this is probably a sort of taking my TRM 389 00:23:33.870 --> 00:23:37.503 sort of crypto hat off. And when I was a prosecutor, I mean, 390 00:23:37.564 --> 00:23:41.380 these were issues that we were looking at, you know, for a long 391 00:23:41.440 --> 00:23:44.832 time, spanned a number of elections at the US Department 392 00:23:44.893 --> 00:23:48.466 of Justice. And I think we've only just sort of seen it get 393 00:23:48.526 --> 00:23:51.918 more serious. And I think what we saw this week and some 394 00:23:51.979 --> 00:23:55.552 reporting is that Russia is using real resources, which are 395 00:23:55.613 --> 00:23:59.247 becoming more valuable, you know, as the war effort goes on, 396 00:23:59.307 --> 00:24:02.881 and all of that, and we're seeing the gains by Ukraine. So, 397 00:24:02.941 --> 00:24:06.515 but the reality is, this is still focus. And we're going to 398 00:24:06.575 --> 00:24:10.451 continue to see and I think it's that we need to remain vigilant 399 00:24:10.512 --> 00:24:14.388 from a social media perspective, we need to remain vigilant when 400 00:24:14.449 --> 00:24:17.901 we're, you know, even talking about these things and talk 401 00:24:17.962 --> 00:24:21.777 really responsibly about them. So, you know, I think it's a ... 402 00:24:21.838 --> 00:24:25.472 I think Tom made a great point in that outset that this is a 403 00:24:25.532 --> 00:24:28.985 reality that we are now living with. And, you know, we've 404 00:24:29.045 --> 00:24:32.437 talked for a long time that wars have moved to a digital 405 00:24:32.497 --> 00:24:35.768 battlefield. You know, obviously, with exceptions, and 406 00:24:35.829 --> 00:24:39.523 this is a really prime example like that. And, you know, sort 407 00:24:39.584 --> 00:24:43.157 of in the age of the Internet, that was more centralized. I 408 00:24:43.218 --> 00:24:46.852 think we're seeing it become more and more decentralized and 409 00:24:46.912 --> 00:24:50.486 it's harder, it becomes harder and harder to get your hands 410 00:24:50.546 --> 00:24:54.180 around. So I think the key is really remaining vigilant. And 411 00:24:54.241 --> 00:24:57.996 yeah, I think it's going to be a tough question, but something 412 00:24:58.056 --> 00:25:00.540 we're going to live with for a long time. 413 00:25:00.000 --> 00:25:03.360 David Pollino: Yeah, it was a great point, Ari. And I think 414 00:25:03.360 --> 00:25:06.750 it's much like the supply chain issues that we've seen over the 415 00:25:06.750 --> 00:25:10.950 last couple of years. It's amazing how something, that 416 00:25:10.950 --> 00:25:14.490 maybe not that you're not thinking about could impact your 417 00:25:14.700 --> 00:25:17.880 ability to deliver to your customers. So in some cases, it 418 00:25:17.880 --> 00:25:22.830 was, you know, semiconductors or other things that may have been 419 00:25:22.830 --> 00:25:26.550 still stuck on the boat are not readily available. So there's no 420 00:25:26.550 --> 00:25:30.600 doubt going to be cyber activity related to the election. And 421 00:25:30.600 --> 00:25:34.440 much like NotPetya was kind of targeted at one particular area, 422 00:25:34.440 --> 00:25:38.640 but the blast area was a lot larger, and what are ways that 423 00:25:38.640 --> 00:25:43.200 either misinformation or cybersecurity activity could 424 00:25:43.200 --> 00:25:45.630 negatively impact your business. I think it's important for 425 00:25:45.630 --> 00:25:48.840 everybody to think about not only the primary areas that they 426 00:25:48.840 --> 00:25:52.410 could be impacted, but also the second and third areas to be 427 00:25:52.410 --> 00:25:55.080 able to say, what should we be investing in, either from a 428 00:25:55.710 --> 00:25:58.860 cyber defense perspective, or monetary perspective. That way, 429 00:25:58.860 --> 00:26:03.570 if things get bad, or if activity, you know, ratcheted 430 00:26:03.570 --> 00:26:06.720 up? How can we do it in such a way that we can continue to 431 00:26:06.720 --> 00:26:09.720 deliver to our customer, so I think, you know, approaching it 432 00:26:10.920 --> 00:26:14.580 first from a tabletop and from an academic perspective will put 433 00:26:14.580 --> 00:26:18.300 people in the best position to not be impacted from their 434 00:26:18.300 --> 00:26:21.210 businesses and what's gonna happen at the election. 435 00:26:22.110 --> 00:26:24.480 Anna Delaney: Yeah, very important perspectives. And I 436 00:26:24.480 --> 00:26:27.930 heard that there's a new risk in 2022. And that's physical safety 437 00:26:27.930 --> 00:26:31.320 threats to election officials and their families and their 438 00:26:31.320 --> 00:26:35.310 workplaces. And I don't think this was so much of an issue or 439 00:26:35.310 --> 00:26:39.690 that was discussed as much in 2016, 2018. It seems like yet 440 00:26:39.690 --> 00:26:42.390 another way to weaken election infrastructure, I don't know if 441 00:26:42.390 --> 00:26:43.320 you've heard the same. 442 00:26:43.000 --> 00:26:47.140 Tom Field: I see some of that down, I would just say that the 443 00:26:47.140 --> 00:26:51.910 concerns I have is that we did see what happened 2016, 2018 and 444 00:26:51.910 --> 00:26:57.280 2020 on a large national scale. My personal concern is that 2022 445 00:26:57.280 --> 00:27:00.430 might be the year of small ball. In other words, we've got so 446 00:27:00.430 --> 00:27:04.120 many regional elections happening that can reshape 447 00:27:04.120 --> 00:27:07.480 Congress in many ways. I wouldn't be surprised to start 448 00:27:07.480 --> 00:27:11.650 seeing some of the tactics we saw on a national scale now come 449 00:27:11.650 --> 00:27:16.660 down to a smaller scale. And I don't know that states and 450 00:27:16.660 --> 00:27:17.980 regions are prepared for that. 451 00:27:21.740 --> 00:27:23.960 David Pollino: That's a great point, Tom. Another thing that I 452 00:27:23.960 --> 00:27:27.380 like to bring up with, you know, some of my clients is around 453 00:27:27.380 --> 00:27:30.290 thinking about not only yourself, but also your 454 00:27:30.290 --> 00:27:33.140 perimeter, your orbit, when you're thinking about these 455 00:27:33.170 --> 00:27:37.790 security issues. So whether it's Daxin, or physical attacks, what 456 00:27:37.790 --> 00:27:41.360 information are you, you know, is out there about you, where 457 00:27:41.360 --> 00:27:44.900 you are, how you operate? And then you go to the next level. 458 00:27:44.900 --> 00:27:49.160 What about my family? What about my kids? What about, you know, 459 00:27:49.190 --> 00:27:52.880 how could that be used either against me, or in such a way 460 00:27:52.880 --> 00:27:56.540 that if somebody was planning some sort of a cyber Daxin-type 461 00:27:56.540 --> 00:28:00.050 of attack, or physical attack that they'd be able to do that? 462 00:28:00.200 --> 00:28:02.090 I think it's important for everybody to sit down, have a 463 00:28:02.090 --> 00:28:05.300 serious conversation with their family, those people who they 464 00:28:05.300 --> 00:28:08.390 trust and say, "What are we doing from a security 465 00:28:08.390 --> 00:28:11.360 perspective? Are we limiting information sharing? Are we 466 00:28:11.360 --> 00:28:14.420 turning on good authentication? Are we staying away from reusing 467 00:28:14.420 --> 00:28:17.570 passwords?" And just getting kind of your family, you know, 468 00:28:17.570 --> 00:28:21.170 your orbit OPSEC to the level of - if you think that you're a 469 00:28:21.170 --> 00:28:25.550 target, and that way that if you know, as Tom said, of the small 470 00:28:25.550 --> 00:28:28.940 ball comes to town, at least you're making yourself and your 471 00:28:28.940 --> 00:28:32.060 loved ones much more difficult target by the attackers. 472 00:28:33.980 --> 00:28:38.060 Ari Redbord: Yeah, it's scary. There's absolutely no doubt. I'm 473 00:28:38.060 --> 00:28:40.790 going to have David come in and do some OPSEC help for me 474 00:28:40.790 --> 00:28:44.390 because I'm terrible at it. And, you know, just constantly 475 00:28:44.390 --> 00:28:47.540 feeling like a victim of these kinds of attacks. You know, 476 00:28:47.540 --> 00:28:51.470 thankfully, not many, but we're all getting these text messages 477 00:28:51.470 --> 00:28:58.340 these days that say, "Hey, Ari, long time, no talk," and it's 478 00:28:58.340 --> 00:29:01.100 just this, you almost feel like you're being attacked so often. 479 00:29:01.250 --> 00:29:03.530 And this is an entire Proof of Concept we could do another 480 00:29:03.530 --> 00:29:07.820 time. But yeah, these are becoming very real. And they 481 00:29:07.820 --> 00:29:09.950 affect everyone. I thought David said it really well about sort 482 00:29:09.950 --> 00:29:12.860 of ensuring that you have the right OPSEC not just for you, 483 00:29:12.860 --> 00:29:17.360 but for your family, especially when you are as out there as so 484 00:29:17.630 --> 00:29:18.920 many of us are these days. 485 00:29:19.010 --> 00:29:21.440 Anna Delaney: Yeah. Well, you are public figures now. So on 486 00:29:21.440 --> 00:29:24.110 Twitter all the time, Ari, you got a whole media presence going 487 00:29:24.110 --> 00:29:26.300 on. Some people might not agree with you. 488 00:29:26.480 --> 00:29:29.510 Ari Redbord: It might not actually be me, Anna. So don't 489 00:29:29.540 --> 00:29:30.410 don't worry about it. 490 00:29:31.880 --> 00:29:32.960 Tom Field: A deep fake right now. 491 00:29:35.720 --> 00:29:38.120 Anna Delaney: Well, that is, as you say, next episode of Proof 492 00:29:38.120 --> 00:29:40.880 of Concept. This has been as always, educational and 493 00:29:40.880 --> 00:29:43.070 informative. Thank you so much. Both of you. 494 00:29:43.370 --> 00:29:43.910 Ari Redbord: Thank you. 495 00:29:44.600 --> 00:29:45.500 David Pollino: Thanks for having us. 496 00:29:46.490 --> 00:29:48.620 Anna Delaney: And it's goodbye from us. Thank you very much for 497 00:29:48.620 --> 00:29:49.130 watching. 498 00:29:49.370 --> 00:29:51.260 Tom Field: Anna, you're out there as much as anyone. Protect 499 00:29:51.260 --> 00:29:51.770 yourself.