WEBVTT 1 00:00:07.440 --> 00:00:09.420 Anna Delaney: Hello, and thanks for joining us for the ISMG 2 00:00:09.420 --> 00:00:12.090 Editors' Panel. I'm Anna Delaney. And this is a weekly 3 00:00:12.090 --> 00:00:14.880 spot where we discuss and analyze the highlights of what's 4 00:00:14.880 --> 00:00:18.600 happening in cybersecurity right now. I'm joined by a stellar 5 00:00:18.600 --> 00:00:23.310 team today: Tom Field - with his aviators - senior vice president 6 00:00:23.310 --> 00:00:26.010 of editorial; Mathew Schwartz, executive editor of 7 00:00:26.040 --> 00:00:29.190 DataBreachToday and Europe; and Michael Novinson, managing 8 00:00:29.190 --> 00:00:33.690 editor for ISMG Business. Hello, wonderful to see you all. Tom, 9 00:00:35.580 --> 00:00:38.490 started us off fresh from Las Vegas. 10 00:00:40.490 --> 00:00:43.760 Tom Field: Viva Las Vegas as they say. And yes, that is the 11 00:00:43.760 --> 00:00:47.360 sky that you see behind you. It's the sunset of Las Vegas 12 00:00:47.420 --> 00:00:51.140 last Wednesday. We had a chance after our grueling day in the 13 00:00:51.140 --> 00:00:54.020 video studio to grab a bite to eat and after dinner, looked up 14 00:00:54.020 --> 00:00:56.510 at the sky and quite liked it. 15 00:00:57.380 --> 00:00:59.990 Anna Delaney: Okay, good. Well, Michael, were there also 16 00:00:59.990 --> 00:01:01.760 dinosaurs in Vegas? I know you were there. 17 00:01:02.650 --> 00:01:04.600 Michael Novinson: There were not dinosaurs in Vegas. I hear 18 00:01:04.600 --> 00:01:07.270 they're extinct now, but there are a actually a couple hours 19 00:01:07.270 --> 00:01:09.820 away from Vegas. This is Cabazon, California, 20 00:01:10.090 --> 00:01:13.120 quintessentially American, just a roadside attraction. It's a 21 00:01:13.120 --> 00:01:16.090 bunch of dinosaur statues, including some that are 30-feet 22 00:01:16.090 --> 00:01:19.330 tall right off the interstate standard feeder featuring 23 00:01:19.330 --> 00:01:25.000 Pee-wee Herman's video from the mid-1980s. We get there - where 24 00:01:25.030 --> 00:01:27.520 my parents spend the winters in Palm Springs - we get there in 25 00:01:27.520 --> 00:01:30.820 the winter, my daughter loves seeing the dinosaurs, they're so 26 00:01:30.820 --> 00:01:32.950 quintessentially tacky that they're actually kind of cool. 27 00:01:32.000 --> 00:01:35.870 Tom Field: As I understand, they actually dress one of those as 28 00:01:35.870 --> 00:01:36.260 Pee-wee. 29 00:01:37.710 --> 00:01:39.750 Michael Novinson: I think they do we normally get there on 30 00:01:39.750 --> 00:01:41.670 Christmas. So the main dinosaur's always dressed as 31 00:01:41.670 --> 00:01:44.430 Santa Claus. And then we can climb inside it and take a look 32 00:01:44.430 --> 00:01:46.800 out at the parking lot. It's one of the kind. 33 00:01:47.310 --> 00:01:50.610 Anna Delaney: That's a seat tribute, if that's true. And 34 00:01:50.640 --> 00:01:52.470 Mathew, RT as ever? 35 00:01:52.000 --> 00:01:55.480 Mathew Schwartz: I wasn't in Vegas, I was in Glasgow, this is 36 00:01:55.510 --> 00:01:59.440 a view out of the Glasgow University Union. It's the 37 00:01:59.440 --> 00:02:04.180 Gilmore Hall, used to be a church and then the university 38 00:02:04.180 --> 00:02:07.030 bought it. And now it's some kind of an audio visual 39 00:02:07.030 --> 00:02:10.210 facility. But this was the view out the window. I was at a music 40 00:02:10.210 --> 00:02:12.070 festival a few weeks back in Glasgow. 41 00:02:13.140 --> 00:02:14.940 Tom Field: Very good on your own wall. 42 00:02:14.000 --> 00:02:21.200 Anna Delaney: I stumbled upon this rather magical garden 43 00:02:21.440 --> 00:02:24.530 recently near where I live in London. And it's part of the 44 00:02:24.560 --> 00:02:28.820 best Treehouse Museum. So it was originally constructed in 1730s, 45 00:02:28.850 --> 00:02:32.960 a parish workhouse, but became a museum of local history in the 46 00:02:32.990 --> 00:02:36.920 1930s. And I just think it's lovely to discover these pockets 47 00:02:36.920 --> 00:02:39.080 of calm in a busy city like London. 48 00:02:39.800 --> 00:02:42.500 Tom Field: So for you that's new construction. For us. It's 49 00:02:42.500 --> 00:02:42.920 history. 50 00:02:43.370 --> 00:02:49.280 Anna Delaney: Yes, exactly. Very modern. Speaking of busyness and 51 00:02:49.280 --> 00:02:52.880 business, Michael, as we've mentioned, you were both in 52 00:02:52.880 --> 00:02:57.170 Vegas for Black Hat, the most popular summer cybersecurity 53 00:02:57.170 --> 00:03:00.440 event. From what I gather, many of the conversations revolved 54 00:03:00.470 --> 00:03:04.760 around the emergence of AI and how cybersecurity products can 55 00:03:04.760 --> 00:03:07.910 evolve with this new technology. How was the event for you, Tom? 56 00:03:07.940 --> 00:03:09.470 What are some of the highlights you can share? 57 00:03:09.000 --> 00:03:11.490 Tom Field: That was terrific! Between the two of us, Michael 58 00:03:11.490 --> 00:03:14.940 and I conducted 46 interviews over the course of two days. So 59 00:03:14.940 --> 00:03:18.630 you could say that was a typical morning at RSA, and yet, it was 60 00:03:18.630 --> 00:03:22.350 so distinct from RSA. We look back at the discussions that the 61 00:03:22.350 --> 00:03:25.860 four of us had with attendees there and with our speakers, 62 00:03:26.220 --> 00:03:28.980 there was a lot of strategy. There was a lot of trends, it 63 00:03:28.980 --> 00:03:32.070 was a lot of "this is the way that the cybersecurity industry 64 00:03:32.070 --> 00:03:36.360 is headed." I think at Black Hat, it was a lot more heads 65 00:03:36.360 --> 00:03:38.880 down. "This is what we're working on. This is the latest 66 00:03:38.880 --> 00:03:43.140 research. This is what we're finding." A lot more hands-on 67 00:03:43.230 --> 00:03:47.010 discussion as opposed to trends. Topics I really enjoyed and 68 00:03:47.010 --> 00:03:51.240 heard a lot about were AI. Michael said earlier that it 69 00:03:51.240 --> 00:03:54.720 came up and in X percentage of his conversation that they were 70 00:03:54.720 --> 00:03:58.680 specifically focused on that. I would say that generative AI at 71 00:03:58.710 --> 00:04:04.230 least was mentioned in 20 out of the 24 discussions that I had, 72 00:04:04.530 --> 00:04:09.090 and it was getting beyond the discussions we had at RSA that 73 00:04:09.090 --> 00:04:11.520 were, "Is this a good thing? Is this a bad thing? Should you 74 00:04:11.520 --> 00:04:16.200 have a policy?" It was more. "What are some of the use cases, 75 00:04:16.200 --> 00:04:20.280 particularly automated incident response and remediation? How is 76 00:04:20.280 --> 00:04:22.860 this being used? What are the skills you need to have 77 00:04:22.860 --> 00:04:26.580 internally now?" So a lot more hands-on as organizations get a 78 00:04:26.580 --> 00:04:30.510 grasp on this. A lot of talk about ransomware. And I think 79 00:04:30.510 --> 00:04:34.800 getting into more of the specifics about some of the big 80 00:04:34.800 --> 00:04:37.500 ransomware groups that Matt was writing about even two years 81 00:04:37.500 --> 00:04:40.980 ago: How they've split up and decentralized, in some ways 82 00:04:40.980 --> 00:04:44.820 they're stronger in the smaller units that they're in now. And 83 00:04:44.820 --> 00:04:49.200 then a recurring theme of the summer has been DDoS is back and 84 00:04:49.200 --> 00:04:53.160 bigger than ever. We keep seeing record-setting DDoS attacks 85 00:04:53.160 --> 00:04:57.930 every month or so. And what we have is an attack surface larger 86 00:04:57.930 --> 00:05:01.350 than it's ever been in DDoS attacks. As more as a service 87 00:05:01.350 --> 00:05:04.440 than they've ever been before, and are often being used not 88 00:05:04.440 --> 00:05:07.950 just for disruption of organizations, but often as 89 00:05:08.760 --> 00:05:13.560 precursors to ransomware attacks, or DDoS, for the sake 90 00:05:13.560 --> 00:05:17.310 of DDoS. But I had a great discussion with Michael Smith, 91 00:05:17.310 --> 00:05:20.850 who used to be with Akamai and is now with Vercara. And, you 92 00:05:20.850 --> 00:05:24.660 know, just 10 years ago, he and I were exchanging points on the 93 00:05:24.660 --> 00:05:28.590 DDoS trends when we still had U.S. financial institutions 94 00:05:28.590 --> 00:05:31.740 being shut down. And it was fun to sit down after 10 years and 95 00:05:31.740 --> 00:05:35.340 see how things have changed, but how they haven't, as well. So 96 00:05:35.340 --> 00:05:37.920 some terrific discussion, some highlights from me, were talking 97 00:05:37.920 --> 00:05:41.850 with Michael Smith, certainly, talking with Erik Decker, the VP 98 00:05:41.850 --> 00:05:45.330 and CISO of Intermountain Health; he was at Black Hat to 99 00:05:45.330 --> 00:05:49.800 make a presentation on renewing cyber insurance and what you 100 00:05:49.800 --> 00:05:53.670 need to do to qualify today, and made some excellent points about 101 00:05:53.670 --> 00:05:57.210 forest See, so what are the five critical controls you have to 102 00:05:57.210 --> 00:06:01.950 have including multifactor authentication. What you do to 103 00:06:01.950 --> 00:06:05.760 make your case to your insurance broker to show the strength of 104 00:06:05.760 --> 00:06:08.460 your program, and how you leverage the strength of your 105 00:06:08.460 --> 00:06:11.280 program, not to get a discount, because insurance discounts 106 00:06:11.280 --> 00:06:14.940 don't necessarily happen, but how you can get more value from 107 00:06:14.940 --> 00:06:19.350 your investment. And then some terrific discussions as well on 108 00:06:20.040 --> 00:06:24.840 quantum computing on what our speaker was calling the quantum 109 00:06:24.840 --> 00:06:29.610 divide: the haves and the have nots. And, Michael, you take it 110 00:06:29.610 --> 00:06:32.280 away as well, between the two of us, I think we had every 111 00:06:32.280 --> 00:06:34.890 conversation that could be had. That's an excellent insight that 112 00:06:34.890 --> 00:06:36.570 we look forward to sharing with our audience. 113 00:06:36.000 --> 00:06:38.378 Michael Novinson: I can take it from here. And I'm going to 114 00:06:38.436 --> 00:06:42.149 double click on a couple of the items you brought up, Tom. So in 115 00:06:42.207 --> 00:06:45.224 terms of generative AI, one particularly interesting 116 00:06:45.282 --> 00:06:49.053 conversation I had was with Jeff Pollard of Forrester; was really 117 00:06:48.870 --> 00:07:19.978 Tom Field: Exactly. I got to point out something. And before 118 00:06:49.111 --> 00:06:52.534 talking about this idea of shadow AI. So if you think about 119 00:06:52.592 --> 00:06:55.957 shadow IT, it's those BYOD devices that an employee brings 120 00:06:56.015 --> 00:06:58.684 in that the central IT department doesn't have 121 00:06:58.742 --> 00:07:02.455 visibility into what's going on here is that a lot of technology 122 00:07:02.513 --> 00:07:05.472 companies are embedding generative AI into existing 123 00:07:05.530 --> 00:07:08.895 tools, in hopes of eventually getting customers to pay for 124 00:07:08.953 --> 00:07:12.550 them. So they embed it now at maybe a more basic level. And as 125 00:07:12.608 --> 00:07:16.263 the technology improves, and as they make more investment, over 126 00:07:16.321 --> 00:07:19.744 18 months from now, they can start charging these customers 127 00:07:19.802 --> 00:07:22.993 for a premium version once they've gotten used to using 128 00:07:20.613 --> 00:07:58.070 we arrived there, Mike and I were talking about what did we 129 00:07:23.051 --> 00:07:26.590 generative AI in the product, and they like it. The challenge 130 00:07:26.648 --> 00:07:30.245 for the security department is that since these are tools that 131 00:07:30.303 --> 00:07:33.320 the company already out, they have no idea where the 132 00:07:33.378 --> 00:07:37.033 generative AI is being embedded. And since it's not a new piece 133 00:07:37.091 --> 00:07:40.456 of technology, they have very little visibility into where 134 00:07:40.514 --> 00:07:44.111 this is happening, and how to wrap their arms around it and to 135 00:07:44.169 --> 00:07:47.766 put your organization's policies around generative AI in place 136 00:07:47.824 --> 00:07:51.131 for these technologies. So that's a big challenge that he 137 00:07:51.189 --> 00:07:54.496 sees emerging. Some other interesting conversations about 138 00:07:54.554 --> 00:07:57.977 bringing generative AI earlier in the life cycle, taking it 139 00:07:58.035 --> 00:08:01.574 from the security operations center and remediation, incident 140 00:07:58.705 --> 00:08:36.162 think the big themes were going to be? Of course, we talked 141 00:08:01.632 --> 00:08:04.649 response and applying it to preventing and detecting 142 00:08:04.707 --> 00:08:07.898 threats, as well as the conversation about constructing 143 00:08:07.956 --> 00:08:11.437 large language models and the benefits of a private language 144 00:08:11.495 --> 00:08:14.395 model and how to train it with sensitive data that 145 00:08:14.453 --> 00:08:17.992 organizations have, that PII, that IP, while at the same time 146 00:08:18.050 --> 00:08:21.531 keeping it secure. Outside of the generative AI space, a lot 147 00:08:21.589 --> 00:08:25.128 of conversations on my end about ransomware - an interesting 148 00:08:25.186 --> 00:08:28.667 divide almost like that cyber poverty line that at the large 149 00:08:28.725 --> 00:08:32.090 enterprise space, the most sophisticated ransomware actors 150 00:08:32.148 --> 00:08:35.107 are moving away from the traditional encrypting and 151 00:08:35.165 --> 00:08:38.820 ransomware, that many are moving toward encryption listed hacks 152 00:08:36.797 --> 00:09:12.349 about generative AI and ransomware, and Michael said SEC 153 00:08:38.878 --> 00:08:42.011 to evade detection by law enforcement, to stay off, to 154 00:08:42.069 --> 00:08:45.144 help victims also avoid publicity if that's something 155 00:08:45.202 --> 00:08:48.683 they want. And it found that they're able to get the payoffs 156 00:08:48.741 --> 00:08:52.222 that they want without the time complexity of having to lock 157 00:08:52.280 --> 00:08:55.935 down all the systems across the organization below that poverty 158 00:08:55.993 --> 00:08:58.662 line. Small and midsize businesses are getting 159 00:08:58.720 --> 00:09:02.143 increasingly targeted, that the barriers to entry for small 160 00:09:02.201 --> 00:09:05.740 ransomware actors are pretty low. A lot of this stuff is just 161 00:09:05.798 --> 00:09:09.279 out there. And given that the cost is low, that even hitting 162 00:09:09.337 --> 00:09:12.992 these hitting up dentist offices or law firms for a couple $100 163 00:09:12.984 --> 00:09:52.981 compliance is going to be big, because everyone's talking about 164 00:09:13.050 --> 00:09:16.589 to pop is so worth it. Since there's not much upfront expense 165 00:09:16.647 --> 00:09:19.606 for these groups. In the DDoS world, I had a really 166 00:09:19.664 --> 00:09:23.087 interesting conversation with Kevin Schroeder, who oversees 167 00:09:23.145 --> 00:09:26.510 cyber enforcement in the US Attorney's Office, who's going 168 00:09:26.568 --> 00:09:29.933 into DDoS for hire, which is really the services that some 169 00:09:29.991 --> 00:09:33.356 financially motivated doctors are using, but it's a lot of 170 00:09:33.414 --> 00:09:36.721 teenagers who are just tapping into them, trying them for 171 00:09:36.779 --> 00:09:40.144 gaming purposes, trying to stifle their opponents. And she 172 00:09:40.202 --> 00:09:43.799 was talking about the challenges of just trying to arrest your 173 00:09:43.857 --> 00:09:47.106 way out of this, and ways that they can partner with law 174 00:09:47.164 --> 00:09:50.703 enforcement, with the private sector, with academia to try to 175 00:09:50.761 --> 00:09:54.010 deter this behavior because the fear is that if you have 176 00:09:53.616 --> 00:10:34.247 it, so the first guest we had - and I asked about SEC compliance 177 00:09:54.068 --> 00:09:57.607 teenagers who are doing DDoS and what types of cybercrime are 178 00:09:57.665 --> 00:10:01.262 they involved in their 20s or 30s? I'm going to call it a case 179 00:10:01.320 --> 00:10:04.743 study: They had a security strategy over GitHub about that, 180 00:10:04.801 --> 00:10:08.166 was rolling out multifactor authentication to their entire 181 00:10:08.224 --> 00:10:11.647 developer community. He talked about some of the challenges 182 00:10:11.705 --> 00:10:15.186 that they encountered upfront in terms of user interface, in 183 00:10:15.244 --> 00:10:18.551 terms of user lockout, talked about some of the different 184 00:10:18.609 --> 00:10:22.148 types of MFA and how they're trying to steer people away from 185 00:10:22.206 --> 00:10:25.454 those text-based notifications toward more secure second 186 00:10:25.512 --> 00:10:28.761 factors like YubiKeys, or physical keys, or prompts from 187 00:10:28.819 --> 00:10:32.358 the mobile phone itself. And he also talked about the need to 188 00:10:32.416 --> 00:10:35.955 verify that second factor that when people are just trying to 189 00:10:34.882 --> 00:11:13.609 - you thought it was talking about U.S. college athletics, it 190 00:10:36.013 --> 00:10:39.668 initiate a 2FA, if they're being forced to, they'll just choose 191 00:10:39.726 --> 00:10:43.440 whatever is easiest as a second factor. And they'll forget about 192 00:10:43.498 --> 00:10:47.153 it very soon thereafter. So what they found is by circling back 193 00:10:47.211 --> 00:10:50.866 20 days later and saying like, "Hey, this is the second product 194 00:10:50.924 --> 00:10:54.521 you chose, is this still what you want?" Do you still remember 195 00:10:54.579 --> 00:10:58.292 what it is that actually 25% of users actually switched on their 196 00:10:58.350 --> 00:11:01.773 second factor at that time? So that he's found to be a very 197 00:11:01.831 --> 00:11:05.544 effective way of avoiding having users locked out down the road. 198 00:11:05.602 --> 00:11:09.373 So those are some of my favorite conversations from. I can't wait 199 00:11:09.431 --> 00:11:11.520 to share the videos with all of you. 200 00:11:14.244 --> 00:11:34.560 was a very different discussion. 201 00:11:15.290 --> 00:11:20.488 Anna Delaney: There could be you teaching him something then. 202 00:11:20.599 --> 00:11:25.908 That's interesting, rich conversations, the rich 203 00:11:26.019 --> 00:11:32.766 takeaways, I really appreciate that. And we can't wait to see 204 00:11:32.877 --> 00:11:39.624 the videos. Back to AI for a moment, did you get a sense from 205 00:11:39.735 --> 00:11:46.150 the experts you spoke with that organizations are actually 206 00:11:46.261 --> 00:11:53.119 prepared to integrate these new innovations in AI? And what do 207 00:11:53.229 --> 00:12:00.530 they need to do to get their house in order before implementation? 208 00:12:00.860 --> 00:12:02.960 Tom Field: I would say that the word of the day is proof of 209 00:12:02.960 --> 00:12:05.510 concept. I think a lot of organizations are there right 210 00:12:05.510 --> 00:12:08.810 now, and trying to figure out exactly what they can be doing 211 00:12:08.810 --> 00:12:11.870 with this, what they shouldn't be doing with it. And a lot of 212 00:12:11.870 --> 00:12:14.540 this is in the basement still. But Michael, I'll defer to you. 213 00:12:15.680 --> 00:12:19.130 Michael Novinson: I would tend to agree, I think that the 214 00:12:19.160 --> 00:12:21.920 conversations have leveled a bit. And some folks are still 215 00:12:21.920 --> 00:12:25.880 talking about how hackers can use it to write emails that are 216 00:12:25.880 --> 00:12:28.040 more convincing, if they're not English speakers. People are 217 00:12:28.040 --> 00:12:31.220 recognizing the shortcomings of it in terms of producing new and 218 00:12:31.220 --> 00:12:34.400 novel malware. But in terms of how to apply, that seems like 219 00:12:34.400 --> 00:12:39.020 they're still maybe some larger enterprises are exploring those 220 00:12:39.200 --> 00:12:41.510 either open-source or proprietary large language 221 00:12:41.510 --> 00:12:46.160 models. But I think it's still really early days and both on 222 00:12:46.160 --> 00:12:49.220 the IT side in terms of uses and then on the security side for 223 00:12:49.220 --> 00:12:50.090 how to safeguard it. 224 00:12:50.630 --> 00:12:53.120 Tom Field: But that said, this horse is out of the barn, I have 225 00:12:53.120 --> 00:12:56.960 never seen a technology take off and get from zero to 60 as 226 00:12:56.960 --> 00:13:00.110 quickly as generative AI has over the course of this calendar 227 00:13:00.110 --> 00:13:03.350 year. It is not going away. And it's something that we all are 228 00:13:03.350 --> 00:13:06.440 going to be dealing with in one way or another and something as 229 00:13:06.440 --> 00:13:08.900 an organization, ISMG, that we're going to address. 230 00:13:10.160 --> 00:13:11.900 Anna Delaney: Sure. Well, as I said, we look forward to those 231 00:13:11.900 --> 00:13:15.890 videos, can't wait. Thank you. Matt, moving on to out of 232 00:13:15.890 --> 00:13:20.000 control hacker teenagers. It seems that the U.S. government 233 00:13:20.000 --> 00:13:23.000 is studying the methods of amateur hackers, many of them 234 00:13:23.180 --> 00:13:26.750 teenagers with little technical training. But the reason they're 235 00:13:26.750 --> 00:13:30.170 doing that is to learn from them because they've been so adept at 236 00:13:30.380 --> 00:13:32.660 reaching large targets. Tell us more. 237 00:13:32.000 --> 00:13:35.173 Mathew Schwartz: Yes, so the goal here is to, as you say, to 238 00:13:34.910 --> 00:14:29.919 Anna Delaney: Very clear. Just tell us a bit more about the 239 00:13:35.247 --> 00:13:39.528 learn. I don't know if we say from them so much as to warn 240 00:13:39.602 --> 00:13:44.400 others of their impending doom - sounds a bit grandiose, but just 241 00:13:44.474 --> 00:13:49.271 to pull it back to the lens here for a sec. We have in the United 242 00:13:49.345 --> 00:13:53.552 States now, a cyber safety review board that was launched 243 00:13:53.626 --> 00:13:58.203 by executive order, was signed into existence by President Joe 244 00:13:58.276 --> 00:14:02.484 Biden in 2021. Took a little while to get going. But last 245 00:14:02.557 --> 00:14:06.765 year issued its first report into the Log4J vulnerability 246 00:14:06.838 --> 00:14:11.636 Log4Shell and looked at how that was an endemic vulnerability and 247 00:14:11.710 --> 00:14:16.286 made a number of recommendations around getting those sorts of 248 00:14:16.360 --> 00:14:21.010 vulnerabilities identified and patched, and things that need to 249 00:14:21.084 --> 00:14:25.070 happen with the software supply chain to improve basic 250 00:14:25.144 --> 00:14:29.351 cybersecurity resilience. And that is the theme for these 251 00:14:29.425 --> 00:14:33.632 hacker teenagers, as you note. For what is now the second 252 00:14:31.141 --> 00:15:45.709 Cyber Safety Review Board. And you mentioned it was set up in 253 00:14:33.706 --> 00:14:38.134 report from the cyber safety review board, which is modeled, 254 00:14:38.208 --> 00:14:42.194 not identically, but modeled on the NTSB, the National 255 00:14:42.268 --> 00:14:46.549 Transportation and Safety Board, which looks into airplane 256 00:14:46.623 --> 00:14:51.421 crashes, aeronautical disasters, that sort of stuff. More on that 257 00:14:51.494 --> 00:14:55.775 in a moment. So what did we learn from these hackers, many 258 00:14:55.849 --> 00:15:00.056 of whom were are teenagers loosely affiliated, a group of 259 00:15:00.130 --> 00:15:04.559 about eight to 10 of them who came together under the banner 260 00:15:04.633 --> 00:15:09.061 of Lapsus$ - U.S. dollar sign just to look extra scary. And 261 00:15:09.135 --> 00:15:13.859 they operated from late, I don't know, September 2021 until late 262 00:15:13.933 --> 00:15:18.362 2022. Some of them have been arrested, alleged to members of 263 00:15:18.435 --> 00:15:22.864 the group have been arrested, but they haven't all been, but 264 00:15:22.938 --> 00:15:26.850 it looks like the group is probably wrapped up in its 265 00:15:26.924 --> 00:15:31.722 activities. So what can we learn from how this loosely affiliated 266 00:15:31.795 --> 00:15:36.150 hacker group, some teenagers were able to compromise dozens 267 00:15:36.224 --> 00:15:40.431 of well-defended companies using low-cost, low-complexity 268 00:15:40.505 --> 00:15:45.008 attacks. That is the mission statement as encapsulated by the 269 00:15:45.081 --> 00:15:49.362 cyber safety review board's deputy head, which is Google's 270 00:15:46.931 --> 00:16:51.720 2021, tasked with reviewing their major cybersecurity 271 00:15:49.436 --> 00:15:54.086 security chief, Heather Adkins. And the board's led by an under 272 00:15:54.160 --> 00:15:58.515 secretary, the Department of Homeland Security, they've got 273 00:15:58.589 --> 00:16:03.239 the oversight of all this. And they said something similar, but 274 00:16:03.313 --> 00:16:07.815 I thought how they really nailed it here, you have a group of 275 00:16:07.889 --> 00:16:12.465 attackers who have taken down lots of organizations using very 276 00:16:12.539 --> 00:16:17.116 low-level attacks. For example, just to pick up on a couple of 277 00:16:17.189 --> 00:16:21.249 the themes that have been sounded in the course of this 278 00:16:21.323 --> 00:16:25.013 episode, multifactor authentication. That's one of 279 00:16:25.087 --> 00:16:29.442 the big takeaways from this report. Not everybody that this 280 00:16:29.516 --> 00:16:33.871 group went after fell victim, and the ones that didn't fall 281 00:16:33.945 --> 00:16:38.521 victim, the ones we don't even know about are the ones, by and 282 00:16:38.595 --> 00:16:42.433 large, that had multifactor authentication, not just 283 00:16:42.507 --> 00:16:46.714 multifactor authentication, but ones who were using, like 284 00:16:46.788 --> 00:16:51.216 Michael said, YubiKeys. So a dedicated piece of hardware, or 285 00:16:51.290 --> 00:16:55.276 what they recommend is smartphone apps. So that you're 286 00:16:52.942 --> 00:18:11.177 incidents impacting the U.S. but how much teeth do they actually 287 00:16:55.350 --> 00:16:59.557 generating like with Google Authenticator, other apps are 288 00:16:59.631 --> 00:17:03.543 available, a one-time code on the device itself. They 289 00:17:03.617 --> 00:17:07.824 criticized - that's a strong word - but criticized anyone 290 00:17:07.898 --> 00:17:12.179 who's still relying on SMS-based authentication, you might 291 00:17:12.253 --> 00:17:16.386 remember that NIST called SMS-based authentication where 292 00:17:16.460 --> 00:17:20.667 you get a text message with a one-time code deprecated, I 293 00:17:20.741 --> 00:17:25.391 think back in 2016. And yet, I don't know how many services you 294 00:17:25.465 --> 00:17:29.746 use, certainly, I am using a number of services, including 295 00:17:29.820 --> 00:17:34.322 from my banks, which rely on one-time code that they send me. 296 00:17:34.396 --> 00:17:38.382 Lapsus$ was good at sim swapping, so they were able to 297 00:17:38.456 --> 00:17:42.884 clone people's mobile phone numbers, cell phone numbers, and 298 00:17:42.958 --> 00:17:47.682 intercept their one-time codes. And they use that to good effect 299 00:17:47.756 --> 00:17:52.111 to break into companies. They also made wide use of initial 300 00:17:52.185 --> 00:17:56.539 access brokers, as do other kinds of cybercrime groups. And 301 00:17:56.613 --> 00:18:01.190 this report isn't just really looking at Lapsus$, it's looking 302 00:18:01.263 --> 00:18:05.840 at Lapsus$ and similar kinds of groups, initial access brokers 303 00:18:05.913 --> 00:18:10.490 for 50 bucks, 500 bucks, maybe $5,000, depending on who you're 304 00:18:10.564 --> 00:18:15.140 going after, you can buy access to an organization. And if you 305 00:18:12.400 --> 00:19:33.080 have and what happens to their investigations and recommendations? 306 00:18:15.214 --> 00:18:19.716 want to take them down, extort them, deploy encrypted locking 307 00:18:19.790 --> 00:18:24.145 malware, if your ransomware group, this is a good return on 308 00:18:24.219 --> 00:18:28.795 investment, and groups are using it because it works. So those 309 00:18:28.869 --> 00:18:33.002 are just a couple of the takeaways from the Cyber Safety 310 00:18:33.076 --> 00:18:36.693 Review Board, urging organizations, saying "Look, 311 00:18:36.767 --> 00:18:41.195 these are teenage hackers, this is low-cost, low-complexity, 312 00:18:41.269 --> 00:18:45.772 please stop them." Because it's not just teen hackers who use 313 00:18:45.845 --> 00:18:50.348 these tactics. Anyone who wants to break into your network, a 314 00:18:50.422 --> 00:18:54.850 more audacious cybercrime group, organized crime, Mafia-type 315 00:18:54.924 --> 00:18:59.574 people, nation-states like North Korea or Russia, they will use 316 00:18:59.648 --> 00:19:03.855 as little as possible to get into your network. There are 317 00:19:03.929 --> 00:19:08.136 teenagers who can do it. Much worse is on your agenda. So 318 00:19:08.210 --> 00:19:12.860 please get your house in order. So that's the big takeaway here 319 00:19:12.934 --> 00:19:14.780 from their latest report. 320 00:19:33.060 --> 00:19:35.935 Mathew Schwartz: They are in the business, I'd say, of gentle 321 00:19:34.170 --> 00:20:10.830 Anna Delaney: Matt, thorough insight, as always, thank you so 322 00:19:36.001 --> 00:19:39.661 nudging. I'd say they have no teeth at all, which is the 323 00:19:39.727 --> 00:19:43.583 unfortunate part here. We have a legislative and regulatory 324 00:19:43.648 --> 00:19:47.309 environment in the United States where certain regulated 325 00:19:47.374 --> 00:19:51.165 industries like healthcare, or if you're a publicly traded 326 00:19:51.230 --> 00:19:55.087 company, there are enforcement measures that can be brought 327 00:19:55.152 --> 00:19:59.009 against you. But in general, there's nothing forcing people 328 00:19:59.074 --> 00:20:02.800 to do any of this stuff. It's not like you're an airplane 329 00:20:02.865 --> 00:20:06.917 manufacturer, and you've done something that's causing them to 330 00:20:06.983 --> 00:20:10.774 crash. And legally, you're required to fix that. We're not 331 00:20:10.839 --> 00:20:14.500 seeing the same thing here. There have been moves by the 332 00:20:11.610 --> 00:20:57.630 much. Michael, back to you. Security vendor Check Point has 333 00:20:14.565 --> 00:20:18.617 Biden administration with his cybersecurity strategy to try to 334 00:20:18.683 --> 00:20:22.082 create, as part of this overarching strategy, try to 335 00:20:22.147 --> 00:20:26.069 create some liability. That is not been warmly received. And 336 00:20:26.134 --> 00:20:29.598 the Biden administration acknowledges it could be two 337 00:20:29.664 --> 00:20:33.585 decades before they even get this in place. So we are seeing 338 00:20:33.651 --> 00:20:37.376 good moves in good directions. With the first report that 339 00:20:37.442 --> 00:20:41.625 looked at software supply chain problems. That report came out a 340 00:20:41.690 --> 00:20:45.678 year ago, certainly the problem hasn't been fixed. Hopefully, 341 00:20:45.743 --> 00:20:49.599 it'll give some impetus for CISOs. Hopefully, they can drop 342 00:20:49.665 --> 00:20:53.783 this report on somebody's desk and say, "My authenticator-based 343 00:20:53.848 --> 00:20:57.443 multifactor authentication program, please." Hopefully, 344 00:20:57.508 --> 00:21:01.299 that'll help. And then we have another third report that's 345 00:20:58.410 --> 00:21:49.890 snapped up Perimeter 81 for $490 million. Tell us about this deal. 346 00:21:01.365 --> 00:21:05.613 going to be coming out, which is on the heels of the massive hack 347 00:21:05.679 --> 00:21:09.404 of Microsoft Online Exchange. And the board's going to be 348 00:21:09.470 --> 00:21:13.457 looking at that. And I should emphasize, again, it's a public 349 00:21:13.522 --> 00:21:17.182 and private board, has excellent people on it. Excellent 350 00:21:17.248 --> 00:21:21.300 insights, really great reports. They're going to be looking at 351 00:21:21.366 --> 00:21:25.091 the massive hack of Microsoft Online Exchange, as well as 352 00:21:25.157 --> 00:21:29.144 other cloud environments. But as you say, some of the initial 353 00:21:29.209 --> 00:21:33.262 reaction from the cybersecurity community has been, "Boy, wish 354 00:21:33.327 --> 00:21:38.099 they had subpoena power, or wish they can hold people to their findings." 355 00:21:50.770 --> 00:21:53.172 Michael Novinson: Absolutely. It's a fascinating acquisition 356 00:21:53.229 --> 00:21:56.776 from both a technology as well as from a financial standpoint. 357 00:21:56.833 --> 00:22:00.379 Let's start by talking a little bit about the technology here. 358 00:22:00.437 --> 00:22:03.926 So Perimeter 81 is a relatively recent company founded in the 359 00:22:03.983 --> 00:22:07.415 past half decade, and they're really focused initially under 360 00:22:07.472 --> 00:22:11.019 zero trust network access piece of the equation, and they were 361 00:22:11.076 --> 00:22:14.737 focused on making it accessible to the masses that this was seen 362 00:22:14.794 --> 00:22:18.398 as a critical component of SASE. And you have something that in 363 00:22:18.455 --> 00:22:21.773 terms of time required to implement in terms of costs, and 364 00:22:21.830 --> 00:22:25.148 manpower required to implement was something that was only 365 00:22:25.205 --> 00:22:28.580 available to large enterprises. They really streamlined the 366 00:22:28.637 --> 00:22:32.126 process of bringing CTNA to the mid-market, to the SMBs, with 367 00:22:32.184 --> 00:22:35.673 deployment in an hour rather than in weeks. With a deployment 368 00:22:35.730 --> 00:22:39.391 that's largely automated, with a deployment that doesn't require 369 00:22:39.448 --> 00:22:42.938 almost any hardware. So it made ZTNA much more accessible and 370 00:22:42.995 --> 00:22:46.370 this was really anyone's claim to fame. More recently, they 371 00:22:46.427 --> 00:22:49.573 tried to broaden out their portfolio. They did add some 372 00:22:49.630 --> 00:22:53.120 secure web gateway capabilities over the past year. But where 373 00:22:53.177 --> 00:22:56.838 they're well-regarded isn't that ZTNA space; were actually named 374 00:22:56.895 --> 00:23:00.441 by Forrester in 2021 as a leader in ZTNA alongside vendors who 375 00:23:00.499 --> 00:23:03.931 are much larger than them. So Check Point here are trying to 376 00:23:03.988 --> 00:23:07.420 figure out how they want to play in this SASE, secure access 377 00:23:07.477 --> 00:23:10.966 service edge, market. So they announced that they're going to 378 00:23:11.024 --> 00:23:13.998 be building out an SD-WAN capability of their own in 379 00:23:14.055 --> 00:23:17.259 February, which is two note behind Fortinet or Palo Alto 380 00:23:17.316 --> 00:23:20.634 Networks, Cisco, all of which have been doing SD-WAN for a 381 00:23:20.691 --> 00:23:24.066 number of years. And they do have some capabilities on that 382 00:23:24.123 --> 00:23:27.612 security service edge side. But they were not recognized as a 383 00:23:27.669 --> 00:23:30.816 leader in SSE, by Gartner earlier this year unlike Palo 384 00:23:30.873 --> 00:23:34.190 Alto Networks and Cisco, which were toward the top of that 385 00:23:34.248 --> 00:23:37.623 quadrant. So Check Point here got themselves as some strong 386 00:23:37.680 --> 00:23:41.283 ZTNA capabilities. But the thing is that while Perimeter 81 was 387 00:23:41.341 --> 00:23:44.830 very good, they are narrow, and that people are talking about 388 00:23:44.887 --> 00:23:48.491 single-vendor SASE, Perimeter 81 couldn't even do single-vendor 389 00:23:48.548 --> 00:23:52.209 security service. And they were partnering for the CASB piece of 390 00:23:52.266 --> 00:23:55.756 it in relying on partnerships for some of the other ancillary 391 00:23:55.813 --> 00:23:59.416 capabilities like remote browser isolation. So it gives a Check 392 00:23:59.474 --> 00:24:03.020 Point some really good expertise in certain pieces of security 393 00:24:03.077 --> 00:24:06.681 service. And then what they're going to need to do is integrate 394 00:24:06.738 --> 00:24:10.170 that with some of their native capabilities to build up that 395 00:24:10.228 --> 00:24:13.889 security service center suite on their own. Neither Perimeter 81 396 00:24:13.946 --> 00:24:17.492 nor Check Point were even made the security service edge Magic 397 00:24:17.549 --> 00:24:21.039 Quadrant that Gartner put out earlier this year. There are 10 398 00:24:21.096 --> 00:24:24.356 vendors in there, neither of them were there. Question is 399 00:24:24.414 --> 00:24:27.789 going to be: If you bring the two of them together, do they 400 00:24:27.846 --> 00:24:31.221 make an appearance on that leaderboard? And if so, how high 401 00:24:31.278 --> 00:24:31.450 up? 402 00:24:31.900 --> 00:24:34.120 Anna Delaney: You obviously have been covering many of these M&As 403 00:24:34.210 --> 00:24:37.000 for a long time now. So what's particularly unique about this 404 00:24:37.000 --> 00:24:38.950 case from a financial perspective? 405 00:24:39.560 --> 00:24:42.440 Michael Novinson: Absolutely. And that's what caught a lot of 406 00:24:42.440 --> 00:24:46.940 people's attention, to have been wondering what does this -with 407 00:24:47.300 --> 00:24:50.330 the market downturn we've seen - what does it mean for startups? 408 00:24:50.330 --> 00:24:52.790 And there's been a lot of attempts to kick the can down 409 00:24:52.790 --> 00:24:55.580 the road, people have done debt financing, people have extended 410 00:24:55.580 --> 00:24:58.190 the runway through layoffs. But at some point, people are going 411 00:24:58.190 --> 00:25:00.470 to need an exit event whether that's another act pretty round, 412 00:25:00.470 --> 00:25:03.080 whether that's going public or whether that's being sold, and 413 00:25:03.080 --> 00:25:06.260 at what value is that going to happen. So what's fascinating 414 00:25:06.260 --> 00:25:09.710 about this Perimeter 81 case is that they're actually a unicorn, 415 00:25:10.010 --> 00:25:14.660 meaning a $1 billion U.S. dollar valuation. Just 14 months ago, 416 00:25:14.960 --> 00:25:17.720 at June of 2022, they revealed the unicorn status at RSA 417 00:25:17.720 --> 00:25:21.770 conference that year. So that 14 months later, they're worth 51%. 418 00:25:21.800 --> 00:25:24.740 This transaction was 490 million, so they're worth not 419 00:25:24.740 --> 00:25:27.410 even half of what they're worth 14 months ago. That's not 420 00:25:27.410 --> 00:25:30.200 because their business was shrinking or anything like that. 421 00:25:30.200 --> 00:25:33.140 It's just how does the market value go, and we're goingto see 422 00:25:33.140 --> 00:25:36.080 a major trickle down here, we saw Cybereason raised a new 423 00:25:36.080 --> 00:25:38.840 round of funding, their valuation was down 90%, in order 424 00:25:38.840 --> 00:25:43.820 to get that money. And the reckoning is, here, people have 425 00:25:43.820 --> 00:25:45.560 extended the runway as much as they can, and they're going to 426 00:25:45.560 --> 00:25:47.990 have to figure, they're going to have to ... investors and 427 00:25:47.990 --> 00:25:51.140 companies are going to have to accept serious downgrades that 428 00:25:51.140 --> 00:25:54.290 people were valued at, in a certain way, which was focused 429 00:25:54.290 --> 00:25:57.260 purely on growth, and not necessarily on profitability or 430 00:25:57.260 --> 00:26:00.380 on exit strategies. And companies are worth a lot less 431 00:26:00.380 --> 00:26:03.740 now than they were a year ago. And we're not just talking like 432 00:26:04.520 --> 00:26:08.060 15-20%, which we've seen with companies like OneTrust or 433 00:26:08.060 --> 00:26:12.320 companies like Snyk that took a small valuation to raise more 434 00:26:12.320 --> 00:26:15.380 money, but we're going to see serious, meaningful valuation 435 00:26:15.380 --> 00:26:18.800 cuts. And then so then the trickle-down effect is what does 436 00:26:18.800 --> 00:26:23.240 that mean in terms of VCs and private equity firms that the 437 00:26:23.240 --> 00:26:26.720 folks who led that billion dollar funding round made 49 438 00:26:26.720 --> 00:26:29.990 cents on their dollar, that's not great. So what does this 439 00:26:29.990 --> 00:26:33.110 mean in terms of willingness to invest in mid-stage, late-stage 440 00:26:33.140 --> 00:26:36.410 startups probably early stages still secure, since the exits 441 00:26:36.410 --> 00:26:38.450 are far enough away? But for those mid-stage and late-stage 442 00:26:38.480 --> 00:26:42.800 funding rounds, how willing are people to go in and do these 443 00:26:42.800 --> 00:26:45.710 right now, given the depressed valuations of the market? And 444 00:26:45.920 --> 00:26:50.240 how well your company's to say like, "I guess we are worth only 445 00:26:50.240 --> 00:26:53.510 a fraction of what we thought we were worth 12-18-24 months ago." 446 00:26:53.510 --> 00:26:56.510 So we are getting some data points here. And I think we're 447 00:26:56.510 --> 00:26:59.840 going to see a lot more exit events, whether it's a sale, 448 00:26:59.840 --> 00:27:02.090 whether it's another funding round in the months to come. 449 00:27:03.170 --> 00:27:04.040 Anna Delaney: Excellent overview. 450 00:27:04.000 --> 00:27:06.490 Tom Field: What I heard last week, Michael, was that tourists 451 00:27:06.490 --> 00:27:11.530 have left the investment market. And that leaves the residents 452 00:27:11.530 --> 00:27:14.590 there to pick up the slack. It's going to be a different market. 453 00:27:14.590 --> 00:27:17.200 Michael Novinson: And I can guess exactly who said that. I 454 00:27:17.200 --> 00:27:19.840 would agree on some of these companies. And some of these 455 00:27:19.870 --> 00:27:22.630 folks who gave these really high valuations. Yeah, we do not have 456 00:27:22.630 --> 00:27:25.270 a document insecurity and have not really made too many 457 00:27:25.270 --> 00:27:27.070 investments since so. Absolutely correct. 458 00:27:28.019 --> 00:27:30.839 Anna Delaney: Thank you. Well, finally, and just for fun, 459 00:27:31.319 --> 00:27:35.459 imagine AI as a pet. So what tricks or skills would you teach 460 00:27:35.459 --> 00:27:37.829 it to help protect your digital realm? 461 00:27:37.000 --> 00:27:40.330 Tom Field: Well, I would say if you're talking about large 462 00:27:40.330 --> 00:27:44.230 language models, AI already knows how to fetch. So that 463 00:27:44.230 --> 00:27:47.980 trick is off the charts but I would also say that if AI as a 464 00:27:47.980 --> 00:27:52.960 pet is most likely a cat in that case, it doesn't have masters, 465 00:27:53.080 --> 00:27:54.040 it has attendance. 466 00:27:57.100 --> 00:27:57.700 Anna Delaney: Like it! 467 00:27:59.260 --> 00:28:01.540 Mathew Schwartz: I would have it alerted me to strangers or 468 00:28:01.540 --> 00:28:05.980 strange behavior or strange activity, stranger danger. 469 00:28:06.610 --> 00:28:09.580 Anna Delaney: Love that. Michael? 470 00:28:11.280 --> 00:28:13.050 Michael Novinson: This one stumped me. I was thinking about 471 00:28:13.050 --> 00:28:16.290 playing dead and I know that works for natural predators in 472 00:28:16.290 --> 00:28:19.710 the wild. I wonder if, like honey potting or deception 473 00:28:19.710 --> 00:28:23.730 technology, if there are ways that AI are played out in the 474 00:28:23.730 --> 00:28:27.540 wild against cyber adversaries and cause them to move on to 475 00:28:27.840 --> 00:28:29.040 more appealing targets. 476 00:28:29.540 --> 00:28:33.200 Anna Delaney: He says that with the dinosaur behind him. T Rex. 477 00:28:34.370 --> 00:28:39.260 I've got patch pounce. So upon my command, my AI would pounce 478 00:28:39.290 --> 00:28:42.320 on software vulnerabilities by finding, applying the latest 479 00:28:42.320 --> 00:28:46.940 patches to keep my systems secure. So one can only dream. 480 00:28:48.410 --> 00:28:52.010 Maybe not too long, though. Well, Tom, Michael, Matt, I 481 00:28:52.010 --> 00:28:56.840 really enjoyed this. Thank you so much, and thanks so much for 482 00:28:56.840 --> 00:28:57.920 watching. Until next time,