WEBVTT 1 00:00:00.000 --> 00:00:01.680 Mathew Schwartz: Hi, I'm Mathew Schwartz with Information 2 00:00:01.680 --> 00:00:04.680 Security Media Group. It's my pleasure to welcome back to the 3 00:00:04.680 --> 00:00:08.370 ISMG studio, Wendi Whitmore. Wendi, senior vice president, 4 00:00:08.430 --> 00:00:12.990 Unit 42, Palo Alto Networks. Did I get all that correct? 5 00:00:13.080 --> 00:00:13.860 Wendi Whitmore: 100%, yes. 6 00:00:14.220 --> 00:00:16.560 Mathew Schwartz: Excellent. Thank you so much for returning 7 00:00:16.650 --> 00:00:19.980 to our studios to share your insights into threat 8 00:00:19.980 --> 00:00:22.140 intelligence, ransomware, other things with us today. 9 00:00:22.170 --> 00:00:24.090 Wendi Whitmore: I'm excited to be here. Thank you. 10 00:00:24.450 --> 00:00:27.690 Mathew Schwartz: So ransomware and threat intelligence and what 11 00:00:27.690 --> 00:00:30.420 the cybercriminals are up to. I want to start with a little 12 00:00:30.420 --> 00:00:34.920 session you did at RSA Conference 2023. If I recall 13 00:00:34.920 --> 00:00:38.640 correctly, it was on real-world threat intelligence and incident 14 00:00:38.640 --> 00:00:43.620 response - lessons learned. What are some of the themes perhaps 15 00:00:43.620 --> 00:00:44.910 that you highlighted, and especially what were the 16 00:00:44.910 --> 00:00:46.890 takeaways you highlighted from that? 17 00:00:46.930 --> 00:00:48.370 Wendi Whitmore: You know, I have to say, that was one of my 18 00:00:48.370 --> 00:00:51.640 favorite sessions that I've probably ever participated in, 19 00:00:52.090 --> 00:00:54.730 in large part because we gave the audience a lot of homework. 20 00:00:54.760 --> 00:00:58.720 And so, in particular, we gave them homework related to them 21 00:00:58.720 --> 00:01:02.890 being more prepared for incident response. We also got tweeted as 22 00:01:02.890 --> 00:01:07.240 the calmest panel at RSA in a very complimentary way, because 23 00:01:07.240 --> 00:01:09.910 we were talking about pretty chaotic times, right? So we 24 00:01:09.910 --> 00:01:12.850 focus a lot on, you know, what are the first 24 hours like 25 00:01:12.850 --> 00:01:14.980 during an incident response investigation? What are you 26 00:01:14.980 --> 00:01:17.890 trying to figure out? And what do you need to avoid doing? So 27 00:01:17.890 --> 00:01:20.920 things like being skeptical of the information that you're 28 00:01:20.920 --> 00:01:26.680 presented with equal parts looking to prove and disprove 29 00:01:26.680 --> 00:01:30.130 the information? Right? So not going in with a bias of data 30 00:01:30.130 --> 00:01:33.310 that you feel like you already know what the answer is? Being 31 00:01:33.310 --> 00:01:38.110 curious and curiosity being such a skill in our field, right? So 32 00:01:38.320 --> 00:01:42.520 being able to take a piece of data, not only be skeptical, but 33 00:01:42.520 --> 00:01:45.850 turn it on its head, ask more questions, figure out, are there 34 00:01:45.850 --> 00:01:48.610 more pieces of this puzzle that we need to be able to put it all 35 00:01:48.610 --> 00:01:53.110 together? And what are they? And then I also recommended one of I 36 00:01:53.110 --> 00:01:56.500 think our core strengths at Unit 42, which is being calm, right? 37 00:01:56.500 --> 00:01:59.260 A lot of these times you're going into a situation, it's a 38 00:01:59.260 --> 00:02:03.580 Friday night, you've got people who have been awake for 24 hours 39 00:02:03.580 --> 00:02:06.970 plus, because by the time they decide to engage an outside 40 00:02:06.970 --> 00:02:10.570 team, they're often at their wits' end. Hey, we've been 41 00:02:10.570 --> 00:02:13.210 trying to figure out these solutions. And now we can't, we 42 00:02:13.210 --> 00:02:17.290 need reinforcements. And it's chaotic. And so coming in with 43 00:02:17.290 --> 00:02:20.410 essentially a bit of a therapeutic angle of, hey, we've 44 00:02:20.410 --> 00:02:24.220 done this before, here's what we need to do. Here's that specific 45 00:02:24.220 --> 00:02:28.030 game plan we need to follow over the next 12 to 24 to 72 hours, 46 00:02:28.270 --> 00:02:31.060 and then really establishing that credibility, but bringing a 47 00:02:31.060 --> 00:02:34.330 sense of calm to the situation. So we covered a whole lot during that session. 48 00:02:34.000 --> 00:02:36.400 Mathew Schwartz: It sounds like you've covered an awful lot. 49 00:02:34.330 --> 00:02:34.810 50 00:02:36.520 --> 00:02:41.110 I'm wondering is this, you know, in particular that need to think 51 00:02:41.140 --> 00:02:44.530 critically about what you're being presented with, actually 52 00:02:44.560 --> 00:02:48.490 chase it down to prove or disprove it, and also not 53 00:02:48.580 --> 00:02:51.820 denying it. It's almost like a stages of guilting - not denying 54 00:02:51.820 --> 00:02:55.990 it from the outset. But saying, okay, what if and then 55 00:02:55.990 --> 00:02:59.500 interrogating that. Is that something you've seen when you 56 00:02:59.500 --> 00:03:02.770 look at incidents often happen with people saying, No, this 57 00:03:02.770 --> 00:03:06.850 can't be true? And then 24 hours, 48 hours later? Oh, it is 58 00:03:06.850 --> 00:03:07.210 true. 59 00:03:07.540 --> 00:03:10.210 Wendi Whitmore: Oh, absolutely. I mean, so whether it's a big, 60 00:03:10.240 --> 00:03:15.010 you know, multi-nation kind of situation, like a Log4j or 61 00:03:15.010 --> 00:03:18.130 SolarWinds, for example, or whether it's confined to one 62 00:03:18.130 --> 00:03:21.790 organization, the challenge I mentioned, it's chaotic. And 63 00:03:21.790 --> 00:03:24.280 it's not only because of the emotions running through, it's 64 00:03:24.280 --> 00:03:27.250 because the data is changing, right? It's so dynamic during 65 00:03:27.250 --> 00:03:31.090 that time period, the worst thing an organization can do are 66 00:03:31.090 --> 00:03:33.400 two things. One, what you mentioned, which was deny that 67 00:03:33.400 --> 00:03:36.520 there's a situation, you know, occurring and not actually look 68 00:03:36.520 --> 00:03:40.180 into it. Conversely, though, what they can do is share too 69 00:03:40.180 --> 00:03:44.410 much data too quickly, before they have the facts. I mentioned 70 00:03:44.410 --> 00:03:47.530 that this data is dynamic, especially during that first 24 71 00:03:47.530 --> 00:03:51.280 hours. So what you want to do is share the information needed to 72 00:03:51.280 --> 00:03:55.720 protect clients, in needed to make regulatory obligations, for 73 00:03:55.720 --> 00:03:59.200 example, to abide by laws. But what you don't want to do is 74 00:03:59.200 --> 00:04:01.510 share information that you're then going to backtrack, because 75 00:04:01.510 --> 00:04:04.150 you've now found new information, and everything's 76 00:04:04.150 --> 00:04:04.720 changed. 77 00:04:05.200 --> 00:04:07.810 Mathew Schwartz: So practice, I think, would maybe be a takeaway 78 00:04:07.810 --> 00:04:11.290 here. Plan for how you're going to respond so that you can have 79 00:04:11.290 --> 00:04:14.890 this calm demeanor that you clearly evinced during the panel 80 00:04:14.890 --> 00:04:15.310 today. 81 00:04:15.810 --> 00:04:18.990 Wendi Whitmore: Absolutely. And with regard to practice, we gave 82 00:04:18.990 --> 00:04:22.530 the audience some specific homework in particular, Lesley 83 00:04:22.530 --> 00:04:26.550 Carhart, who I presented with gave them the specific action 84 00:04:26.580 --> 00:04:30.720 of, you know, I had talked about when you're in these type of 85 00:04:30.720 --> 00:04:34.470 situations, you actually need to think and plan for people who 86 00:04:34.470 --> 00:04:36.960 need to go home at some point, who need to get some sleep, who 87 00:04:36.960 --> 00:04:40.680 maybe after a week onsite, need to rotate out with another team 88 00:04:40.680 --> 00:04:42.780 member so that they can go home and spend some time with their 89 00:04:42.780 --> 00:04:46.800 family. And Lesley say challenged the audience to actually make 90 00:04:46.800 --> 00:04:49.230 that part of your incident response playbook, like have 91 00:04:49.230 --> 00:04:51.630 this documented, these are the actions we're going to take, 92 00:04:51.750 --> 00:04:54.810 here's the shifts that people are going to work, here are the 93 00:04:54.810 --> 00:04:57.540 escalation paths toward that, and here's how we're going to 94 00:04:57.540 --> 00:05:01.050 get through this crisis because it's not just a first 24, 48 95 00:05:01.050 --> 00:05:04.050 hours, right? Oftentimes, these can last for weeks, and in big 96 00:05:04.050 --> 00:05:06.780 cases even months. So we've got to be able to plan for that. 97 00:05:06.990 --> 00:05:09.990 Mathew Schwartz: It's great to hear takeaways from the incident 98 00:05:09.990 --> 00:05:11.760 response engagements you've worked on. I think that 99 00:05:11.760 --> 00:05:15.660 dovetails so nicely with threat intelligence, because just 100 00:05:15.660 --> 00:05:17.460 because you're getting the intelligence doesn't mean you 101 00:05:17.460 --> 00:05:20.670 got the processes, procedures in place, so that your people know 102 00:05:20.670 --> 00:05:24.090 what they're supposed to do. And so I want to shift now to 103 00:05:24.090 --> 00:05:26.760 ransomware. Because I think it's the obviously it's not the only 104 00:05:26.790 --> 00:05:30.030 threat facing organizations, but it's clearly a very innovative 105 00:05:30.030 --> 00:05:34.050 one on the part of criminals, and points to a lot of the 106 00:05:34.050 --> 00:05:37.110 challenges that they're having to deal with these days. So if 107 00:05:37.110 --> 00:05:40.320 you will fill me in just a bit on what you're seeing on the 108 00:05:40.320 --> 00:05:43.230 ransomware front in terms of trends or new tool used, for 109 00:05:43.230 --> 00:05:46.770 example, or strategies by groups that the organizations 110 00:05:46.770 --> 00:05:48.930 that you advise are having to respond to now. 111 00:05:49.170 --> 00:05:52.050 Wendi Whitmore: Yeah, so in regard to ransomware, it 112 00:05:52.050 --> 00:05:55.800 continues to be super prevalent, right? It is still everywhere. 113 00:05:56.040 --> 00:05:58.620 And I think many organizations have. The good news is they're 114 00:05:58.620 --> 00:06:02.280 being more effective at dealing with it. One of the interesting 115 00:06:02.280 --> 00:06:06.150 trends is a move toward extortion combined with threats. 116 00:06:06.180 --> 00:06:09.540 And so I'll explain what I mean there. It used to be that 117 00:06:09.540 --> 00:06:12.330 attackers were encrypting data and asking to be paid for that 118 00:06:12.330 --> 00:06:15.270 to have it restored back. And then they might threaten to 119 00:06:15.270 --> 00:06:18.630 extort you, they might threaten to have a DDoS, wage DDoS attack 120 00:06:18.630 --> 00:06:21.870 against you. They might threaten to share that information with 121 00:06:21.870 --> 00:06:24.780 your most sensitive clients. Today, what they're by and large 122 00:06:24.780 --> 00:06:27.300 doing is moving just toward that second stream, which is 123 00:06:27.300 --> 00:06:30.510 extortion. So I'm going to steal the data, and then I'm going to 124 00:06:30.510 --> 00:06:32.460 ask you to pay me so that I don't release it on the 125 00:06:32.460 --> 00:06:36.480 internet. What they're not doing as much is encrypting the data, 126 00:06:36.510 --> 00:06:39.660 because it takes a lot of time, takes money and effort. And they 127 00:06:39.660 --> 00:06:42.270 figured out that man, this work is kind of a pain in the butt, 128 00:06:42.270 --> 00:06:45.390 right? So they're looking to say, Hey, I'm gonna steal it, 129 00:06:45.420 --> 00:06:48.510 and then ask you to pay me and force you to pay me in many 130 00:06:48.510 --> 00:06:52.710 cases, right money on the back end. So we're also seeing in 131 00:06:52.710 --> 00:06:55.410 combination with that, though, is this threatening element. So 132 00:06:55.410 --> 00:06:59.550 it is not uncommon today, for a CEO to be, you know, initially 133 00:06:59.550 --> 00:07:04.260 reached out to directly, for example. A CEO, CISO, a CFO, 134 00:07:04.410 --> 00:07:08.010 maybe the head legal counsel for our company, to be reached out 135 00:07:08.010 --> 00:07:11.100 through initially via company email and other types of 136 00:07:11.100 --> 00:07:14.880 communication means. But if those responses go unanswered, 137 00:07:15.000 --> 00:07:19.740 then attackers often will go to the CEO's spouse. Find their 138 00:07:19.740 --> 00:07:22.860 social media account, figure out who they are, and message them 139 00:07:22.860 --> 00:07:27.240 directly. And same with children and other family members of 140 00:07:27.240 --> 00:07:30.600 executive staff. So it's really become an interesting element, 141 00:07:30.630 --> 00:07:33.300 you can imagine, you know, going home and having your spouse ask 142 00:07:33.300 --> 00:07:36.330 you, hey, why haven't you dealt with this situation yet? You 143 00:07:36.330 --> 00:07:39.300 know, at work, for example, that's probably not something 144 00:07:39.300 --> 00:07:42.630 that many executives want to hear. And so, attackers are 145 00:07:42.630 --> 00:07:47.880 continuing to leverage time as a pressure valve in order to 146 00:07:47.880 --> 00:07:50.100 essentially try to get to decisions faster. 147 00:07:50.330 --> 00:07:52.280 Mathew Schwartz: They are such experts, unfortunately, at the 148 00:07:52.280 --> 00:07:56.420 psychology of pressure. I hadn't heard of this before - going 149 00:07:56.420 --> 00:07:59.840 after the spouse or the kids, having them come to you and say, 150 00:07:59.840 --> 00:08:03.140 what do I need to ask you about a ransomware attack, mom, dad? 151 00:08:03.170 --> 00:08:05.300 You going, oh, gosh, what's going on here? 152 00:08:05.330 --> 00:08:07.190 Wendi Whitmore: Right. You gotta be kidding me. Right? I have to 153 00:08:07.190 --> 00:08:08.420 deal with this at home now too. 154 00:08:08.000 --> 00:08:12.530 Mathew Schwartz: I know on top of everything else. So there's 155 00:08:12.530 --> 00:08:15.800 been a change. I've been hearing as well in the tooling not to 156 00:08:15.800 --> 00:08:18.800 get into the weeds on tech stuff, but I know Cobalt Strike, 157 00:08:18.800 --> 00:08:21.830 for example, has been a widely used, I mean, legitimate tool, 158 00:08:21.830 --> 00:08:25.460 but widely abused tool by ransomware groups. I'm hearing 159 00:08:25.460 --> 00:08:28.250 there's been a shift toward some other tools. I think to help 160 00:08:28.250 --> 00:08:31.040 complicate their attacks. What have you been seeing on this 161 00:08:31.040 --> 00:08:35.210 front? Is it the Brute Ratel, I believe, that they're trying to use now? 162 00:08:35.000 --> 00:08:38.840 Wendi Whitmore: Yeah, so we absolutely continue to see 163 00:08:35.210 --> 00:08:35.840 164 00:08:38.840 --> 00:08:41.510 Cobalt Strike. But Brute Ratel is a newer tool that's been 165 00:08:41.510 --> 00:08:45.260 released, similar capabilities to Cobalt Strike, but also has 166 00:08:45.380 --> 00:08:48.560 more ability to obfuscate their traffic and slack, for example, 167 00:08:48.560 --> 00:08:51.800 Microsoft Teams and other types of social media applications, 168 00:08:52.340 --> 00:08:54.890 which makes it pretty challenging to detect it right. 169 00:08:55.100 --> 00:08:58.700 I think what we're also seeing are the reality, it could be a 170 00:08:58.700 --> 00:09:02.090 nation-state actor that's using that tool, it could be a 171 00:09:02.090 --> 00:09:05.000 legitimate, you know, red team or a pen tester using that tool. 172 00:09:05.030 --> 00:09:07.640 And it also could be a cybercriminal actor. So we're seeing 173 00:09:07.640 --> 00:09:11.180 a huge convergence of the tool sets. And really this mentality, 174 00:09:11.180 --> 00:09:14.510 and I'm talked in the previous comment about attacker saying, 175 00:09:14.510 --> 00:09:16.310 Oh, hey, I'm not going to encrypt the data, because that's 176 00:09:16.310 --> 00:09:20.450 just a lot of effort will in the same sense, really moving toward 177 00:09:20.480 --> 00:09:23.330 efficiencies. And so using whatever tool is available to 178 00:09:23.330 --> 00:09:25.550 get the job done, regardless of who created it. 179 00:09:25.610 --> 00:09:26.990 Mathew Schwartz: And not the first time we've seen 180 00:09:26.990 --> 00:09:31.730 nation-state attackers using common tooling or tactics in 181 00:09:31.730 --> 00:09:34.820 order to try to obfuscate their identity, I suppose. 182 00:09:34.990 --> 00:09:37.210 Wendi Whitmore: Absolutely. Right? I mean, you think about, 183 00:09:37.210 --> 00:09:40.960 it makes the defender's job more difficult to determine, right, 184 00:09:40.960 --> 00:09:43.180 whose responsible and it's a good thing for the attackers. 185 00:09:43.000 --> 00:09:45.460 Mathew Schwartz: And how concerning or not that sort of 186 00:09:45.460 --> 00:09:49.390 incident may have been. Well, I'd like to shift then as a 187 00:09:49.390 --> 00:09:53.470 final question to the role of partnerships in advancing cyber 188 00:09:53.680 --> 00:09:57.340 diplomacy. Another panel, you were very busy at RSA this year 189 00:09:57.460 --> 00:10:02.500 on which you participated. Partnerships has been, is being 190 00:10:02.500 --> 00:10:06.340 and will be key, I think, to combating such things as 191 00:10:06.340 --> 00:10:08.410 ransomware. What were some of the takeaways you might 192 00:10:08.410 --> 00:10:10.900 highlight from that panel or advice that you have based on 193 00:10:10.900 --> 00:10:11.080 it? 194 00:10:11.810 --> 00:10:12.980 Wendi Whitmore: You know, I think what I would leave you 195 00:10:12.980 --> 00:10:16.280 with is there's no one tool in the toolkit, right? There's no 196 00:10:16.280 --> 00:10:19.280 one source of data that's going to be the silver bullet for 197 00:10:19.280 --> 00:10:21.890 solving this problem. And so what it really takes is 198 00:10:21.890 --> 00:10:25.730 technology working in tandem and in heavy alignment with 199 00:10:25.730 --> 00:10:29.810 diplomacy with sharing of information, not only between 200 00:10:29.810 --> 00:10:32.750 public and private partnerships, but private and private 201 00:10:32.750 --> 00:10:36.020 competitors, for example, right? And within the government 202 00:10:36.050 --> 00:10:39.380 sharing that not only within interagency collaboration, but 203 00:10:39.380 --> 00:10:42.560 with allies throughout the world. So all of that is really 204 00:10:42.560 --> 00:10:44.690 responsible for us to drive the ball forward. 205 00:10:44.000 --> 00:10:46.370 Mathew Schwartz: Okay. And we've been seeing that, I believe, 206 00:10:46.370 --> 00:10:49.160 with a number of initiatives on the government level, I think 207 00:10:49.160 --> 00:10:52.640 increasing private initiatives as well as to share more in the 208 00:10:52.640 --> 00:10:55.310 way of the threats that are being seen. It's been a 209 00:10:55.310 --> 00:10:57.470 long-standing thing. I think security researchers are have 210 00:10:57.470 --> 00:11:01.040 done, but I believe it's being done more programmatically now. 211 00:11:01.370 --> 00:11:03.410 I mean, calls for hope here? 212 00:11:04.170 --> 00:11:06.270 Wendi Whitmore: I'm optimistic about it. No doubt about it. 213 00:11:07.110 --> 00:11:09.180 Mathew Schwartz: Excellent. Well, Wendi, it's always a 214 00:11:09.180 --> 00:11:11.460 pleasure to have you in our studios. Thank you so much for 215 00:11:11.460 --> 00:11:12.480 your time and insights today. 216 00:11:12.780 --> 00:11:14.280 Wendi Whitmore: Thank you. Have a great time. 217 00:11:14.000 --> 00:11:16.730 Mathew Schwartz: Thank you. I'm Mathew Schwartz with Information 218 00:11:16.730 --> 00:11:19.310 Security Media Group. Thank you for joining us.