WEBVTT 1 00:00:00.240 --> 00:00:03.240 Anna Delaney: Hello and welcome to the ISMG Editors' Panel. I'm 2 00:00:03.240 --> 00:00:06.030 Anna Delaney. And this is our weekly roundup of the top 3 00:00:06.030 --> 00:00:10.200 stories in cybersecurity. And I'm delighted to be joined by 4 00:00:10.200 --> 00:00:13.380 some of my excellent colleagues, Tom field, senior vice president 5 00:00:13.380 --> 00:00:16.800 of editorial; Marianne Kolbasuk McGee, executive editor of 6 00:00:16.830 --> 00:00:20.160 HealthcareInfoSecurity; and Matthew Schwartz, executive 7 00:00:20.160 --> 00:00:23.460 editor of DataBreachToday and Europe. Wonderful to see you 8 00:00:23.460 --> 00:00:23.760 all. 9 00:00:24.420 --> 00:00:25.740 Tom Field: Wonderful to be seen. Welcome. 10 00:00:26.580 --> 00:00:27.390 Matthew Schwartz: Great to be here. 11 00:00:27.000 --> 00:00:31.410 Anna Delaney: Tom, I think I recognize that view. Are you in 12 00:00:31.410 --> 00:00:32.640 New York, perhaps? 13 00:00:32.930 --> 00:00:36.380 Tom Field: I was in New York. What's contrast is, I was on the 14 00:00:36.410 --> 00:00:40.040 15th floor of the hotel that was across from the venue where we 15 00:00:40.040 --> 00:00:44.390 had our New York Summit this week. And opening my window, all 16 00:00:44.390 --> 00:00:47.390 I could see were other windows and other buildings. Now, 17 00:00:47.600 --> 00:00:50.630 contrast is to a colleague who will remain nameless, who was on 18 00:00:50.630 --> 00:00:53.510 the 30th floor above and was looking at the Empire State 19 00:00:53.510 --> 00:00:53.810 building. 20 00:00:54.530 --> 00:00:55.940 Anna Delaney: I think that nameless colleague was actually 21 00:00:55.940 --> 00:01:01.700 on the 33rd; just upgraded that, but yes, what a view to have 22 00:01:01.910 --> 00:01:08.090 behind me. I'm just looking out still. It's great. Marianne? 23 00:01:08.270 --> 00:01:11.060 Marianne McGee: I'm in Plymouth Harbor, which is like 20-25 24 00:01:11.060 --> 00:01:14.060 miles from where I live. Went there on Memorial Day, I don't 25 00:01:14.060 --> 00:01:16.910 know if you can see the Mayflower II behind me. 26 00:01:19.340 --> 00:01:20.210 Anna Delaney: It's gorgeous. 27 00:01:21.500 --> 00:01:24.020 Marianne McGee: Yeah, you can't see it. But it's a replica of 28 00:01:24.020 --> 00:01:29.300 the original Mayflower from 1620, when the Pilgrims came 29 00:01:29.300 --> 00:01:31.130 over from England. 30 00:01:32.120 --> 00:01:34.370 Tom Field: The original might be behind me, but who'd know? 31 00:01:36.530 --> 00:01:39.380 Anna Delaney: Matt, very funky street art behind you. 32 00:01:40.070 --> 00:01:42.470 Matthew Schwartz: Flashing back to the RSA Conference in San 33 00:01:42.470 --> 00:01:46.280 Francisco, there was just some good street art in the region of 34 00:01:46.280 --> 00:01:48.800 the Moscone. So I couldn't resist. 35 00:01:49.400 --> 00:01:51.050 Anna Delaney: There was quite a bit of street art out there; 36 00:01:51.050 --> 00:01:55.310 urban artists taking advantage of those walls. 37 00:01:55.760 --> 00:01:58.220 Matthew Schwartz: Yeah, sketching scenes amongst the 38 00:01:58.730 --> 00:02:01.190 extreme wealth and dereliction block by block. 39 00:02:01.430 --> 00:02:06.080 Anna Delaney: Yeah. Well, I am in a plane. I just thought with 40 00:02:06.080 --> 00:02:11.420 all the recent plane travel, this is the window view and it 41 00:02:11.420 --> 00:02:14.750 never fails to impress me — the clouds and the sunrises, the 42 00:02:14.750 --> 00:02:18.170 anticipation of arriving somewhere new or getting home. 43 00:02:19.160 --> 00:02:20.360 Matthew Schwartz: You're up in the air, Anna. 44 00:02:20.930 --> 00:02:21.950 Tom Field: And she's getting used to it. 45 00:02:23.480 --> 00:02:25.280 Anna Delaney: Well, after two years of being deprived, I 46 00:02:25.280 --> 00:02:25.700 think. 47 00:02:26.870 --> 00:02:29.450 Tom Field: I think by the time we do our next one of these, you 48 00:02:29.450 --> 00:02:31.820 might actually be speaking with a New York accent. 49 00:02:32.870 --> 00:02:36.830 Anna Delaney: Give me time, give me time. I'm working on it. So 50 00:02:36.830 --> 00:02:41.870 Tom, we were in New York for a reason, our Northeast Summit. 51 00:02:41.960 --> 00:02:44.210 How was it for you? What were the highlights? 52 00:02:44.810 --> 00:02:47.300 Tom Field: Anna, it was so terrific for so many reasons. 53 00:02:47.300 --> 00:02:52.220 First of all, New York has always been our foundational 54 00:02:52.310 --> 00:02:55.430 summit location. And we were back there in the city for the 55 00:02:55.430 --> 00:02:59.600 first time since the fall of 2019. So here's the opportunity 56 00:02:59.600 --> 00:03:03.020 to bring together a community that we haven't had a chance to 57 00:03:03.020 --> 00:03:07.640 see for a long time; meet new faces, familiar faces, continue 58 00:03:07.640 --> 00:03:11.000 relationships. And think about the topics that we were able to 59 00:03:11.000 --> 00:03:15.800 discuss over the course of a day. We're talking about ... you 60 00:03:16.190 --> 00:03:20.360 held a panel on the latest P2P fraud trends. We were able to 61 00:03:20.360 --> 00:03:23.660 talk about zero trust with the godfather of zero trust, John 62 00:03:23.690 --> 00:03:30.230 Kindervag. We were able to talk about supply chain security with 63 00:03:30.230 --> 00:03:35.810 Chris Wysopal of Veracode. We dug into the impacts that are 64 00:03:35.810 --> 00:03:40.220 being felt everywhere from the Russian war in Ukraine. So we 65 00:03:40.220 --> 00:03:45.710 were able to dig into the real topical areas with significant 66 00:03:45.710 --> 00:03:48.110 thought leaders in the industry. Think about it, we had the 67 00:03:48.110 --> 00:03:51.320 chance to sit down with Ari Redbord and Lisa Sotto for the 68 00:03:51.320 --> 00:03:54.710 first time ever. They had the chance to meet each other. We 69 00:03:54.710 --> 00:03:59.420 really did bring a pretty good who's who together for a day of 70 00:03:59.420 --> 00:04:03.110 just nonstop conversation. Add to that, we had what, four 71 00:04:03.110 --> 00:04:06.800 different discrete roundtable discussions over the course of 72 00:04:06.800 --> 00:04:11.660 the day. Just an amazing event for me, so those are my 73 00:04:11.660 --> 00:04:13.160 takeaways. How about for yourself? 74 00:04:13.270 --> 00:04:14.920 Anna Delaney: That's an excellent overview! I really 75 00:04:14.920 --> 00:04:18.850 enjoyed watching your one-on-one interviews with Claire Le Gal 76 00:04:18.850 --> 00:04:22.570 and John Kindervag. Always amusing, just because with those 77 00:04:23.200 --> 00:04:26.830 ... they are quite personal and you get to talk about their 78 00:04:26.950 --> 00:04:29.710 background and their personal journeys, their professional 79 00:04:29.710 --> 00:04:34.060 journeys. So with Claire Le Gal, of course, she leads the cyber 80 00:04:34.180 --> 00:04:38.380 and fraud division at MasterCard and what she's doing to tackle 81 00:04:38.380 --> 00:04:42.250 cybercrime with her team, but also, it's all about breaking 82 00:04:42.250 --> 00:04:45.910 down industry silos, in her opinion. And what's on her mind, 83 00:04:45.910 --> 00:04:48.610 what's scaring her in the future, well, it's quantum 84 00:04:49.120 --> 00:04:53.680 cryptography, but also cryptocurrencies. So they give 85 00:04:53.680 --> 00:04:56.740 her hope, but that she knows that there are challenges ahead 86 00:04:56.740 --> 00:05:00.190 with those. Also, the Secret Service; how cool is it to be on 87 00:05:00.190 --> 00:05:02.800 a panel with the Secret Service or members of the Secret Service 88 00:05:02.830 --> 00:05:09.040 in New York and how they're investigating crypto-related 89 00:05:09.040 --> 00:05:13.270 crimes. And what's on their mind, well, Web 3 and DeFi. 90 00:05:13.270 --> 00:05:16.030 They're saying watch those. There are going to be challenges 91 00:05:16.030 --> 00:05:19.090 ahead and opportunities for criminals, and how can we have 92 00:05:19.090 --> 00:05:21.970 better law enforcement and cybersecurity in the industry 93 00:05:21.970 --> 00:05:25.420 work together? So those are my highlights. But yes, as you 94 00:05:25.420 --> 00:05:29.410 said, meeting people that we haven't met in person; we've 95 00:05:29.410 --> 00:05:32.530 been zooming for a couple of years with; and also the 96 00:05:32.530 --> 00:05:35.980 roundtable attendees as well, that a range of conversations. 97 00:05:35.980 --> 00:05:37.270 Loved every moment of it. 98 00:05:37.630 --> 00:05:39.970 Tom Field: It's such a relief not to have to say, 'excuse me, 99 00:05:39.970 --> 00:05:42.850 you're muted,' or 'oh, could you turn your camera on?' And just 100 00:05:42.850 --> 00:05:46.600 to have these conversations. I think another theme that you 101 00:05:46.600 --> 00:05:49.960 can't help but pay some attention to is just where is 102 00:05:49.960 --> 00:05:53.620 the economy going right now, there are so many warning signs 103 00:05:53.620 --> 00:05:57.010 out there. And I keep coming away from this feeling much like 104 00:05:57.010 --> 00:05:59.410 I did, and the conversation you and I had the other day when we 105 00:05:59.410 --> 00:06:03.400 recorded a session where I think when you look ahead to the next 106 00:06:03.400 --> 00:06:09.850 six months to a year, the rich may not get richer, but those 107 00:06:09.850 --> 00:06:13.990 who are poor, in terms of cybersecurity, are going to have 108 00:06:13.990 --> 00:06:17.860 to get more secure. So I don't think that the things that we're 109 00:06:17.860 --> 00:06:21.550 talking about and the areas that we focus on, are going to become 110 00:06:21.550 --> 00:06:25.840 any less priority. Certainly, the adversaries aren't going to 111 00:06:25.840 --> 00:06:28.930 let up because there's an economic downturn. So I think 112 00:06:28.930 --> 00:06:32.230 we're going to be continuing to discuss these issues with these 113 00:06:32.230 --> 00:06:36.850 and other thought leaders as we move forward. If I may, I would 114 00:06:36.850 --> 00:06:41.620 like to share just one highlight — a visual highlight of the day 115 00:06:41.650 --> 00:06:43.810 — and we're going to be looking at a Delaney photo here I 116 00:06:43.810 --> 00:06:44.260 believe. 117 00:06:46.490 --> 00:06:47.630 Anna Delaney: Spanners in the works. 118 00:06:49.160 --> 00:06:51.980 Tom Field: No spanners here but not a photo of you, a photo by 119 00:06:51.980 --> 00:06:52.250 you. 120 00:06:52.730 --> 00:06:53.420 Anna Delaney: Oh, great. 121 00:06:53.000 --> 00:06:54.410 Tom Field: As you get toward the end of the day, and you're going 122 00:06:54.470 --> 00:06:55.670 session this session, and Anna and I didn't take a break. We 123 00:06:55.700 --> 00:06:57.080 went on straight through the day, you get a little bit 124 00:06:57.080 --> 00:07:03.500 punchy. And one thing that I had noticed when we were on stage is 125 00:07:03.650 --> 00:07:12.680 yes, the staff had put water out there for the speakers. But they 126 00:07:12.680 --> 00:07:17.930 were cardboard cartons of water. Box of water; boxofwater.com I 127 00:07:17.930 --> 00:07:20.270 believe it was called. And finally toward the end of the 128 00:07:20.270 --> 00:07:23.240 day, I picked up one of these and started reading the text on 129 00:07:23.240 --> 00:07:26.600 it. And the text was not about missing people or anything like 130 00:07:26.600 --> 00:07:31.280 that. It was saying, if you take a photo of yourself with this 131 00:07:31.280 --> 00:07:35.240 box and post it with the appropriate hashtag, we will 132 00:07:35.240 --> 00:07:40.490 plant two trees in your name somewhere in a forest. And so, 133 00:07:40.940 --> 00:07:43.220 we did take the opportunity right in the middle of the 134 00:07:43.220 --> 00:07:47.540 session by the way. Take a picture of John Kindervag. This 135 00:07:47.540 --> 00:07:49.550 is, I believe, an Anna Delaney-photo. 136 00:07:49.640 --> 00:07:50.240 Anna Delaney: Yes, it is. 137 00:07:50.240 --> 00:07:53.840 Tom Field: John Kindervag holding up his box of water. So 138 00:07:53.840 --> 00:07:58.100 we're going to have two zero trust trees planted somewhere. 139 00:07:59.000 --> 00:08:01.730 Because of John Kindervag and Anna Delaney. That's one of my 140 00:08:01.730 --> 00:08:02.660 highlights of the event. 141 00:08:03.290 --> 00:08:05.690 Anna Delaney: Very good. I still need to put that hashtag on it. 142 00:08:05.690 --> 00:08:09.200 So you've reminded me, very good. Marianne, we have a 143 00:08:09.200 --> 00:08:12.590 healthcare summit soon. So you'll be in New York as well. 144 00:08:13.020 --> 00:08:17.670 Marianne McGee: Yeah, we have a great lineup. We have people 145 00:08:17.670 --> 00:08:21.450 representing different segments of the government from 146 00:08:21.480 --> 00:08:26.040 Department of Health and Human Services. We have the leader of 147 00:08:26.040 --> 00:08:29.490 medical device cybersecurity, Suzanne Schwartz will be 148 00:08:29.490 --> 00:08:35.970 speaking. We have the top HIPAA enforcement person under the 149 00:08:35.970 --> 00:08:40.740 Biden administration, Lisa Pino will be attending and speaking. 150 00:08:41.520 --> 00:08:45.810 We have Josh Corman, who recently completed a stint at 151 00:08:45.810 --> 00:08:49.140 CISA, representing the healthcare sector, who will be 152 00:08:49.140 --> 00:08:52.860 giving a keynote about you know, some of the observations that he 153 00:08:52.860 --> 00:08:57.000 made about the healthcare sector and the precarious position that 154 00:08:57.000 --> 00:09:03.300 it's in during his time at CISA. We have Errol Weiss who is the 155 00:09:03.780 --> 00:09:07.380 Chief Security Officer of the Health Information Sharing and 156 00:09:07.380 --> 00:09:14.160 Analysis Center. We have a lineup of other top notch CISOs 157 00:09:14.160 --> 00:09:17.040 and other security leaders from medical device makers, 158 00:09:17.040 --> 00:09:21.630 healthcare organizations. It runs the gamut so hopefully 159 00:09:21.630 --> 00:09:24.600 everybody will show up. Everyone that I just plugged here will be 160 00:09:24.630 --> 00:09:27.930 healthy and well and there. But that's what's planned. 161 00:09:27.000 --> 00:09:30.465 Anna Delaney: Marianne, you do attract the top names so that's 162 00:09:27.000 --> 00:09:35.850 Tom Field: If I may, one of your speakers actually showed up at 163 00:09:30.537 --> 00:09:31.620 an awesome job. 164 00:09:35.850 --> 00:09:38.040 our event this week, Marianne. So she was there what three or 165 00:09:38.040 --> 00:09:42.630 four weeks early. Anahi Santiago of ChristianaCare from Delaware 166 00:09:42.630 --> 00:09:43.350 and Philadelphia. 167 00:09:43.560 --> 00:09:45.030 Marianne McGee: Yeah, Anahi is great. Yep. 168 00:09:46.350 --> 00:09:48.450 Anna Delaney: So Marianne, Facebook has been in the news 169 00:09:48.450 --> 00:09:50.790 yet again this week. Tell us more. 170 00:09:50.960 --> 00:09:55.400 Marianne McGee: Yeah. As you know, or maybe you don't know 171 00:09:55.460 --> 00:10:00.500 that Facebook is now called Meta and, as you said, it has been in 172 00:10:00.500 --> 00:10:04.370 the news in recent days for a couple of privacy controversies 173 00:10:04.700 --> 00:10:08.270 involving allegations that the company is collecting consumers' 174 00:10:08.300 --> 00:10:13.010 sensitive health data through its pixel tracking code. Pixel 175 00:10:13.010 --> 00:10:16.700 is a snippet of code used by organizations to track the 176 00:10:16.700 --> 00:10:20.480 website activities of users and to help improve targeted 177 00:10:20.480 --> 00:10:24.680 marketing and advertising. A proposed class action lawsuit 178 00:10:24.680 --> 00:10:28.460 filed in a California Federal Court alleges that Facebook is 179 00:10:28.460 --> 00:10:32.900 using pixel to scrape patient data from more than 600 websites 180 00:10:32.900 --> 00:10:37.400 and patient portals of U.S. hospitals and medical providers, 181 00:10:37.670 --> 00:10:41.450 including data concerning patients, website encounters, 182 00:10:41.690 --> 00:10:44.780 ranging from setting up appointments with doctors to 183 00:10:44.780 --> 00:10:49.010 searching for information about various diseases. The lawsuit 184 00:10:49.010 --> 00:10:51.830 alleges that this is all happening without the knowledge 185 00:10:51.830 --> 00:10:56.420 or consent of individuals in violation of various state and 186 00:10:56.420 --> 00:11:01.460 federal laws, including HIPAA. Now, under HIPAA, a covered 187 00:11:01.460 --> 00:11:06.410 entity must have an individual's prior written authorization 188 00:11:06.410 --> 00:11:11.180 before the use or disclosure of protected health information can 189 00:11:11.180 --> 00:11:14.930 be made for marketing communications. The lawsuit 190 00:11:14.930 --> 00:11:18.770 alleges that through pixel, Facebook is obtaining patient 191 00:11:18.770 --> 00:11:23.090 identifiers, including email addresses, IP addresses, the 192 00:11:23.090 --> 00:11:27.890 user's status of as being a patient of a certain medical 193 00:11:27.890 --> 00:11:32.660 provider, as well as contents of communications relating to 194 00:11:32.660 --> 00:11:36.500 appointments that the patient has set up. It should be noted, 195 00:11:36.500 --> 00:11:40.310 though, that Facebook has faced similar lawsuits in the past. 196 00:11:40.520 --> 00:11:44.900 For instance, a class action lawsuit that was filed in 2016 197 00:11:44.930 --> 00:11:48.890 also allege that Facebook violated various federal and 198 00:11:48.890 --> 00:11:52.580 state laws by collecting and using individuals' browsing data 199 00:11:52.820 --> 00:11:56.810 from healthcare-related websites. That federal lawsuit 200 00:11:56.810 --> 00:12:02.960 was dismissed in 2018. The U.S. Court of Appeals upheld a lower 201 00:12:02.960 --> 00:12:06.890 court's decision to dismiss the case ruling that plaintiffs were 202 00:12:06.890 --> 00:12:10.280 barred from suing Facebook because they had agreed to be 203 00:12:10.280 --> 00:12:14.780 bound by Facebook's contract terms, which prevented the 204 00:12:14.780 --> 00:12:21.080 lawsuit. And another Facebook controversy this week, nonprofit 205 00:12:21.080 --> 00:12:24.050 investigative reporting organizations, The Markup and 206 00:12:24.050 --> 00:12:27.950 Reveal alleged that Facebook again, through the use of pixel, 207 00:12:28.160 --> 00:12:32.090 is collecting ultra sensitive personal data about individuals 208 00:12:32.090 --> 00:12:36.320 considering abortions, enabling anti-abortion organizations to 209 00:12:36.320 --> 00:12:41.930 use that data as a tool to target and influence these 210 00:12:41.930 --> 00:12:46.370 people online in violation of Facebook's own private own 211 00:12:46.400 --> 00:12:51.770 policies. Now, privacy experts are warning that, in that case, 212 00:12:52.040 --> 00:12:55.730 individuals' data trails could be used against them if some 213 00:12:55.730 --> 00:12:59.150 states criminalize abortion following the expected decision 214 00:12:59.150 --> 00:13:04.430 by the U.S. Supreme Court to overturn Roe v. Wade. So far, 215 00:13:04.430 --> 00:13:07.790 Facebook has not responded to our requests for comment on the 216 00:13:07.790 --> 00:13:10.700 allegations. But Facebook reportedly says that its 217 00:13:10.700 --> 00:13:14.840 filtering systems detect and remove potentially sensitive 218 00:13:14.840 --> 00:13:17.900 information before it gets stored in its advertising 219 00:13:17.930 --> 00:13:22.910 systems. So overall, the Facebook privacy controversies 220 00:13:22.910 --> 00:13:26.600 seem to fit into a theme that, in some cases, the fear of Big 221 00:13:26.600 --> 00:13:30.200 Brother government surveillance might actually come down to 222 00:13:30.200 --> 00:13:34.130 social media and other technology firms having access 223 00:13:34.130 --> 00:13:37.880 to too much sensitive data as part of their tech service 224 00:13:37.880 --> 00:13:40.880 offerings, and then the questions surrounding by how 225 00:13:40.880 --> 00:13:45.560 that data could get misused. And finally, along those same lines, 226 00:13:45.590 --> 00:13:49.250 Senator Elizabeth Warren, along with several other lawmakers, 227 00:13:49.460 --> 00:13:53.330 last week, introduced a bill into Congress, the Health and 228 00:13:53.330 --> 00:13:57.230 Location Data Protection Act, which proposes to ban third 229 00:13:57.230 --> 00:14:01.700 party data brokers from selling or transferring sensitive health 230 00:14:01.700 --> 00:14:06.950 and location data. So, you have a lot of controversies involving 231 00:14:07.070 --> 00:14:08.420 the privacy of health data. 232 00:14:08.610 --> 00:14:11.250 Anna Delaney: A lot of controversy. Marianne, has Meta 233 00:14:11.250 --> 00:14:13.590 had ever responded to your requests? 234 00:14:13.860 --> 00:14:17.400 Marianne McGee: No, I'm not surprised. No. But like I said, 235 00:14:17.400 --> 00:14:20.850 they've responded in general, I guess, to some of the reporting 236 00:14:20.850 --> 00:14:23.040 that's been out there about this, that they have filtering 237 00:14:23.040 --> 00:14:27.630 systems that, you know, prevent the sensitive data from being 238 00:14:27.630 --> 00:14:30.570 stored in their systems. And then, you know, as I mentioned, 239 00:14:30.570 --> 00:14:34.230 that other lawsuit that was sort of similar, I guess, got 240 00:14:34.230 --> 00:14:37.440 dismissed, because, you know, when people agree, I guess, to 241 00:14:37.440 --> 00:14:40.950 the policies of Facebook, when they use Facebook, you know, the 242 00:14:40.950 --> 00:14:44.430 social media site, they agree to give up certain rights, like 243 00:14:44.430 --> 00:14:47.970 suing. So I don't know if that pertains to this, or these, you 244 00:14:47.970 --> 00:14:51.540 know, latest controversies, but there's always sort of the fine 245 00:14:51.540 --> 00:14:55.530 line. They are, you know, the tiny print that you don't read. 246 00:14:56.520 --> 00:14:58.410 Matthew Schwartz: If only the U.S. had strong privacy 247 00:14:58.410 --> 00:15:00.900 protections, like you were saying or health data or for 248 00:15:00.900 --> 00:15:03.660 anything else? I mean, in Europe, that would be illegal. 249 00:15:05.130 --> 00:15:06.570 Which system do you prefer, right? 250 00:15:08.070 --> 00:15:09.810 Tom Field: Once again, legislatively we're a third 251 00:15:09.810 --> 00:15:10.440 world country. 252 00:15:12.510 --> 00:15:13.800 Marianne McGee: Yeah, well, you know, again, there's always 253 00:15:13.800 --> 00:15:17.670 these national proposals for federal privacy laws that, you 254 00:15:17.670 --> 00:15:19.200 know, don't go anywhere. And then you know ... 255 00:15:19.980 --> 00:15:20.580 Matthew Schwartz: Still waiting! 256 00:15:20.760 --> 00:15:23.460 Marianne McGee: Yeah, when you want the health data, you know, 257 00:15:23.460 --> 00:15:27.570 privacy security, there's like so many interesting and 258 00:15:28.020 --> 00:15:32.790 promising legislative proposals that are being floated even now, 259 00:15:32.790 --> 00:15:36.210 as I speak, that probably won't gain any steam. So you know, 260 00:15:36.210 --> 00:15:37.710 good ideas that never go anywhere. 261 00:15:39.060 --> 00:15:42.600 Anna Delaney: Let's see what happens next. As always, Matt, 262 00:15:42.660 --> 00:15:46.020 you're going a bit retro on us this week. Talk about 263 00:15:46.000 --> 00:15:49.002 Matthew Schwartz: Old school, Anna. Old school. That's right. 264 00:15:46.020 --> 00:15:46.320 Desjardins. 265 00:15:49.072 --> 00:15:52.983 When you have a site called DataBreachToday, I won't say 266 00:15:53.053 --> 00:15:57.033 that you actively seek data breaches. But definitely when 267 00:15:57.103 --> 00:16:01.503 they come along, it seems like they need to be highlighted. And 268 00:16:01.573 --> 00:16:05.693 there's a really interesting development in, as Maryann was 269 00:16:05.763 --> 00:16:09.813 talking about a class action lawsuit in the United States, 270 00:16:09.883 --> 00:16:14.003 this is a class action lawsuit in Canada against Desjardins 271 00:16:14.073 --> 00:16:18.123 Group, which is a Canadian financial services cooperative. 272 00:16:18.193 --> 00:16:21.545 Now, you may remember, Desjardins from such data 273 00:16:21.615 --> 00:16:25.875 breaches as a massive one that came to light in 2019, when it 274 00:16:25.945 --> 00:16:29.925 was revealed that personal details for 4.2 million active 275 00:16:29.995 --> 00:16:34.046 customers of the credit union group had been sold to third 276 00:16:34.115 --> 00:16:38.306 parties. And by third parties, I think we're talking darknet 277 00:16:38.375 --> 00:16:42.775 sites, cybercrime forums, that sort of thing, places where this 278 00:16:42.845 --> 00:16:46.965 information can be monetized. According to court documents, 279 00:16:47.035 --> 00:16:51.365 this information also ended up in the hands of other financial 280 00:16:51.434 --> 00:16:55.764 industry folks who wanted to use it for marketing purposes. So 281 00:16:55.834 --> 00:16:59.745 where did this horrible data breach trace back to? Well, 282 00:16:59.815 --> 00:17:03.935 traced back to somebody in the marketing department, get in 283 00:17:04.005 --> 00:17:07.985 your marketing jokes now. Nothing says targeted marketing 284 00:17:08.055 --> 00:17:12.455 like this, right? So a guy named Sebastien Vachon-Desjardins of 285 00:17:12.524 --> 00:17:16.645 Quebec has been arrested. Back in 2019, charged with fraud, 286 00:17:16.715 --> 00:17:21.044 identity theft, and tracking in stolen personally identifiable 287 00:17:21.114 --> 00:17:25.165 information. Now the case remains open, he hasn't appeared 288 00:17:25.234 --> 00:17:29.494 before a judge yet in a trial. He hasn't pleaded, as far as I 289 00:17:29.564 --> 00:17:33.894 know, not guilty. It's an open case still. So I don't know how 290 00:17:33.964 --> 00:17:37.874 that's going to resolve. But that didn't stop a bunch of 291 00:17:37.944 --> 00:17:42.134 customers from filing lawsuits. Those were consolidated. And 292 00:17:42.204 --> 00:17:46.464 earlier this year, Desjardins group suggested that it settled 293 00:17:46.534 --> 00:17:50.375 the lawsuit with these individuals who'd filed it for a 294 00:17:50.445 --> 00:17:54.355 total of about $200 million Canadian, so a bit over $150 295 00:17:54.425 --> 00:17:58.266 million U.S. And this has been approved now. So this is 296 00:17:58.336 --> 00:18:02.666 interesting, because data breach lawsuits tend to get settled. 297 00:18:02.736 --> 00:18:06.856 Companies don't want a court deciding or jury deciding what 298 00:18:06.926 --> 00:18:11.186 sorts of damages should go to the plaintiffs. I think because 299 00:18:11.255 --> 00:18:15.725 they think it could go horribly wrong for them. So typically, if 300 00:18:15.795 --> 00:18:20.054 things proceed, if they can't get thrown out, eventually they 301 00:18:20.124 --> 00:18:24.384 will settle in order to not set a precedent they might regret 302 00:18:24.454 --> 00:18:28.504 later. So that is what has happened here. Interesting that 303 00:18:28.574 --> 00:18:32.485 the breach happened, or 26 months, actually before being 304 00:18:32.555 --> 00:18:36.535 discovered in late 2018. And is only now getting settled. 305 00:18:36.605 --> 00:18:40.795 Another interesting thing is we've had an investigation into 306 00:18:40.865 --> 00:18:44.985 what went wrong, both by the privacy watchdog in Quebec, as 307 00:18:45.055 --> 00:18:49.315 well as Canada's Office of the Privacy Commissioner, and they 308 00:18:49.385 --> 00:18:53.854 decided to join forces, they did a joint investigation. And it's 309 00:18:53.924 --> 00:18:58.324 always fascinating to me, when you have a big bad breach, and a 310 00:18:58.394 --> 00:19:02.863 privacy watchdog comes in and is legally allowed to get anything 311 00:19:02.933 --> 00:19:06.913 they want, and then they publicly released a report about 312 00:19:06.983 --> 00:19:11.173 what went wrong. Fascinating, highly recommended for anybody 313 00:19:11.243 --> 00:19:15.713 in this industry or who's trying to prevent data breaches to see 314 00:19:15.782 --> 00:19:19.973 what went wrong. And so just to give a couple of highlights. 315 00:19:20.042 --> 00:19:24.232 There was segmentation in place to protect people's personal 316 00:19:24.302 --> 00:19:28.423 information when it was being stored on a banking system, a 317 00:19:28.492 --> 00:19:32.613 banking data warehouse actually. And so if you tried to get 318 00:19:32.682 --> 00:19:36.942 access to the information, you had to have appropriate access 319 00:19:37.012 --> 00:19:41.202 credentials. Great. This is exactly what should be the case. 320 00:19:41.272 --> 00:19:45.392 But this information was also being stored in a credit data 321 00:19:45.462 --> 00:19:49.303 warehouse and the controls weren't in place there. Also 322 00:19:49.373 --> 00:19:52.934 somebody in the marketing department was taking the 323 00:19:53.004 --> 00:19:56.077 information, including sensitive, regulated, 324 00:19:56.147 --> 00:19:59.499 confidential customer information. And they were 325 00:19:59.569 --> 00:20:03.759 copying it over. They had a batch job, they ran once a month 326 00:20:03.829 --> 00:20:08.228 to copy it over onto a database in the marketing department. So 327 00:20:08.298 --> 00:20:12.418 this malicious insider didn't allegedly hack into anything. 328 00:20:12.488 --> 00:20:16.818 The malicious insider allegedly just accessed information that 329 00:20:16.888 --> 00:20:21.357 was being inappropriately stored in the marketing department. So 330 00:20:21.427 --> 00:20:25.407 huge, bad thing that happens that the organization should 331 00:20:25.477 --> 00:20:29.179 have spotted in advance. And so there was a number of 332 00:20:29.248 --> 00:20:33.089 recommendations that the OPC, the Office of the Privacy 333 00:20:33.159 --> 00:20:37.698 Commissioner of Canada made, and one of them was access controls, 334 00:20:37.768 --> 00:20:41.679 this is a no brainer. This employee was able to find the 335 00:20:41.749 --> 00:20:45.660 data, get access to it, he shouldn't have had it, copied 336 00:20:45.729 --> 00:20:50.199 onto a USB key in circumvention of the confidentiality agreement 337 00:20:50.269 --> 00:20:54.528 that he signed, oh, well, you know, piece of paper, look out, 338 00:20:54.598 --> 00:20:58.020 and then sell all this information. And it wasn't 339 00:20:58.090 --> 00:21:02.001 detected for more than two years. So huge list of things 340 00:21:02.071 --> 00:21:06.051 that were done wrong. Class action lawsuit, which was not 341 00:21:06.121 --> 00:21:10.381 going in Desjardins' favor. So they ended up settling it. And 342 00:21:10.451 --> 00:21:14.641 here we have a breach story, which, again, like so many data 343 00:21:14.711 --> 00:21:19.040 breach stories, happened a long time ago and still hasn't been 344 00:21:19.110 --> 00:21:23.650 resolved. So there we go, a little data breach fun for everybody. 345 00:21:24.420 --> 00:21:27.720 Anna Delaney: That's an incredible story. So other than 346 00:21:27.750 --> 00:21:31.350 reading the report, lessons learned for organizations would 347 00:21:31.350 --> 00:21:31.830 be? 348 00:21:33.400 --> 00:21:34.930 Matthew Schwartz: Well, make sure you have the right controls 349 00:21:34.930 --> 00:21:37.300 in place. I mean, this comes down to a really simple problem: 350 00:21:37.300 --> 00:21:40.630 marketing was just bringing information in a matter that it 351 00:21:40.630 --> 00:21:44.650 shouldn't have done in order to make its life a little bit 352 00:21:44.650 --> 00:21:48.430 easier. And I've certainly been in organizations before where it 353 00:21:48.430 --> 00:21:52.840 has been copied left, right and center. I don't think in 354 00:21:52.840 --> 00:21:55.630 violation of any laws or anything, but the people will 355 00:21:55.630 --> 00:21:58.060 try to get their jobs done. And so there needs to be the right 356 00:21:58.060 --> 00:22:01.570 controls, the right tools. They didn't have those in place to 357 00:22:01.570 --> 00:22:05.890 facilitate this while maintaining security. So I was 358 00:22:05.890 --> 00:22:09.190 just reading the findings and just seeing if anything looks 359 00:22:09.190 --> 00:22:12.490 like it could happen in your own organization, and then maybe 360 00:22:12.520 --> 00:22:15.610 investigate to see whether or not it is and whether or not 361 00:22:15.610 --> 00:22:18.040 you'd be able to detect it if it was happening. 362 00:22:18.940 --> 00:22:21.190 Anna Delaney: Incredible that nobody in the organization 363 00:22:21.190 --> 00:22:23.770 actually got a sense of it as well. It's not just about the 364 00:22:23.770 --> 00:22:26.710 controls. It's the activity that's going on. 365 00:22:27.320 --> 00:22:29.480 Matthew Schwartz: Exactly and that's a great point as well. 366 00:22:29.480 --> 00:22:31.850 Certainly when we were at RSA, and we were talking to a lot of 367 00:22:31.850 --> 00:22:35.090 organizations that do threat intelligence, and one of the big 368 00:22:35.090 --> 00:22:37.850 deliverables from that, one of the big products that 369 00:22:37.850 --> 00:22:40.520 organizations, especially the financial services sector sign 370 00:22:40.520 --> 00:22:46.100 up for, is any hints or clue that data from their systems 371 00:22:46.130 --> 00:22:49.790 might be appearing where it shouldn't be, especially on 372 00:22:49.790 --> 00:22:53.750 cybercrime markets. Smoke, fire, all that sort of stuff. If you 373 00:22:53.750 --> 00:22:56.870 get a sense that it's showing up, then maybe they could have 374 00:22:56.870 --> 00:23:00.320 caught this breach a month into it, as opposed to 26 months into 375 00:23:00.320 --> 00:23:00.530 it. 376 00:23:02.280 --> 00:23:04.470 Anna Delaney: Always great insights, Matt, thank you, and I 377 00:23:04.470 --> 00:23:07.950 loved your French accent there. Really appreciate it. You've 378 00:23:08.370 --> 00:23:09.120 done very well. 379 00:23:09.990 --> 00:23:12.090 Matthew Schwartz: Thank you very much. All that time spent in 380 00:23:12.090 --> 00:23:13.890 France, not for nothing, I hope. 381 00:23:14.040 --> 00:23:16.920 Anna Delaney: It paid off. So finally, you have been 382 00:23:16.920 --> 00:23:20.760 commissioned, of course, to create the next big blockbuster 383 00:23:20.760 --> 00:23:25.560 film. And surprisingly, it's all about cybersecurity. What would 384 00:23:25.560 --> 00:23:26.490 you call it? 385 00:23:27.390 --> 00:23:29.850 Tom Field: I'm going to do a remake. We're going to update 386 00:23:29.850 --> 00:23:33.090 it. 30 years later, 40 years later, almost. We're going to 387 00:23:33.090 --> 00:23:36.090 call it Revenge of the Nerds: Back to the Breach. 388 00:23:37.620 --> 00:23:40.590 Anna Delaney: Love that! Marianne? 389 00:23:41.220 --> 00:23:45.060 Marianne McGee: Hacker Wars; you can use that for any kind of 390 00:23:45.060 --> 00:23:49.500 hackers you want to think of, you know, nation states fighting 391 00:23:49.500 --> 00:23:54.030 each other. You know, little kind of low grade hackers doing 392 00:23:54.060 --> 00:23:57.270 petty crimes. There's a lot of possibilities there. 393 00:23:59.940 --> 00:24:03.900 Anna Delaney: Yeah, big screen. Definitely. I can see that now. 394 00:24:04.350 --> 00:24:04.890 Matt? 395 00:24:05.980 --> 00:24:08.260 Matthew Schwartz: I was going to do Star Wars: Revenge of the 396 00:24:08.260 --> 00:24:11.710 Cybers. I just think that Star Wars is casting around for 397 00:24:11.710 --> 00:24:14.380 something new, right? We have all these flashback shows, all 398 00:24:14.380 --> 00:24:17.260 these little characters. They need to rethink things. So 399 00:24:17.680 --> 00:24:19.930 cybersecurity is hot. Why not? 400 00:24:20.190 --> 00:24:22.620 Anna Delaney: We'll have to call Harrison Ford , if he is free 401 00:24:22.620 --> 00:24:27.510 for that. A Whole New World is what I'm going for. Just because 402 00:24:27.510 --> 00:24:30.390 there's a whole new world, everybody's having to adapt — 403 00:24:30.390 --> 00:24:35.670 law enforcement, my mom, schools. So yes, hopefully, 404 00:24:35.670 --> 00:24:39.600 there's a hint of hopefulness. Hopeful, hopeful. There's a bit 405 00:24:39.600 --> 00:24:41.430 of positivity spin on there, I think. 406 00:24:42.090 --> 00:24:43.830 Tom Field: So back to Hacker Wars: A New Hope. 407 00:24:45.660 --> 00:24:47.700 Anna Delaney: Well, this has been entertaining, informative, 408 00:24:47.850 --> 00:24:51.510 and fun. Thank you very much, Tom, Marianne, Matt. Always a 409 00:24:51.510 --> 00:24:51.900 pleasure. 410 00:24:52.560 --> 00:24:53.190 Tom Field: Until next time! 411 00:24:54.990 --> 00:24:56.910 Anna Delaney: Thanks so much for watching. Until next time!