WEBVTT 1 00:00:07.470 --> 00:00:09.990 Anna Delaney: Hello, thanks for joining us at the ISMG Editors' 2 00:00:09.990 --> 00:00:12.930 Panel. I'm Anna Delaney, and this is a weekly spot where we 3 00:00:12.930 --> 00:00:16.590 tackle the latest cybersecurity news and challenges, as well as 4 00:00:16.590 --> 00:00:19.710 explore the most interesting innovations and technologies. 5 00:00:20.010 --> 00:00:23.340 I'm in fine company this week. The editors joining me are Tom 6 00:00:23.340 --> 00:00:26.610 Field, senior vice president of editorial; Marianne Kolbasuk 7 00:00:26.610 --> 00:00:30.300 McGee, executive editor for HealthcareInfoSecurity; and 8 00:00:30.300 --> 00:00:34.080 Michael Novinson, managing editor, ISMG business. Wonderful 9 00:00:34.080 --> 00:00:34.680 to see you all. 10 00:00:35.550 --> 00:00:36.390 Tom Field: Good to be seen, Anna. 11 00:00:37.050 --> 00:00:37.770 Marianne McGee: Thanks for having us. 12 00:00:37.770 --> 00:00:38.190 Michael Novinson: Thanks for having us. 13 00:00:39.360 --> 00:00:42.930 Anna Delaney: Tom, the Billy Goat Tavern. Now, I've not yet 14 00:00:42.930 --> 00:00:45.150 been, but I think I know where it is. Are you in Chicago? 15 00:00:46.740 --> 00:00:48.420 Tom Field: Not your kind of place, perhaps, Anna, but it's 16 00:00:48.420 --> 00:00:51.600 my kind of place. It's home to me. You know, it's known for two 17 00:00:51.600 --> 00:00:55.620 things. One is the origin of a very famous Saturday Night Live 18 00:00:55.620 --> 00:00:59.190 skit, starring John Belushi. But, more personal to me, it was 19 00:00:59.190 --> 00:01:02.130 the place where the journalists I grew up idolizing, hung out, 20 00:01:02.370 --> 00:01:04.380 when they got done at the end of the day at the Chicago 21 00:01:04.380 --> 00:01:07.740 newspapers. So, I've read about this for, God!, 40 years. It's 22 00:01:07.740 --> 00:01:11.310 nice to go and visit. Yes, I had my Double Cheeseborger for 23 00:01:11.310 --> 00:01:12.060 dinner last night. 24 00:01:12.630 --> 00:01:17.400 Anna Delaney: Very good. I have to see what that is, and taste 25 00:01:17.400 --> 00:01:20.940 what that is in the future sometime. Marianne, looking very 26 00:01:20.940 --> 00:01:23.370 pretty in pink and surrounded by beautiful flowers. 27 00:01:24.060 --> 00:01:27.540 Marianne McGee: Yeah, thanks! This was taken at a farm stand, 28 00:01:27.570 --> 00:01:30.030 out in Westwood, Mass. My husband and I took a trip out 29 00:01:30.030 --> 00:01:33.210 there a couple of weekends ago. New England in the fall is 30 00:01:33.600 --> 00:01:35.610 always very pretty, when it's not raining. 31 00:01:37.230 --> 00:01:39.210 Anna Delaney: I can only imagine, but lovely! And 32 00:01:39.240 --> 00:01:41.970 Michael, another sort of fall picture. 33 00:01:42.630 --> 00:01:44.850 Michael Novinson: Absolutely! Coming to you from Four Town 34 00:01:44.850 --> 00:01:48.990 Farm at my hometown of Seekonk, Massachusetts. My daughter chose 35 00:01:48.990 --> 00:01:52.200 a pumpkin there. And, one notable thing is they do tractor 36 00:01:52.200 --> 00:01:54.030 rides, but it's the only place I've ever been where you can do 37 00:01:54.030 --> 00:01:56.940 a multi-state tractor ride. In just 10 minutes, you go from 38 00:01:56.940 --> 00:01:59.310 Seekonk, Massachusetts, into East Providence, Rhode Island, 39 00:01:59.310 --> 00:02:01.890 back into Seekonk, with little signs when you're crossing the 40 00:02:01.890 --> 00:02:04.470 state line. So, a one-of-a-kind experience. 41 00:02:05.520 --> 00:02:05.850 Anna Delaney: Truly, a one-of-a-kind... 42 00:02:05.850 --> 00:02:08.310 Tom Field: A very sincere pumpkin patch, I expect to see 43 00:02:08.310 --> 00:02:09.870 the Great Pumpkin rising any moment. 44 00:02:11.580 --> 00:02:13.290 Michael Novinson: Indeed, I'll keep an eye out for it, and my 45 00:02:13.290 --> 00:02:14.310 daughter will be on the lookout. 46 00:02:15.930 --> 00:02:17.730 Anna Delaney: Well, Tom and Michael have seen this view 47 00:02:17.730 --> 00:02:21.090 already, but I just had to show you, Marianne. This was my hotel 48 00:02:21.090 --> 00:02:25.050 room with a view in New York, a few days ago. So, it wasn't 49 00:02:25.050 --> 00:02:29.460 much, was it? And, that's only 32, there were 45 floors. So, 50 00:02:29.970 --> 00:02:31.140 one can only imagine. 51 00:02:31.650 --> 00:02:33.060 Tom Field: Well, I got the galley. 52 00:02:35.850 --> 00:02:38.820 Anna Delaney: So, Michael, I think it's fair to say that the 53 00:02:38.850 --> 00:02:41.520 whole world has been watching the current horrific events 54 00:02:41.520 --> 00:02:44.670 unfold in Israel and Gaza. And, you've been taking a close look 55 00:02:44.670 --> 00:02:47.850 at its impact, both when it comes to cyber activity and 56 00:02:47.850 --> 00:02:49.800 cyber workforce. So, what can you share? 57 00:02:50.800 --> 00:02:52.540 Michael Novinson: Thank you for the opportunity, Anna, I think 58 00:02:52.540 --> 00:02:55.060 I'll take each of those in order. So, in terms of cyber 59 00:02:55.060 --> 00:02:59.200 activity, obviously, it's been nearing two weeks since the 60 00:02:59.200 --> 00:03:02.920 initial attack by Hamas on Israel on the 7th of October. 61 00:03:03.160 --> 00:03:05.830 Cyber activity, at this point, has been relatively limited. 62 00:03:06.010 --> 00:03:08.500 From what we've seen, it's been hacktivism, it's been DDoS 63 00:03:08.500 --> 00:03:12.130 attacks, shutting down websites like the Jerusalem Post. 64 00:03:12.460 --> 00:03:15.550 Anonymous Sudan is claimed to be involved; they do claim to be 65 00:03:15.550 --> 00:03:20.950 involved in a lot. So, it's been, kind of, amateur-type 66 00:03:21.010 --> 00:03:24.760 groups doing, kind of, low-impact, low-level attacks. 67 00:03:24.910 --> 00:03:27.160 What's going to be interesting to see is as the kinetic 68 00:03:27.160 --> 00:03:31.390 activity escalates, in particular, as a ground invasion 69 00:03:31.390 --> 00:03:36.010 of Gaza, perhaps into Lebanon, seems imminent. What does that 70 00:03:36.010 --> 00:03:39.340 mean from a cyber dimension? So, I'd had the opportunity to talk 71 00:03:39.340 --> 00:03:42.730 to Rob T. Lee of the SANS Institute about this. And, he 72 00:03:42.730 --> 00:03:47.110 was saying it would be good to keep an eye out in terms of the 73 00:03:47.110 --> 00:03:49.570 targeting of communications and command-and-control 74 00:03:49.600 --> 00:03:52.960 infrastructure; he feels that that would be an initial step as 75 00:03:52.990 --> 00:03:57.580 the kinetic activity escalates. And also, I think a key question 76 00:03:57.580 --> 00:04:01.180 here is going to be to what extent is this a local conflict? 77 00:04:01.450 --> 00:04:04.060 To what extent does it become a regional conflict with the 78 00:04:04.060 --> 00:04:07.090 involvement of Iran? And to what extent does it become a global 79 00:04:07.090 --> 00:04:10.120 conflict with the involvement of the United States? And each of 80 00:04:10.120 --> 00:04:15.280 those dimensions, obviously, brings a cyber component as 81 00:04:15.280 --> 00:04:20.080 well, in terms of Iran. I know, Rob Lee of the SANS Institute, 82 00:04:20.080 --> 00:04:25.150 was talking about Iran's capabilities to almost use cyber 83 00:04:25.150 --> 00:04:28.300 as a precursor to kinetic, that can they get intelligence on 84 00:04:28.300 --> 00:04:32.080 what Israel has planned, what their strikes are, maybe even 85 00:04:32.080 --> 00:04:36.550 what's going on with the U.S. military presence in the Middle 86 00:04:36.550 --> 00:04:40.840 East about the Mediterranean Sea. So, it's a way of getting 87 00:04:40.840 --> 00:04:45.700 involved in supporting without committing actual troops. Yeah, 88 00:04:45.730 --> 00:04:48.130 Rob Lee was talking about it being focused on intelligence 89 00:04:48.130 --> 00:04:50.950 gathering to understand where future military action is going 90 00:04:50.950 --> 00:04:55.030 to take place to try to counter it before the kinetic occurs. 91 00:04:55.450 --> 00:04:58.540 And then, from the U.S. standpoint, that would really be 92 00:04:58.540 --> 00:05:01.600 two things. One would be around intelligence sharing, and 93 00:05:01.600 --> 00:05:05.290 obviously, I know, we had heard last week from leadership at 94 00:05:05.290 --> 00:05:09.100 CISA that there's been very close contact between CISA and 95 00:05:09.100 --> 00:05:12.610 their counterparts in Israel. But, in continued intelligence 96 00:05:12.610 --> 00:05:18.160 sharing, and then now we see the big clash would be probably in 97 00:05:18.160 --> 00:05:21.070 the public would know, but in terms of zero day exploits, the 98 00:05:21.070 --> 00:05:23.200 U.S. obviously does have some of its reserves as the U.S. ends up 99 00:05:23.200 --> 00:05:25.960 sharing any of those with Israel, if this conflict 100 00:05:26.350 --> 00:05:29.620 escalates. And, you have entities like Iran with their 101 00:05:29.620 --> 00:05:32.200 own cyber capabilities getting involved, obviously, there's 102 00:05:32.620 --> 00:05:37.300 risk to that zero days are very resource intensive to develop 103 00:05:37.300 --> 00:05:40.000 and then be, as we've seen, that there can be some downstream 104 00:05:40.000 --> 00:05:44.050 impacts that zero days that get released out into the wild can 105 00:05:44.050 --> 00:05:46.510 end up in the hands of bad actors and used against our own 106 00:05:46.510 --> 00:05:49.930 interest. So, certainly from the standpoint of Rob Lee that 107 00:05:49.930 --> 00:05:53.830 wouldn't be an initial step. But, if this is turning into a 108 00:05:53.830 --> 00:05:57.340 more pronounced cyber conflict, but that's obviously another way 109 00:05:57.340 --> 00:06:01.270 that the U.S. could assist Israel. From a workforce 110 00:06:01.270 --> 00:06:04.030 perspective, the big thing that's going on here is just 111 00:06:04.030 --> 00:06:09.880 this unprecedented calling up of reserves to serve in the Israeli 112 00:06:09.880 --> 00:06:12.580 military. And, it's something that, being from a more 113 00:06:12.820 --> 00:06:14.890 population-rich country of the United States, is hard to 114 00:06:14.890 --> 00:06:18.910 comprehend. So, you're talking about 360,000 people, called up 115 00:06:18.910 --> 00:06:23.260 360,000 reservists, in a country of 9.7 million people. So, 116 00:06:23.260 --> 00:06:26.500 that's 4% of the overall population. So, not just the 117 00:06:26.500 --> 00:06:30.370 adult population, but 4% of the overall population, of Israel 118 00:06:30.370 --> 00:06:33.250 has been called up in one of the largest mass mobilizations in 119 00:06:33.250 --> 00:06:35.860 history according to The Washington Post. So, if you 120 00:06:35.860 --> 00:06:38.620 think about who works at cyber companies, its a lot of folks 121 00:06:38.620 --> 00:06:42.700 who came out of Mossad, came out of Unit 8200, also infantry, 122 00:06:42.700 --> 00:06:45.880 IDF. A lot of these startups are started by folks who are only a 123 00:06:45.880 --> 00:06:48.940 few years out of the military service, folks in their 20s and 124 00:06:49.030 --> 00:06:51.850 their 30s. So, if you're talking about who's going to get called 125 00:06:51.850 --> 00:06:56.560 up, that's a lot of that population. So, you have not 126 00:06:56.560 --> 00:06:58.780 that many companies have been public, but for the companies 127 00:06:58.780 --> 00:07:02.260 who have been public, you have companies like Armis, Pentera 128 00:07:02.260 --> 00:07:06.850 and Aqua Security, you hear this 10% figure, roughly 129 00:07:06.880 --> 00:07:11.710 10% of the Israeli workforce has been called up. And so, the 130 00:07:11.710 --> 00:07:14.440 question then becomes how well can you absorb that? So, if 131 00:07:14.440 --> 00:07:17.590 you're Armis, you're maybe a bit larger, you're Palo Alto 132 00:07:17.590 --> 00:07:20.740 Networks and you're global then it's easier to absorb. If you're 133 00:07:20.740 --> 00:07:24.130 a smaller company, I'm thinking of a company like Cyera, who's 134 00:07:24.130 --> 00:07:26.560 in that data security space, half of their workforce is in 135 00:07:26.560 --> 00:07:31.090 Israel. So, 10% of half, or 5%, of your overall workforce is out 136 00:07:31.090 --> 00:07:33.550 of commission. What does that mean for you? What type of 137 00:07:33.550 --> 00:07:36.940 redundancy do you have? I think the impact, in particular, is 138 00:07:36.940 --> 00:07:39.250 going to be pronounced in the research and development side. 139 00:07:39.610 --> 00:07:42.640 So, these companies mature as startups get to that DNC phase, 140 00:07:42.790 --> 00:07:45.880 they tend to move corporate headquarters to the U.S., they 141 00:07:45.880 --> 00:07:49.030 build up sales and marketing and channels and other go-to-market 142 00:07:49.030 --> 00:07:53.290 stuff in the larger regions. But, often that R&D core stays 143 00:07:53.290 --> 00:07:56.560 in Israel, just because of the immense cyber talent. I mean, 144 00:07:56.560 --> 00:07:59.560 really, there's about three places globally where R&D takes 145 00:07:59.560 --> 00:08:05.470 place. Israel's huge; India has a fair amount; and then Silicon 146 00:08:05.470 --> 00:08:08.860 Valley, in the U.S., is really the core of R&D. You've seen 147 00:08:08.860 --> 00:08:11.860 some sales and marketing in Texas, you see in other parts of 148 00:08:11.860 --> 00:08:15.160 the U.S., the Research Triangle, but the R&D in the U.S. is 149 00:08:15.160 --> 00:08:18.460 really in Silicon Valley. So, the question becomes, for these 150 00:08:18.460 --> 00:08:22.150 Israeli companies, if a lot of your R&D folks have been 151 00:08:22.150 --> 00:08:25.450 deployed, what do you have to fall back on? So, I know 152 00:08:25.450 --> 00:08:28.930 CyberArk had said - CyberArk is larger, and they have a good 153 00:08:28.930 --> 00:08:31.300 Israeli presence. So they've actually built up an Indian R&D 154 00:08:31.300 --> 00:08:35.050 team too. So, they're going to be continuing to expand R&D in 155 00:08:35.050 --> 00:08:38.230 India and leaning more on the Indian R&D folks, given the 156 00:08:38.230 --> 00:08:41.110 Israeli call up. The question is, I mean, CyberArk is a 157 00:08:41.110 --> 00:08:44.140 publicly traded company, they've been around for 20 years. If 158 00:08:44.140 --> 00:08:48.010 you're a series A round, B round startup, you don't have those 159 00:08:48.010 --> 00:08:53.020 same options. Then obviously if you're not in India today, 160 00:08:53.020 --> 00:08:57.550 trying to build that up from stage zero is hard to do. In 161 00:08:57.550 --> 00:09:01.480 Silicon Valley, obviously, the competition for skilled cyber 162 00:09:01.480 --> 00:09:04.210 talent is enormous. The salaries are enormous if you're a 163 00:09:04.780 --> 00:09:08.440 cash-strapped startup. So, I mean, I do think this is going 164 00:09:08.440 --> 00:09:11.680 to be a real challenge. And I think, in particular, given that 165 00:09:11.680 --> 00:09:14.440 all signs from Israel are that this is going to be a prolonged 166 00:09:14.470 --> 00:09:17.680 military operation, that they're looking for regime change in 167 00:09:17.680 --> 00:09:21.040 Gaza, and this could be months or years of military activity in 168 00:09:21.040 --> 00:09:23.950 Israel. I think this is really going to be an important space 169 00:09:23.950 --> 00:09:24.670 to watch. 170 00:09:26.020 --> 00:09:28.360 Anna Delaney: Very well said, Michael. And, did you get a 171 00:09:28.360 --> 00:09:32.350 sense of what security companies are doing to safeguard their 172 00:09:32.350 --> 00:09:34.990 operations on the ground there in Israel and their data during 173 00:09:34.990 --> 00:09:37.090 this period of heightened tension? Did that come up in 174 00:09:37.090 --> 00:09:37.900 conversations? 175 00:09:38.560 --> 00:09:40.480 Michael Novinson: It has, and, I mean, it's obvious, they're 176 00:09:40.480 --> 00:09:43.420 really messaging that everything is business as usual, leaning on 177 00:09:43.420 --> 00:09:48.190 other folks and no disruption to customers. I think there has 178 00:09:48.190 --> 00:09:51.490 been some increased attention around, like, are we going to be 179 00:09:51.490 --> 00:09:54.340 targeted by hacktivists because we're an Israeli-based company. 180 00:09:54.640 --> 00:09:59.890 So, there's been some attention paid to, kind of, watching your 181 00:09:59.890 --> 00:10:03.220 rear flank, so to speak. So, I think there's been some 182 00:10:03.220 --> 00:10:07.240 attention there. At this point, it doesn't seem like there's 183 00:10:07.240 --> 00:10:10.240 been a ton of targeting of the Israeli private sector, I mean, 184 00:10:10.240 --> 00:10:13.390 obviously, newspapers, which are a bit more public facing, not a 185 00:10:13.390 --> 00:10:16.780 ton of the private sector. And, to the extent that there are 186 00:10:16.780 --> 00:10:19.600 capabilities here, are they are going to be focused more on the 187 00:10:19.600 --> 00:10:22.120 commercial side or on the military side? I mean, my gut 188 00:10:22.120 --> 00:10:25.300 would tell me, probably, more on the military side. So, I don't 189 00:10:25.300 --> 00:10:27.460 think that's been an enormous concern. Certainly, they're 190 00:10:27.460 --> 00:10:30.970 being careful. But, I think it's something, overall, you're going 191 00:10:30.970 --> 00:10:32.710 to start to have public companies here in the next 192 00:10:32.710 --> 00:10:35.590 couple of weeks Check Point, CyberArk, Palo Alto Networks, 193 00:10:35.590 --> 00:10:38.470 companies that have meaningful presence in Israel, are going to 194 00:10:38.470 --> 00:10:40.810 be having to put out their outlook, and are going to have 195 00:10:40.810 --> 00:10:43.840 to be discussing what the impact is to investors. And, in 196 00:10:43.840 --> 00:10:45.880 regulatory filings, I think we're going to get a better 197 00:10:45.880 --> 00:10:49.120 sense at that point of how meaningful this impact is going 198 00:10:49.120 --> 00:10:53.860 to be of the mass mobilization, as well as the conflict between 199 00:10:53.860 --> 00:10:55.210 Israel and Hamas. 200 00:10:55.800 --> 00:10:58.050 Anna Delaney: Okay. Now, Tom I know you've been conducting a 201 00:10:58.050 --> 00:11:00.420 fair amount of interviews as well. What are you hearing? 202 00:11:01.380 --> 00:11:02.790 Tom Field: Well, a couple of things I want to say. First, I 203 00:11:02.790 --> 00:11:07.380 want to say this is the enormous story for ISMG. Every one of us 204 00:11:07.380 --> 00:11:10.230 on the screen has covered this to one aspect or another. 205 00:11:10.440 --> 00:11:12.900 Michael, from the very start, starting with the horrific 206 00:11:12.900 --> 00:11:16.860 attacks on October 7, he published our first stories, 207 00:11:16.860 --> 00:11:20.190 conducted interviews, and I want to say that I feel that our 208 00:11:20.220 --> 00:11:23.370 coverage has been responsible, it's been thorough, we've 209 00:11:23.370 --> 00:11:27.090 offered perspectives within and without Israel, on what's 210 00:11:27.090 --> 00:11:32.040 happening, the impact on Israel, on the world, on the industry, 211 00:11:32.190 --> 00:11:35.940 and, certainly, on humanity. But, it's not just a story. For 212 00:11:35.940 --> 00:11:40.800 us, it's personal. As you know, we have an affiliate, Xtra Mile, 213 00:11:41.130 --> 00:11:44.220 based in Israel; we have teammates and we have friends 214 00:11:44.610 --> 00:11:47.070 who are a part of this, and we've spoken to them, and 215 00:11:47.070 --> 00:11:50.610 they've shared with us their insights on what is happening. 216 00:11:51.000 --> 00:11:54.990 Going forward, I was supposed to take a trip to Israel in 217 00:11:55.020 --> 00:11:58.170 mid-November, that's not going to happen now. But instead, I'm 218 00:11:58.170 --> 00:12:01.800 going to conduct a series of virtual interviews with security 219 00:12:01.800 --> 00:12:05.910 and technology leaders within Israel on exactly these topics, 220 00:12:06.150 --> 00:12:08.910 the impacts on their organizations and on their 221 00:12:08.910 --> 00:12:12.480 teams, and their message to the world about what is happening, 222 00:12:12.480 --> 00:12:16.860 what they expect to happen, and how the world cannot just sit by 223 00:12:16.860 --> 00:12:22.080 and watch, but can help and be supportive of the innocents who 224 00:12:22.080 --> 00:12:23.370 are being attacked today. 225 00:12:25.290 --> 00:12:27.870 Anna Delaney: Well-said, well, so I, we kind of will be looking 226 00:12:27.870 --> 00:12:31.470 forward to watching those. But, Michael and Tom, thank you so 227 00:12:31.470 --> 00:12:34.980 much! We'll be definitely coming back to this, for sure. So, 228 00:12:35.010 --> 00:12:38.610 Marianne, you attended a House committee hearing this week, 229 00:12:38.700 --> 00:12:41.340 which addressed concerns surrounding AI regulation, 230 00:12:41.340 --> 00:12:45.000 privacy and the role of the U.S. in shaping the future of AI 231 00:12:45.000 --> 00:12:48.300 technology and data-usage standards. So, what were the 232 00:12:48.300 --> 00:12:49.200 main takeaways? 233 00:12:49.800 --> 00:12:53.790 Marianne McGee: So, Congress has been eyeing potential 234 00:12:53.790 --> 00:12:59.370 legislation to regulate AI. And, at that hearing, on Wednesday, 235 00:12:59.370 --> 00:13:03.210 by the House Committee on Energy and Commerce, one of their 236 00:13:03.210 --> 00:13:07.350 subcommittees, they tried to grapple with some of the issues 237 00:13:07.350 --> 00:13:12.660 involving AI, including privacy, biometrics, data minimization 238 00:13:12.660 --> 00:13:16.050 and a bunch of other issues. But, one of the overarching 239 00:13:16.080 --> 00:13:19.920 messages that some of the expert witnesses offered was the need 240 00:13:19.920 --> 00:13:24.420 for the U.S. to first pass national data privacy 241 00:13:24.420 --> 00:13:28.830 legislation, which has been talked about for years, and 242 00:13:28.830 --> 00:13:32.640 which could set a foundation for some of the similar issues that 243 00:13:32.640 --> 00:13:37.590 also dog AI. And, that includes issues around data collection, 244 00:13:37.590 --> 00:13:41.790 retention, deletion. For instance, regulations around how 245 00:13:41.790 --> 00:13:46.980 consumers' data is collected and used by data brokers, also 246 00:13:47.010 --> 00:13:51.570 relates to how data is collected and used for AI development and 247 00:13:51.570 --> 00:13:56.640 deployments. Late last year, the House Energy and Commerce 248 00:13:56.640 --> 00:14:01.590 Committee passed, nearly unanimously, a bipartisan 249 00:14:01.680 --> 00:14:05.910 privacy bill, the American Data Privacy and Protection Act or 250 00:14:05.940 --> 00:14:11.460 ADPPA. But, that bill has not moved beyond the committee for a 251 00:14:11.460 --> 00:14:17.610 vote by that full house. During the hearing, witnesses said that 252 00:14:17.700 --> 00:14:25.560 the ADPPA could also be sort of a foundation for AI regulations. 253 00:14:25.950 --> 00:14:30.210 For instance, one of the witnesses, former Federal Trade 254 00:14:30.210 --> 00:14:32.850 Commission Chair and Commissioner, Jon Leibowitz 255 00:14:33.060 --> 00:14:37.680 urged the committee to go back to the ADPPA and make some 256 00:14:37.680 --> 00:14:42.150 changes in that proposed data privacy regulation or 257 00:14:42.150 --> 00:14:45.960 legislation that could establish some of the rules for AI. For 258 00:14:45.960 --> 00:14:50.010 instance, he said that although ADPPA which was criticized by 259 00:14:50.010 --> 00:14:54.780 some privacy advocates, while it's not perfect, he said, it is 260 00:14:54.780 --> 00:14:58.980 something that perhaps can be improved upon, and already has 261 00:14:58.980 --> 00:15:03.300 some of the potential foundations for AI regulations. 262 00:15:03.300 --> 00:15:08.490 For instance, the ADPPA already has provisions for things 263 00:15:08.490 --> 00:15:11.430 around, for instance, prohibiting discrimination in 264 00:15:11.430 --> 00:15:17.310 algorithms, which Leibowitz said would also be applicable to 265 00:15:17.370 --> 00:15:22.290 prohibiting discrimination with AI. Witnesses during the hearing 266 00:15:22.290 --> 00:15:26.490 also warned that the U.S. needs to act swiftly if it wants to be 267 00:15:26.490 --> 00:15:30.660 a global leader in setting the rules of the road for AI, before 268 00:15:30.660 --> 00:15:34.920 adversaries, like China, dominate. There was also a 269 00:15:34.950 --> 00:15:38.850 discussion on how AI has been used in various industries such 270 00:15:38.850 --> 00:15:42.870 as healthcare to improve clinical decision-making and 271 00:15:42.870 --> 00:15:47.130 drug discovery and other sorts of areas of medicine. Now, some 272 00:15:47.130 --> 00:15:50.220 of that discussion touched upon whether the current HIPAA 273 00:15:50.220 --> 00:15:53.940 regulations might serve as a road map for data protections 274 00:15:53.940 --> 00:15:58.470 involving AI use in healthcare. HIPAA only covers certain 275 00:15:58.500 --> 00:16:04.560 healthcare data, not all, such as consumer device, health 276 00:16:04.680 --> 00:16:09.120 sort-of data, things that are entered by consumers on more of 277 00:16:09.120 --> 00:16:11.640 a consumer health-related website and those sorts of 278 00:16:11.640 --> 00:16:15.810 things, which are also often scraped and collected for AI 279 00:16:15.810 --> 00:16:21.900 sorts of issues or developments. With that in mind, the big 280 00:16:21.900 --> 00:16:25.980 question is whether the U.S. even has the wherewithal right 281 00:16:25.980 --> 00:16:31.530 now to tackle a national AI legislation, considering that it 282 00:16:31.530 --> 00:16:34.650 hasn't gotten very far with national data privacy 283 00:16:34.680 --> 00:16:38.880 legislation after so many years, not to mention the big partisan 284 00:16:38.880 --> 00:16:43.800 divide right now in Congress. So, as we're recording this, the 285 00:16:43.800 --> 00:16:47.460 U.S. House of Representatives still has not elected a new 286 00:16:47.460 --> 00:16:51.660 house speaker. So, much of the work in Congress right now in 287 00:16:51.660 --> 00:16:54.480 the near term is at a standstill, and we don't know, 288 00:16:54.480 --> 00:16:57.420 like, how it will proceed with some of the bigger-picture 289 00:16:57.450 --> 00:17:00.780 issues that the U.S. is dealing with, and, the global world at 290 00:17:00.780 --> 00:17:04.890 large, as Michael was just discussing with the issues in 291 00:17:05.100 --> 00:17:08.250 Israel. And we also, of course, have, you know, the 292 00:17:08.250 --> 00:17:11.340 Ukraine-Russia war still going on, and funding that's needed. 293 00:17:11.730 --> 00:17:14.940 So, we'll have to see what happens next, not only with this 294 00:17:14.940 --> 00:17:18.930 national AI legislation but also with the other important work 295 00:17:18.930 --> 00:17:20.190 that Congress needs to do. 296 00:17:21.120 --> 00:17:22.500 Anna Delaney: Excellent overview, and you mentioned the 297 00:17:22.500 --> 00:17:26.520 fear about China potentially setting global AI standards. 298 00:17:26.520 --> 00:17:30.120 Just talk more about that, and how do censorship requirements 299 00:17:30.150 --> 00:17:33.780 imposed by Beijing factor into China's AI development? 300 00:17:34.560 --> 00:17:38.190 Marianne McGee: Well, one of the things that, sort of, the 301 00:17:38.790 --> 00:17:41.700 industry witnesses that were testifying there, and you know, 302 00:17:41.700 --> 00:17:44.760 of course, for them, you know, they represent large software 303 00:17:44.760 --> 00:17:48.540 companies in the U.S., you know, of all sizes. And, you know, 304 00:17:48.540 --> 00:17:54.120 part of their worry is the whole idea of competition, and, you 305 00:17:54.120 --> 00:17:58.230 know, not having the U.S. lag behind in competition, but and 306 00:17:58.230 --> 00:18:02.310 then also, you know, setting the standards of, you know, what is 307 00:18:02.310 --> 00:18:05.070 acceptable? What is not acceptable? And then, you know, 308 00:18:05.100 --> 00:18:07.980 what would China do with, you know, a lot of the data that 309 00:18:07.980 --> 00:18:11.280 maybe it would collect about American consumers and 310 00:18:11.280 --> 00:18:16.410 businesses. So, you know, a lot of, thorny issues and 311 00:18:16.410 --> 00:18:22.260 complicated issues that have to be worked out. And, you know, it 312 00:18:22.260 --> 00:18:23.130 will take time. 313 00:18:24.510 --> 00:18:26.580 Anna Delaney: For sure. Well, that was a great overview for 314 00:18:26.580 --> 00:18:30.480 now. Thank you, Marianne. Tom, funny to think that we were in 315 00:18:30.480 --> 00:18:33.480 the same city with Michael, just 48 hours ago, and that we were 316 00:18:33.480 --> 00:18:37.080 there for the same reason, to host the ISMG Financial Services 317 00:18:37.080 --> 00:18:39.510 Summit. So, what did you take away from the event? 318 00:18:39.960 --> 00:18:43.020 Tom Field: Well, first of all, let's pay some respects here. It 319 00:18:43.020 --> 00:18:46.110 was not just the Financial Services Summit, it was the 10th 320 00:18:46.110 --> 00:18:50.550 Anniversary Summit for ISMG. We hosted our first such 321 00:18:50.550 --> 00:18:56.010 conference, October 21st, 2013. And, think about it, 10 years 322 00:18:56.010 --> 00:18:59.370 ago, when we hosted that we didn't even know about the 323 00:18:59.370 --> 00:19:05.880 Target breach. When we hosted that we didn't have chip-and-PIN 324 00:19:05.880 --> 00:19:09.990 cards in the U.S., we were still dependent upon the mag stripe to 325 00:19:09.990 --> 00:19:13.860 a very large extent. When we hosted that people were talking 326 00:19:13.860 --> 00:19:17.550 about the merits of bring your own device to work. And, now 327 00:19:17.550 --> 00:19:21.150 these same organizations are insisting that their employees 328 00:19:21.180 --> 00:19:24.300 bring and use their own devices. It is a whole different world. 329 00:19:24.810 --> 00:19:28.410 And, it was a whole different summit. I think I told you and 330 00:19:28.410 --> 00:19:31.470 Michael, when we wrapped up there, to me, the notion of a 331 00:19:31.470 --> 00:19:34.440 summit is you bring diverse people together to have 332 00:19:34.440 --> 00:19:38.490 meaningful dialogue on topics of importance. And, if that's the 333 00:19:38.490 --> 00:19:41.940 definition, then we succeeded with very high marks because we 334 00:19:41.940 --> 00:19:45.390 brought some wonderful speakers together, we brought hundreds of 335 00:19:45.390 --> 00:19:49.050 high-level practitioners together in the audience. And, 336 00:19:49.050 --> 00:19:52.260 we really had very meaningful discussions about topics such as 337 00:19:52.380 --> 00:19:56.400 the threat landscape, such as response to ransomware. And 338 00:19:56.400 --> 00:19:59.970 Michael, did we or did we not get the topic du jour? 339 00:20:00.630 --> 00:20:05.550 Generative AI. I think it was in every session. There were some 340 00:20:05.550 --> 00:20:08.550 things, Anna, that I was particularly pleased to have the 341 00:20:08.550 --> 00:20:12.390 opportunity to sit on a couple of panels, including the keynote 342 00:20:12.390 --> 00:20:15.360 panel that was about navigating the storm, about protecting 343 00:20:15.360 --> 00:20:19.050 financial services in an era of cyber turbulence. And, we had 344 00:20:19.050 --> 00:20:23.100 some good speakers - Susan Koski, the CISO of PNC; our 345 00:20:23.220 --> 00:20:27.330 event committee chair, Matanda Doss from JP Morgan Chase; to be 346 00:20:27.330 --> 00:20:31.020 joined by William Beer of Accenture; and Paul Leonhirth of 347 00:20:31.020 --> 00:20:34.530 Palo Alto Networks. And, among the things that came out is it's 348 00:20:34.530 --> 00:20:37.350 just not navigating one storm, it's navigating multiple storms 349 00:20:37.350 --> 00:20:42.090 today. And, we're talking about things such as the human 350 00:20:42.090 --> 00:20:44.790 element. And, that was another big topic that we discussed 351 00:20:44.790 --> 00:20:48.510 throughout the day, is how the insider risk has expanded much 352 00:20:48.510 --> 00:20:51.240 more beyond the malicious insider and the accidental 353 00:20:51.240 --> 00:20:55.320 insider to now the exploited insider; exploited by external 354 00:20:55.320 --> 00:20:57.960 factors. We talked a lot about that and the impact on 355 00:20:57.960 --> 00:21:02.130 institutions. I think that one of the points that resonated 356 00:21:02.130 --> 00:21:06.990 home with me was raised by Susan Koski. And it's not just about 357 00:21:08.010 --> 00:21:11.040 particular threats or threat actors, it's the force 358 00:21:11.040 --> 00:21:15.210 multiplication of them, and at the scale that they impact 359 00:21:15.210 --> 00:21:19.020 organizations today. We have not seen anything like this, 360 00:21:19.170 --> 00:21:21.330 certainly in the 10 years that we've been bringing people 361 00:21:21.330 --> 00:21:24.690 together for these summits. So, those are among the things that 362 00:21:24.870 --> 00:21:27.960 resonated with me. But, I want to ask you, you hosted for the 363 00:21:27.960 --> 00:21:32.520 first time one of our Solution Room, in this case, dedicated to 364 00:21:32.610 --> 00:21:36.960 ransomware response. How did you enjoy participating in that 365 00:21:36.960 --> 00:21:38.160 dialogue? And, what did you see? 366 00:21:39.630 --> 00:21:43.050 Anna Delaney: I loved it! It was great. You know, people love a 367 00:21:43.050 --> 00:21:46.590 good immersive experience. And I think, you know, to bring 368 00:21:46.920 --> 00:21:50.910 members of the Secret Service and members in incident response 369 00:21:50.910 --> 00:21:55.800 teams together on tables with CISOs, and then, to dive into a 370 00:21:55.800 --> 00:21:59.340 real-, well at least, it feels like a real-life scenario of a 371 00:21:59.340 --> 00:22:04.650 global supply chain attack. And, they're under pressure, time 372 00:22:04.650 --> 00:22:07.710 under pressure. So, 10 minutes, we had a phase I, and then we 373 00:22:07.710 --> 00:22:11.310 had various phases, and we reduce the time. And, there was 374 00:22:11.310 --> 00:22:14.190 excitement in the room, but also brilliant to have those 375 00:22:14.190 --> 00:22:18.570 takeaways with the Secret Service afterwards, and people 376 00:22:18.570 --> 00:22:23.970 on incident response teams. It's not easy! People will always 377 00:22:23.970 --> 00:22:26.760 struggle with this, though. What happens if all your documents 378 00:22:26.790 --> 00:22:30.510 are offline versus online? Who do you call, there's still 379 00:22:30.570 --> 00:22:34.410 questions about do we really need to bring in law 380 00:22:34.410 --> 00:22:38.100 enforcement? What does that mean? What are the consequences? 381 00:22:38.190 --> 00:22:41.790 How will this impact our reputation? civil lawsuits, for 382 00:22:41.790 --> 00:22:45.990 instance? So, there were a few misconceptions put right. But, 383 00:22:46.140 --> 00:22:49.290 very, very good, very successful event, and very well said, Tom. 384 00:22:49.290 --> 00:22:51.750 Congratulations for your 10 years. That's pretty impressive! 385 00:22:52.440 --> 00:22:54.570 Tom Field: Marianne was there for day 1, she remembers that. 386 00:22:55.740 --> 00:22:58.650 And, I want to say, too, I don't think I shared this, Anna, with 387 00:22:58.650 --> 00:23:03.300 you or with you, Michael. I wasn't, I was maybe a little bit 388 00:23:03.300 --> 00:23:06.120 nervous going into this event because of all the coverage we 389 00:23:06.120 --> 00:23:08.880 had on Israel. I was a little concerned about how it might be 390 00:23:08.880 --> 00:23:12.720 received. But, I want to tell you that there were people there 391 00:23:12.720 --> 00:23:15.000 that approached me and sought me out and wanted to talk 392 00:23:15.000 --> 00:23:18.120 specifically about stories we've written, interviews we've 393 00:23:18.120 --> 00:23:22.320 conducted, and how meaningful they were to them, that I would 394 00:23:22.320 --> 00:23:24.720 say that the interview that I conducted with our colleague, 395 00:23:25.080 --> 00:23:28.800 Sharon Israel of Xtra Mile a week or so ago, probably the 396 00:23:28.800 --> 00:23:31.710 most emotional interview I've ever recorded in my life, and 397 00:23:31.710 --> 00:23:35.340 I've recorded a few. But, I want to say that one of our 398 00:23:35.340 --> 00:23:38.070 attendees, actually one of our sponsors, came up to me at the 399 00:23:38.070 --> 00:23:41.730 event and actually hugged me, and thanked me for conducting 400 00:23:41.730 --> 00:23:43.920 that interview. So, I wanted to share that with the two of you 401 00:23:43.920 --> 00:23:46.260 that made me feel very good about what we've done and the 402 00:23:46.260 --> 00:23:48.600 responsible role that we've played in this coverage. 403 00:23:49.320 --> 00:23:52.650 Anna Delaney: That's very moving. Well done! Well done to 404 00:23:52.650 --> 00:23:56.130 everybody. But also, just on generative AI, because I know 405 00:23:56.130 --> 00:23:59.370 it's come up a lot. And, it's had a few months to evolve the 406 00:23:59.370 --> 00:24:03.000 conversation, to mature a little bit. We were in RSA, that was 407 00:24:03.000 --> 00:24:06.420 the theme du jour, and Michael, I know you've conducted lots of 408 00:24:06.420 --> 00:24:09.900 interviews on generative AI. Was there anything new? How do you 409 00:24:09.900 --> 00:24:12.630 feel the conversation has shifted slightly? 410 00:24:12.000 --> 00:24:14.506 Michael Novinson: I would say it has certainly matured. It's been 411 00:24:14.559 --> 00:24:17.705 kind of three big blocks of interviews I've done. Tom and I 412 00:24:17.758 --> 00:24:20.798 know, we were all together in San Francisco at the end of 413 00:24:20.851 --> 00:24:24.264 April at RSA, Tom and I already gathered in August at Black Hat, 414 00:24:24.317 --> 00:24:27.410 and now it's about two months later that we're speaking to 415 00:24:27.463 --> 00:24:30.823 folks in New York. Obviously, it was a bit of a different focus 416 00:24:30.876 --> 00:24:33.915 and user practitioners in financial services, while Black 417 00:24:33.969 --> 00:24:37.008 Hat was much more threat researchers. I was hearing a lot 418 00:24:37.061 --> 00:24:40.154 more, I will say, a lot more about AI hallucination than I 419 00:24:40.207 --> 00:24:43.460 had heard before. Several people I spoke to brought it up and 420 00:24:43.513 --> 00:24:46.606 it's just one of the downside risks. The fact that they'll 421 00:24:46.659 --> 00:24:49.912 just kind of reach and guess and conjecture in ways, and what 422 00:24:49.965 --> 00:24:53.165 does that mean if you're using it within your business? It's 423 00:24:53.218 --> 00:24:56.631 something I hadn't heard as much about at RSA or Black Hat. That 424 00:24:56.684 --> 00:24:59.990 seems to be more top-of-mind for folks. And, it does seem that 425 00:25:00.044 --> 00:25:03.510 people are thinking a little bit more specifically about how they 426 00:25:03.563 --> 00:25:06.282 can use it to enhance their operations, not just in 427 00:25:06.336 --> 00:25:09.268 security, but then from a compliance standpoint, from a 428 00:25:09.322 --> 00:25:12.201 legal standpoint, from a risk standpoint, and that, in 429 00:25:12.254 --> 00:25:15.720 particular, since I was speaking to security leaders at some very 430 00:25:15.774 --> 00:25:18.973 large political and financial institutions, how do you break 431 00:25:19.027 --> 00:25:22.386 through their silos. Since all of these functions have normally 432 00:25:22.439 --> 00:25:25.745 been very segmented? If you are using generative AI in some of 433 00:25:25.799 --> 00:25:29.051 these areas? How do you allow for that cross communication to 434 00:25:29.105 --> 00:25:32.411 make sure that all the various teams that are taking advantage 435 00:25:32.464 --> 00:25:35.823 of it and are also collectively aware of the risk as it's being 436 00:25:35.877 --> 00:25:38.010 used in other parts of the organization? 437 00:25:39.390 --> 00:25:41.250 Tom Field: At the end of the day, the evolution I've seen 438 00:25:41.250 --> 00:25:44.100 when we started talking about this earlier in the year, it 439 00:25:44.100 --> 00:25:47.370 was, we need to get a policy around this. We need to stop 440 00:25:47.370 --> 00:25:50.820 this, in some cases, before it takes over and has matured to 441 00:25:50.820 --> 00:25:53.520 the point now where my discussions are, we have to 442 00:25:53.520 --> 00:25:56.850 enable this. We have to have guardrails, if anything else. A 443 00:25:56.850 --> 00:26:00.090 comparison was made, I think at a dinner I hosted the other 444 00:26:00.090 --> 00:26:03.180 night, that it's like, you've heard this before, it's like 445 00:26:03.180 --> 00:26:06.990 brakes on a car, brakes aren't there to stop the vehicle, the 446 00:26:07.080 --> 00:26:09.630 brakes are there to give you the security that you can accelerate 447 00:26:09.630 --> 00:26:12.570 faster. And, that's where I'm seeing a lot of security leaders 448 00:26:12.570 --> 00:26:14.760 put their energy now; that's encouraging to see. 449 00:26:15.450 --> 00:26:17.310 Anna Delaney: I noticed on one of your panels, one of the 450 00:26:17.310 --> 00:26:21.060 speakers said, oh we don't talk about zero trust anymore! It's 451 00:26:21.060 --> 00:26:23.790 all about generative AI. We all laughed, but is there any truth 452 00:26:23.790 --> 00:26:24.240 in that? 453 00:26:25.350 --> 00:26:26.640 Tom Field: Total truth in that! 454 00:26:28.380 --> 00:26:30.150 Anna Delaney: Well, I hope people, or organizations, are 455 00:26:30.150 --> 00:26:33.120 not abandoning there's zero trust policies. There's always 456 00:26:33.120 --> 00:26:37.860 strategies. But, moving on for a bit of fun, I think we need it. 457 00:26:38.130 --> 00:26:42.450 If AI could take over one boring, mundane task in your 458 00:26:42.450 --> 00:26:45.960 life, what would it be? And how would you spend your newfound 459 00:26:45.960 --> 00:26:48.480 free time? Dive in! 460 00:26:48.480 --> 00:26:51.390 Tom Field: Oh boy, if generative AI could take over my travel 461 00:26:51.390 --> 00:26:54.510 plans for me, I'd be delighted! If I could just tell them and 462 00:26:54.510 --> 00:26:57.540 tell them where I have to go and have these decisions made for 463 00:26:57.540 --> 00:27:01.560 me. And, then I would no longer be booking flights and realizing 464 00:27:01.560 --> 00:27:04.860 later that when I booked it, it said PM and I meant AM. 465 00:27:07.740 --> 00:27:11.280 Anna Delaney: We've all been there, Tom. Marianne, sorry, 466 00:27:11.310 --> 00:27:12.690 Michael, go ahead. 467 00:27:13.560 --> 00:27:15.630 Michael Novinson: Absolutely, all the names it does get tough. 468 00:27:15.660 --> 00:27:17.970 I was going to say, and this might be a bit of a stretch, but 469 00:27:17.970 --> 00:27:20.910 if AI could somehow help with sorting and folding laundry, 470 00:27:21.090 --> 00:27:24.450 particularly balling up socks and folding fitted sheets that 471 00:27:24.450 --> 00:27:27.720 would be appreciated. I realized this may be more of a long-term 472 00:27:27.720 --> 00:27:30.150 goal, but I would take it. I'd love to, if it freed the time 473 00:27:30.150 --> 00:27:32.010 up, I'd love to get back to playing tennis. I played as a 474 00:27:32.010 --> 00:27:35.730 kid, I miss it. It'd be a fun activity to do. So, that's what 475 00:27:35.730 --> 00:27:36.090 I'm looking for. 476 00:27:36.090 --> 00:27:36.270 Anna Delaney: Perfect! 477 00:27:36.750 --> 00:27:37.350 Tom Field: I'm with you there. 478 00:27:38.950 --> 00:27:39.520 Anna Delaney: Marianne? 479 00:27:39.820 --> 00:27:41.650 Marianne McGee: I would say putting the groceries away. 480 00:27:42.670 --> 00:27:46.690 After you go grocery shopping, it's all so... But now, 481 00:27:47.380 --> 00:27:50.680 actually, when you were just talking about, what you're 482 00:27:50.860 --> 00:27:53.740 speaking about, Michael, now about the laundry, I was 483 00:27:53.740 --> 00:27:56.650 thinking, you know, when my kids were home and I had a lot of 484 00:27:56.650 --> 00:28:00.880 other sort of things to do, like with schoolwork and that sort of 485 00:28:00.880 --> 00:28:03.880 thing, it would have been great to have some sort of AI tool at 486 00:28:03.880 --> 00:28:06.700 the beginning of the school year. When you have to fill out 487 00:28:06.700 --> 00:28:10.510 all the paperwork about you know, contacts and emergency 488 00:28:10.510 --> 00:28:15.070 contacts and backup contacts and, you know, doctors and all 489 00:28:15.070 --> 00:28:16.810 that sort of things that would have been great 490 00:28:17.740 --> 00:28:20.320 Anna Delaney: That's spot on. I think we're all feeling that 491 00:28:20.320 --> 00:28:25.210 pain. There's no pain like being a mother and then having to do 492 00:28:25.210 --> 00:28:29.800 all of that. I would love AI talking about travel, Tom, I 493 00:28:29.800 --> 00:28:33.880 agree. How about AI to go through airport security for us, 494 00:28:33.910 --> 00:28:37.240 with an AI avatar so I could spend more time exploring the 495 00:28:37.240 --> 00:28:42.430 new city and less time queuing. Well, one can only hope that the 496 00:28:42.970 --> 00:28:45.850 technology advances. Marianne, Tom, Michael, this has been a 497 00:28:45.850 --> 00:28:48.490 real pleasure, very moving episode. So, thank you so much. 498 00:28:49.330 --> 00:28:49.900 Michael Novinson: Thank you Anna. 499 00:28:51.550 --> 00:28:53.260 Anna Delaney: And, thanks so much for watching! Until next 500 00:28:53.260 --> 00:28:53.470 time.