WEBVTT 1 00:00:07.200 --> 00:00:09.660 Anna Delaney: Welcome to the ISMG Editors' Panel. I'm Anna 2 00:00:09.660 --> 00:00:12.570 Delaney, and today we delve into the state of the identity 3 00:00:12.570 --> 00:00:16.680 security in 2024, tackling pivotal questions. How does gen 4 00:00:16.710 --> 00:00:20.430 AI influence identity and access management, and mobile driver 5 00:00:20.430 --> 00:00:23.520 licenses address identity verification challenges? And 6 00:00:23.520 --> 00:00:26.850 will social security numbers persist like passwords in the 7 00:00:26.850 --> 00:00:30.510 era of passwordless security? Well, joining us to provide 8 00:00:30.510 --> 00:00:33.510 insights into these topics and lead our conversation today is 9 00:00:33.510 --> 00:00:37.170 the excellent Jeremy Grant, managing director of technology 10 00:00:37.170 --> 00:00:41.490 business strategy at Venable LLP. Jeremy, so very good to 11 00:00:41.490 --> 00:00:43.170 have you back on the ISMG Editors' Panel. 12 00:00:43.530 --> 00:00:45.000 Jeremy Grant: Great to be back. It's been too long. 13 00:00:45.240 --> 00:00:48.300 Anna Delaney: It has and also with us are ISMG's Tom Field, 14 00:00:48.330 --> 00:00:50.910 senior vice president of Editorial, and Mathew Schwartz, 15 00:00:50.940 --> 00:00:53.760 executive editor of DataBreachToday in Europe. Good 16 00:00:53.760 --> 00:00:54.720 to see you all. 17 00:00:55.470 --> 00:00:56.670 Jeremy Grant: Thanks for having us. 18 00:00:56.670 --> 00:00:56.940 Mathew Schwartz: Hello! 19 00:00:56.940 --> 00:00:59.670 Anna Delaney: Hello! So, Jeremy, where are you in your virtual 20 00:00:59.670 --> 00:01:00.090 world? 21 00:01:00.660 --> 00:01:03.930 Jeremy Grant: I am in Houston at the college football national 22 00:01:03.930 --> 00:01:06.360 championship game that took place last month where my 23 00:01:06.360 --> 00:01:10.050 University of Michigan Wolverines smoked the Huskies of 24 00:01:10.140 --> 00:01:14.610 University of Washington, after previously smiting other awful 25 00:01:14.610 --> 00:01:18.420 schools like Ohio State and Alabama, and celebrating the 26 00:01:18.420 --> 00:01:19.290 national championship. 27 00:01:20.520 --> 00:01:21.600 Tom Field: A very good year for you. 28 00:01:23.580 --> 00:01:25.830 Anna Delaney: Great stuff, and Tom are you in the candy store? 29 00:01:26.340 --> 00:01:28.350 Tom Field: Well, this is the local Target, and this stemmed 30 00:01:28.350 --> 00:01:31.200 from a conversation you and I had when I found out that you in 31 00:01:31.200 --> 00:01:35.790 London did not know anything about Peeps candy. Well, these 32 00:01:35.790 --> 00:01:38.280 marshmallow treats are especially popular as you go 33 00:01:38.280 --> 00:01:40.830 into Easter. Although they're seasonal, you can find them for 34 00:01:40.830 --> 00:01:43.950 anything. So, as I was in the store last night, I took this 35 00:01:43.950 --> 00:01:47.250 photo expressly to share with you so you can essentially see 36 00:01:47.250 --> 00:01:50.700 the wall of Peeps that includes the standards little bunnies and 37 00:01:50.700 --> 00:01:54.930 chicks, as well as those dipped in chocolate. There are Peeps 38 00:01:54.960 --> 00:01:59.730 gummies, Peeps breakfast cereal, and new this year, Dr. Pepper 39 00:02:00.000 --> 00:02:00.840 flavored Peeps, 40 00:02:02.550 --> 00:02:05.040 Anna Delaney: My Peeps education continues. This is excellent. 41 00:02:05.790 --> 00:02:08.040 Jeremy Grant: I have a question for Tom, when you eat them. Do 42 00:02:08.040 --> 00:02:10.470 you bite the head off first? Or do you save the head for last? 43 00:02:11.640 --> 00:02:13.050 Tom Field: Usually it's "Off with their heads!" 44 00:02:13.950 --> 00:02:14.460 Jeremy Grant: I'm with you. 45 00:02:16.080 --> 00:02:18.960 Anna Delaney: Mat, back me up. We don't have Peeps here do we? 46 00:02:19.500 --> 00:02:22.260 Mathew Schwartz: We do not, no. We're less Peep-tastic than the 47 00:02:22.260 --> 00:02:25.290 States, unfortunately. So, I know what your next question is 48 00:02:25.290 --> 00:02:28.500 going to be, which is "Where are you? Are there Peeps?" No, 49 00:02:28.500 --> 00:02:31.500 there's no Peeps. This is in Scotland, just off the coast a 50 00:02:31.500 --> 00:02:34.560 little bit a place called Stone Haven, which has this beautiful 51 00:02:34.950 --> 00:02:35.460 harbor. 52 00:02:36.120 --> 00:02:40.230 Anna Delaney: All heads intact, that's good. And, this is from 53 00:02:40.230 --> 00:02:45.240 the Egypt section of the Mets Museum of Art in New York City, 54 00:02:45.240 --> 00:02:48.090 which is a wonderful place to time travel, and home to the 55 00:02:48.090 --> 00:02:50.580 world's oldest surviving piano. Did you know that? 56 00:02:51.240 --> 00:02:51.900 Tom Field: Did not know that. 57 00:02:53.400 --> 00:02:55.860 Anna Delaney: Well, Jeremy, let's dive into the questions. 58 00:02:55.890 --> 00:02:56.970 Tom, do you want to start us off? 59 00:02:57.420 --> 00:02:59.250 Tom Field: I'd be delighted to! Jeremy, you and I have had this 60 00:02:59.250 --> 00:03:01.830 conversation for years, at the start of the year, and here we 61 00:03:01.830 --> 00:03:05.550 are. We've got all the year-end reports for 2023, and the year 62 00:03:05.550 --> 00:03:10.710 beginning reports in 2024. My perennial question, what is the 63 00:03:10.710 --> 00:03:15.360 state of secure identity here in the first part of 2024? 64 00:03:17.310 --> 00:03:20.550 Jeremy Grant: Not awesome. I would say though, in pockets, 65 00:03:20.550 --> 00:03:24.270 we're making good progress. I think as the numbers are coming 66 00:03:24.270 --> 00:03:27.270 out over the last couple of years, and also we're seeing 67 00:03:27.270 --> 00:03:30.090 some new information come from different authoritative sources 68 00:03:30.090 --> 00:03:34.620 in the U.S. government tracking where we're seeing identity 69 00:03:34.620 --> 00:03:37.200 theft, identity-related cybercrime, things continue to 70 00:03:37.200 --> 00:03:41.220 get worse each year. The Identity Theft Resource Center 71 00:03:41.490 --> 00:03:44.010 recently released their annual data breach report in an event 72 00:03:44.010 --> 00:03:48.960 that we put on together with them and the Better Identity 73 00:03:48.960 --> 00:03:51.840 Coalition and the FIDO Alliance at a policy forum last month. 74 00:03:52.830 --> 00:03:56.310 Hands down worst year ever in terms of breaches and incidences 75 00:03:56.310 --> 00:03:59.670 of identity theft, and we've seen FinCEN, the Treasury 76 00:03:59.670 --> 00:04:02.670 Department's Financial Crimes Enforcement Network, release a 77 00:04:02.670 --> 00:04:05.550 new study that revealed- they did an analysis of all the 78 00:04:05.550 --> 00:04:08.070 suspicious activity reports, essentially the things that 79 00:04:08.070 --> 00:04:11.010 banks report to them where they think something is improper in 80 00:04:11.010 --> 00:04:15.000 the financial system with how it's been used. Over $212 81 00:04:15.000 --> 00:04:18.420 billion in transactions in one year tied to some sort of 82 00:04:18.420 --> 00:04:24.030 compromise of identity. So we continue to see incident after 83 00:04:24.030 --> 00:04:28.770 incident it is somebody's said recently, "You don't necessarily 84 00:04:28.770 --> 00:04:32.580 hack in, you log in with a compromised credential," or you 85 00:04:32.580 --> 00:04:38.280 spoof somebody's identity to pretend to be them to open up an 86 00:04:38.280 --> 00:04:42.270 account someplace, and that's where the vast majority of 87 00:04:42.270 --> 00:04:45.660 cybercrime is happening. But we're making good progress, I 88 00:04:45.660 --> 00:04:48.540 think, at least in the authentication side of that how 89 00:04:48.540 --> 00:04:51.210 do we finally start to move beyond passwords, and some 90 00:04:51.210 --> 00:04:54.420 legacy forms of multi-factor authentication and actually get 91 00:04:54.420 --> 00:04:57.630 to true passwordless authentication that's both more 92 00:04:57.630 --> 00:05:00.480 secure and easier to use. A lot of things happening in the FIDO 93 00:05:00.480 --> 00:05:03.690 ecosystem with TASKI, is that I think we're going to see a lot 94 00:05:03.690 --> 00:05:06.660 more adoption this year. It's really that- there I'm a little 95 00:05:06.660 --> 00:05:09.690 bit bullish, it's that identity proofing side where I'm 96 00:05:09.870 --> 00:05:13.440 increasingly worried that we're falling behind where the attacks 97 00:05:13.440 --> 00:05:15.330 are, and that things are about to get much worse. 98 00:05:16.320 --> 00:05:18.450 Tom Field: Jeremy, I'd be remiss if I didn't bring AI into the 99 00:05:18.450 --> 00:05:21.330 conversation, and I know it's still early days for gen AI, 100 00:05:21.330 --> 00:05:24.690 we're just getting through a hype cycle now. But how do you 101 00:05:24.750 --> 00:05:31.200 already see gen AI being used to one attack identity and two to 102 00:05:31.200 --> 00:05:31.860 protect it? 103 00:05:32.820 --> 00:05:34.590 Jeremy Grant: Well, this is a good follow-up on my last point, 104 00:05:34.590 --> 00:05:37.950 which is I think things are about to get worse. I think, the 105 00:05:37.980 --> 00:05:41.700 commoditization of tools that can create very convincing deep 106 00:05:41.700 --> 00:05:46.380 fakes, be it voice, photo, video. In the last year, it's 107 00:05:46.380 --> 00:05:48.480 really been a sea change in terms of the tools that are 108 00:05:48.480 --> 00:05:52.500 available to attackers, and we are already seeing adversaries 109 00:05:52.530 --> 00:05:57.090 exploit those, whether it's- there was a story a few weeks 110 00:05:57.090 --> 00:06:00.810 ago, somebody through what would have normally been a business 111 00:06:00.810 --> 00:06:03.540 email compromised attack actually used video deep fakes 112 00:06:04.170 --> 00:06:06.840 to fool somebody into wiring - I think it was - over $25 million 113 00:06:07.290 --> 00:06:10.080 to criminals. And they thought it was all direction from their 114 00:06:10.080 --> 00:06:14.910 executive team. I worry at a more. I mean, that's obviously 115 00:06:14.910 --> 00:06:19.980 major crimes, not a scalable attack very sophisticated, but 116 00:06:20.190 --> 00:06:24.630 tools that we use today to prove that we're real human online, 117 00:06:24.630 --> 00:06:27.840 and that we're a particular human - again, photos, voices, 118 00:06:27.840 --> 00:06:32.700 videos - we're already seeing how what would have been a very 119 00:06:32.700 --> 00:06:35.340 sophisticated attack a few years ago, if not technically 120 00:06:35.340 --> 00:06:39.000 impossible, is now becoming quite easy. And I'm not really 121 00:06:39.000 --> 00:06:43.830 sure that we are fully prepared for, say, the new advanced 122 00:06:43.830 --> 00:06:46.050 attacks on biometric systems and other things that we'll be 123 00:06:46.050 --> 00:06:49.830 seeing where this is an area I think liveness detection is 124 00:06:49.830 --> 00:06:52.650 going to become much more important. On that note, I think 125 00:06:52.650 --> 00:06:55.650 AI can also help protect whether it's AI powered systems that can 126 00:06:55.650 --> 00:07:00.330 monitor for anomalies, in terms of are you seeing something 127 00:07:00.330 --> 00:07:03.390 unusual? Can you actually tell if it's really a live person 128 00:07:03.390 --> 00:07:05.850 who's presenting a photo or a fingerprint on the other end of 129 00:07:05.850 --> 00:07:08.070 a transaction, or maybe something that's been spoofed, 130 00:07:08.820 --> 00:07:12.600 that's going to become, I think, much more important, and I think 131 00:07:12.600 --> 00:07:15.390 there's certainly some good tools on the AI side that can 132 00:07:15.390 --> 00:07:20.610 also power defenses. But, it's a bit of a new frontier right now, 133 00:07:20.610 --> 00:07:21.690 when it comes to attacks. 134 00:07:22.380 --> 00:07:24.720 Tom Field: We all share your concerns. With that, I want to 135 00:07:24.720 --> 00:07:27.180 pass this over to my colleague, Mat. Mat, your witness. 136 00:07:29.130 --> 00:07:31.530 Mathew Schwartz: Oh, wow, that sounds hostile and adversarial. 137 00:07:31.530 --> 00:07:35.400 But, I'm just here for truth and justice. So, I know this won't 138 00:07:35.400 --> 00:07:39.870 be news to you, Jeremy that the Better Identity Coalition 139 00:07:39.930 --> 00:07:44.910 recently issued his five-year review. Great report, really 140 00:07:44.910 --> 00:07:48.270 interesting reading. But if we're looking at it from an 141 00:07:48.270 --> 00:07:51.900 elementary school teacher's perspective, not grades that 142 00:07:51.900 --> 00:07:54.930 you'd really want to take home with you. And, one of the big 143 00:07:54.930 --> 00:07:59.370 challenges that I thought is fascinating, is developing next 144 00:07:59.370 --> 00:08:05.070 generation remote ID proofing and verification systems - a bit 145 00:08:05.070 --> 00:08:08.460 of a mouthful - but this seems to be something that we're going 146 00:08:08.460 --> 00:08:12.690 to need, I wish we already had it, but it's such a challenge 147 00:08:12.750 --> 00:08:18.390 for the U.S. Has anybody got this right yet? Are there any 148 00:08:18.420 --> 00:08:21.210 signs of potential awesomeness that you see on this front? 149 00:08:22.080 --> 00:08:23.610 Jeremy Grant: There's some signs sometimes of potential 150 00:08:23.610 --> 00:08:26.940 awesomeness. So, as background the coalition's policy blueprint 151 00:08:27.090 --> 00:08:29.910 Better Identity Coalition's an industry group that I run that's 152 00:08:29.910 --> 00:08:33.600 focused on what I would call the policy layer of identity. Not 153 00:08:33.600 --> 00:08:36.180 looking at technology or standards, but more, what are 154 00:08:36.180 --> 00:08:38.550 the things that government needs to do in terms of advancing 155 00:08:38.550 --> 00:08:42.330 policies, regulations, initiatives, to try and address 156 00:08:42.420 --> 00:08:44.490 among other things, some of these threats that we've been 157 00:08:44.490 --> 00:08:47.250 seeing over the years? Our original policy Blueprint was 158 00:08:47.250 --> 00:08:50.280 published in 2018. And, then we did an update of it that we 159 00:08:50.280 --> 00:08:52.680 released last month, which as you pointed out, I had a report 160 00:08:52.680 --> 00:08:57.450 card. And on this topic of what is the government actually doing 161 00:08:57.450 --> 00:09:00.270 to prioritize better remote identity proofing systems, we 162 00:09:00.270 --> 00:09:04.890 gave them a D. It gave me no pleasure to be doing that. In 163 00:09:04.890 --> 00:09:07.500 fact, it's a little upsetting, I think, that we're at this point. 164 00:09:07.860 --> 00:09:11.640 But, I would say between, I mean, look, we're two successive 165 00:09:11.640 --> 00:09:14.400 administrations, what the Trump and Biden administration, that 166 00:09:14.400 --> 00:09:19.230 have declined to act on this - not to say there aren't people 167 00:09:19.260 --> 00:09:21.270 within both who seem to get the issue, but that doesn't 168 00:09:21.270 --> 00:09:24.600 necessarily translate into action, or coordinated activity. 169 00:09:24.810 --> 00:09:27.180 And, there's been legislation pending in Congress that's come 170 00:09:27.180 --> 00:09:30.240 close to getting over the fishing line- the finish line 171 00:09:30.540 --> 00:09:33.810 that would prompt the executive branch to act. But again, we 172 00:09:33.810 --> 00:09:36.750 have a couple of people who have blocked it, when you get down to 173 00:09:37.050 --> 00:09:42.210 the year-end deals involved, where big bills tend to pass. So 174 00:09:42.240 --> 00:09:45.540 it's a challenge for the U.S. in that- that's not to say there's 175 00:09:45.540 --> 00:09:49.200 no activity, there's a handful of states piloting mobile 176 00:09:49.200 --> 00:09:53.550 driver's licenses, you've got a couple of small projects that 177 00:09:53.550 --> 00:09:57.060 NIST is leading to try and advance some of the remote 178 00:09:57.060 --> 00:10:01.680 identity proofing applications of them. But you're talking two 179 00:10:01.680 --> 00:10:05.610 or three people a third of their time in an agency with no 180 00:10:05.610 --> 00:10:08.670 resources. And, where there's bright spots. Look, I look 181 00:10:08.670 --> 00:10:12.780 across the pond in Europe where they have a major European 182 00:10:12.780 --> 00:10:15.900 Commission initiative to create portable digital wallets with 183 00:10:15.900 --> 00:10:20.010 identity at the center of them for every European, you look in 184 00:10:20.010 --> 00:10:22.560 the U.K., they're advancing legislation around a digital 185 00:10:22.560 --> 00:10:26.430 identity and attributes trust framework. Canada, Australia, 186 00:10:27.060 --> 00:10:30.390 New Zealand, Singapore, a lot of Latin America, a lot of 187 00:10:30.390 --> 00:10:32.730 countries are taking this issue seriously and making it a 188 00:10:32.730 --> 00:10:34.710 priority. They're all approaching it a little bit 189 00:10:34.710 --> 00:10:38.760 differently, which I think does make sense because identity can 190 00:10:38.760 --> 00:10:42.330 be very local, and the values that you want to build into a 191 00:10:42.330 --> 00:10:45.960 system might vary from country to country. But, I think that 192 00:10:45.960 --> 00:10:48.930 there's a bigger question, which is, the longer the U.S. lags 193 00:10:48.930 --> 00:10:54.330 here, what does it do to make us in the U.S. a bigger target for 194 00:10:54.360 --> 00:10:56.220 identity-related attacks, because other countries are 195 00:10:56.220 --> 00:10:58.020 hardening identity infrastructure, and we're kind 196 00:10:58.020 --> 00:11:00.780 of ignoring it? And what does it do over time to our economic 197 00:11:00.780 --> 00:11:03.780 competitiveness as well, when it comes to digital transactions? 198 00:11:03.810 --> 00:11:08.520 So it's not too late for the U.S. But, we should get moving 199 00:11:08.520 --> 00:11:10.410 soon, and taking this a little bit more seriously. 200 00:11:11.370 --> 00:11:13.920 Mathew Schwartz: I think, speaking as an American, the 201 00:11:13.920 --> 00:11:19.680 driver's license is such a critical part of identity, I 202 00:11:19.680 --> 00:11:22.410 suppose. If you ever need to attest who you are, you think 203 00:11:22.440 --> 00:11:25.560 driver's license, and you mentioned mobile driver's 204 00:11:25.560 --> 00:11:29.880 licenses being developed by some states? I know, there's an 205 00:11:29.880 --> 00:11:33.870 immense backstory here in terms of deadlines, missed deadlines, 206 00:11:34.020 --> 00:11:37.410 all that sort of thing. Do you think that mobile driver's 207 00:11:37.410 --> 00:11:44.580 licenses will be what does give us this ability to remotely ID 208 00:11:44.850 --> 00:11:47.550 proof and verify people? Or do you think it's going to end up 209 00:11:47.550 --> 00:11:50.400 being some complementary solution? Or is the jury just 210 00:11:50.400 --> 00:11:53.220 really out on how this all unfolds? 211 00:11:53.610 --> 00:11:56.250 Jeremy Grant: So, I'm really bullish on the concept of mobile 212 00:11:56.250 --> 00:11:59.400 driver's licenses, I will say the way that they're being 213 00:11:59.430 --> 00:12:02.460 implemented to date is, from my perspective, really missing the 214 00:12:02.460 --> 00:12:05.430 mark in terms of priorities. And so, I think there's a ton of 215 00:12:05.430 --> 00:12:10.590 potential there. But this is getting back to why we gave the 216 00:12:10.590 --> 00:12:13.200 government collectively a D on this topic. There needs to be a 217 00:12:13.200 --> 00:12:16.140 lot more activity to really prioritize the right use cases, 218 00:12:16.140 --> 00:12:19.290 and also take a step back to define what would "good" look 219 00:12:19.290 --> 00:12:22.650 like in a system of digital identity in the U.S. and how do 220 00:12:22.650 --> 00:12:25.380 we get there setting a high bar for security and privacy and 221 00:12:25.380 --> 00:12:28.590 user control, and making sure we don't inadvertently build stuff 222 00:12:28.590 --> 00:12:32.580 into architectures that leads us down to a bit of a darker place. 223 00:12:32.970 --> 00:12:36.360 So I testified actually, in front of Congress, there was a 224 00:12:36.360 --> 00:12:38.490 hearing in the House Homeland Security Committee in early 225 00:12:38.490 --> 00:12:42.030 December on this topic, that was really focusing in on the role 226 00:12:42.030 --> 00:12:44.400 of our Transportation Security Administration, the guys who run 227 00:12:44.400 --> 00:12:48.030 the airport checkpoints in driving MDLs. And it's very 228 00:12:48.030 --> 00:12:51.690 interesting, because TSA has basically been given authority 229 00:12:51.900 --> 00:12:55.200 to update the regulations around an old law we have in the US 230 00:12:55.200 --> 00:12:57.630 called the Real ID Act that prescribes standards around 231 00:12:57.630 --> 00:13:01.020 driver's licenses in the physical world, to also do 232 00:13:01.020 --> 00:13:05.520 things in the digital space. And the main thrust of my testimony 233 00:13:05.520 --> 00:13:09.600 was, TSA is doing an admirable job focusing on the use cases 234 00:13:09.600 --> 00:13:12.300 that they care about getting people through a checkpoint, but 235 00:13:12.330 --> 00:13:16.290 that's just starting to scrape the surface, this transition to 236 00:13:16.320 --> 00:13:20.460 digital identity from physical, and TSA has been left off on an 237 00:13:20.460 --> 00:13:24.120 island. And it's kind of absurd. There's really two use cases 238 00:13:24.120 --> 00:13:26.520 when it comes to mobile driver's licenses, there's the in-person 239 00:13:26.550 --> 00:13:29.100 use cases like going through a security checkpoint at an 240 00:13:29.100 --> 00:13:33.600 airport or getting a beer at a bar. From my perspective, being 241 00:13:33.600 --> 00:13:36.780 able to carry my ID in my phone, that'd be cool. It's nice to 242 00:13:36.780 --> 00:13:41.040 have. But, when you look into the online world, where I 243 00:13:41.040 --> 00:13:43.710 mentioned before FinCEN, documenting hundreds of billions 244 00:13:43.710 --> 00:13:48.000 of dollars in suspicious activity and government 245 00:13:48.000 --> 00:13:51.210 benefits, we've seen over $100 billion in pandemic fraud that 246 00:13:51.210 --> 00:13:54.120 had been documented, again, tied to identity proofing spoofing. 247 00:13:54.540 --> 00:13:57.750 And just so you know, the stories we see week to week, and 248 00:13:57.840 --> 00:14:02.040 places like ISMG's publications around how exploits of identity 249 00:14:02.040 --> 00:14:07.320 proofing are being used for all sorts of nefarious purposes, we 250 00:14:07.320 --> 00:14:09.690 have the priorities upside down, we've got a crisis here in the 251 00:14:09.690 --> 00:14:12.540 online side, and we're focusing on these flash passes first, 252 00:14:12.540 --> 00:14:15.360 that can be digital, we should really be flipping the 253 00:14:15.360 --> 00:14:17.310 priorities. And honestly, we should have flipped it several 254 00:14:17.310 --> 00:14:22.890 years ago. So all that said, work is progressing slowly on 255 00:14:22.890 --> 00:14:26.850 standards. And, for those online use cases, I'm bullish on it 256 00:14:26.850 --> 00:14:29.280 being the solution in the U.S. and that the U.S. is not going 257 00:14:29.280 --> 00:14:32.310 to have a national ID for a whole bunch of reasons anytime 258 00:14:32.310 --> 00:14:35.730 soon, and if ever, but the driver's license kind of 259 00:14:35.730 --> 00:14:39.660 functions, along with state ID cards for people who don't drive 260 00:14:39.930 --> 00:14:42.750 as a de facto national ID in the physical world. It's the one 261 00:14:42.750 --> 00:14:46.770 place where most adult Americans go to a state office and prove 262 00:14:46.770 --> 00:14:50.100 who they are. And then they get a relatively robust credential. 263 00:14:50.310 --> 00:14:52.980 And so the logical starting point, if you want to think 264 00:14:52.980 --> 00:14:55.140 about how to address deficiencies in digital identity 265 00:14:55.140 --> 00:14:57.630 infrastructure to come up with digital counterparts that 266 00:14:57.630 --> 00:15:00.330 credentials like the driver's license in the state ID card. 267 00:15:01.050 --> 00:15:06.300 So, I think over time, it'll become quite important. We're 268 00:15:06.330 --> 00:15:09.270 not focused right now in the way that we should be in terms of 269 00:15:09.270 --> 00:15:12.690 how to prioritize that in a way that other countries are. One of 270 00:15:12.690 --> 00:15:15.390 the points you asked, are they going to be the only approach or 271 00:15:15.390 --> 00:15:18.150 complementary, I do think you're going to continue to see them be 272 00:15:18.150 --> 00:15:21.420 one offering in a broader ecosystem in that not all 273 00:15:21.420 --> 00:15:23.340 Americans are going to be comfortable using the government 274 00:15:23.340 --> 00:15:25.920 credential in the digital world. And so I think you're going to 275 00:15:25.920 --> 00:15:28.440 continue to see a lot of industry solutions as well. 276 00:15:29.190 --> 00:15:32.580 Ideally, you'll have a vibrant ecosystem where everybody's got 277 00:15:32.580 --> 00:15:35.010 a choice if they want to use one of these mobile driver's 278 00:15:35.010 --> 00:15:37.110 licenses, but it's not going to be the only solution in the 279 00:15:37.110 --> 00:15:41.010 marketplace. And you'll probably see people going down using a 280 00:15:41.010 --> 00:15:42.660 variety of different tools in the future. 281 00:15:43.800 --> 00:15:45.810 Mathew Schwartz: Fantastic. Thank you for that. I am going 282 00:15:45.810 --> 00:15:48.480 to pass you over to Anna now, please. 283 00:15:48.630 --> 00:15:51.330 Anna Delaney: Thank you so much. So, from mobile driver licenses 284 00:15:51.330 --> 00:15:55.350 to social security numbers. So Jeremy, the security experts 285 00:15:55.350 --> 00:15:58.560 have long warned against using social security numbers as 286 00:15:58.560 --> 00:16:02.670 identifiers and authenticators, and we've seen some progress. 287 00:16:02.880 --> 00:16:06.330 Progress has been made with the Better Identity Coalition noting 288 00:16:06.330 --> 00:16:10.710 improvements. Over 20 laws, however, still mandate to their 289 00:16:10.710 --> 00:16:14.910 use. So what will their future be? Do you see them lingering 290 00:16:14.910 --> 00:16:18.240 like passwords, even as passwordless security emerges? 291 00:16:19.590 --> 00:16:21.270 Jeremy Grant: Well, I think, yeah this just kind of gets to a 292 00:16:21.270 --> 00:16:24.060 core issue we flagged in the coalition's original report five 293 00:16:24.060 --> 00:16:26.550 years ago, which is when people were- and keeping in mind that 294 00:16:26.550 --> 00:16:30.450 the origins of this were massive breach with a big credit bureau 295 00:16:30.450 --> 00:16:34.350 in late 2017, over 140 million social security numbers stolen, 296 00:16:34.710 --> 00:16:39.060 and you're seeing proposals from policymakers, we should replace 297 00:16:39.060 --> 00:16:41.670 the SSN with something new, we should ban the credit bureaus 298 00:16:41.670 --> 00:16:44.580 from using it for any identification purpose. Sounded 299 00:16:44.580 --> 00:16:46.020 great in the wake of the headlines, because all these 300 00:16:46.020 --> 00:16:48.870 things got breached. But, what are we really talking about 301 00:16:48.870 --> 00:16:51.840 here? And a point that we made in our original policy blueprint 302 00:16:51.870 --> 00:16:54.240 that I think has actually helped change the conversation a lot, 303 00:16:54.660 --> 00:16:58.110 is when you're talking about the SSN, it is not one thing, it's 304 00:16:58.110 --> 00:17:02.460 two, it's an identifier to try and figure out which Jeremy 305 00:17:02.460 --> 00:17:06.720 Grant or which Matthew Schwartz is who at least going on Google, 306 00:17:06.720 --> 00:17:09.990 I think, just about 300, Jeremy Grants in the U.S., only one has 307 00:17:09.990 --> 00:17:13.200 my SSN, that's an identifier, and that's a really essential 308 00:17:13.200 --> 00:17:16.560 thing that every society needs. So, when somebody claims to be 309 00:17:16.560 --> 00:17:18.450 somebody and they're applying for credit, or a government 310 00:17:18.450 --> 00:17:21.300 service, or something else, where you need to actually vet 311 00:17:21.300 --> 00:17:24.300 them, you can very quickly resolve which of these persons 312 00:17:24.300 --> 00:17:27.330 with this name you're actually talking about, we should always 313 00:17:27.330 --> 00:17:30.060 preserve the SSN as an identifier, in that you need an 314 00:17:30.060 --> 00:17:33.000 identifier, and it's the least bad solution that's out there. 315 00:17:33.390 --> 00:17:37.110 Where we've really gotten silly over the years, is pretending 316 00:17:37.110 --> 00:17:40.050 that this number is a secret, and that nobody will ever find 317 00:17:40.050 --> 00:17:42.810 it out if you just keep it locked up and are very careful 318 00:17:42.810 --> 00:17:46.200 who you give it to. And, so we started using the SSN as an 319 00:17:46.200 --> 00:17:49.050 authenticator. And, I've been pointing it out for a while, if 320 00:17:49.050 --> 00:17:52.590 you call your bank, and they say, Anna, what's the last four 321 00:17:52.590 --> 00:17:55.680 of your social security number? The only logical question and 322 00:17:55.680 --> 00:17:58.410 the only logical response these days is to say, "don't you 323 00:17:58.410 --> 00:18:01.350 realize that the Russians have that, and the Chinese have that, 324 00:18:01.350 --> 00:18:04.080 and about 87 well-organized criminal gangs have it and any 325 00:18:04.080 --> 00:18:07.920 mediocre 17-year-old hacker can get on the dark web for 63 326 00:18:07.920 --> 00:18:11.790 cents." These things stopped being secret a long time ago. 327 00:18:11.820 --> 00:18:15.120 And so, but we have this problem in cybersecurity, we're always 328 00:18:15.120 --> 00:18:17.610 fighting the last war, rather than looking to where the 329 00:18:17.610 --> 00:18:21.180 attacks have shifted. So, whether it's in passwords, where 330 00:18:21.180 --> 00:18:24.780 we advise people, oh, have a strong, unique password, and 331 00:18:24.780 --> 00:18:27.060 change it every three months. Yeah, that doesn't really work 332 00:18:27.060 --> 00:18:30.360 these days. Because even that strong password, you'll probably 333 00:18:30.360 --> 00:18:33.720 fall for a phishing attack and put it in. And so getting the 334 00:18:33.720 --> 00:18:37.260 password list, I think is where we need to go there, on the 335 00:18:37.260 --> 00:18:41.040 social security number side, devalue it, stop pretending it's 336 00:18:41.040 --> 00:18:43.290 a secret. In fact, we actually worked with some members of 337 00:18:43.290 --> 00:18:45.780 Congress to they put a bill together that said, in 10 years, 338 00:18:45.780 --> 00:18:48.600 the SSA will publish to the equivalent of a phonebook of 339 00:18:48.600 --> 00:18:51.540 everybody's name and social security number, for the sole 340 00:18:51.540 --> 00:18:54.600 purpose of making clear, this is not secret information. And you 341 00:18:54.600 --> 00:18:57.960 should never build or architect the system around the idea that 342 00:18:57.960 --> 00:19:01.080 there's any security value to this whatsoever, because sadly, 343 00:19:01.080 --> 00:19:04.080 every Americans had their SSN breach too many times already. 344 00:19:04.380 --> 00:19:09.120 And so but identifiers don't have to be secret. In fact, in a 345 00:19:09.120 --> 00:19:12.000 lot of countries, they are publicly known. And typically, 346 00:19:12.000 --> 00:19:14.610 we're not advocating to actually publish everybody's SSN. But, 347 00:19:14.610 --> 00:19:17.580 the introduction of the bill made a point, which is that 348 00:19:18.390 --> 00:19:21.630 attackers have moved on, our defenses need to move on. Let's 349 00:19:21.630 --> 00:19:24.300 stop building systems that pretend that knowing your SSN 350 00:19:24.300 --> 00:19:26.310 means anything from a security standpoint. 351 00:19:28.230 --> 00:19:30.420 Anna Delaney: Very interesting. Well, I want to finish with data 352 00:19:30.420 --> 00:19:33.480 breaches. I mean, last year, as you said, was the worst year 353 00:19:33.480 --> 00:19:36.690 ever. When it comes to data breaches, the U.S. saw a record 354 00:19:36.810 --> 00:19:40.200 number of breaches involving stolen identities or weaknesses 355 00:19:40.290 --> 00:19:45.000 in identity-focused defenses, yet, many effected organizations 356 00:19:45.570 --> 00:19:49.470 aren't transparent about what occurred. So, how can we improve 357 00:19:49.500 --> 00:19:52.470 our understanding of these incidents? Jeremy, do we need 358 00:19:52.470 --> 00:19:55.890 better intelligence on breaches and how on earth do we make that 359 00:19:55.890 --> 00:19:56.310 happen? 360 00:19:56.880 --> 00:19:58.770 Jeremy Grant: Yeah, I mean, it's a real challenge. And, this is 361 00:19:58.770 --> 00:20:00.930 something our friends at the ID Theft Resource Center had 362 00:20:00.930 --> 00:20:03.390 flagged the last couple years is they're able to document more 363 00:20:03.390 --> 00:20:06.120 breaches, but companies are releasing less and less about 364 00:20:06.120 --> 00:20:08.130 what happened. And look, a lot of that's because they're 365 00:20:08.130 --> 00:20:10.920 worried about enforcement actions and lawsuits. And so 366 00:20:11.160 --> 00:20:14.760 they're just not incentivized to share. But, you're right, if we 367 00:20:14.760 --> 00:20:18.450 don't have a collective picture of how things are happening, and 368 00:20:18.540 --> 00:20:20.820 we know the bad things are happening, but can't document 369 00:20:20.820 --> 00:20:24.180 why or what the attack methods are, it really limits the intel 370 00:20:24.180 --> 00:20:26.400 we have in terms of trying to understand how attacks are 371 00:20:26.400 --> 00:20:30.630 happening, and what we should be preparing for in the future? I 372 00:20:30.630 --> 00:20:34.980 don't have a great answer there. I mean, you certainly see a lot 373 00:20:34.980 --> 00:20:38.820 of focus on information sharing both with the government as well 374 00:20:38.820 --> 00:20:42.450 as with sectors in some of the ISACs, the Information Sharing 375 00:20:42.450 --> 00:20:45.240 and Analysis Councils, and I think we're still able to glean 376 00:20:45.240 --> 00:20:49.230 a decent amount of information there in terms of where attacks 377 00:20:49.230 --> 00:20:52.620 are happening. We've certainly seen in the U.S. agencies like 378 00:20:52.620 --> 00:20:56.760 the FBI, and CISA constantly publishing alerts of look, we're 379 00:20:56.760 --> 00:21:00.750 seeing signs and detecting this attack vector that's been 380 00:21:00.750 --> 00:21:03.750 exploited. And it's a little bit of a twist on perhaps what we 381 00:21:03.750 --> 00:21:07.380 saw two years ago, and should be prepared for that. So, that 382 00:21:07.380 --> 00:21:10.140 information is getting out there. But I would agree that 383 00:21:10.530 --> 00:21:15.720 trying to come up for ways to better incent companies that 384 00:21:15.720 --> 00:21:18.210 have been victimized by breach to share more about what 385 00:21:18.210 --> 00:21:26.790 happened in - I would say - a safe harbor, without the threat 386 00:21:26.790 --> 00:21:29.130 that that's also going to lead to some sort of enforcement 387 00:21:29.130 --> 00:21:33.120 action. That would be very beneficial. You know, the flip 388 00:21:33.120 --> 00:21:35.640 side is we're seeing like, for example, with publicly-traded 389 00:21:35.640 --> 00:21:37.620 companies new rules from the Securities and Exchange 390 00:21:37.620 --> 00:21:40.680 Commission, the SEC, is actually been focusing more on holding 391 00:21:40.680 --> 00:21:44.760 companies accountable. So those two things are both worthy 392 00:21:44.760 --> 00:21:47.790 goals, but also very much at at odds with each other in terms of 393 00:21:47.790 --> 00:21:49.470 some of the objectives that we're trying to achieve. 394 00:21:51.030 --> 00:21:53.190 Anna Delaney: Well Jeremy, insightful and educational as 395 00:21:53.220 --> 00:21:55.290 always, we've got one more question, but I think you should 396 00:21:55.290 --> 00:21:59.760 take a break, you've earned a break. So, last question, just 397 00:21:59.760 --> 00:22:03.510 for fun if you could create a superhero whose sole mission is 398 00:22:03.510 --> 00:22:06.630 to protect people's identities online. What would their name 399 00:22:06.660 --> 00:22:10.350 and superpowers be? Tom, has that got anything to do with 400 00:22:10.350 --> 00:22:10.830 Peeps? 401 00:22:10.830 --> 00:22:12.720 Tom Field: I don't think there's anything to do with Peeps, but 402 00:22:12.720 --> 00:22:14.610 you're right in my wheelhouse here as you know, because 403 00:22:14.610 --> 00:22:21.360 superheroes are my thing. I am calling my hero, MF Agent. No, 404 00:22:21.360 --> 00:22:25.590 that's not what MF means. Multifactor! Multifactor Agent 405 00:22:25.590 --> 00:22:29.010 he's going to be out there fighting the dark web for us to 406 00:22:29.010 --> 00:22:32.970 ensure that kind and good-hearted people are 407 00:22:32.970 --> 00:22:36.570 protecting themselves with multiple factors when they log 408 00:22:36.570 --> 00:22:38.040 into their various accounts. 409 00:22:38.730 --> 00:22:41.730 Anna Delaney: Wonderful. MF Agent love it. Mat? 410 00:22:43.050 --> 00:22:46.470 Mathew Schwartz: Yeah, so mine's less cool. It's called The Data 411 00:22:46.500 --> 00:22:52.080 Expunger - devoted to truth, justice and data minimization 412 00:22:52.260 --> 00:22:55.590 principles, including - as we were just discussing - the 413 00:22:55.590 --> 00:22:59.040 ability to eliminate people's personally identifiable 414 00:22:59.040 --> 00:23:02.760 information, or PPI, from criminal or intelligence agency 415 00:23:02.760 --> 00:23:05.880 databases. Unfortunately, obviously, this is a fictional 416 00:23:05.880 --> 00:23:07.950 character, because this would totally be a fictional 417 00:23:07.950 --> 00:23:11.070 superpower. Because like Jeremy said, copies of everything 418 00:23:11.070 --> 00:23:13.260 important are already circulating at least in 419 00:23:13.260 --> 00:23:17.190 triplicate everywhere so. Well, we can dream, right? 420 00:23:17.190 --> 00:23:20.280 Anna Delaney: We can dream! I'm going for Cyber Guardian. 421 00:23:20.430 --> 00:23:23.280 Original, of course. My superpowers would include making 422 00:23:23.310 --> 00:23:26.070 myself and others invisible like you guys have done I mean, 423 00:23:26.100 --> 00:23:29.580 making it nearly impossible for these hackers to identify or 424 00:23:29.580 --> 00:23:33.570 track individuals online. But, more importantly, I'd be a cyber 425 00:23:33.600 --> 00:23:37.230 empath. So, I'd be able to empathize with victims of crime 426 00:23:37.230 --> 00:23:40.380 providing them with emotional support and guidance. 427 00:23:40.800 --> 00:23:41.700 Tom Field: You're going to be busy. 428 00:23:41.940 --> 00:23:45.270 Anna Delaney: Yeah. Jeremy, what have you got for us? 429 00:23:46.020 --> 00:23:49.020 Jeremy Grant: Didn't think about this enough beforehand. I'm 430 00:23:49.020 --> 00:23:51.660 going to go for a reach. I'm going to go for MDL Mothra. A 431 00:23:51.780 --> 00:23:55.770 giant moth, man, moth woman, let's say it how about MDL 432 00:23:55.770 --> 00:24:01.650 Mothra Mama, who with a flap of her moth wings could basically 433 00:24:01.650 --> 00:24:05.490 instantly send a ripple through the space time continuum that 434 00:24:05.490 --> 00:24:09.240 would undo the last 10 years of ineptitude on advancing portable 435 00:24:09.240 --> 00:24:12.000 digital identities for people and actually get a foundation in 436 00:24:12.000 --> 00:24:14.730 place that will give everybody something they could carry with 437 00:24:14.730 --> 00:24:17.190 him that they could use to protect and secure the 438 00:24:17.550 --> 00:24:18.690 information when they need it. 439 00:24:19.680 --> 00:24:21.330 Anna Delaney: Impressive creative thinking there on the 440 00:24:21.330 --> 00:24:21.690 spot. 441 00:24:22.350 --> 00:24:24.000 Tom Field: He came up with this not prepared, he gave us a movie 442 00:24:24.000 --> 00:24:24.450 series. 443 00:24:24.990 --> 00:24:26.190 Mathew Schwartz: Yeah, all hell Mothra 444 00:24:27.570 --> 00:24:28.020 Tom Field: Mama. 445 00:24:28.620 --> 00:24:29.160 Mathew Schwartz: Mama 446 00:24:29.190 --> 00:24:31.620 Jeremy Grant: Mothra Mama. MDL Mothra Mama. I gotta get the 447 00:24:31.620 --> 00:24:32.670 alliteration go in there. 448 00:24:32.820 --> 00:24:35.250 Anna Delaney: Oh, yes. Very good. Well, Jeremy, it's been a 449 00:24:35.250 --> 00:24:38.910 blast as always. Thank you so much for your deep insights and 450 00:24:38.910 --> 00:24:41.100 time on the panel today. I hope you'll join us again soon. 451 00:24:41.580 --> 00:24:43.290 Jeremy Grant: Hope to do so as well. Thanks for the invite. 452 00:24:44.070 --> 00:24:47.280 Anna Delaney: Thanks, Tom and Mat. And, thank you so much for 453 00:24:47.280 --> 00:24:48.420 watching. Until next time.