Your Data's Already Gone. Now What?

· with James Lee
“You cannot have data compromised that you do not have.” James Lee has spent two decades watching breach transparency collapse–from near-total disclosure in 2020 to just 30 percent today. The president of the nation’s leading identity crime nonprofit breaks down why your Social Security number is worthless on the black market. Why your driver’s license isn’t and what individuals can actually do when the system designed to protect them has already failed.

That one percent of revenue put a hundred percent of the enterprise at risk.

— James Lee

Timestamps

  • 00:04 The ChoicePoint breach: what a 2005 data disaster taught James Lee about institutional accountability
  • 09:26 Are data brokers a net positive? The uncomfortable tradeoff between utility and risk
  • 12:07 Breach notice fatigue: when data breach letters become a humiliation ritual
  • 15:24 From ChoicePoint to ITRC: how James Lee ended up on the other side of the table
  • 18:42 Breach transparency on the decline: better legal cover, worse consumer protection
  • 23:33 The golden age of identity crime: what ITRC advisors hear from victims today
  • 28:27 Recycled stolen data: old breaches fueling new attacks at scale
  • 30:21 Why your driver's license is now worth more than your Social Security number
  • 33:42 KYC under pressure: app-based identity verification getting duped by stolen credentials
  • 38:13 The case for friction: why instant payments make fraud easier
  • 44:44 Government data sharing: DOGE, SSA, and the fight over federal databases
  • 48:18 Data minimization: should anyone be holding this much data in the first place?
  • 53:19 Credit freezes, passkeys, and individual control over identity
  • 57:33 Redesigning identity from scratch: ITRC's new vision
  • 1:01:06 Three things to do today if you've been breached

Resources

About James Lee

James E. Lee is president of the Identity Theft Resource Center (ITRC), the nation's leading nonprofit dedicated to supporting victims of identity crime at no cost. Before joining ITRC's staff in 2020 as COO, Lee served over a decade on its board of directors, including three years as chairman. His career spans senior leadership at ChoicePoint (now LexisNexis), where he navigated the landmark 2005 data breach, and Irish cybersecurity firm Waratek. He chaired two ANSI working groups on identity management and privacy and has testified before the Senate Commerce Committee on identity crime trends. Lee holds credentials from the University of Texas Center for Identity, the Wharton School, and the University of Arkansas.

Transcript

Show full transcript

KYC is an interesting topic because it's in that area where data is so valuable. And yet we are so resistant to moving to some of the next kinds of technologies. And this is where we get into the difference between, we'll talk about biometrics. The difference between verification and identification. One's very good, less intrusive. Great when it comes to preventing fraud. Identification, not so much. Very intrusive, not as accurate. And even if it were, not very transparent. Authentication, very transparent because you're doing it. You are initiating a transaction. So the confusion that people have between the two leads to we don't want either one. James Lee, welcome to Trust Revolution. Well, thank you very much. Happy to be here. I am grateful for your time. when you are, as you noted, it is a cold, still icy day in the Beltway.

And so where I'd like to begin, James, is we will certainly spend most of our time on your current work. But to set the backdrop sort of from inside the machine, let's begin here. You were a senior vice president at Choice Point when the organization was, as goes the story, selling personal data to identity thieves, which is an incident that triggered breach notification laws across the country. And so to the degree that you care to get into that, what did that experience teach you about how institutions actually think about the data that they hold? How much time do we have? As much as you want. It's actually an interesting story. It's one of those things where the gentleman who was the president of the company at the time, Doug Curley, Doug, I think, made the comment one time that anytime you go through an experience like that, you learn something.

It's like talking to your wife. It may not always be a pleasant conversation, but you're always going to learn something. And that was certainly the case in ChoicePoint. We were very much a leading edge in data at that time. We were big data before anybody used the term big data. And although we thought we had very good policies and procedures, we were very thorough in our due diligence, It showed at the time that no matter how tight you thought you had the information locked down, how many good processes and procedures you had, they weren't good enough. And this was not a cyber attack. This was, today we think of it as a scam or fraud, where the individuals who got a hold of this data set up shell companies with storefronts. and for the sole purpose of being able to access credit reports.

And they had business licenses. By all accounts, it looked like a legitimate business. Now, if you had scratched the surface a little bit more, you would have found out, wait a minute, and actually the technical things that we had in place did catch it, but not in time because they immediately began to order reports in a way that was not intended. The algorithms were tripped immediately. This is an anomalous search. And so they only had like a day, maybe, of exit. Maybe a little bit more than that. It's been nearly 20 years, so I don't remember all the fine details, but it wasn't very long. And we had this raging debate within the organization. Do we follow the California law only? because at the time, California had the only data breach notice law in the world, not just in the country, in the world. So do we just follow that narrow law, which said,

if it's, I believe it was 25,000 people or more who are residents of California, then you had to issue a data breach. And at that time, our count was around 50,000, 56,000, I believe, of the potential impact. Now, we're going to come back to the actual impact later, but the potential impact, that's 56,000 California residents. And so we had the debate, do we notify everybody whether they're in California or not? Because there were people who were not residents of California who were in their searches. And the decision was, well, we should, we have to, you know, the law says California, we're going to follow the law. Well, that didn't last long. because once you put that genie out of the bottle, there's no putting it back in. Data does not recognize those little dotted lines that we learned on the map. Indeed. Data goes everywhere.

And so you had literally the entire country going, is my data in their database? Or did they lose my data? Who are these people? Because we were not a household name. even though we had information on if you had homeowners or auto insurance in this country, we probably had information about you. So the vast majority of adults in the United States were in our database and every one of them wanted to know, what did you do with my data? So we had 100 media calls a day, every day for 30 days, not to mention all the congressional hearings, the state legislative hearings, regulators, all of the things which come around that because nobody knew who we were. Nobody knew what we had and nobody knew what had happened. And as you say, James, forgive me, this is 20 years ago. 20 years ago. Yeah. It was 2005.

So when we expanded the search, we got up to, I don't remember the exact number, let's to say it's 150,000. So we added another 100,000 potential individuals who had been impacted. And meanwhile, we're also going through inside the company, we're going through the whole discussion around, well, we know what happened now. We know what we need to do to make sure that that doesn't necessarily happen again. But let's look at this more broadly. What should we be doing? What should the business be? Not what it can be within the legal structure, but what is it we should actually be doing? And what level of transparency, because we thought we had a good level to begin with, but we obviously needed more. What more can we do? And we spent the next year rebuilding a lot of systems, rebuilding processes, reevaluating customers.

We exited entire customer sets, entire lines of business, because we determined it was not in the best interest of the company, our shareholders, and most importantly, from the data subjects, the people whose data we had, it wasn't in their best interest that anybody sell that data. So we got out of the business in a lot of ways, particularly around things like private investigators, which is a big business. But ultimately, that area of the business where we sold that, which was what we thought we were selling to as an insurance company, doing insurance verification and doing verification on potential insurers. we exited that investigatory part of the business. That 1% of revenue put 100% of the enterprise at risk. So when you looked at it that way, it wasn't a very difficult decision. But more so, what should we be doing?

So that's when we started saying, well, we have a lot of data about a lot of people, but they don't know that. And they're going to be curious about that, as they should be. How can we make it easy for them to actually see the data? How can we follow the same guidelines and requirements if it's a credit report? So under the Fair Credit Reporting Act, there's very specific kinds of things and access. You have free credit reports, free right to challenge the data, to have the data corrected. Let's apply that to public records data. Let's apply that concept to all of the data we have, which is what we said about doing that over the balance of the year. So ultimately, for us, the learning experience was there are things you can do and there are things you should do. And the reverse of that is true as well. So we very much focused on stopping doing things which were not helpful and doing more things which were helpful to individuals with a respect for their privacy.

But one thing that, and it's still in the debate today about the use of data, the collection and use of data, and that is sometimes data is actually very useful and people like it. It makes things convenient and people like convenience. And you shouldn't have convenience at the sacrifice of security or privacy. You can't have all three of those things. You just have to dial, tune in the dials to make sure you've got the right balance there. But it is certainly possible to do. And that's what we really set about doing after that was making sure we are being respectful of privacy, also respectful of making things secure, and offering products that help people get what they needed and what they wanted in a way that was both privacy and security-centric. So it sounds like here 20 years later, we're going to, in short order, get into, pardon me, your work now.

It sounds like maintain a perspective that data brokers are a net positive, or how would you now reflect on their role in at least American society and business? Well, there's a lot of things we need. Maybe we don't always like who it is that gives it to us. but we need it nonetheless. So there are good players and there are bad players. And that's true of most any kind of industry, most any kind of business. There are some people, there are some businesses that you can fully embrace and endorse what they do and how they do it. And then there are others you kind of go, they're a company. Right, right, they exist. Yeah, and they're doing, they're obviously following the law and somebody likes what they do and whatever. so on a whole we have built an entire global economy based on data the concept that we can overly restrict or

overly release data both poles of that are not realistic and don't lead to good outcomes but we need data but what we need is at the time and in the manner and with the all of the protections all at the right time. We need to have regulation. In the United States, we need to have more regulation. Now, it doesn't have to be regulation that chokes things off or it makes things particularly difficult, but we do need more regulation, if for no other reason than to drive out the illegitimate actors, of which there are plenty, and to make the legitimate actors more secure so they don't have to make those decisions about, well, do I take this to the maximum level I can to protect somebody, or do I dial it back a little bit because my competition is out doing things that are harmful to people,

but they're impacting my financial health, so I can't afford to do the better thing. Yeah, it's fascinating. I didn't actually even put this here as a prop, but I will make sure my fingers are covering my PII. But, you know, here's another notice, this one from Conduent. There we go. Well, I mean, and so you too, James, got free credit reporting, right? And so I think what is, and this is a bit of a digression, but I think it's still on point, is it has become, I am personally desensitized to it. I kept it because I'm still debating within reason, you know, within the realm of what's possible, what I want to do with it. Because in my view, I'd love to get your take on this. it has become a humiliation ritual of sorts is we, you know, and I love terms like threat actor and all these are just absolute nonsense. And so, you know, we messed up, we lost your data,

we got breached. We're telling you six months later, here's some free credit reporting. So, so I guess with that, what is your read, James, on the state of affairs broadly? and we'll unpack this over time. Well, let me ask it this way. When you, in particular, as you did, received this letter from Conduit, what's your take? What are your thoughts and feelings as you open these notices? You know, I'm burdened by knowledge. And the first part is right. The curse of insight. Yeah, I don't know anything about this company. I know nothing about the company, but I know a lot about data breach. And this comes at the same time right about the time we released the ITRC's 20th edition of our annual data breach report. I think within a matter of days. Now, this is dated December 31st.

Mine is. I'm not sure what year it's dated. It is as well. I'm looking at it right now. Yeah. Dated December 31st. I received it at the end of January, first part of February. I think I got early Feb. Yeah. And so I immediately look at that and said, well, that's that's and I knew from from our research into last year's data breaches that this had occurred a long time ago. And they're just now getting around to issuing notice. But more importantly, I look at this and I go, I don't know who you people are. I don't know how you have my data. I don't know what data it is. You say you've got my social security number and address. That was in the file, and they have no evidence of actual or attempted misuse. Well, that's great. How do you know that? What forensic tools did you use? How do you know that? What else is in that file? And how did you get it? Because you're a third party.

So who was your customer? Whose supply chain are you a part of? So I can go and poke them in the chest and go, what are you doing? Right. And please do. What are you doing to protect my data? Right. Well, and that's probably a great, and I've buried the lead here, but let's then do take that as an opportunity, James, to move into from ChoicePoint. You went to an organization that is now the nation's leading identity crime nonprofit, the Identity Theft Resource Center. Was there a specific moment when you decided you needed to switch sides? That's where you needed to be. Well, so part of the whole Choice Board experience was when we recognized, look, we've got to do more and we have to lead by example. We donated a million dollars to the ITRC. And after that, after a period of time, I was invited on the board and I served on the board, well, a total of 14 years.

In two different times. And I actually served three years as chairman. The person who's actually the CEO of the organization, I was the chairman when she was hired. And after I left Choice One, because we were ultimately acquired by the parent company of LexisNexis. And so that was in 2008. So I left and then I did consulting for a number of years and then ultimately ended my corporate career in cybersecurity. Particularly application security in Ireland. But the entire time I stayed in touch on this concept of data breaches and was involved with a number of efforts around data privacy and how to improve data privacy. I worked with ANSI on to the American National Standards Institute on a couple of projects to improve data privacy and data identity management around the whole concept of verification, but doing it in a way that is privacy centric So I had kept in touch with this And then in my cybersecurity work it was all around keeping data secure keeping particularly systems where data is held secure

and protected from software glitches and banned code, which is one of the leading causes of data breaches historically. Certainly. Today, the leading cause of data breaches is actually other data breaches, but that's a different – we'll probably get to that. But I stayed in touch with that. And then when I left the company in Ireland, when it was a startup, so another round of funders, new team comes in. Sure. I was having dinner with the CEO of the ITSC, and she said, you know what? All those years, you were badgering me to do some things and to grow some things and to expand into more policy work, expand into some products that can be helpful for individuals and businesses. Take that mountain of data that we've been collecting since 2005 and actually do something with it. You need to come to San Diego and do that. So that's exactly what I did. And then six weeks later, we had a global pandemic.

And we all know how that went. Well, so now, so with that, ChoicePoint 2005, as you noted, 21 now, almost years later, ITRC's own data shows breach transparency is on the decline, if not collapsing. And so the data brokerage, pardon me, industry sector, have they learned anything from the era that you worked through and built through? Or are they repeating the same mistakes with better legal cover? I think what we have seen is your last statement about better legal cover, I would apply that broadly across business period. Fair. Not just data brokers. I do think the data industry, writ large, so that's any company that has data and winds up selling data, whether that is their sole product. But let's just think of organizations that may they collect it. That may not be what they do for a business.

MasterCard comes to mind. Yeah. And any of the credit card companies, any of the retailers, they're going to sell those lists. So they're in the data business along with selling widgets. What all of those organizations have learned is if you do not have proper cybersecurity, you are going to have a data breach. If you are going to have a data breach, you are going to have an extraordinary level, depending on your size, an extraordinary level of unbudgeted expense and reputational expense, customer churn. You're going to have all those bad things happen. You are probably going to survive, but you're never going to be the same. And if I may then, James, which I find really interesting, perhaps it's the cynic in me, but I tend to think it is a cost of doing business. But what I hear you say is it is painful enough to motivate a change in behavior or perhaps it depends on the size of company and size of the breach.

And I think we're both right, because I think there are people who do treat it as it's a cost of doing business and it's cheaper to pay the fine. There are clearly organizations, and I don't like to engage in name and shame. No, no, of course. But if we go back a few years and we look at a company called Blackbaud, publicly traded company. Massive amount of data, never issued a data breach note. Left it all to their customers to do it. who in turn are non-profits largely, if I recall. Which are non-profits and who, in many cases, didn't know their data had even been compromised. And it wasn't just their data. It was their donors, their customers, all of the people who, maybe their patrons, their arts organizations. That was the data. Those were the individuals who were lost. Most of those individuals were never told. We tracked out, at the time, we tracked the ITRC.

about 600 data breach notices directly related to Blackbaud. And that was after they were basically forced by the Securities Exchange Commission to fess up. And they did it during their SEC filings, not in a data breach notice, which I don't know about you, but I don't spend a lot of time reading SEC filings. Indeed, great place to bury that information. So they were sued by the state's attorneys general. I think it was 49 of the states sued. And in the course of that lawsuit, then we find out it wasn't 600 organizations. It was 12,000, nearly 13,000 organizations. So the customers and the patrons of those organizations never knew their information had been compromised. Because they thought it was cheaper to pay the fine. And that exists in every kind of business, but not every kind of business has the impact that data businesses do.

When you hold the information of an individual, you hold the keys to their life in many instances. And when that's compromised, you have a deep and abiding impact on those individuals. and organizations who have gone through that generally come out with a much deeper respect for their responsibilities. Now, cybersecurity is the biggest risk. That's where most breaches begin today. And you can't stop all of them, but you can stop a lot of them. And if you have the right policies, practices, and procedures in place, you can make that down to a very, very small number. So you minimize the impact to the maximum degree that you can. So they're always going to happen. We're never going to get to zero, but we can always minimize that impact. We can do more. So the companies that we want to encourage to do more, we want to recognize them and pat them on the back when they do it, are those companies that take that seriously.

And the rest of them, we need to slap around. So I'm with you there. I'm with you there. So now with over 20 years, I won't put a number on you, James, but with 20 plus years of work in this field, let's, you know, I think I saw someone use the phrase the golden age of identity crime. But you told the Senate, you know, that very thing, right? That guy being you. So what does that actually look like? What are the ITRC's advisors hearing from victims that is different from, say, five years ago? Yeah, great question. That was in front of the Senate Commerce Committee, and that was right at the tail end of the pandemic. So that was 21, I believe, 2021. So here we are five years later. And if that was the golden age, I'm not sure what happens after the golden age, but that's the age we're in now. At the platinum age.

Yeah. For a fricken flyer analogy, we're headed toward diamond, double diamond or whatever. We are seeing things that have in many respects changed and in other respects not. Most identity crimes today, whether it's identity theft, which is when you take the information. So a data breach is identity. But then somebody has to use that. They don't want to steal it just because it's fun. Maybe once upon a time they did that. Now they don't. They want to make money off it. And that's when you get into frauds and scams. And so a lot of what we're seeing today are data-fueled frauds and scams. So whether that's the romance scams that we hear about now, things like maybe they're AI-fueled. There's some sort of variation of what we call the grandparent scam,

which is called in the middle of the night saying your grandchild, your child, your spouse, your cousin, co-worker, they're in the ER, or they've been kidnapped, and they need $10,000 right now to be able to treat them, or something bad is going to happen. And those kind of things, we see that. We see crypto scams. We see account takeover. Because one of the things that we see as a direct result of all this mountain of data, mountain range of data that's been stolen over the years, particularly the last five, is the ability to impersonate another person. You have enough information between what has been stolen, and then compare that with what's publicly available, probably on your social media account. Somebody can impersonate you well enough to either take over an account you already have or open up a new account in your name, and you are completely unaware that that is happening.

And, by the way, as I'm sure you're tracking, can now replicate your voice pitch perfect. Yeah. Yeah. Three or four seconds or once. Somebody can pull this and pretend to be me. Yes. Good luck to them. If you want to be me, here, I'll give you the key. But that's something that, you know, from a, if you're the person whose data has been stolen, that realization is just now kind of coming to the forefront. This most recent report, the research we did for this report on the 2025 data breaches was the first time we've actually seen people be able to say, I know, or at least I think I know, what they did with my data. So it's attempts at account takeover. It's actual account takeover. It's attempt at impersonation.

it's increased spam increased scam text fake email our phones are our phones are virtually unusable at this point right yeah as a phone and so people are paying attention which is absolutely amazing and it's great and they are then taking steps after that they have not taken before which is also it's great so messaging to a degree is getting through and the frustration with this constant rise of data breach notices is getting to a point where, you know, if people ever get organized around this concept, we'll actually see some good public policy come out of it. Right now, we don't have that groundswell. But boy, it sure looks like the early indicators of that are there and may very well grow over time because there's nothing that I see on the horizon that is going to reduce the number of data breaches and data breach notices.

You touched on it, which is you flagged so-called recycled information, so stolen credentials from old breaches being repackaged for new attacks. And so, you know, if our data is already out there from multiple breaches, what's the risk model for an individual? So to your prior point, we're becoming more aware. I hope we're becoming more intolerant. But realistically, what should we expect as individuals? Tough question, perhaps, but, you know, how bad is it going to get before it gets better, if it gets better? Yeah, that's the, and I think that's one of the great unknowns. I think what we're going to see is, again, the continued rise in the volume and probably velocity of data breaches, which means we're going to get more data breach notices. We're at a point where if you're an adult in the United States, there's not a lot more about you that isn't readily available already. And that's a staggering statement to say and to absorb.

But your social security number has been available for years. Probably the thing that is coming on now is driver's licenses. They have not been very valuable historically, but since the pandemic and during the pandemic, we have seen the value of a driver's license number go up. A lot easier to get a new driver's license number. And why is that, by the way? Because we're using it in ways we've never used it before. So think about it historically, pre-pandemic. So let's talk about if you're an adult in the United States in 2019, you used your driver's license when you went to the airport and you used your driver's license when you went to a place where you had to show your age, whether you're buying alcohol or whatever reason. And if you're a college kid with your fake ID, right? Those are the only times. And then if you got pulled over, you know, those are the times you needed your driver's license. Now, fast forward. Think about mid-2021. You could open up a bank account online.

You can open up any kind of account online. Anytime you need to verify you are who you say you are, you're setting up your account with the IRS. They're going to make you go through a process. What do you have to do? You've got to show your driver's license. Your insurance company, you're renewing your auto insurance. What do you got to do? You've got to show your driver's license. Your driver's license has become the de facto social security number because the social security number is so competent. So bad guys figured this out. So your social security number in a marketplace where they buy and sell data is a BOGO. You buy data and they give you the social security number. There is no financial value in terms of a purchase price to a social security number. Your driver's license, that number went through the roof. So the height of the pandemic, you're talking about $300, $400 for a driver's license number and the attendant information. now it's back closer to $150, $200 social security number still zero driver's license $150, $200

depending on what's stated and that then is used to impersonate you to what? open up accounts you know if it's a good enough fake and some of the fakes are good enough it gets you through TSA it gets you through any other kind of situation we have recorded instances working with people where they have encountered law enforcement coming to them and either attempting to arrest them or tell them they have arrest warrants, get judicial notices for failure to appear in court because the driver's license presented to a law enforcement officer was so good that it passed their test. And that person then later found out they were in a different state in an auto accident or in some way encountered law enforcement, and they only found out about it when somebody came to arrest them. Incredible.

And so let me guess this, James, is on that note, you know, given my field of work certainly over the last four or five years in payments, KYC is an ongoing raging debate. And so the question I have for you there, which what you mentioned is fascinating, zero value to a Social Security number, nominal, growing, declining value, some significant value to a driver's license. And the story you just told about being able to deceive a cop with a driver's license is just staggering. So my question is, to what degree are you tracking and do you see these app-based KYC processes being duped by stolen or fake identity? Where's that on the radar for you? It is on the radar. And what we're seeing today, fortunately, is more attempts than success. Because some of the some of these companies who are using the either they're either using this app base or they're offering app based KYC solutions.

They're actually more effective than you think they might be sometimes. It's like everything else. There's a wide variety. There's some that are very intrusive. Yeah. But they are blocking. So we are seeing more attempts, but we also are seeing more successes overall in account takeover, which means whatever we're doing from an account verification at the startup. which is one, it's an area of granularity we've got to get into more which is the differentiation between when you establish the account so account initiation and is the takeover occurring later so there's not secondary verification along the way so you log in one time you verify one time and then they trust you from there on out Well that doesn work either anymore You need to have secondary verification and authentication periodically

Now, some people would argue, depending on how valuable what it is you're trying to protect behind that account, would indicate, well, I need to do that every time I log in. and others, you know, if it's my IMD account, IMDB account, before I figure out what movie I'm going to go see this weekend, maybe not so much. But if I'm getting into my bank account, they better authenticate me every single time. And you do see that. But KYC is an interesting topic because it's in that area where data is so valuable. and yet we are so resistant to moving to some of the next kinds of technologies and this is where we get into the difference between we'll talk about biometrics the difference between verification and identification one's very good less intrusive great when it comes to preventing fraud identification not so much very intrusive not as accurate and even if it were

not very transparent. Authentication, very transparent because you're doing it. You are initiating a transaction. So the confusion that people have between the two leads to, we don't want either one. Right. And I think, and I would add, pardon me, James, I would add the, well, the intrusiveness, the friction, the, you know, I mean, we could easily see, and I've had a couple of these conversations with folks in the field, that every time I open an app, it needs to scan my face. You know, I'm sure from their standpoint, they could make these arguments. Perhaps, you know, ITRC could make these arguments. So I guess the gist of my question is, what do you see in terms of where it ends, right? I mean, diminishing returns, one could be like me, skeptical that they'll always breach it. It's always going to be insufficient. So sort of where

do we land on this eventually? Which is, I think that's a conversation we've got to have. And I think it's a matter of degrees. It's back to which is more important to protect. If I'm protecting my financial assets, my real estate assets, if I have any, my family, whatever it is that I need to protect, I may be willing to share and go a little bit further on that than I would be on something of lesser value. I don't need to show my face, my finger, a code or something for a routine transaction. If it's even a transaction at all, I might just be seeking information. I might be browsing, looking for something before I ever make a decision that I better purchase. I don't need to have that where I'm not providing data. I don't need to have a lot of friction at that point. I need a lot of friction as you go up that value chain. And at the ITRC, we're big fans of friction.

We think we have gone too far down the path between convenience and security. But if some people want to go to the maximum degree, it should be their choice, but it shouldn't be default. And that, I think, is the crux, absolutely. uh yeah let's let's think about where we are with um any of the instant payment uh processes you know most people that we talk to who get scammed today um their their their their method of sending money if they're not actually getting cash out of the bank is they're using one of those instant transfer products. Well, it's just like cash. Once it's gone, it's gone. Now, maybe there's a way to claw back depending upon the time of the day, the day of the week,

other things, other circumstances. But for the most part, that money's gone. And you as the individual who find out later that you're a scam, you think the financial institution should make you whole. Maybe they should, maybe they shouldn't. That's a different debate. But what we also know from working with victims is when you talk to them, and they'll tell you, if I had only taken a minute or two more to think about it, if I hadn't made that decision in the heat of the moment, I wouldn't have done it. Well, what if we didn't make it quite so easy to send large sums of money, which is relative, $500 is large to somebody, and $500 is large to other people. But whatever the amount is, what if you had the option where you can't let me make an instant transfer unless I talk to somebody, I get a second verification.

There's got something to insert some friction. Milliseconds to complete a transaction is not going to make somebody go someplace else. Right. Well, and let me ask you this, James, just in fact, today we'll publish an episode with, pardon me, a gentleman, Jesse Poster, who's the co-founder of a company called Vora. And so this gets us into the Bitcoin realm and self-custodied money and all these things. And so what Jesse and team are building at Vora and as is available in different degrees are devices and so-called signers or, you know, cold wallets, what all these terms float around for those not familiar, that are true self-custody. And with that, I am able to fully control, in this case, Bitcoin and the way it moves. I also have, so I have extreme ownership and extreme responsibility.

If I screw up, it's on me. There's nobody to bail me out, claw me back. So I say all that to ask that in your circles, be it DC or otherwise, to what degree do these options appear in conversation? to what degree are they palatable to those making these regulatory or other choices? Or is it, as you say, that the financial institutions, you know, need to create, add additional scrutiny, additional verification? So maybe on that spectrum from it's mine, I control it, I own it, I'll do what I want, I'll make my own mistakes and be responsible to the institutions need to keep me from hitting my thumb with the hammer. Yeah Great question I think we're in the very very early days Of these discussions And it's because of the circumstances That we've now seen emerge Where people who are losing large sums of money Through fraud

As well as you have people who Have large sums of money That's a relative basis It's large to them That they want to have more control over Both are equally valid But you're trying to wedge both of them under a system that doesn't really recognize the value of either one. So I think we're early days in this discussion, but there is this growing recognition that this maximalist, friction-free road we've been going down is not serving everyone well. And we have to maybe come up with a process and a schema that recognizes the spectrum as opposed to shoving everybody in to everybody gets the maximum amount that we can technologically do today at scale. Right. Fair. Absolutely. Dial it, not necessarily dial it back, but let's dial it differently.

Let's divide up things differently. Those discussions, I think, are beginning to happen because everybody's realized that it's very problematic. And as you might imagine, most large institutions are trying to avoid overregulation. So if they can avoid overregulation by changing process to give that kind of flexibility, let people decide how much friction they want, then I think you're going to see, that's where we'll see more probably action over the next year or so. But as you might imagine, it's not easy conversations, it's not going to be fast conversations. And there's going to be a lot of bumps along the road. Well, speaking of the fox and the hen house. Yeah. The government. And so I don't know, and perhaps you do, the degree to which it is still an active conversation. But before Elon left the building, there was obviously a very lively debate about Doge and accessing Social Security Administration data, you know, combining that with other federal databases.

is as someone who tracks this, what's your read on the government itself, at least at the federal level, where are they in helpful to hindrance, you know, again, on that spectrum? Yeah. And I know it's not one thing, so. Yeah, yeah, because there are a lot of moving parts. Yes. And we're involved in, there's a lot of litigation over this. We just had a decision late last week over the Social Security Administration sharing data with DHS. And you've had IRS data being, the litigation around IRS data being shared. And so there are both good and bad things within government regulations about data share and laws that let's set aside the last year. because one of the things that frustrates

victims of identity crimes, and those of us who work in these areas where you're talking about identity theft, fraud, and scams, is the lack of data sharing within the government. And there are a lot of barriers. There are statutory barriers that exist that thou shalt not share this with anybody else. In many cases, like the IRS data, social security data there's very good and valid reasons that some of the other agencies less so if you're sharing it with another agency just something as basic as how many reports of identities that were there what were the particulars not necessarily who or any of the personal information but just the fact that it existed that kind of data doesn't get shared We don't have good data around the number of individuals who are actually impacted by fraud, scam, and identity theft.

What data we do have is all self-reported. And I presume, Lee, James, excuse me, that that goes to incentives? Or is it incompetence? Is it complexity? What's your read on that? more than anything else there is incentive and it is nothing more complicated than turf battles within government there's always a level of turfiness and that's a big part of it in the business world and in all of our personal lives we can't manage something we don't know about. I can't fix something I don't know about. And I can't manage risk I'm unaware of particularly. Right. And we don't have good data about the real scope of, not of just the

volume and velocity of the occurrences, but what are the real impact on real individuals? We don't have good data about that. It doesn't have to be in a central repository, but it does need to be collected and shared. You can scatter it across as many agencies as you want, but it does need to be collected and shared so we can then make good and valid decisions about how to address whatever the underlying issue is. We don't have that. I think if I got this right, ITRC's data shows over 25,000 breaches over 20 years and 79 billion exposed records? Yeah, that's right. And so, you know, staggering. And I guess as we talk about multi-agencies, federal government, data brokers, it certainly occurs to me that at some point the question stops being, how do we secure these databases better, but should anyone be holding this much data in the first place?

where are you on that question? We're a big fan of data minimization. Again, there are good and valid reasons why data should be collected and used but that's different from should it be stored. So when we talk about data minimization it's a multi-step process. First question is, do I need the data? If you don't need the data, then we'll separate need from one. I might want the data, but the marketing department always wants the data. But do they need the data? No. So if you don't need it, don't collect it. You know what that does? That reduces your risk profile because you cannot have data compromise that you do not have. So data breaches go down right there. Data breaches go down. Victims go down. Data breaches go down, cybersecurity attacks go down. Do you need the data? Yes or

no? Okay, I need the data. Okay. Why do you need the data? And how long do you need the data? So once that purpose, that valid purpose, is fulfilled, are you statutorily or regulatorily required to keep it? If not, get rid of it. Don't store it. And this gets to that convenience, that friction part again. where they ask you, well, do you want to store the credit card? Don't ask that question. Just don't do it. If you don't need it, you've completed the transaction, you provided the receipt, you don't need it anymore, get rid of it. Having to fill in the credit card number again when they come back. My browser will do it quickly for me. Yeah. 15 seconds longer is the transaction. If they want the sweater, they're going to buy the sweater. Don't worry about it. And then the third thing is, if you need the data, you have to keep the data, and you've got to store it in a secure fashion.

It's got to be encrypted. And now increasingly, we've got to prepare for quantum. So it's got to be encrypted in a way that will be still safe and secure once quantum becomes mainstream. Of course, there are a lot of people who believe that we're having preemptive attacks to cache data that is encrypted now. They just hold it so when quantum becomes available, they can go in and decrypt it using quantum. But the base part of minimization is we over-collect data, and it creates risk, not reward, not financial value in the long term, or organizations. Changing that mindset is very difficult and is not going to happen absent regulation. We've been having these conversations for 20 years. The power, the computing power we have, the storage capacity we have today,

has led to the massive amounts of data that's been stolen because we've collected massive amounts of data that we couldn't collect. Was it Edward Snowden during those revelations that called the NSA's approach collect it all? Not to conflate the NSA with your average data broker, but I think to some degree we can. Well, you can certainly. It is a mentality. And I wouldn't limit it to the NSA or to data brokers. It is any organization. My point precisely. Yeah. Retail is probably the absolute worst when it comes to that. because they're always profiling their customers. They're always profiling the purchase. They're always trying to figure out what it will take to get you to buy more. Obviously, the social media platforms, which are only just – they're giant advertising platforms. Always remember, you are the product. You are not the customer. You are the product And so they always wanting to collect more so they can analyze more so they can be more refined in it So it a global obsession

So we're not going to change that, certainly not overnight. And it's only going to change if we have a regulatory structure, whether that's self-regulation or governor regulation, that gives people the incentive to collect less, use less, and store less because it reduces their risk. Absolutely. And in combination with that, you, I believe, advocate for credit freezes, the use of pass keys, which are promising, you know, I think still somewhat emerging technology. Both of these, to differing degrees, put control back in the individual's hands instead of trusting an institution to protect you. Is that the direction identity needs to go? Less custodianship, more individual control? And what's your over under on the average individual's ability to do that? I do think we are moving to a time of more balance

where people are beginning to realize, look, nobody's going to take care of me. I'm going to have to take care of myself. so that's why you're seeing the increase in passkey usage you're seeing an increase in credit freezes you're seeing an increase in people closing old accounts clearing out old files and things which have value to a an identity criminal but they don't need anymore and they're asking organizations to get rid of it too you don't you don't need all this data on me I don't want you to have this on me. And so the places where they allow you to delete it, you're seeing people actually do that. Not in big numbers, but that will grow over time. So we're seeing this balance emerge, which has historically been, now that's somebody else's job. The company that has my data, that's their job to protect me, not my job to protect myself. That was never true. It was always, it had to be a joint.

And the visible part of the failure was always on the part of the organization. And the highly visible part is still organizational failure. But because we weren't taking as individuals the steps beforehand to make that data less useful, it exacerbated the impact. That's what we can do as individuals is make our data less useful. Hard to make it less available, but easy for us to take steps to make it less useful. and so long as the bad guys continue to be basically lazy if they can't do it at scale and they can't do it automatically if you throw up roadblocks to their usage they're going to go away they're going to move on to somebody else because you're making it you're making them work they don't like to work so they'll move on to somebody else who makes it easy so we're reaching that battle but I do think that continuum has to do just that continue. Businesses have to continue to improve.

Individuals have to continue to take more responsibility for their own data protection. And ultimately, government has to provide the framework that incentivizes both to do that. And that incentive may be a hammer. It doesn't have to be a financial incentive. It doesn't have to be a you get out of jail free card. It can be there's a hammer. but it should be a level of if you're doing all the right things all the right way and you still get ag maybe you do get not a get out of jail free card but you get a pass on the X multiplier on punitive damages or you know you have to maybe you get out of some other forms of liability maybe the Regulators won't make you sign a 20-year deal that you have to. Right. Yeah.

But we have to have all three of those elements working together, individuals, organizations, and government. And that's what's missing today. And that's one of the things we brought home in this most recent report, and we're going to talk about it increasingly over time, is no one organization can do it. Consumers can't do it by themselves. Businesses can't do it by themselves. Government shouldn't do it by itself. So we all got to work together, or this problem is not going to get better. It's only going to continue to get better. Appreciating the naive nature of this question, I am still interested, James, to ask if you could redesign how identity works in the U.S. from scratch. If you had that magic wand, what would it look like? well you know we just changed our mission statement and our vision to be a very simple one a world where nobody can use my identity but me and so i think we would need to design processes

that start at birth that give you full and complete control over your identity and if and then with all the protections that come and would be required for anybody that you're sharing it with, but also all the responsibilities that you would incur. So from a practical standpoint, you know, a credential, you know, that follows you throughout your life that is completely static, not sure that's the way to go. Don't know. Uh, because how would you secure that? That we'd have, that you'd still have the same problem we have today. We, the social security number was designed to be a lifetime number. Uh, now we've got driver's license that are designed to be your adult life. Um, we're, if we could, if we get people over biometrics, you know, a, a, a, a secure biometric would undoubtedly be a part of that.

But we've got to rethink the whole concept of how identity data is captured, used, stored, and then we have to reevaluate when and how we authenticate that identity. What we're doing today works okay, but obviously we see the flaws. And when the flaws occur, there's significant impact. Well, the good part about it is, as you well know, look, there are trillions of transactions a day that involve your identity. Not your individual, but individuals collected. Trillions a day. The vast majority of those go through without a hit. and they're secure and they're not at risk. But that doesn't matter. It's that slice that don't and that there are people in the world who are hell-bent on making sure

that that number grows. The number of transactions that don't make it through, they want that number to grow. So we have to design an identity system that is built for that world, whereas the identity system we have today is not. And to me, it strikes me as a valuable framing, which is the social security number initiated, what, in the 30s as an accounting identifier assumes a benevolent environment. And I think to your point, we have to assume a hostile environment and build accordingly. Well, and on that note, let's wrap it here, James. So for someone who's listening, watching, who's just gotten their third, you know, breach notice this year and hasn't done anything, what are two or three things that they should do today? Whether you've got a data breach notice or not, go freeze your credit.

This is not 20 years ago when credit freezes were first introduced where you had to pay for them and you had to physically mail a letter to the credit bureau and wait for a letter to come back and all. Look, it's as easy as any online transaction. It has no impact on your credit score, which some people think it does. It does not. So freeze your credit because that's the only thing that is going to stop something bad from happening. Everything else is basically a trailing indicator. So freeze your credit. Adopt pass keys if you haven't already. I would assume most of the people listening to this are probably well down the path of pass keys. realizing the value because there's nothing stored at the account side and it's a token on your side. You never see it. You cannot self-compromise, which is how a lot of data breaches occur today, is people self-compromising credentials that we use later. So passkeys are a vast improvement over what we have today from an account opening and account verification setup.

if these people are still using passwords, obviously, make sure you're not reusing them because we still see a very high number of people using the same password on every account. There's that convenience factor again. So don't do that. Use a password manager. Also not perfect, but better than nothing. MFA, not perfect, better than nothing. So taking in its totality, if you take those steps, that's going to get you from a technical perspective that's about as far as you can go as an individual now there are some other things you can do and and should consider doing do you like the privacy policies the data practice policies the data collection do you have the right to look at your data do you have a right to correct your data delete your data at all the organizations where you do business and if you don't find someplace else to give them your hard-earned money. You don't have to always go to the same organization

if you don't like what they're doing if they breach your data and you don't think they're doing enough to protect it, go someplace else. But that's a personal choice. And that's the kind of thing we have to get people to think more of today, think critically about it. It's in cybersecurity, it's a zero-trust model. I only want to do business with people that I trust who are going to protect my interests. And if you don't, I'm going to go someplace else. So we have to do that. We have to be critical and have a critical eye towards the things that are coming at us. There's a scam going around now with the Social Security annual benefit note. Some of your folks watching here listening may have received one of those. They are letter perfect. letter perfect but they're fake and when you click on it you go to

an info stealer that's going to when you're trying to verify your social security administration login information it's just collecting it they're a fake website so you know we have to have this critical eye to look for those telltale signs which is harder to do with AI making things so realistic so we have to be that zero trust. We have to have a critical eye toward what we're receiving. And don't be afraid to ask questions. Don't be afraid to be. Maybe your mother told you it's rude to ask a question. It's rude to say no. It's neither. And it's something we have to get far more comfortable with than we are historically. Most Americans, we're friendly people. We're gregarious. We're outgoing. We like interacting with people. And the bad guys you set against us. We want to be helpful. We want to be helpful to other people. We certainly want to be helpful to our friends and family. So if we think we're dealing with somebody that we know and we turn out we're

ma, those kind of scams occur every day. So we have to learn the tools of asking questions. Somebody asks about money and they never ask you about money in the 25 years you've known them or the 25 minutes you've known them. That's a red flag. Follow up on that. Don't just assume that it's okay. So we've got to kind of change a lot of the ways we think and do business. And we can sit here and say, gosh, I wish we didn't have to do that. But unfortunately, my friends, we do. So those are the kind of things, the technical things, and then the sort of lifestyle things that will make us more secure. Great, great advice, James. Thank you for that. And so paired with that, What should a small business owner who relies on third-party software for everything they do be doing differently after reading ITRC's 2025 report?

And it's so hard to tell the difference between a small business and an individual anymore. In a lot of cases, a lot of gig workers, a lot of single-entity LLCs. Hopefully, they've got all that software. It's SaaS, software as a service, all that. They've got all the automatic updates set. That is the bare minimum you need to do. Your antivirus and everything is kind of built into your software now. It's built into your operating system. You can always do belt and suspenders and go to a higher level of antivirus. That's a resource question for most people. But even before you do that, if you're looking and you want to spend more of your hard-earned resources, It's pinned on a managed service security provider, an MSSP. So have that third-party person who's going to monitor your equipment, who's going to monitor your traffic in and out of your network, look for those attacks, block those attacks, keep those tools up and running.

And that is sort of the first step beyond just you doing everything yourself that you should really contemplate doing. And that's very important because small businesses are big targets. People may think that just because I'm small, I'm a one person, I'm this, I'm that, that I don't have anything that anybody wants. That's not true. The bad guys will find a way to make money off of anything and anybody. So you are a target, and you need to do things to defend yourself. And so as soon as you get a little extra cash and you can, when your business is growing, get that managed service provider. And then ultimately, you know, the goal would be to have somebody maybe on your own staff who's looking after that as your business grows. But that is that is very important. If you have employees and not all that, train yourself to look for all of those indicators of fraud, fraud and now leads cyber attacks.

so they're going to try to trick you into maybe paying an invoice you don't owe. I get two or three of those a day, it seems. Yeah, so things like that, learn those indicators, and there are probably if you're in an industry that has some sort of association in your state or in your community, those groups should create resources, too, for what specifically is being targeted in your particular business sector. So take advantage of those things. But train your staff to look for that too because the bad guys know the weakest link and the best security are human beings. Maybe well-intentioned, but somebody's going to click on a link. They're going to answer a phone call and provide information they shouldn't. They're going to respond to a text. It is so hard to make sure that doesn't happen. But the way you do that is through good training.

And it's not just training the day they show up for work. It's periodic training because the techniques change. The technology changes. So you need to do that on a periodic basis. Those things are sort of the baseline for small businesses. Excellent. Well, James, thank you so much for your time today. It's been great. I appreciate it. I wish you all the success in your Sisyphean, but incredibly important tasks. And I say that with full sincerity in terms of policy, particularly. I know you guys are doing great work within the spheres you can control. But I know on the policy and government front, it is a tough slog. So thanks for the work you do and thanks for your time today. Well, thank you very much. Appreciate it. Thank you.