- March 1, 2023
Bill Tolson: Welcome to Archive360's Information Management 360 Podcast. This week's episode is titled The Current and Future State of Information Management in an Evolving Data Privacy Environment. My name is Bill Tolson. I'm the Vice President of Compliance & eDiscovery at Archive360. Joining me today is Jesse Wilkins, president and principal consultant of Athero Consulting based in Denver, Colorado. Basically the same state I'm in. So we're relatively close today. Jesse, thanks again for taking time to join me for our podcast today to discuss data privacy challenges for information management professionals and all the stuff associated with that.
Jesse Wilkins: Thanks, Bill. I'm really happy to be here today.
Bill Tolson: Yeah, no, that's fantastic. I've actually been looking forward to this. This is going to be a lot of fun and I know that you've been in the industry a long time as myself, so I think we'll get into some really interesting stories and opinions and stuff like that. So with that, I'll go ahead and kick us off. Based on the subject matter that we're going to be talking about today, I wanted to start off by saying new and emerging data privacy laws in the US, Canada, and the EU are quickly becoming an inflection point for information management professionals.
In fact, January is when most US state legislatures typically start off a new session, and so far there has been a pretty big rush to introduce new data privacy bills in the state legislatures this month. On top of the five relatively new data privacy bills that became law over the last couple of years, including California, Connecticut, Utah, Colorado, and Virginia, eight states introduced new privacy bills in January of this year, 2023, just in the first couple of weeks. They include Indiana, Iowa, Kentucky, Mississippi, New York, Oklahoma, Oregon, and Tennessee. In fact, New York, the state senator there who introduces a privacy bill recession, Senator Kevin Thomas I've had on before, early last year on a podcast, and we are scheduling him to be on again in the near future. Really interesting guy.
Seven of the eight of those data privacy bills were filed last session, but they're being carried over into this session and three New Jersey bills rolled over from last legislative session as well. And obviously I think just like last year, there will probably be many more introduced in the next several months. Like last year, many of those never made it into law, but several did. And I think we're going to see probably an acceleration this year of states finally adopting new privacy bills. And as I've talked about in several podcasts and blogs, these new data privacy laws are similar but also slightly different in that organizations will not be able to pick a high watermark law with the intention that if an organization meets that particular law, they will automatically be in compliance with all the others.
Usually I hear, and I don't know about you Jesse, but usually I hear people say as well, we believe we're going to meet the CCPA and CPRA, therefore the other ones are a slam dunk. And for everybody listening, that is not the case. Each law is slightly different. So you can't say, gee, if I meet California's then I meet all of them. And just want to make sure people are very aware that they need to look at each one of these laws very specifically because their definitions slightly are different, their exemptions are different, their timeframes to implement stuff are slightly different. So you can't just say, well, I meet GDPR, therefore I meet over the rest of the world's privacy laws. It just doesn't work that way.
So Jesse, I'm sure our listeners are excited to hear your views on information management and data privacy laws. As I mentioned, all of these data privacy laws are similar but different. So companies really cannot rely on a single high water mark, as I said, to follow specific laws that ensures all the other ones are meant. This really will raise the complexity and cost to comply for individual organizations across the board, especially for those information managers in those companies that will be looking at a much larger data load because, and I think we'll get into it in some questions later on, but I think we're looking at information managers needing to potentially take on a lot more data to manage than just the five or 10% that happens to be regulated.
Jesse, drawing on your decades of experience in the information management profession, what do you think this new data privacy landscape means for companies as well as for the information management professionals?
Jesse Wilkins: Well I think you're right. I think we often kind of fall into this trap of, well, we already did GDPR, we already did CCPA and the CPRA, so we're good. And I know that here in our mutual home state of Colorado, one of the interesting exemptions that doesn't exist is that nonprofits are covered. Whereas in most other privacy regulations, at least so far, they tend to be exempt and it really is a focus on private sector organizations. But you're right, it does make it more challenging. And I think one of the things that's really interesting to me having these discussions with clients today is that for so long, so many of us have been focused on the unstructured stuff, the emails, the invoices, all those things where we're collecting that personal information, but it's in a document thing, some sort of a document package, it's a standalone thing. You can look at it.
I know that one of the concerns that's playing out right now is trying to figure out all of the stuff that's in point of sale systems, all of the stuff that's in HR information systems, marketing systems, your accounting software. How do you deal with the information that's stored in them and what are you going to have to do when many of these systems weren't designed to be able to export out particular consumer's information to remove a particular consumer's information. So it's a different mindset for the organization. It's also a different skillset for the information professionals who are now being tasked with doing this. It's not just going to your repository and finding stuff and pulling it out. Now you've got to pull it out from all these other places too.
And I think there's going to be a fairly steep learning curve on the parts of those professionals and on the parts of their organizations as we all try and figure out what this all means. I think it's interesting too that in the list of states that you rattled off at the start, some of these are new, some of these are reintroduced, but we're already starting to see the second round for some of these states, right? We've already done that with California with CCPA and now CPRA coming into effect as of January 1st. But I saw that several other states are also starting to revise and revamp and they've introduced legislation revising and revamping what just got passed in some cases last year or the year before. I know I saw that Virginia has introduced something this session to change their data protection act.And I think I saw that, it's not Colorado, it's Utah has already introduced something changing theirs.
And so I think part of the challenge that we're going to be struggling with is it's not just this patchwork of different laws and regulations with slightly different terms, slightly different penalties as you outlined, but it's also kind of a never-ending moving target, right? In that who's to say that California's not going to propose something else? At some point I think we're going to get to federal privacy legislation, but even if it does pass, it's not going to be a static target. And that whole preemption issue of whether or not it's going to preempt the state patchwork, I think is still a big thing that's keeping that from moving forward. And I don't know how they're going to resolve that.
Bill Tolson: That was a bone of contention in the last Congress with Speaker Pelosi slowing or stopping the forward progress of the ADPPA because it potentially would exempt or basically preempt state law, especially California law, and they wanted to have their own above and beyond capabilities. So now with a new speaker, people are starting to think that that's maybe going to help it go forward, but we're talking about the federal Congress and their ability to come to an agreement. So yes, I think it's going to be very up in the air for a while going forward. But both you and I have mentioned the state laws, and by the way, we're not just US centric, it's worldwide, but the EU's GDPR as well as Canada with their C-27 bill that I think is in its second reading in parliament, and most people think it's going to make it through this year. That in North America we're all moving data around and collecting data and things like that.
But with those additional complexities, and people wonder what I mean by that. But with additional complexities of all these different laws that are different and slightly different from state to state, retention disposition expectations based on data subject consent differ, purpose for which the PII was collected and the state they live in is going to add things to it. So for example, if somebody collects my data and I'm sitting in Colorado with our Colorado privacy law, and then they're collecting data from others in the EU and in Virginia and Utah and so forth, each of those are slightly, if not more than slightly different in that what kind of consent was given? What was the consent for? Was it for a specific white paper or was it for something else? And then each one of those states potentially has differing requirements around how long that data that was collected can be kept.
And they're not all the same. One state might say, well, gee, you can keep bills data for just the original purpose for which you provided, which you received consent. That means you have to track that. You have to track when that consent was given and based on that specific state law, my very specific PII is directed by that. But if you're doing that with other individuals from other states or other countries and stuff, those are all different. So it's not, what I try to explain to people is, it's not, well, gee, yes, you had to get consent, either opt-in or opt out, but then you get to keep it for two years or five years or whatever happens to be. No, it's very specific to that state that that data subject was in. And I'm just wondering, Jesse, in your practice, have you yet run across any companies that are trying to figure out how to map and track the complexities of these emerging laws?
Jesse Wilkins: The company that I'm working with right now, which was a large exclusively US-based retail organization, they're doing this, they have operations in, I want to say about 40 states and certainly in all of the states that have already enacted comprehensive privacy. And of course California now extends that to employees as well as consumers personal data. And so they've got really it's the five jurisdictions plus they have to deal with employees who are California based. And they are. I think like everybody they're struggling right now because they know that they need to get consent and they believe they've gotten consent. But I think a lot of this depends on the nature of the purpose for which consent was given.
And I thought up an example of this, and I'm going to use an example of a car parts store, not to mention any names, but someplace you would go to buy aftermarket parts for your car, replacement parts for your car. And so if you go to one of these places and they're just checking your engine light, your engine light's on, check engine light, and they're troubleshooting that for you, oftentimes it's a free service. They hook you up to the machine, it outputs the code, and they tell you what and off you go. And there's really no personal information collected, or at least not necessarily. If you go in there and buy a pair of jumper cables, again, especially if you pay with cash, there's not really necessarily going to be any personal information there.
But now you want them to test your battery because you think something's wrong with the battery. Now they need a little more information, a little more specificity around the nature of your car, the nature of the battery, how old it is. And we're starting to get some things that could be personal. Well now you buy a new battery for a 2016 Ford Fusion SE, you pay for it with a credit card and you get the warranty. And now the expectation that you have with regards to your privacy and how that personal information is being used is different in each of those examples. Cash versus credit can complicate that. Is it a fleet purchase versus a personal purchase? Can complicate that. What if you're doing it through the website? Obviously they have to collect information through the website because you can't pay cash through the website.
But what about joining the rewards program? What about downloading the app? All of these sorts of things pile more and more expectations of consent and expectations of how things are being used because they're collecting more and more things. I think that organizations haven't really even begun to scratch the surface of what that means from a retention perspective, from a findability of information perspective. Where would you find that sort of information in a typical organization? Because it's not in a content repository. It might be spread across six or seven or eight different systems. And that's before we get into things like aggregations and data warehouses, data lakes, those sorts of things.
And I think the organizations that are trying to tackle this are in an odd place where, because we're still waiting on regulations, the Colorado regulations are in draft, the California regulations are in draft. I don't know if Virginia has their act any better together than those two states, but we're in this weird place where we have to comply with these requirements without necessarily knowing what the regulations will be. And we have our stuff all over the place and we don't know how to deal with that. And so you're almost in this position where organizations are going to have to throw themselves at the mercy of the regulators and say, hey, we're trying, meet us halfway here.
Bill Tolson: Exactly. That's a great example by the way. But also exactly. We made our best attempt. We documented what we were doing, why we thought it was the right thing to do, and then in midstream you amend the law or you fine tune a part of the law that makes it a little bit more questionable for us. How were we supposed to know about this stuff? And politicians in general, but state legislators and state governments are not necessarily on top of things all the time. I think that's a great point, being early on in the whole process, what are we doing, we're spending a great deal money potentially to try our best intention of meeting this stuff. Does that mean that we're going to be perceived that way?
But that's a great point, especially early on in all these laws. Definitions are going to change, expectations are going to change, and keeping track of that is going to be a complex process for companies and their outside advisors and so forth. And by the way, I think privacy laws in general, the ones that are coming up, I think they're great steps. I'm not advocating against privacy laws, but this whole, I think new environment that companies are looking at is I think a lot of them do not yet understand the kinds of resources they're going to have to invest in to meet these new laws.
Jesse Wilkins: I saw a study that was released about this time last year that estimated that for current state laws, current and contemplated, that the fulfillment cost for industry as a whole in the US is going to exceed $100 billion a year, so a trillion dollars over the course of a decade. And they then estimated that if all 50 states pass their own just slightly different from the next guy legislation, that it could be upwards of a quarter trillion dollars per year. And that's hard cost associated with new technology and hiring the equivalent of data protection officers. It's the soft cost of having to do the compliance with data subject access requests and opt-outs and retooling everything from their policies to their retention schedules. It's a lot.
Frankly am a little gobsmacked I guess, because I remember there used to be a movement, at least in the United States where you would have a model legislation, if you will, and it was the uniform commercial codes, I believe. And the uniform commercial codes, there was one in our industry for accepting photographic reproductions as evidence. And that gave way to facts and ultimately digital signatures and to a whole host of other technologies that we've all embedded into our work processes. But I'm surprised frankly that there isn't something out there that maybe California takes the lead on or whoever the UCC has morphed into today, that they didn't do that and say, look, GDPR has this, CCPA has this, let's mash all this stuff up together and homogenize it and here's the model.
And you don't have to implement the model exactly as written, but everybody's is going to cover opt-in versus opt-out. Everybody's going to have a consistent definitional approach. And I'm just surprised that we haven't seen something like that yet.
Bill Tolson: Well, it's interesting because on one of our podcasts last year, I had on a professor from the Uniform Law Commission, who was one of the co-authors of the Model Privacy law that they were trying to get all the states to adopt. And she was very interesting, really, really interesting discussion with her. But so far, at that point in time, and I haven't checked on the last couple months, but so far no states had adopted the Uniform Law Commission's Model law. And at the time I asked her, I said, why do you think that is? And she said, I'm not really sure because the Uniform Law Commission, the Model law, they tried to make it non-controversial too much on one way or another. And I've read through it and it lacked some stuff. I thought it was a good first attempt, but I don't believe any states have adopted it yet.
She talks a little bit about how all this started outside of California. California did their own thing with the CCPA. But all of the other states, and I've actually, during my other podcasts with state senators and stuff, they all acknowledged that this was the case that they all basically for some reason looked at Washington State. Washington State besides California, was one of the first states to try to construct a data privacy law and it had all kinds of stuff in it. We maybe get into that a little bit later. But the Uniform Law Commission didn't have some of that stuff in it. I think it was a process of the state senators and representatives basically hearing about the Washington law and saying, well, let's look at that. That looks fine. Let's take it and then start changing some things and changing definitions and stuff like that.
And as far as I can tell, of all the new state laws, except for California's, they all started by looking at the Washington State law, which so far has not actually passed a bill into law, which was pretty interesting. But yeah, like you said, there was that Model law out there that the states had access to and they for some reason did not accept it. Have you ever run into the Uniform Law Commission?
Jesse Wilkins: I have not. I'm aware of them, but I have not.
Bill Tolson: Really interesting. And by the way, we also had on U.S. Chamber of Commerce, one of the lead lawyers there that works on the hill with the legislators that helped on the ADPPA. And they have very specific things. They're trying to get into laws and very other specific things they're trying to make sure don't get into the laws. There's a lot of back and forth and I'm glad I'm not a politician based on some of the side stories I heard from these people, but really, really interesting. And I like following this stuff. I mentioned when I was opening that information managers are going to be looking at or having to potentially see their duties expand beyond records management, for example.
And one of the reasons for that, and I know I've mentioned it to you before, is this idea of an inflection point based on the date privacy laws. And because of the rights that the new laws offer, data subjects, the right to query a company and say, what kind of data do you have on me? How is it being used? Have you sold it to anybody? And oh, by the way, I want you to delete it. Basically infers, and they don't say this in the laws, but they infer it. If a data subject has the ability to say, what data do you hold on me? And a company has to respond to that, that's not a best guess requirement. That's an absolute requirement. So if a company says, well, gee, Bill Tolson just asked us what kind of data we have on them, what do we do? Well check Salesforce and check this and check that. If there's nothing there, just tell them we don't have anything.
And based on the lawyers I've talked to and so forth, that's also in the EU around GDPR, that's not good enough. When that right extends to potentially every potential repository in that organization, meaning your and my laptops, computers, removable media, personal cloud accounts, all of that kind of stuff. I kind of have for the last year or so, I've referred to this as an inflection point in that companies, organizations are going to have to come to the realization that besides just managing records, they're going to have to manage all data in the company, because I could on my laptop, have an email or a work document, an Excel spreadsheet that has 2000 data subjects listed, all of their PII, their email addresses, their phone numbers, their addresses, all kinds of neat stuff.
If IT or legal, basically responding to a data subject access request doesn't have access to my laptop centrally, then how do they know I do or don't have that? And again, these laws are not make your best guess, but you need to go out and report on this information. So either you need to keep information out of being disseminated across the company like it usually is in our corporate cultures or all of that data's going to need to be synced centrally so that IT can search it. What do you think of my hypothesis there?
Jesse Wilkins: No, I think you're exactly right. And I think that realistically the only way you're going to be able to do it is the second way, right? It's got to be automated. There's got to be something basically sitting on the network, inventorying the organization's systems and its assets in real time or as close to real time as possible. Responding to a data subject access request is not an immediate thing, but there is a timeline. And as we've talked throughout, the timelines are different depending on is it California? Is it Colorado? Wherever. It doesn't have to be instant, but as you said, you can't go fumbling around saying, well, gee, I wonder what other systems we're not thinking of, because, and I mentioned this a little bit earlier, but this is where it comes full circle in California, these rights extend to employees.
Employees know those systems. They know about their HR system, they know about the lengthy lists of spreadsheets that are out in box or whatever repository that they're using and they're going to go after. I think that's the thing that I hear from privacy professionals, is they're not really worried about the consumers as much as they are worried about employees and former employees. And in some cases it's not even employees, it's job applicants, it's contractors. I don't know that we're quite at the supplier point yet, but at some point somebody's going to say, yeah, I'm a third party and I have a business relationship with you, which means now I have those same rights for my people who have interacted with your people. I think, again, what we're going to have to have happen is we're going to have these technologies that are, I don't want to say that they're analogous to file analysis tools, though I think there are some similar characteristics there.
But you're going to have to have something sitting on the network that has a positive connection to everything the organization touches. How's that going to work for people working from home on their laptops? I don't know. How's that going to work for me as a private contractor who has consulting clients and I'm using my own laptop? I don't know how that's going to work either. But I think we're going to have to go through a little bit of wandering in the desert for a bit to get to some case law that helps spell some of this stuff out. The regulations will help, but I think it's going to take some case law and somebody's going to have to very visibly fall on their face for many organizations to really wake up and take notice.
Bill Tolson: Good points. And when I was consulting I spent a lot of time at a bank and the bank as well as another company, actually it was an alcohol distribution company. But the bank basically for each employee's computer, anything done on that computer was being automatically synced to a central repository that IT had. In certain cases depending on the employee, it was not resident on the computer, but for those who had laptops who needed to travel, all that kind of stuff, there was syncing software that basically said, gee, I can do what I need on my laptop, but the next time I connect into the enterprise, all that data is going to be synced. So everything I have locally is going to have a mirror of it on the centralized file share or whatever it happened to be.
That seemed to work. In fact, I published a blog this week that talks about nowadays you can do automatic syncing with obviously OneDrive, everybody knows that, but you can do it with SharePoint as well instead of SharePoint as a file share, but that automatically syncs everything. But yeah, no, either you need to force everybody to save everything they do into one folder that is basically a link or a pointer to the file share centrally, or you need that syncing software. But either way, I think that the downside or the potential liability of these privacy laws are going to force companies to eventually acknowledge that all of that other data that we've never tried to manage before, we're going to have to start managing.
And I think you're going to see, number one, storage even in the cloud costs go way up, but also the requirements on the records managers and the information managers and so forth. And it's going to be really, it's going to be a culture shock. I wrote about this, I don't know, four or five years ago, but the culture in the United States is, I work for a company, but all of that data I store locally on my laptop is mine. And we all know in the United States that's not true. You might think it's yours, but they have the right to it, they have the right to manage it and so forth. And I think we're going to have to move further down the line of managing all the data just to lower the liability for the company, not be able to respond to these DSARs.
Jesse Wilkins: I think there's maybe another angle to this too, and that is that in privacy we've been talking about data minimization for a little while now, and certainly as a records person that's keep it as long as you need to and then get rid of it. And we know that organizations that do that have a lot of benefits compared to the organizations that either keep everything forever or just do it in on an ad hoc basis. But I think that a lot of this privacy stuff, and again, once a couple of organizations really, really get hammered hard, I think that's going to spark some more interest in data minimization. And I think you're going to have another huge culture clash in many organizations between the sales and marketing team who wants all the data for forever so that they can do trend analysis, they can figure out campaigns, they can do whatever.
Versus the rest of the organization, especially the legal and compliance folks who are like, we don't want to keep this one second longer than we have to, because if you don't have it can't be DSARd, if you don't have it can't be breached. However, you don't have to pay for it. But I think you're going to see that tension spark pretty strongly over the next couple of years. I don't know which side's going to win. I suspect that as the laws proliferate, sales and marketing is going to have to figure out a new way to do this. And I think you're going to see some tools come out that make it easier to anonymize or pseudonymized. But I think even that ultimately where we end up at is probably California, and then somebody else is going to say, if you collect things on a marketing basis, you're going to have a mandatory maximum retention of x.
I didn't know what X is. And I think you're still going to see some playing around with that, because the example I gave you, if I buy a new battery, as a company I can't have you getting rid of your information a year down the road or two years down the road if I've got a battery with a six year warranty. I think there's a lot of nuance to this that still have yet to be played out.
Bill Tolson: Good points. And one side benefit from what I mentioned around needing to manage all of data and sync and all that kind of stuff is at least eDiscovery will be more straightforward. And I was in eDiscovery for a long time, so you won't have to be going to each individual laptop and doing an image of it and then taking it back and searching it in various things. You'll just be able to do a search against the central repository. So that's one good thing. But I think, and I've done several podcasts on the subject's, wealth data minimization is going to be a huge driver, and I think corporate legal is going to really play a big part in driving that of, do you really need this data? You get inquiries now from something you looked at eight years ago. It's like, how is that useful to the company doing the marketing? That's ridiculous.
That the marketing, the targeted focused marketing only makes sense based on data from the last six months because human beings change and all kinds of stuff. So why are you trying to sell me something that I might have looked at when I was 20 versus now? It doesn't make sense. So like you say, I think sales and marketing departments especially are going to have to get a lot better around data hygiene, for example, getting rid of stuff that probably has no value to them anymore, much less meet the various retention requirements within the data privacy laws.
Jesse Wilkins: I think that's exactly right. It's going to be a sea change, and I think we're already starting to see that in some of the stuff that we're seeing primarily out of the European Union with respect to cookies and tracker pixels and stuff like that. But I don't envy the folks who are in sales and marketing roles these days, because I think they're going to have a lot of challenges over the next two, three, four years.
Bill Tolson: Well, I'll briefly recount a story here that really surprised me. I saw on the news, this was a year or so ago, that a large well-known telecommunications company had been breached, like 140, 150 million personal or PII stuff. And I thought, well, gee, that sucks for those people. I kind of notice from that company saying we were breached and your data was potentially breached as well, and good luck. Nothing else. It was just good luck. And I thought to myself, I haven't done business with this specific company in 22 years. Why was my data still there? How could it have been useful to this telecommunications company? But they kept it and they raised my risk and their liability by being dumb enough to keep it, because it was of no business use to them whatsoever.
But I think that's the kind of thing you've referenced it, and I think it's a big deal that companies need to understand the value of the data and get rid of the valueless data as soon as possible, because even valueless data carries risk. But you mentioned, and I've mentioned DSARs as well, data subject access requests, and they're basically the forms that you can find on websites for different companies to query a company to say, what kind of data do you have on me? And one of the things that I've been looking at and considering is the whole idea of DSAR weaponization. Could the whole right of a individual like me to fill out a DSAR and put a company through some rigamarole of trying to figure out do they have any data on me, could that be used in a mean sense or even in a cyber attack?
Because, and for people who haven't run across DSARs yet, and they're not as much in the United States and Europe, since the GDPR came into being they have been an issue for companies. I ran across some data and it was attributed to both Gartner and IDC, and this was in 2021 numbers, but basically they were saying that the average cost for a company to respond to a DSAR was $1,400, 88 man hours for each one. So you could see the 1400 is probably more than that. But the in Europe, the average number of DSARs was 147 per month, meaning that the average company in Europe was spending about $200,000 a month trying to respond to DSARs.
And I thought, well, gee, what if somebody wanted to start playing games? Then they just started hitting companies with DSARs, because it's not that easy to determine if this is a valid request or one that's just trying to cause your problem because you still got to put time in to determine, do I need to respond to this? So you're still consuming hours, but what if ransomware or extortion wear went in and stole, and they do this, they go in and they copy all your sensitive data before they encrypt everything. What if they use that data to start hitting you in an automated fashion with DSAR? So all of a sudden, instead of getting five per month, you started getting 1500 per month, automated, but still a company has to respond to them to decide, are these real or not?
And I don't believe there's any real software or applications out there that can determine if they're real or not. So this is just another thing. But like you said, with employees, disgruntled employees can start using the whole DSAR workflow to cost the company money. And like you said in one of our other discussions, the ex-employees know where the bodies are buried, so they'll know if you were able to actually respond or not. So these are the kind of things that I think companies need to be made aware of, but also need to be worked out both technologically but also in the law.
Jesse Wilkins: I think that as you were talking, I wrote down three use cases and you talked about the disgruntled employee, and I wrote down one called DDoS, distributed denial-of-service, which you came at from a different angle than I was thinking of. So I'll share my example too. I don't think it would be that hard to write a script that just started alphabet attacking, only alphabet BSR filing. We already have that with alphabet attacks in email. It wouldn't be any more difficult, except you'd have to point it at a URL rather than an email address. But you're right. And the organizations that are doing this today, you clearly can't give somebody a copy of their personal data until you've confirmed that it's actually them, or you're basically doing a breach every time you respond.
I think on one hand, the law is not keeping up with this and is not going to be able to keep up with this meaningfully. Law can't keep up with technology, I don't think. The third example that I had is actually from my time working as an employee and as a consultant to government. And just about every jurisdiction in the same vein of privacy has freedom of information or open records or Sunshine Act or whatever they choose to call it. But it's again, this idea that you have a right to know what the government knows about you, what the government is doing, and there's generally legal penalties if they don't respond within a certain period of time. Now, what I think is interesting and slightly different with many of the regulatory regimes that are out there now, most of them don't yet have a, I know you're going to say the term because now I'm stumbling over it. Personal right of action, I think, right?
Bill Tolson: Private right of action. Yes.
Jesse Wilkins: That's it. Yeah. So if a law gets passed that has a private right of action, now not only does the data breach process and the DSAR response process and the aggregate have teeth and have penalties, but I can be that gadfly and I can call every single one of my internet, my cable, my utility company, and I can just start DSARing them to death and I can get my friends to do that. And again, whether you do it in an automated way or just, I'll bet that the college that I graduated from 21 years ago probably still has more information on me than they need, and I want to know what they have and I'm going to sue them for their failure to respond appropriately to my DSARs request.
It's not a lot of people doing that, but I know people who have made comfortable mid six figure incomes annually by doing that in the context of Open Records Act requests with government entities. And now you're talking about everybody with whom you do business. Again, I think these next couple of years are going to be very, very interesting as we work through the ramifications.
Bill Tolson: Exactly. And like you were mentioning, some of the laws, so the privacy laws do include a private right of action, which means the individual data subject can sue under a bunch of circumstances. Some of the state and country laws do not have that. So that it's usually the state attorney general who has to decide, are we going to fine this company and do various other things? And oh, by the way, the data subject never gets that money if it's the ag, it goes into the state treasury. So those states with a private right of action means those individuals can go after it as well. But even so, if the DSAR requests are legal and they're from real people, what if over the next two years the American public gets much more educated on this stuff and all of a sudden you instead of five states, you have 30 states with privacy laws and the same rights or rights that are generally around the same stuff.
That workload, because of all those educated people now saying, well, I want to see what company X, Y, Z has on me. I've never heard of them, but I want to see if they've collected public information on me and are using it, because that comes into it too, was the data publicly gathered or was it gathered from a consent? Some states it doesn't matter, but even so with that education, that number of DSARs for US based companies can go from three per month up to 100 or 200 per month, and it's real. It's not a vicious attack, but that means they could be spending several million per month just trying to cover that requirement for DSARs response. And our company's planning for that. Are they putting the automation software in? Are they putting the customer service reps in? Are they consolidating their data within the company as much as they can so they can speed up the response?
Those are all really interesting things. And I know Jesse, the time is flying by very quickly, so I wanted to get to some other points as well. Not that this wasn't interesting. This is really interesting to me. But we've talked about a lot about the new data privacy laws, and they all have some form of data security requirements in them. In fact, almost all of them use exactly the same terminology on how within the data privacy law. PII needs to be protected. And many of them, almost all of them use the term, must use reasonable security practices to ensure PII is adequately protected. And my first response when I first read that in one of the laws, I don't know a year or two ago, was, what's reasonable? Reasonable to one guy might mean I'll put it in a shoebox under my bed, and the next one will be, I'm going to put it into an NSA level data center.
And I've asked the various state senators I've had on the podcast, what's reasonable security? And they're politicians, they don't necessarily know. But I've brought up the whole idea of, again, reasonable security is a gigantically wide definition, and wouldn't it make more sense to be just slightly more prescriptive in what kind of base security requirements you should put in there? All I must be encrypted while in transit and at rest. Been around for 50, 60, 70 years, actually it's been around since 1970 actually, and none of them have had a very good answer. One of them, basically one of the politicians said, well, we don't want to lock in a particular vendor into the law. And I said, I'm sorry, senator, but encryption is not owned by a single company. It's out there. Everybody uses it.
So why wouldn't you say must be encrypted? Or nowadays with the federal government and President Biden's executive order on cybersecurity, they're now saying all applications must be built on zero trust to designs. I'm not advocating going that far, but maybe multifactor authentication, encryption, various role-based access controls at the lowest level to say, we want you to apply at least these kinds of data security capabilities on PII to maybe close it up a little bit. What do you think about that?
Jesse Wilkins: Well, I think encryption is notably absent. As you say, it's been around for decades. Everybody uses it every single day on the internet and on their smartphones. I guess maybe the concern is, and again, we're talking about politicians who often either don't understand the technology or perhaps worse misunderstand the technology, understand it wrongly. And so if you say encryption, well, that may lead to a question. Well, okay, are we talking this kind of encryption, that kind of encryption? Does it have to be quantum encryption? Does it have to be certain bit? I think there's a little bit of a risk of it getting too prescriptive by people who have no idea what they're talking about. On the other hand, I actually have an article up that I'm going to finish reading after this call that I had up before, it actually talks about this and some of the things that you could implement from a reasonable security perspective.
Encryption being at the top of the list of course. Multifactor authentication can be in a lot of circumstances though, probably not all of them. But then there's this whole laundry list of other things that only apply in certain contexts. And again, I worry that politicians who don't understand this are going to try and mandate things that either are irrelevant or actually make things worse. The first time that somebody has a system that they've encrypted that they somehow lose the ability to access because something happened to the software, they're going to be livid and they're going to have no reaction. I think what's interesting about this idea of reasonable security, what is reasonable? I also think there's a concern that different states, different jurisdictions are going to have different ideas as to what is reasonable.
And so I think we run into this concern from an organizational perspective. California says, I have to encrypt my stuff at rest and in motion, in transit. Texas says it has to be encrypted at rest, but not in motion. And Illinois says in motion, but not at rest. And so how do you have a system that can meet all of those requirements for the 40 states in which you do business? I don't know the answer to that one. I'm sure that some smart people are probably trying to work on that, both from a policy and process perspective and also from a software capabilities perspective. But encryption has to be at the top of the list. Some of these other things around good cybersecurity and cyber hygiene practices, I think it's more a case of educating and making sure that people are aware that reasonable security can include the following 20 things rather than saying, and thou shall do all of them.
Bill Tolson: Right. Two points. And I agree with all that. One of the issues, I don't know if it's an issue, but one of the issues that I've been told is that the state, especially the state governments, the senators and representatives completely or almost completely rely on outside parties writing the bills for them. And in these cases, those outside parties are a select number of technology companies that are pushing a very specific set of uses, so they're not adding the additional stuff because maybe it'll take more compute within the cloud or whatever it has to be. And I've talked to those companies, I've had some of them on my podcast, but that's at least one point. The other one, and I know you've heard of the Sedona Conference, correct?
Jesse Wilkins: Of course.
Bill Tolson: Sedona, and I've had some Sedona people on as well. And actually Sedona Conference published a guide, I think it was in 2021, maybe 2022, specifically on defining reasonable security in technology. And I very much respect the Sedona Conference, but in defining what reasonable security is when it comes to data security, for the Sedona Conference it came down to an algorithm, which I kind of thought was the wrong approach, because the algorithm relies on variables that you pick out of the air, much less measure and stuff. I think the whole idea of reasonable security based on an algebraic formula was somewhat off the target. But like I say, even the federal government and the Feds usually are behind in all this stuff. The IRS is still using DOS in a lot of cases.
So it's the veterans administration in certain circumstances, but even they've gotten to the point that says, this is the base level of data security you're going to have in all agencies. And it would be nice if our federal government and the Congress came out with the ADPPA that basically gave a base level prescriptive. Now I've read the ADPPA and the various versions I've seen do not, they all use reasonable security. That's an ongoing problem that I don't think is going to change, and I'm not going to have a whole lot of effect on. But you also, and I know we're coming up against the clock here, but you also mentioned data minimization, which I think I've done a couple of webinars on here lately and also some podcasts, and I think that's extremely important.
And some data privacy laws are now starting to include, or at least infer data minimization. There was just a case that was settled just in the last month, and it was a New York Department of Financial Services versus a company called EyeMed. And basically there was a breach and a bunch of six year old data was taken and abused and stuff. And the New York Department of Financial Services basically came back and said, it's your fault because you did not employ data minimization, one of the reasons, there was various other reasons, but one of them was, you're keeping data too long. Why did you need this data? You should have gotten rid of it. Therefore, you've lost this case.
And I thought your discussion on data minimization was right on too, because like you say, I think that's going to be a major topic going forward because if data doesn't exist, it can't be breached, it can't be leaked. And also on the discovery side, it can't be discovered because it doesn't exist. So companies and their cultures, especially corporate legal as well as records and everything have to adopt this, a newer culture of not keeping data forever.
Jesse Wilkins: I think that's it. And again, I think that's going to be a sea change in how organizations, especially sales and marketing groups within organizations are how they view the world and how they do business. I don't think anything's going to change until some state or the Feds actually incorporate that, where it's not just reasonable data minimization, but there's a hard date of some sort. But some state is going to do that and that's going to open the floodgates.
Bill Tolson: I'll close with a story I've told once or twice before in the podcast, you may have not have heard it. But years ago I was working for an archiving company and we visited a company that was very big in North America, in Canada and United States and Mexico, and they did large construction projects and stuff like that. We were visited them in the GC, general counsel had all of his lawyers in there, and there was probably 18 or 20, very nice guy, old guy with a vest and gray hair and very dignified. We were talking and I asked him, what's your current retention policies on various types of data? And he very, very quickly looked at me and he said, well, we've decided that for Canadian data, data generated in Canada for Canada projects we're going to keep for, I think he said 15 or 20 years, probably 15 years.
And he said, the data generated in the US we're going to keep for 34 years. And surprised me because I've done some homework before we visited, and I had a general idea what he should have said. And I said, well, how did you come up with those numbers? And he proceeded to explain in a very, very nice way how they came up with those numbers and went on for about five minutes or so. But in the meantime I was calculating what their storage requirements would be over 15 and 34, 35 years. And when he got done, I looked at him and I said, you understand that you're talking about needing storage resources in the hundreds of petabytes, maybe even exabytes after 34 years. And I said, number one, that's going to cost a fortune. You don't have the square footage to put that much spinning disc and so forth.
And he looked at all of his lawyers around the room, and he looked at me and he smiled and he says, I don't care. I'm retiring in two years. So they obviously hadn't gone far enough out to understand, number one, what their real requirements were, but also what that would mean. That was a true, I think, a data hoarder basically, just keeping way too much information because there was no legal reason for it. But Jesse, I think we've run out of time. I think we could probably talk for another hour or so, but I think we'll wrap up this edition of the Archive360 podcast. And again, I think with your experience and background and stories we could have gone for a lot longer.
I want to thank you for a really interesting discussion today on this very timely subject of data privacy laws and information management and all the stuff around it, because this is just going to keep getting more and more headlines and everywhere else. If anyone has questions on this topic or would like to talk to a subject matter expert at Archive360, please send an email mentioning this podcast to email@example.com, or basically email me directly at Bill.firstname.lastname@example.org, and we'll get back to you just as soon as possible. You can also contact Jesse at Jesse.email@example.com.
Also check back at the Archive360 resource page for new podcasts with leading industry experts like Jesse on diverse subjects, including data security, data privacy, information management, and archiving, records management, eDiscovery, compliance, holistic things. We touch on all kinds of subject. In fact, our next podcast will be with the Association of Corporate Council, a group of corporate councils called the ACC, great group, and they have a new data steward program, which we'll talk more about. But with that, I really thank everybody for downloading and tuning in, and especially you, Jesse, for really, really fantastic conversation.
Jesse Wilkins: My pleasure, Bill. Thank you for having me.
Bill Tolson: Thanks.
Have a question for one of our speakers? Post it here.