Episode 34: Privacy as Culture: Understanding Privacy Across Global Markets
Hosted by Aaron Burnett with Special Guest K Royal
In this episode of the Digital Clinic, we talk with K Royal, Global Chief Privacy Officer & Deputy General Counsel at Crawford & Company, about how different cultures approach data protection and why it matters for your marketing strategy. From Europe’s human rights framework to Asia’s honor-based systems and America’s commercial model, we uncover how these fundamental differences shape the way you need to think about privacy compliance.
The key insight? Privacy doesn’t come down to just checking regulatory boxes. When you prioritize transparency and genuine consent, you build trust that drives better marketing performance.
Whether you’re navigating HIPAA requirements, managing third-party tracking across your tech stack, or addressing the growing patchwork of state privacy laws, this conversation offers practical strategies for turning compliance challenges into competitive advantages, without sacrificing the performance your business needs.
Listen & Subscribe:
K Royal’s Unconventional Path to Privacy Expertise
Aaron Burnett: You already are the most whimsical person with whom I’ve spoken on the podcast, which I love. I love whimsy. I also am going to say you’re probably the most accomplished person with whom I’ve spoken as well. You have a litany of credentials, any one of which would be impressive. You have a PhD. You have a JD. You have an undergraduate degree in psychology. And another in nursing. And you just won a pageant as well. I am very curious if you’re willing to share about the arc of your life and all of these things that you’ve done. You worked in nursing, you practice law, you became a privacy expert, you’re now winning pageants. You are fascinating right out of the gate. So, anything you want to share about that would be great.
K Royal: Thank you. So, can we say ADD for career? Maybe. Although, you know, everything seems to have built towards my career in privacy. When the Powerball recently was like 1.8 billion, people were like, what would you do? Would you seriously still work in privacy? Yeah, I’d start a privacy think tank. Seriously, over my career when I worked in psychology, I created a privacy program for an inpatient mental health hospital for patients coming in. Someone, I mean, we were in Mississippi, someone would call up and say, “Hey, is K Royal on the alcoholic ward?” I can’t give out patient information. Then they’d call up, and they’d say, “Hey, can you connect me to K Royal on the alcoholic ward?” And they’d be like, “Sure, hold on.” I’m like, really? How? How is that protecting privacy? They’re like, well, they already knew it. There’s two inpatient hospitals in Mississippi. I’m pretty sure they were taking their guesses. But I created a way of admitting people with code numbers and passwords. Trained everybody and wrote the procedures. I was like 24 years old. And it became a best practice for mental health hospitals around the nation. I think they still use it now and they use it in other hospitals. So, something should have told me a long time ago that I was meant to go into privacy.
As a nurse, after that experience as a nurse, I was a registered nurse when HIPAA was passed. And you bet your bottom dollar I was pulling, okay, before the internet, I was pulling all of the literature on HIPAA and reading it and teaching the people at the hospital what to do to understand HIPAA. So again, a little slow. I should have known I was meant to go into privacy a long time before I did. Being the first-gen college educated person in my family, I just never stopped. If you’re going to learn something, you might as well have a degree or a certification or something to show for it, especially with my name. You need more letters tacked on the end.
Aaron Burnett: K, just the letter K as the first name.
K Royal: Just the letter, and I don’t have a middle name. But don’t tell my mom I’ve never admitted to her that I dropped my middle name. You know, the stories change depending on how much caffeine and chocolate I have. But probably the closest to the truth is I was adopted by my mother’s second husband, so I had a second birth certificate and we think there was a typo in it. So through a very series of deliberate and constructed events, I got my name down to what I like.
Aaron Burnett: That’s great.
K Royal: And I dropped the last name if I was rich, but I’m not that rich yet.
Aaron Burnett: If you can go by just a single initial, you are icon status.
K Royal: There you go. Right, there you go.
Understanding Privacy Through a Cultural Lens
Aaron Burnett: Let’s talk about privacy, then. You told us a little bit about your history. Would you describe the history of privacy regulations then in the US and the EU? And I know people can go through the moments of legislation. I think you have a very interesting perspective on the origins of privacy. And I’ve heard you say that privacy is cultural. And so, if you could talk about privacy in the US, privacy in Europe, their origins and differences in philosophy, and where they overlap.
K Royal: Absolutely. So, as you already know with that question, they are diametrically opposed to each other. Europe comes at it from a human rights perspective. I don’t know how the US comes at it because we take the, and I know someone’s going to fire me on this one, but we take the, by the way, I should give the disclaimer. I speak for myself, not my company, blah, blah, blah. I think everybody knows that by now. Here in the US, we are very much driven for commercial enterprises, for startup companies, for tech innovators, for driving and going and doing. We look at data as, this is information that I took in. I should be able to use it without ever giving a thought as to. Well, there might be someone out there that might oppose you using it for particularly the ways that you’re going to. But who thought about that when things were coming up?
So, it’s interesting that I’ve traced, in my little GDPR handbook, I give a good layout of this, and in my dissertation, I give a good layout of this, of how we got to privacy regulation. And the first privacy regulation recognized to ever have been drafted was the state of Hesse in Germany, which is not actually a country. It’s a local law. So, then the first country law, I believe, was Switzerland. That came up later. But all of these were passing at the same time that the US was actually passing legislation too. It’s just our legislation wasn’t considered privacy legislation. It was credit card legislation and the privacy act for the government and things like that. So, it came from two different directions.
And what most people find fascinating is the word privacy is not in the US Constitution. All of that comes from this penumbra of privacy that the Supreme Court finds with the privacy of your home and the privacy of your belongings and the privacy of your thoughts. And they say, along with these penumbras, that gives you the right to privacy. But companies didn’t extrapolate that to being, well, all of this personal data I’m collecting, what do I do with it? I can use it and lose it and combine it and sell it and lease it. Most people have no idea what data brokers and companies do with their data. None. They have no idea the type of science that goes into these unconnected little pieces of data to say, I have a profile on you. And what I’m going to do is I’m going to show K stories on election day that all of the election places are backed up. People are fainting from the heat and police are chasing vagrants away and there’s been a fight. And K’s not going to actually go down there and see if that’s true. K’s going to pay attention to the news stories I give her on her computer. And that’s not paranoia. It might be, but it’s true still. That’s what happens during elections or through significant historical events. We pay attention and we believe the news stories that we’re seeing on our computer, not realizing there are 50 million news stories out there being targeted to direct people.
And so, there’s always the question: do you want targeted advertisement, or do you want general advertisement? Well, most people would say, I want targeted. I only want to be shown ads that apply to me. I’ve bought two houses that way. I don’t want to see ads that apply to me. I want to see general ads. I want to see the billboards that I drive down the highway and see. I don’t want to see something that goes, “Hi, K, your Mercedes still needs service. Please stop by the nearest center.” So, all of these things come from a very creative use of data that we’ve seen. All kinds of scandals. I mean, the Facebook scandal was one of them. But this is how they get the data.
Aaron Burnett: Well, I think in any given context, if you’re looking for the most egregious actor, it’s a safe place to start. You could start with Meta.
K Royal: Yeah. It’s a really good place to start. And the second one is going to be Google. They have the brains and the money to come up with all of these things they can do with data. They’re not being allowed to do it in Europe. So, if we go back over to the European perspective, Europe is stopping them from doing a lot of things that we just take as part of the parcel here in the United States because of how we look at it. And people like to say, well, Europe doesn’t have a right to privacy. They have a right to data protection. No. The fundamental rights that include a right to data protection actually has a right to privacy too. It’s just, it’s two different things. Here, we think of it as the same thing because we don’t know what we’re talking about. But over there, they know what they’re doing, and they’re actually paying attention, and they’re holding companies accountable.
Then you take it to Asia, and the Asia Pacific countries operate more on honor. So, it’s not about what’s written in the law; it’s what’s honorable as a person. And think about that. You do what’s honorable, what’s respectful to other individuals. And then you go to Latin America, and they’re like, hey, we want to be in the privacy world too, but we really don’t care a whole lot yet. I’m not saying they’re not getting there. They’re getting there. And so, when you look at all of that put together, you’re like, what is the status of the world with privacy nowadays? Everybody defines it differently. Everybody treats it differently. Everyone has a different perspective. The only one that’s really going to honor what to do with your privacy is yourself.
Charting a Course Through Complex Privacy Regulations
Aaron Burnett: So, if you are in the US and we focus on healthcare and medical device manufacturers, so privacy-first industries with some of the more restrictive US laws around privacy. How is a digital marketer to make sense of the various levels of privacy regulations, federal regulations, those that apply only to healthcare and med tech via HIPAA, FDC enforcement actions? I think we’re up to 20 states with their own privacy laws now.
K Royal: At least 20. Now if they count Florida as an omnibus privacy law, which I do.
Aaron Burnett: Right. And I think another 17 where legislation is in deliberation. And so, there’s quite a patchwork. Some of those state-level privacy regulations are very restrictive and conservative, and some are a little more lax. Is there a philosophical approach that you advise people, digital marketers, and clients take, that keeps them safe? And then maybe to the extent there’s any different, is there a philosophical approach that you suggest they take as well?
The Power of Transparency: Building Trust Through Consent
K Royal: Well, one of the approaches is we’d be good to take Europe’s approach because this is where Europe was under Directive 95. They had 28 member states, countries that had different laws that all met a basic baseline. And you didn’t know what to do to comply with European law because you had 28 different countries to do. We are now entering that here in the United States with these state laws. And in addition to the 20 omnibus privacy laws, and the omnibus privacy law merely means that the law applies to all types of data. It’s not; you have to be a customer of healthcare under HIPAA or a customer of education under FERPA, or finance under GLBA. You’re a person who has data. Therefore, the state law applies to what it covers. That’s the approach. Every country outside the US takes. The US is the only one that takes the approach of being very sector- specific.
So, with this, not only do we have the states with the omnibus privacy laws, but now we have states with AI laws. We have states with kids’ codes; how do you keep kids safe online, which is coming? And you still have more federal laws being amended and changed as well as modified through enforcement. They’re not officially modified, but we’re getting the idea of what the enforcers are looking for when they hold a company accountable.
So how do you approach all this? You do the right thing. I know that sounds very simple, but almost everything you want to do marketing-wise, you can do with consent and with notice. And if you tell people what you’re doing and you do what you say you’re doing and nothing else, and then you tell them you did it, and you show them what it is and you give them rights for it, most people are not going to opt out. You have to be someone crazy like me that opts out of every website with all the cookies and trackers. Everybody else is just click, click, click. Just get me through here. I don’t want to read this stuff. My teenage children asked me years ago, they’re no longer teenagers, we’re not going to talk about how long ago that was, why do I write privacy notices when nobody reads them? Because you have to tell people what you’re doing. How are you going to tell them without telling them? They’re not going to watch videos. They’re not going to read a little shorthand. They’re barely going to read the little icons you give them. How do you tell people what you’re doing if they’re literally not going to read it? And that’s where it comes in.
So, you have to make your best efforts to tell people, hey, we’re basically going to take everything we can find on you, which by the way, includes things you have no idea that we can find. Buy data from other companies. We’re going to rent data from Meta because they can’t sell it, but they can rent it. We’re going to scrape everything else up, even though the courts say we probably shouldn’t. Because just because it’s publicly available doesn’t mean it’s publicly usable. We’re going to combine all that with a whole bunch of data brokers that do all kinds of wild things with your data. And then we’re going to build profiles of you and we’re going to sell it and we’re going to sell you stuff that you don’t need, that you don’t even know you’re buying because you’re using it and that’s the product. How do you tell people that? People are going to be like; you’re going to do what? I just want to download this game. Just let me download this game. I don’t need anything else.
And so, the philosophical approach is that you might have to wind up breaking some of your revenue streams. No doubt about it. But that doesn’t mean there aren’t other revenue streams available to you. Most people actually do like being marketed to. They like it. They like to get ads for stuff. They like getting tailored ads for stuff. They like being shown stuff that they might buy or dream about or win the Powerball on. So, I really don’t see where it’s going to hurt a whole lot of marketing, although I will say, I think email marketing’s kind of going out the door, isn’t it? People don’t like emails. They like to put everything into a spam filter, which Google does for you.
Aaron Burnett: Yeah.
K Royal: I will say this goes into the Telephone Consumer Protection Act, which I absolutely totally despise. It should never have been rolled out over all the things that’s being rolled out. If someone reaches out to do business with your company and they use their cell phone number, you should have the ability to reach out to them via text messaging. And yes, they can opt out at any time, include that, but you should have the ability to do it because they initiated contact with your company. If they didn’t want to be contacted by their cell phone, they shouldn’t give their cell phone. And I know a lot of privacy people would definitely say, okay, that’s not the right direction to go there. Why not? Why not? I get 50,000 text messages in my phone a day and I can’t read the ones that are useful. So, it’s becoming almost like email now, but it’s still a revenue stream. It’s still digital marketing.
Discovering What Your Tracking Tools Actually Do
Aaron Burnett: We are having an interesting experience because we focus on these highly regulated industries. We have needed to put in place HIPAA-compliant data collection solutions. So, we constrain data at the moment of collection. We also filter and govern data that is stored, and we control down to a single data attribute what is shared with a third-party. By definition, you’re working in a data-constrained environment. And the fear before putting those kinds of solutions or this kind of solution in place among clients and other digital marketers is in the absence of all of the easy data, big air quotes for easy data that the platforms provide you and the easy tracking that the platforms provide you, that you’d lose fidelity, lose efficiency, and now you’d end up just spending a ton more money and you wouldn’t be able to drive performance. Our experience though is that when you actually work with more discipline, you focus on the right data, not all of the data that the platforms provide you; we can drive much better performance.
K Royal: Yeah, I don’t doubt that.
Aaron Burnett: We felt pretty sure that was the case. We’re really glad that’s the case.
K Royal: Yes. I mean, this comes from the philosophy as we discussed that, you know, the old days, five years ago, used to be collect all the data, figure out what you’re going to do with it. Taking a targeted approach and saying, we want to do this, that is something you can tell people you’re doing. You can say, we’re collecting these pieces of data for these purposes and want to use it in these ways. That is much more transparent and much more visible to people to be able to, if someone did go read the notice, they probably wouldn’t object to it because you’re being very target specific in what you’re looking for.
I think we started seeing a lot of the digital impact really with the megapixel. The companies had it loaded to their website. They didn’t even know it was loaded to their website, and it was showing video history once someone was logged in, and that became really bad. I think the stretch right now of these class action lawsuits that we’re seeing that are targeting old laws like the Video Privacy Protection Act and the chatbot laws for wiretapping and things like that. I think that’s throwing a lot of unnecessary noise in the mix. Because now companies are scared to do chatbots with their legitimate customers as a customer service need because they think some class action lawyer in California’s going to come after them for a class action lawsuit. Same for the video privacy. Do you disclose that you track who watches what video? Are you using that data for something? If you’re not using it for something specific, why? Why do you care whether or not someone watched a video? Now, again, I’m not a digital marketer. If you’re putting digital content out there, you probably do want to know who’s watching it and where they’re coming from and where the lead is and where they may be geographically. And did they stop at 10 seconds in or whatever? There’s probably a whole lot of stuff you can get from that. The average person is like; I don’t care about data on videos. So, if you know that you’re looking for those specific pieces of metrics and information, why not disclose what you’re doing and continue to do it in a legitimate manner?
Aaron Burnett: I suspect that video is a target because the analytics payload that is loaded with video. If you’ve embedded a video, it’s often a YouTube video. And the analytics are a part of the video, and it’s Google Analytics.
K Royal: Yeah. And that goes to Google. And they just had another settlement, I think from YouTube again.
Aaron Burnett: I have a hard time keeping up with all of the fines and settlements against Google right now. The multitude of antitrust actions and judgments, and then the fines that are being levied in the EU as well.
K Royal: And you’re sitting there. I actually sat there, and I calculated back when Amazon had the largest fine ever proposed under GDPR, which was what, about three years ago. I actually sat there and calculated, and I think it was less than what they made in an hour. Yeah, no wonder they don’t care about the fines or the penalties. It’s the oversight, and that’s what a lot of people don’t pay attention to. Is it’s not just the fines and the penalties, but it’s also the fact that then it comes with government oversight. And a HIPAA violation in particular tends to come with 20 years of oversight. Why do you want HIPAA regulators all up in your business telling you what you can and can’t do? That’s no one’s idea of fun, especially when it comes to marketing and innovating digital content, all of that. HIPAA doesn’t understand.
And I was, I lost one of the first unencrypted laptops when HIPAA had the breach notification rule. I was working at Concentra and turns out I’ve gone back and looked. I was not the first, but I think I was in the same month as the first. So, I was like the third. The regulator there, when we turned in our security policies, the policies reference specific technology like Windows 2000. And you should never have that in a policy. That should be in a procedure or a supported document or whatever. You shouldn’t actually have it in the policy. So, because of that, our policies got escalated to their cyber group for oversight. Now we wound up with no findings or anything, but it took two to three years to settle it. And the investigator became one of my best friends once she exited being a regulator, and she found out what businesses actually have to do under regulatory oversight. She’s like, how does anybody function? You can’t with regulatory oversight. And I believe, and the FTC has started doing this, especially with COPPA enforcement, making companies dump the data that they determine has been collected illegally or non-compliantly, whichever way you’d like to put it up. You don’t have the right permissions to underpin it. So, all of this data store that you have is now wiped off and gone, and you can’t use it. Or if you’re looking at doing business in another country, Europe can stop you from doing business in Europe. The US can stop you from doing business in the US. That’s way beyond the fines.
Exploring the Path Forward for Privacy Regulations
Aaron Burnett: Where do you think privacy regulations are headed?
K Royal: You know, a lot of people think it’s going to go towards the person making the calls and having their permission, and they deliberately choose where to share their data and what pieces of data to share. I don’t believe it’s going to go there. If it does, I think it’s going to fail mainly because people don’t know what companies do with their data. They don’t understand what bits and pieces of data come along with that data. You may say, okay, I am fine sharing my name and my address. Well, but your address brings, you know, potential socioeconomic and this and that. And they combine that with the type of car you bring and how many credit cards you have, and they may target you for predatory lending or something. So, I don’t think that individuals are the ones that have the knowledge to control the data sharing. It’s kind of like going into surgery and you come out and the wrong leg was cut out. You have no idea what happened.
So I believe it’s going to come down with corporations, whether it’s self-regulatory, and I kind of hope it’s self-regulatory and it works, coming up with the rules for their industry and for their needs that the individuals can live with, as well as the rest of the regulators or whatever can live with as well. Because nobody knows what they need rather than the industry that’s making the call. Now you got to have some that want to stand up and do the right thing, and they can’t all be out for the bottom dollar. I think that’s what it’s going to have to be because I don’t think there’s enough knowledge on the side of the consumers to make it happen. Now, watchdog groups have a role to play as well, and I think that would go into the industry self-regulation. But I’d like to see it go towards industry self-regulation. And if it doesn’t, I’d like to see there be penalties for the really, really bad actors that matter.
Aaron Burnett: Our thesis is that the very restrictive privacy regulations that right now pertain principally to healthcare, but also to medical device manufacturers of a certain type, are going to come for everyone. Privacy regulations will become more and more restrictive across the digital ecosystem. And that’s one of many reasons why you’ve chosen to focus on this space. also wondered, though, if in the very near future where being highly attuned and valuing and protecting consumer privacy becomes an important element of brand strategy.
K Royal: Oh, I think it needs to, because people think Apple is very privacy-forward. They haven’t read the stories behind the scenes. Apple markets that it’s very privacy-forward. And so, people believe it. And in some ways, it’s really good, but in some ways it’s not. So, I believe you’re right there. And you’re right. If we look at the industries that are already heavily regulated, can they become self-regulated? No. That ship has sailed. You’re right. But I think it’s the bits and pieces within that industry that still have the ability to be self-regulated, like advertising under HIPAA rules. We have some regulations in place for it. By and large, it’s pretty unregulated, especially when you look at FDA software as a medical device kind of a thing. I was working in an FDA software field back when that came live, and I loved it. So, I think the overall industry is going to be even more regulated, and I think you’re right. I think they’ll come for everybody. They’ll come for everybody that they can possibly tack onto. But I believe the subpopulations within those industries still have an opportunity to be heard and to be known and to effectuate a difference. You know, you’re taking a look at this. You’re going down to targeted bits of information. You’re focusing on this. I think those are the areas that have the opportunity maybe to lead towards some self-regulation. But you’re right, it’s in an already heavily regulated field, and that’s not going to go away.
Now every state has called health information as sensitive data. The US never had a definition of sensitive data before. You could intuit that if it was a regulated law, FERPA, HIPAA, GLBA, Privacy Act, that data itself was sensitive data merely because we had a federal law on it. You could also extrapolate that the data covered under the state data breach notification laws would indicate that data is of higher sensitivity. But until the state omnibus laws, the US did not have a definition of sensitive data. And that definition varies from country to country. It is not across the board what people consider sensitive data. But health data is in every single state’s definition of sensitive data.
Navigating the New Terrain of Healthcare Marketing Compliance
Aaron Burnett: I’m curious to get your perspective on the November 2022 OCR guidance that expanded the definition of PHI and added this designation of individually identifiable health information, which kind of overnight made all the third-party tracking for advertising a de facto violation of HIPAA. And then was followed on by the American Hospital Association versus HHS court case, a lawsuit. I read that judgment and the judgment said, okay, there is this prescribed combination. There’s the combination of IP address or some identifier and the content that an individual is reviewing that may or may not contain information on a health condition or treatment or that sort of thing. This prescribed combination was one of the biggest things that the OCR guidance went after. And the judge said, didn’t actually say that’s bad, that this is bad guidance. The judge said, you violated the Administrative Procedures Act. You did it wrong.
K Royal: Right.
Aaron Burnett: But very much left open the door to do it again. And so, my hypothesis has been okay, that was deliberately, narrowly tailored so that HHS could do it again. There was absolutely no reason in that judgment for anyone to relax.
K Royal: No, I agree with you. I was surprised when they passed the guidance on it. I thought that was a bit far-reaching myself. I think most people thought it was a bit far-reaching. So, I think the court came to the right conclusion when they made them back down. But you’re right, they did not say, leave this area alone. You’re not qualified to speak on it. I think they did leave the door open for it. But also, what I find interesting is that OCR does not have the expertise to speak to that technological type of information. And now we no longer even have the rule that says you can rely on the experts to speak on something. I do think that if OCR structures it right, and they are, they continuously look at updating HIPAA, and they should, because HIPAAs vastly outdated. Although it still scares people with what it has on paper. I mean, how many people out there? I consulted one time for a company that had no idea that there was HIPAA. It was a doctor’s office. I’m like…
Aaron Burnett: Oh my gosh.
K Royal: Have you ever heard of HIPAA? Let me educate you. But when you look at this and you look at how they classified the data, I had this argument with my security guy all the time. He’d say, if you write down a set of medical record numbers, just arbitrarily pick a random set of medical record numbers, and you throw them out the window, is that a breach? Yes. He’s like, how is that a breach? You don’t have a name. You don’t even have the name of the entity on it. You have no one. And I’m like, it doesn’t matter. HIPAA doesn’t say that. HIPAA says that if you don’t identify and remove these 18 identifiers, then it’s a breach. If you lose medical record number, it’s not a reportable breach is the key. So you can lose or misuse bits and pieces of what HIPAA would qualify into their de-identification standard, which by the way, I think now they need to avoid that and they need to go straight to the data scientist that says that it can’t be re-identified if you don’t have those 18 identifiers.
But to take what they know and apply it to these 18 identifiers, which is what they did. The individually identifiable health information they took and retrofit into what they wanted to say was bad, rather than coming out and saying, this is bad. You shouldn’t track people’s videos. We’re going to rely on this authority and these actionable little pieces of data that we have to say, you can’t do this and this without thinking it through. And so, I do think they could come back. I would like to think they’re not going to unless they actually get it right this time. But it did scare a lot of people. A lot of people straightened up. They shouldn’t have had those trackers on logged in behind the paywall. We know what you’re doing kind of thing. Because to have a tracker on a general website, and I was working in privacy when this happened and I looked at it from a different perspective as well because COPPA addresses children’s law. If you can’t do it for healthcare, then you can’t do it for COPPA and you can’t do it for FERPA and you can’t do it for these others. So, at what point can you do it that it’s not taking illegal data and scraping it? And maybe you need to target it for the pieces of information you want. If it was only capturing information that couldn’t be relayed back to the person, was just capturing metrics, why couldn’t they use it?
Uncovering Hidden Data Collection on Your Website
Aaron Burnett: I’m going to say a majority, maybe even a vast majority of digital marketers had no idea what was on their websites or how the pixels worked.
K Royal: None. Did not even understand the mechanics of a third-party pixel.
Aaron Burnett: This is the easiest way to do this is you just put it on all the pages without understanding what that pixel collects. And then, in combination with the pixel, not understanding what things from an advertiser’s perspective need to be configured carefully so that you’re not accidentally collecting even more. And then Meta, because they are, they’re data greedy. They’re going to get everything you let them get. There also were some settings prior to that OCR guidance that sometimes were auto-on. There were a lot of things that I surmise were stealthily implemented. That particular setting is quite deep in the administrative settings for advertiser management. And that setting, that checkbox meant that the pixel would collect all the data entered into a form on a website.
K Royal: Got it. And that’s the part that makes it; you’re collecting data you shouldn’t be collecting. Because I know years ago when I was, when GDPR first came out, and I was working with our website people and our marketing people and saying, hey, we can’t have cookies. There’s a cookie law. You can’t have cookies. My guy was like, oh my God, I can embed this into the URL. They won’t see it on the URL, but I can embed it into the backside of the URL. In that case, we can. I’m like, no, no, no. I think you’re missing the point. We’re not supposed to be tracking people.
Aaron Burnett: Right.
K Royal: Not that it’s not cookies. You’re not supposed to be tracking people.
How Platforms Connect the Dots Across Digital Experiences
Aaron Burnett: You started to reference Cambridge Analytica and some of the election stuff. Meta, I mean, even if you’re not giving them enough to create a full profile and identify someone, because they have that megapixel on so many sites, what they’re doing is knitting together your behavior on this site…
K Royal: Oh, of course they are.
Aaron Burnett: …with lots of other sites. And so now, okay, you didn’t tell them your name. There’s nothing identifiable on this healthcare website, but they already know it.
K Royal: They already know. Even your device fingerprint. I try to teach law schools that. I said, “Do you know that the way you use and have your device set up gives you a device fingerprint that can be unique to you?” And if that is tracked from website to website, then they can figure out who you are. They’re like, no one’s going to be able to pull these profiles, K, and say, I have this. And I’m like, you think a person’s pulling these and comparing these?
Aaron Burnett: Right.
K Royal: Yeah. AI’s been around a long time, people, a long time.
Aaron Burnett: To say nothing of the actual hardware identifier.
K Royal: Yes. That also is transmitted too. But technical people are like, oh, we can find a creative way of doing something. And marketing people are like, I just want this outcome. And privacy people are like, just don’t do this. The rest of it falls in between all the lines because this field has very few hard, dark lines that can’t be crossed. There’s you have to be very comfortable living in ambiguity. HIPAA has some really good, hard, dark lines you can’t cross. But not everything necessarily falls under that. And so, I think of it as a mystery. You can generally do what you want to do as long as one, you’re transparent about it. Notice. And you get people’s consent, and you honor consent and all that good stuff. You can generally find a way to do what it is you want, because people just have to agree to let you do it. It’s just that most people have no idea that what they’re implementing, what it actually does, like the megapixel. They have no idea what it actually does to be transparent about it. Have you ever studied the Google logins and the Facebook logins?
Aaron Burnett: And what transpires, what takes place mechanically? I don’t think I have.
K Royal: I have actually pulled the developer’s code on these and everything on them. Because you know, just like always you bring in privacy. Privacy’s embedded in the project. We know what you’re doing. You go live and wait a minute, where’d that Facebook login come from? They’re like, well, you said that we could share; people could share these pictures to Facebook that they got from our thing. They could choose to share them. I’m like, yeah, that has nothing to do with the Facebook login. And they’re like, well, yeah, it’s just a login, Kate. And I’m like, no, it’s never just a login. Go. And I pull all of the developer manuals and everything, which is a lot, and shared it back to the technical people and like. This is why you should not be doing it because we get this information. Well, we’re not going to do anything with it. I don’t care. I don’t want the information. Don’t log in with that. Just don’t do it. And now people do it as just a matter of course. You log in with Google, you log in with Facebook, log in with whatever. One pass to get you through everything. Without thinking about what data is going to the companies that the privacy people at the company may not ever want that company to actually have exposure to.
Aaron Burnett: We are in an era right now where it seems like many forms of regulation are headed the other way. Tenants’ rights are being rolled back, and consumer protections are being rolled back. And airline passenger rights are being rolled back. Do you see any indication of a softening around privacy regulations similar to that?
K Royal: No, I really don’t. I think this is the one area of regulation that we’ve been so woefully behind on that I think they have lots of room to make up before they start rolling back on it. Because we just have countries that are just now passing laws. And like I said, even if it’s not an omnibus privacy law, they’re passing other affiliated laws. Kids code laws, identification laws, biometric laws, AI laws. So even if it’s not really a privacy law per se, or a data protection law per se, it’s something that actually deals with personal data. And I don’t see that going away anytime soon. Yeah, maybe it will though. Maybe I’m wrong. You know, nowadays people are like, ooh, okay, job security. I’m like, I have plenty of job security. I do not need more job security. Trust me. But no, I don’t think that’s anything that’s going to go. We still are waiting on too many areas for the courts to actually speak to, to give us clarification on what can be cut back. Because, like you said, everything they’re ruling on is so very narrowly tailored to that specific question. And again, they’re not actually giving guidance on whether something can or can’t be done. They’re actually just saying, this one specific thing you did is illegal, so we’re throwing the whole thing out. But they’re leaving the doors open to come back and do more. And it seems to be deliberate. And I think it’s because, you know, unfortunately, judges aren’t experts at all of the cases and the technology that they have to pass decisions on. So, they’re doing the best they can. They just know that they are woefully ill-prepared to speak to a technology of that magnitude.
Aaron Burnett: That seems to be specifically, or particularly the case around the wiretapping lawsuits in California.
K Royal: I waited on pins and needles for some of these decisions to come, and they came, and it was like, really? You gave me nothing? You told me I can’t use this guy’s wireless pager. Okay, fine. I’m not going to use his wireless pager.
Aaron Burnett: It seems problematic from the start. An old law, not at all applicable to this technical environment being applied in kind of a tortured way.
K Royal: A tortured way. I like that. Because it is, it is a tortured way the way that they have to do gymnastics around it. It is torture. I agree. And it’s torture for those of us that are hoping that it’s going to give us some clarity as to what we can expect in the future. And it’s not.
Aaron Burnett: You pay attention to privacy laws globally. And in particular in the United States and Europe. And I’m curious about another judgment that we’ve been watching, but I haven’t seen anything written on. And this is, hit me with it. There was a ruling from the Administrative Court of Hanover upheld a Lower Saxony Data Protection Commissioner’s conclusion. And this applied to Google Tag Manager, and the ruling was you must have consent before you even load Google Tag Manager, which is problematic technically because a lot of people fire consent through Google Tag Manager. So, the only option if you can’t do that is you have to hardcode consent management again. It’s not good. It feels like a ruling that has broad implications and a lot of pain for a lot of website owners and digital marketers that is based on a lack of understanding. I wonder if you’re familiar with it, and if you think we read it and think, well, that’s then going to broadly apply across the EU.
K Royal: I don’t know, frankly, what the appeals process is in Europe. As well as I know, the appeals process in the US, I don’t know the one in Europe as much. This is one that we actually talked about on the podcast before, because you’re right; typically, consent is Google Tag Manager. And a lot of companies use GTM as their consent mechanism for what to load afterwards through GDPR. We were expecting a lot of them to go to zero load before anything period was loaded. You had to consent. Well, that consent was quite often through Google Tag Manager.
Aaron Burnett: Through GTM. Yeah.
K Royal: Right. So now you have to say, woo, do you consent for us to use something to consent for you, and now we’ll load all your consent. So, it seems very difficult to operationalize it. I don’t know how sustainable or how enforceable that’s going to be. Because I mean, it really is just, you know, in Germany right now. The IAB compliance done at pretty much the same time. If I’m looking at the dates right, I believe the IAB decision was going on at the same time. And that’s also one that my co-host, Paul Breitbarth, despises the IAB consent mechanism. He hates it. He was ready years ago for it to go away. That was something that came up with us. I don’t think it can be operationalized. I don’t know how it can be even under zero load requirements. How is that enforceable? So, I expect this one to be appealed up. Now, what might be, as you know, here in the US, if you appeal from a court of appeals to the Supreme Court and the Supreme Court doesn’t take it, then that’s your final decision. You have nowhere else to go. It’s going to uphold. I don’t know if that’s the same in Europe. That would be an interesting one to follow up on, because consent stinks.
Strategies for Building Privacy-First Marketing Systems
Aaron Burnett: Switching gears just a little bit, I saw that you had given a talk recently, the title of which was Get Your Data in Shape.
K Royal: Yes.
Aaron Burnett: And I am curious from your perspective, what it means to get your data in shape, because we talk about something similar with our clients.
K Royal: Oh, nice. Mine, if I’m pulling up this exact thing that I actually was talking on was more about getting your data handling controls in place of knowing what data you have, where does it come from, what rules do you apply to it? Where, what do you do with it? What you said earlier about what you want to use data for, and do you have the right data collected in the right way and used in the right way to make sure that you have it. But it was more about putting operational controls around the data and making sure that you get what you need. Because a lot of times when you start cutting back on data collection processes, you wind up losing the data you need. And if you have a legitimate need for that data, or you have consent for the data, you should have the data. You just have to put the right processes in place. And here in the US we’re really, really bad at that legitimate interest test under GDPR. So that’s probably the biggest one. And companies tend to violate that a lot. When they look at their legitimate interest, that your need as a company for processing this data does not outweigh the person’s fundamental right to data protection or privacy. They do think the almighty dollar and their interest in it couldn’t possibly hurt anyone. It can’t hurt anyone if I do this. They don’t take it out of context and look at that specific person. Would it hurt that person? Would it hurt an individual? Would an individual in the European Union who has quite a different perspective on data management than we do here in the US or Canada or Latin America or Asia, would what we want to do with it harm them at all? And you have to take it from that individual’s perspective. And we tend to be very, very bad at that. I’ll be fair; European companies are very, very bad at that too. There’s a lot of European companies that have been held accountable for legitimate interest too.
Aaron Burnett: We just have our own flavor of bad at it.
K Royal: We do have our own flavor of bad. But I mean, you know, look at it. We are also the most innovative country in the world. Arizona State University, that my law degree is from, has been named like the number one innovative university in the world, something like five years running. We have the moxie to innovate and to do really, really cool things. It’s just that we tend to run faster than our legal sides can keep up with. And I do talk to a lot of startup companies, saying, but you’ve got to get this right because you are devaluing yourselves up to half to 75% of your value. If your data was collected illegally or non-compliantly, whichever way you’d like to put it up, you don’t have the right permissions to underpin it. So, get your data in shape, get your controls in shape, and do it right. Do it right the first time. But I know it costs a lot to put a lot of this in place, and startup companies just don’t have that type of money and funding and resources to spend on putting all the legal parameters in place. So, identify what’s most important and focus on those.
Aaron Burnett: It’s a matter of when do you want to pay and for what. Do you want to pay at the beginning to get it right, or do you want to pay when you get in trouble, and you have to do the work anyway, and you have reputational damage?
K Royal: Well, the good thing is the European regulators take a very practical approach. If they think you’re trying to do the right thing, they’ll give you credit for that.
Why Data Discipline Unlocks Better Performance
Aaron Burnett: When we talk about getting your data in shape, we really talk with our clients or prospective clients about taking control of your own destiny.
K Royal: Yeah, I like that.
Aaron Burnett: Moving away from an environment in which you’re relying on third-party data and third-party platforms, and you are sharing with them in a way that’s risky, and you’re also using their data for audience targeting and for performance analysis and analytics. And instead, we kind of try to drive home that the act of collection in a consented environment is not problematic. It’s the act of sharing that’s problematic. And so, you need to have your data in order so that you actually have a data warehouse with all of your performance data in it. And it’s in an environment that is covered by privacy regulations, either it’s your own infrastructure or under BAA, and that you’re building a first-party data strategy as well.
K Royal: Absolutely. I hate the term zero-party data.
Aaron Burnett: Yeah.
K Royal: That doesn’t resonate with me at all.
Aaron Burnett: To me, it’s always been kind of needlessly confusing.
K Royal: But I like what you’re saying when you’re talking about; it’s the sharing that often gets companies in trouble. It really is because using the data that you’ve collected yourself for your purposes under the parameters that you have and de-identifying it, which we know that anonymization is a very, very high standard, but here in the US, we tend to rely on de-identification. But what most people don’t realize is you are going to de-identify data to use it. There is a piece of the law that says you have to publicly commit to not using resources to try to re-identify the data. And if you don’t have that, it doesn’t meet the definition of de-identification.
Aaron Burnett: Well, once more, job security for you.
K Royal: Right? Job security. More job security. I love it. You’re just stacking up job security. This field is ultimately just fascinating as it goes. And I actually wanted to invite a digital marketer to our podcast. I reached out to several, so maybe this is a good connection here to say, “What does marketing in your world look like with all of these regulations? Is there still money to be had? Can you still make a living doing digital marketing?”
Aaron Burnett: I was absolutely sincere when I said our results are much better in a data-constrained environment. The quality of our results is better. And there are a couple of reasons for that. The process of going through, you know, becoming HIPAA-compliant, implementing privacy-first data collection, data sharing methods, not easy. And so, there’s a rigor that’s involved in it that requires looking at the data that you have, and what data actually is valuable and what data is actually directionally accurate. And there’s something else that has been even more valuable, and that is if you’re then going to integrate your systems in a compliant MarTech stack…
K Royal: Okay?
Aaron Burnett: …when you do so, you should also integrate in a way that allows identification of a moment of conversion that’s deeper in the systems that usually isn’t available when you’re using third-party tracking. And by that, I mean, let’s say you’re in a healthcare system, or you’re in a medical device manufacturer. If you’re using third-party tracking or conventional tracking, the thing you’d be driving for is lead generation or someone set an appointment. The reason you would do that is prior to OCR guidance; it was fine to track that. That was a moment of conversion that you could implement with Google Analytics or Meta, or that sort of thing. But if you can’t do that and you have to integrate your systems more deeply, then it makes sense to look at the moment that the lead became a customer and generated revenue, or when the patient actually came to their appointment and generated revenue and the lifetime value of that patient. You can do that in a consented and compliant environment in a way that you can’t with third parties. And now if you optimize for that, for actual business value delivery, then you find, and you don’t rely on all of the big red easy button audience targeting because you can’t anymore; you have to get much better at what you do. And by definition, you end up spending more efficiently to drive an outcome that actually has business value.
K Royal: I like this because this is much, much better than a shotgun approach. This is looking at actual data that you can rely on better. So rather than taking a conversion rate of, I don’t know what a conversion rate would be, but 99%, but it’s only 50% accurate versus a conversion rate of 65% that’s 99% accurate.
Aaron Burnett: So, I can tell you that our clients, and we’re all healthcare and med tech, are all having record years.
K Royal: Wow.
Aaron Burnett: After implementing HIPAA-compliant data solutions.
K Royal: That is the story to tell, that regulations aren’t going to kill your business.
Aaron Burnett: No.
K Royal: You can still innovate in using data the right way. And I like your perspective that it depends on getting down to identifying what data really matters and being more accurate about what you’re doing.
The Value of Diverse Perspectives in Leadership
Aaron Burnett: I’ve heard you talk about women in leadership.
K Royal: Yes.
Aaron Burnett: And the importance of women in leadership. We have the same belief here. It’s a company. We’re 50-50, male, female. We’ve been around 15 years and I have been in this company during a time when it was majority male.
K Royal: Oh, wow.
Aaron Burnett: We made a decision to shift, and I have seen how the company works when we have women in leadership and how it works when we don’t. And so, I believe in that as well. I’m curious to know why it’s important to you and your perspective, and what you can cite as the value that you have seen when women are in leadership?
K Royal: Yeah. For me it comes from, again, poor rural girl with no education, no family, you know, scrapping to get herself up to the top and finding myself often in leadership positions, not just by virtue of the education or the credentials, but also in, I’ve been on the boards of several nonprofit organizations and things like that, and volunteer organizations. I worked with the Association of Corporate Counsel to develop a global program to support women in-house counsel because they didn’t have one. That was a program I helped them launch, and their foundation took it a step further. And they have the global women in law and leadership that’s held annually at the United Nations building in New York. We actually just did the 10-year program, which was really cool.
So why am I drawn to it? I guess because I’m one of these people that I’m going to drive for the top regardless. So, if you’re going to find yourself in a leadership role, you should know how to do it right. And that’s not something they teach attorneys. So, attorneys graduate law school, and they may be leaders by virtue of their role or their education, but yet they don’t know how to actually be a leader. And while I was at Arizona State, I ran the pro bono in student life. So, everything the students did outside the classroom, I was in charge of while I was there before I got into privacy. To be a leader of leaders is even more challenging. So, I had to figure out what it meant to be a leader and how to educate young professionals in this. Because think about it, most lawyers are graduating 25 years old, and now they, if they’re a leader, they’re on a nonprofit, they’re on a regular board, whatever, how do they become a leader?
And then I looked around and realized there weren’t a lot of people that looked like me. I’m not pure Caucasian. I am part Native American, Mississippi Choctaw. And I have a little something wrong with me. I don’t know if you want to call it, I don’t have a brain-mouth filter, or I’m just a little too authentic for my own self. I believe that people ought to be able to be their authentic selves at work. And believe it or not, that is something that should be modeled by leadership. So, in order to see more people like me, I had to learn what that meant and then figure out why.
So, I started my studies in looking at women leaders in law, women lawyers, and found out that all the studies show that women are not leaders in law. Women lawyers are not leaders in law because they’re not the managing partners of the law firms. They’re not the deans of the law schools. They’re not the general counsel of the big companies. Well, what that told me, first off, the bat is, really? Are you sure they’re not leaders? Because I’m pretty sure women define success differently and it turns out they do. And I actually launched this study through the Association of Corporate Counsel. What do women define as success? And the leadership metrics there are not what have historically been defined as success as a leader, whether or not you’re the managing partner or you’re the dean of law school or you’re the general counsel at a big company. What defined their success as a leader is did, they identify what they wanted out of their career? And it was typically not the title, not the salary, not the office location. It was typically being paid enough money to get done what they needed and getting out of the office in enough time to go home and spend it with their families. That’s what they called success. Even if they didn’t have children, a lot of people are like, oh yeah, they had to get home to the kids. No, their families, their sisters, their brothers, their spouses, their, whatever you wanted to call it. They wanted time outside the office. You know, this work-life balance we call.
Then I started looking around and realizing that all of these studies on boards of directors that show that if you have three or more directors that are women, you tend to do better as a business. Why? Because of the conversation and the idea of perspectives that are going on behind the scene. You don’t want to just welcome them. You want to make them a default part of your process because if you don’t bring other perspectives to the table, then you are missing the perspectives that are out there. And that comes from whether it is cultural diversity, whether it is gender diversity, whether it is, you know, area of the country you come from, area of the world that you come from. Because even in the US we have a lot of different culture. How can you truly know that you are addressing the problem holistically if you don’t have holistic views being brought to it?
Thank you so much for having me. This has been absolutely delightful.