Episode 39: Beyond Automation – Amplifying Healthcare Marketing Through AI Content Strategy
Hosted by Aaron Burnett with Special Guest David Patton
In this episode of The Digital Clinic, we explore the evolution of AI in healthcare content strategy with David Patton, Senior Director of Content Strategy at Fred Hutchinson Cancer Center.
David shares his dramatic transformation from viewing AI as an existential threat to content professionals to embracing it as a capacity multiplier that enables expert teams to accomplish more without sacrificing quality. We discuss the “dynamic storytelling evolution”, how content teams must adapt to audiences who stay in their native platforms rather than crossing back to institutional websites, and why this shift requires rethinking content creation from text-first to multi-format approaches.
The conversation reveals why expert oversight remains essential in healthcare content where accuracy is sacrosanct, how David operationalizes storytelling with world-class researchers who aren’t natural self-promoters, and why Fred Hutch takes the provocative stance that AI should “steal” their content to better serve their educational mission. We also explore David’s experience implementing HIPAA-compliant analytics, a journey that challenged assumptions about measurement and ultimately produced not less data, but different and better data with greater organizational visibility.
From building trust in an era of institutional skepticism to combating misinformation without becoming a campaign organization, David demonstrates how removing the “crutches” of easy tactics creates better strategists who focus on messaging quality and authentic storytelling. This episode delivers actionable insights for healthcare marketers navigating AI adoption, privacy compliance, and the transformation from traditional content approaches to platform-native strategies that amplify expert capacity at scale.
Listen & Subscribe:
AI Transformed from Threat to Capacity Multiplier
Aaron Burnett: So, let’s jump right in with what’s still in the zeitgeist. You and I talked a year ago.
David Patton: Just summer, was it? It wasn’t last summer, but maybe it was last summer. Yeah.
Aaron Burnett: Yeah. And we talked a lot about AI.
David Patton: Yes.
Aaron Burnett: I know that your thinking has evolved. You’ve been thinking a lot about content strategy. Sure. You were already thinking about the implications of AI for content and media. How has your thinking evolved around AI since we last talked?
David Patton: I would say the one thing that my thinking has evolved, and I had drinks with another person who’s a professional content person, runs a shop that does content for a lot of healthcare organizations. We were commiserating like a year ago, last spring. We were like, wow, this is the end, right? People who are doing content for that kind of format. Yeah. Very sort of block and tackle content. Like this is it. But completely turned it around now, where we were both saying how it has completely changed how we think about things and vastly expanded our capacity to do things in a way that’s super positive. We used to have to prioritize things, and so you never do a whole swath of things that you just didn’t never got time to. You were never going to use a professional editor or a professional writer or a content strategist to do that thing. Now they can do it because you can use AI to basically do it very quickly and then use your expertise to make it really good. And so now we’re doing more things for more people because we have more capacity. And there’s still so much to do. Yeah. So, it’s okay that it’s going more quickly, and maybe you’re doing it a different way. So, in that way, our point of view has changed. From a practitioner, from a discipline, I’m probably among my peers in the office for sure, like adopting a lot more quickly. Yeah. Because I’m also the one who’s responsible for people’s resourcing. And so, sure, I was like, anything that’s going to make it go faster, yes, I will try doing this. So that’s the big place where I think it’s changed our habits and what we’re doing, and also taken a little bit of pressure off of the things that we produce. I think because I’m more and more thinking about it, certainly from a search standpoint, but also from generative AI, that whatever we’re going to create for our audiences, many of them are cancer patients or potential cancer patients. They’re going to consume it in a different way than they might have in the past. They’re not going to have a one-to-one like go read it on the website. They may consume it on a generative AI output. More likely, and we’re just hoping to be able to inform that in a way that’s useful. For a lot of organizations, they’re freaking out because they’ve invested a lot in search because search was a way to get people directly to your website and then sell, right? Whatever you’re going to do, which, or you may have the transaction directly on the platform. For us, you’re always going to have to go through some process to make an appointment to get care. So, it won’t impact us in the same way as probably will impact some of the retail channels. Yeah, because you always have to come and talk to us at some point. Yeah.
Aaron Burnett: You don’t need someone to transact. You just need a multitude of ways to create awareness and draw people to you.
David Patton: Correct.
Why Expert Oversight Remains Essential in Healthcare
Aaron Burnett: So as long as you do that well, yes, it works well. We are very similar. We look at AI as a means of leverage and scale for an expert. It’s not even; it was always human in the loop. When people talked about AI, human in the loop is fine, but I think that a human in the loop, an average human in the loop, is maybe not sufficient. I think an expert human in the loop is what’s required to ensure quality, particularly in healthcare. Yes. Where the information, its accuracy is sacrosanct.
David Patton: Yes. Which, you know, is one of the things. I was reading your piece that came in my inbox today, right? About healthcare and AI. And that seems to be the place where platforms are actually pulling more and more stuff into the general AI. Yeah. Part of that is because there’s so much information out there, much of it created by organizations that aren’t; they’re creating that content, right? There’s a couple of big players in the private sector, and then all of the government websites and all of the advocacy organizations. So much really high-quality content out there that the AI companies can feel comfortable that whatever’s going to get pulled in, they can narrow their field, is going to be pretty good. And again, going to the earlier point, you’re not necessarily upsetting a direct revenue stream for the folks who are. Like, it’s okay. You’re still going to have to go someplace to get the goods or services that you’re researching. Yeah. So, it’s less problematic as they, since most of them are making money selling ads or whatever.
Aaron Burnett: I think that latter issue is the reason why AI overviews, for example, are so prevalent in Google. I think our last data was you have AI overview in 99% of the queries, but the accuracy rate is much lower than 99%. Sure. Inaccuracy is tremendous, but you are right. They’re not disintermediating a transaction or an ad click or that sort of thing.
David Patton: I mean, I think one of the problems we have now is there’s so much misinformation out there, right? That’s actually polluting the results in the generative AI. So yes, that’s a problem, and they need to do better, but it’s also just a problem for anybody who’s using search or any social media or anything doing research about sensitive health topics. There’s a lot of people out there who are putting content out for a purpose that is not necessarily a good purpose. Yeah. And so therefore, their information is bad.
Aaron Burnett: Yeah. As such an incredibly credible leader in the space as Fred Hutch, how do you think about combating misinformation and positioning Fred Hutch so that you are a source of trusted information?
Why Fred Hutch Won’t Become a Campaign Organization
David Patton: Yeah, it’s a good question. And I’ve had multiple conversations with the clinicians and researchers recently because this is a delicate time, and many of them feel like our job should be to get out there and combat the misinformation. And I don’t disagree. What I’m very clear to them when we talk to them is to say, yes, we will do the best to create the content that we think, that leverages your expertise, that shows what is accurate based on the best research and the best clinical knowledge. We are not a campaign organization that’s going to go out there and push back against the bad actors who have a perspective, who are pumping out this bad information for whatever reason. And it’s a challenging conversation because, sadly, we’re not an advocacy organization. Even if we have a perspective, what we want is we want to bring in patients and we want to grow, and we want to make sure people are well-informed and serve the patients of the Pacific Northwest related to cancer. But we are not the people who are going to be out there like duking it out against the misinformation people. And so I often have to kind of dial folks back and say, look, that’s not, the marketing communications team at Fred Hutch is not like a, cannot be weaponized against misinformation, unfortunately, just because it’s not really our primary goal. But we’ll use our best. We have great tools, we have great content creators, you have great folks. We will certainly help you do that thing. It is interesting to your question, like it’s a very delicate time for the folks in our space, right? If you’re looking for cancer information or just infectious diseases. But it’s also like, I would rather us be good at what we’re doing in terms of creating the content for the patients and our core audiences versus spending a lot of time trying to fight against those other folks. Yeah. Because they’re more focused in what they’re doing.
Aaron Burnett: Yeah. And they’re more mercenary. Yeah. I know you’ve been very focused on storytelling.
David Patton: Yes.
Aaron Burnett: I heard you talk about dynamic storytelling evolution.
David Patton: Yes.
Aaron Burnett: Tell me a little bit about that.
The Dynamic Storytelling Evolution
David Patton: One of the reasons I came to Fred Hutch is because they already had this fantastic content team that was created primarily to put content on the website as a news wire around cancer and research. And that was fantastic. I came in, and I was able to see the value of what they were doing. There were some questions as to whether it was doing the thing it needed to be doing for the brand at the time. And you could very quickly look at the data, and you’d be like, yep, this is working. That was now six years ago. The world has changed. And no longer can your website be, and this even goes to the changes in social media, where everything was like, let’s create a piece of content on our website, whether you’re a media organization or us. Put a post on Twitter that has a link back, and we’ll draw people back to the website. Folks don’t cross as much as they used to, right? They stay in a channel, right? They only look at Instagram. They only read Facebook. They only read YouTube shorts or TikTok or whatever. So, with that realization, what I had to do is really a resource question, is to say, no longer can our content folks just be primarily focused on text. How do we get them to think dynamically? Tell the stories so that we can be in TikTok, in Instagram, draw the best kind of journalism practices from these folks, and make sure that they’re out there on these different platforms. Because we don’t believe that they’re going to cross over. And it’s really just an efficiency question, which is to say, how do we tell that story one time and put it on all those channels with no longer having the assumption that they’re going to start on Instagram or Twitter and come back to our website, and that’s okay. Yeah. No longer is that kind of conversion to the website the primary goal. It’s challenging from a measurement standpoint in that if you were doing content creation and aligning with your social media and bringing people back to the website, you could see where you were converting people. You really just need to see how things are performing, particularly on all the social channels. And then at the website, you want to combine those numbers. So, it’s an inference measurement.
Aaron Burnett: Yes.
David Patton: So that’s the purpose of this. It’s a little bit of a resourcing question, and getting people who have been primarily text-based. Yeah. And getting them to think about the different formats, video, audio, or whatever. And then also recognizing that the audience is going to stay in their channels and their platforms, and it’s okay that we don’t get them to cross over because they’re still getting the impression that we want them to get. Yeah. They’re still getting the experience with the brand.
Aaron Burnett: Sure. Historically, I would assume that as you thought of content strategy, so you suggested you publish something and you want folks to engage, come back, and read it. That’s your measurement. And in that instance, you’re thinking in terms of single transactional outcomes, not a transaction, but engagement on this content. I think the implication of what you just described is if people are going to stay in their channels, then you’re thinking more in terms of an ongoing conversation with them in that channel, not bringing them back. Is that right?
David Patton: Yes. And actually, it’s one of the things we’ve been discussing, which is there’s a universe of people who are going to be exposed to our brand in one way, impressions or whatever, through advertising generally, or other places. Then there’s the kind of nucleus, right? There’s the patients, the donors, the other folks, the people who work there, the hires, and things like that, who are your core audience. But there’s this in-between, right? Who have experienced your brand they’ve taken an action to get closer to you, given you an email, followed you on a social media channel. And one of the things I’ve gotten passionate about is, like, how do I grow that universe? I have some brand names for it that I can’t share here. Yeah. Because the boss hates them. It’s very much like a normal marketing because you’re thinking about the funnel. But from a content standpoint, it’s interesting in content standpoint in which we are not getting people to pay for anything. And so how do we focus on growing that universe of people who we have a relationship with, which is not a core relationship, right? They’re not working here; they don’t donate to us, but they’re now aware of us, and they care. When I think about goals for the content, they’re going to have a conversation. They’re probably going to have multiple experiences with our content in some way. And then take that step to say, I’m going to subscribe. I’ll give you an email so that I get your newsletter, whatever that kind of is. It’s very much like normal advertising; it’s just that we’re not paying to be in the places.
Aaron Burnett: How do you think about building trust in these channels? I would assume part of the focus on storytelling is part engagement. It’s also because if a real person is telling you about a thing, you are more likely to trust that person than you are some prose.
David Patton: That has been an interesting part of the journey as well. Most of the folks who are text-based writers, journalists, reporters, however you want to describe them, are very much accustomed to, yes, there’s a byline, and they’re very proud of the byline, right? But they’re very uncomfortable being visible, right? As the active storytellers. And so part of my, the dynamic storytelling evolution is also turning those people into faces that can be trusted. The byline certainly has traditionally been the place where it drives that trust, and now it’s going to be, hey, I’m an expert in content and storytelling. Talking to another expert who’s an expert in research, or DNA, or immunotherapy. Together, we’re going to draw that story out, and you’re going to trust me because you see who I am, right? Because I’m on, or you’re hearing me as case here, right? Yeah. Where it’s both of those things, and that’s how I think about the trust piece, because you need to have that face piece, or at least the distinctive personality, much more than you ever did, right? Yep. Because institutions are not trusted. Fred Hutch is a very well-trusted institution, but it’s still an institution. And we’re all in this moment where it’s all people and where institutions are increasingly suspect, regardless of the kind of institution. It’s a really interesting challenge.
Aaron Burnett: I’ve also heard you reference the concept of operationalizing storytelling. Fred Hutch is packed with some strong and deeply expert personalities there. And those personalities, researchers in particular, I assume, are some of the most important people who you need to be involved in storytelling. How do you operationalize storytelling with these kind of constituents?
Operationalizing Storytelling with Researchers
David Patton: Our opportunity and challenge is, most of the folks, it’s a little bit of the, part of the personality of the place where folks come there. They’re not necessarily self-promoters, right? You choose to come to Fred Hutch as a researcher, to a lesser extent clinician or a provider, because you like that kind of a culture. It’s more collaborative than other places. And part of this is we’ve done research about the folks who come here and a little bit about our brand, and it’s also just the experience that you have here. That means they are generally not really good at that promotion piece. So, when I talk about operationalizing storytelling, it’s how do I draw out the stories from those experts? My folks have to be the people who are pulling the stories out of people.
Aaron Burnett: Yeah.
David Patton: Because even though in the rest of the world, everybody’s holding up their phone, right, and telling their story and talking about their expertise and trying to get followers and likes or whatever, our folks are very not into that, and we have to be the ones who are like the drawing out piece, which is good because the folks who work for me are really good at pulling out those pieces. But that’s where that operationalizing comes. It’s the traditional mechanisms of media and journalism. It works when you’re putting it into the framework that we’re into, which is a place that isn’t a content creation place. Yeah. It’s ancillary to the goal, but we’re taking the best practices and pulling it out in that way. Operationalizing storytelling is a little bit of my own little personal brand piece. The dynamic storytelling is that next evolution, which is in the old days, you had to take some of the best media pieces. Now, it’s you have to take some of the best influencer and social media practices and fuse them together to do your brand building and your content creation.
Aaron Burnett: Yeah. You have had a long and storied career. You started with the Wall Street Journal. You were there for about 15 years, and you started in digital when digital was barely a thing.
David Patton: Yeah. I started at Dow Jones in 93. Yeah. I was like, in some, I used to read the paper and code it. Read. Sure. Not the actual paper, but the digital paper and code it so that you could find things. It was a great job.
Aaron Burnett: What, if anything, about the current time and how disruptive AI has been, and the fragmentation of media is similar, can you pattern match versus the very early days of digital, where the same thing was happening and digital was disrupting a traditional?
David Patton: Yeah. Having been in the industry for so long, and then as an observer, you watched. The media business used to be one of the most profitable, successful businesses in the world. Yeah. Newspapers used to be like crazy profitable, and the internet came, and they like, pat, it’s on its head, and they’re like, nah, nothing’s going to change. They just ignored it until obviously it’s too late. And it’s now the industry is a shell of what it used to be. So, Google announced or somebody that they were going to start putting ads around the AI overviews. And I was like, this is now they are media companies. Yes. Instead of a human going out and pulling in the information and writing it down and whatever, and putting ads around it. Now, its AI is going out and putting the information out there. It’s like it’s exactly the same business. Yeah. In some ways, it was a little depressing because I was like, really, nothing has changed. Like we’re doing the same thing that we did 30, 50, a hundred years ago. It’s just the engine behind the creation is now a, I don’t know how we describe AI. Is it a bot or is it a process? It’s an algorithm, algorithm is doing the work, not a human, but we’re still just putting ads next to it. Those who are being disrupted are much more watchful. Like they’re recognizing that this is going to be a big impact to them, than I think the media business was for the first 10 years.
Aaron Burnett: Although I still get the sense that there are two disparate camps, those who are hypervigilant and maybe overestimating the impact, and those who are unaware and completely underestimating the implications.
David Patton: I would agree with that because it mirrors kind of personal experiences, too. Sure. You have some people in your life who are like, this is amazing. It’s changing my life. I’m figuring out how to get AI to do all of these things and make my life easier. And then there’s some people who, like my wife just recently was like, she was working with somebody, and she was asking them, and they were like, what? What’s that like? It was like, what’s Copilot? And she’s, that’s a thing you could have used to do the thing that you just spent a lot of time doing. So I do think that there are those two camps organizationally and industry-wide, as well as on an individual human level.
Aaron Burnett: Yeah. One of the opportunities that you have, one of the embarrassments of riches that you have at Fred Hutch, is that there are lots and lots of research papers published, but research papers are impenetrable, and nobody but other researchers probably read them in the main, and I would assume that you want others to know about the amazing work that’s done there. How are you going about that? And then I, I have, of course, inevitably an AI question related to that as well.
Making Research Papers Accessible
David Patton: When I started at Fred Hutch, like the currency of good research is a paper to which I now privately say to myself, like, you’ve devalued your currency. Because there’s a lot of papers that come out, right? There are a lot of papers that come out. But the ones that are, however you want to describe them, big enough, important enough, worldview-changing enough, are few. We often have almost the reverse problem in that we have to really hunt for a paper that has enough interesting things in it that we can actually build a story around. Yeah. Because a lot of the papers are, we looked at this very small thing, and it went from this to that. It’s part of the incremental understanding of how these medicines work or how we understand how this part of the body works. But we have beats, so our content people are reporters. They stay engaged with all of the folks who are doing things, and so they’re always in on the lookout for things. We were very cognizant of timing, and we’d always try to time it for when papers have an embargo. So, you can’t talk about it publicly until it’s published. Now, we don’t worry about embargoes anymore because no one’s reading them. And so if we write about it two weeks later, it doesn’t matter. It’s more about getting this, really, drawing the story out to make it more comprehensible by, as I say, normal humans versus science pieces. Yeah. I get a weekly email from PubMed, which is the, that’s where all the papers go, right? They’re not in there, but they’re at least classified. So you can find them, and it is like amazing in that they’ll be from Fred Hutch. Generally, somewhere between 20 and 30 papers a week, and maybe two or three of them are ones that I can even understand. And maybe one of them is one that’s going to have like broad reach or implications in terms of storytelling.
Aaron Burnett: Yeah. We started working with Fred Hutch in 2017, and the content team was mostly PhDs at that point, as I recall. Yeah. It sounds like you have maybe either a differently trained cohort or some very different people.
David Patton: So, I have four on-staff writers and then two video folks, and then other content folks. One of my writers has a PhD. I have other folks who have scientific training. Sure. But a lot of the PhDs have just either moved on or done other things. Most of the folks now are primarily content people. And even my PhD pivoted after she completed it and moved into science communications and journalism. Yeah. Do we need really strong science people? I think you need people who can at least talk to other scientists. Our job is to make it translatable and comprehensible by normal humans. And so, what you need is someone who’s going to ask dumb questions. That’s the key to this is explain it, and this is where AI becomes super useful.
Aaron Burnett: Yeah.
David Patton: And actually, one of my colleagues was saying, he had to do so much research just to understand a particular research paper. It was great because he was able to use AI to ask all the dumb questions to get at least to a level where they could ask the questions of the researchers that were a little bit more intelligent.
Aaron Burnett: Yeah. Here is the AI question. It’s an ethical question. From our perspective, we use AI in all sorts of ways. I think we’ve got 80 custom GPTs and Claude projects that we use internally, but because we work in healthcare and med tech, we stop short of producing content with AI. Authorship matters, accuracy matters, all that sort of thing. In your situation, you have researchers who are not just experts; they’re the one in the world in a lot of instances, and so it’s without question that their highest and best use is research, focusing on that research. But you also need to be able to tell their stories, which means you need some of their time to tell their stories as well, which may or may not be the highest and best use. So, my question is, this is an instance where I could see an argument for using AI to tease out variations on the theme to create other content, a more accessible version of a research paper, still with an expert in the loop for final review. But I could see an argument that’s a good use because it relieves the researcher of so much time that they otherwise might have to spend on answering the dumb questions, figuring out how to produce the content that makes it more accessible.
David Patton: I’m using AI a lot. We’re continuing to develop new kind of prompts that we can use, but I would never have thought about it to take a research paper and kind of spit out a like a high-level storytelling example of it because it’s a place where, because we’re looking at something that’s very leading edge, I’m a little uncertain that the AI’s going to be able to translate it in a way that I would feel comfortable even knowing whether it’s accurate or not.
Aaron Burnett: Oh, sure. Yeah.
David Patton: You’d certainly have the researcher do it. I would certainly use that if I were going to talk to them, so I can understand the research better. But in terms of like directly posting something out into the world using AI, oh, I wouldn’t do that. No.
Aaron Burnett: I’m not suggesting that. Part of the reason I’m suggesting it is there was an article in the New Yorker, I think a couple of months ago, that juxtaposed highly specialized LLMs that are deployed specifically for medical use cases, radiology, diagnostics, differential diagnostics, in particular in academic settings. And because they were really focused and they were trained on deep information, their accuracy was very high, around that 99% level. And they often, the AIs would often come up with a diagnosis more efficiently and more quickly than a doctor would, versus a generalized LLM, a version of whatever model ChatGPT is running right now, where the accuracy is like 60, 70%. So I would never do it with an open LLM, but I could see that you could train an LLM on the body of research and get a really high degree of accuracy and efficiency.
David Patton: That’s definitely going to be the future, even in clinical practice, where we’re starting to look at it from like brand guidelines, right? Where you can say, only look at this and tell me whether this blurb, ad, whatever I wrote is accurate. In the same way, you could train it on our own content or one of the things we’re looking at; we’re part of the Comprehensive Cancer Center network. Yeah. They have a very comprehensive set of information, tons and tons of PDFs on all the best practices and care that the researchers and clinicians get together and agree like that is a goldmine of training LLMs, so that any clinician could access that and basically get good information back, drawing it out of the 57 PDFs that’s in there. Exactly. I think that’s the future, and we’ve actually started exploring, particularly, we have a lot of patient education. Yep. It’s really like how do you manage your care, especially for cancer care, which is often months and months. A lot of things you need to do and maintain. That should be something that we should be training an LLM on where you, instead of having 60 pages of guidelines, you just ask it questions whenever you need it. And it will tell you at that moment, this is a thing you need to be doing. That would be fantastic.
Aaron Burnett: Yeah. We have both Claude Projects and a custom GPT that are trained on our brand guidelines, brand voice, pillars, and all that sort of thing. It was one of the first use cases that we employed. It works really well.
David Patton: We’re about to do hopefully a little bit of that rollout. And that’s the one thing that I, and it’s been great because I’ve been getting to mess around with it myself to understand how it works, so that I can provide this and just get greater adherence, which, when you talk about it in that way, that’s exactly what we want with medical profession. Greater adherence, more consistency in the experiences, and less work on them to pull out things that they may or may not remember.
Aaron Burnett: Yeah.
David Patton: That’s going to be fantastic.
Aaron Burnett: We have something called a strategy group. Every week, we will review a client strategy, company strategy, tech strategy, what have you. It’s a litmus test. It’s a winnowing process for everything that we do strategically. It ensures quality, and the goal is not that it’s trial by fire. The goal is we’re going to get the smartest, most expert people together, and they’re going to give the best advice and show you things that you haven’t seen, and we’ll make sure we’ve got very strong, compelling, and defensible strategic frameworks. And we trained a GPT and a Claude project on a particular method of developing strategies. And there’s a book behind it. So we trained it on the methodology in the book, and that sort of thing, and people can submit their strategies to the custom project or GPT and get an assessment and recommendations as to how to bolster it, where weaknesses are, that sort of thing. And that’s another use case that has worked very well.
David Patton: Yeah, my use of AI has ramped up so significantly in the last two or three months because once you train yourself on how to use it. Traditionally, most of us who’ve been fiddling around with it, we used it as we used a Google search bar. You’d ask very generic questions, but when you realize that it’s all about training yourself and training the thing and giving it the parameter so it knows who you are and what you need. Yeah. It’s like remarkably much more efficient and practice changing, I guess I would say, where you start to trust it so much more because it’s really doing the thing that you would expect a human to do. Yeah. Which is put some parameters around it.
Aaron Burnett: So, we’re three years on from OCR guidance that required changing everything in terms of analytics and third-party tracking and that sort of thing. And you have a HIPAA-compliant data solution in place now.
David Patton: Let me just clarify. Yes. My lawyers would say a HIPAA-compliant, knows this thing exists, but we, yeah, that’s exactly right, is a compliant tracking mechanism that we feel confident would pass muster against OCR and all other, yes, and our additional legal ramifications or parameters.
Aaron Burnett: I should, sure, said. Yes. Said. When that guidance came out, what were the fears and anxieties within Fred Hutch? And then how has it turned out?
The Journey to HIPAA-Compliant Analytics
David Patton: Not soon after it came out, one of our legal folks and one of our compliance folks came to my colleague and I, she runs user experience and the infrastructure of the website. And then me, from a content standpoint, came to us and said, we’re very concerned about this. We don’t know what we’re doing at Fred Hutch. What should we be doing? Of course, they immediately started with turn everything off. No measurement, no anything. And so that put us on the journey that we are referencing now that we’ve been with you and Wheelhouse for most of the time, which I always say it was the best case study I’ve ever heard because literally two weeks later, I was talking to one of your colleagues and just said, here’s the problem we had. And I was like, that is, yes, that is the exact problem we have. So, let’s go on a journey together to see if we can figure this out to get it, if not HIPAA-compliant, as our legal, all lawyers would say, at least compliant. And in a way, it’s taken a long time to get there because part of it was our lawyers and outside counsel wanted a level of scrutiny that was even higher than most, even higher than the OCR piece. So it took even more time to tune it, but it set us, and I think a lot of organizations who are doing this are finding the same thing, is you realize how much habit and assumptions and like off-the-shelf Google stuff that everybody was using that they just really didn’t pay attention to. It was free. It gave you some insights as to how people were using your website, and you just used it, and some of the folks in your marketing communications understood it, but not a lot of people.
Aaron Burnett: Right?
David Patton: So now, not only have we implemented it and we’re very happy with it, we have all kinds of new kinds of data that we didn’t have before. And we’re on this path to have a really good understanding of how folks are using our website. Also, more broadly, how our digital advertising is performing. That I think is going to allow us to both improve the experience of our core audiences, patients, and potential patients, being one of them. And also, just be more efficient on how we spend our energies to attract more of them. Yeah. And I think, certainly talked about that with you and your folks, but anytime I talk to anybody else, it’s a little bit like this revelation, wow, I was doing it in a way that I didn’t really understand, and now so many more people understand it and we’re building dashboards. Internally that, give us visibility that we’ve never had. It’s moved beyond just the web people and the content people, right? The marketing people are involved, internal comms involved. Everybody’s looking at these dashboards just to see what’s working and not working.
Aaron Burnett: That’s been our consistent experience. People afraid that you were going to lose data and lose fidelity, and you end up actually with different and better data.
David Patton: Me personally, like I understand how this works so much better, even if I’m not the technical person, but I can at least articulate. And one of the things we did this year for our goals and KPIs was literally like, explain everything that we’re doing. Yes, here’s the goal. Here’s the thing we’re tracking, and here’s literally the tags that it’s all going through.
Aaron Burnett: Right?
David Patton: So, in case anybody ever asked, you could be like, there’s this document that explains what it is. And even if you’re not technical, you can at least get a sense of what it is. There’s so many habits and just so many ways that people have been doing things, and they’re not necessarily, haven’t had to question or pay attention. And that was the benefit of this, is that we had to question and pay attention and get smart about this thing. And now we’re set for the moment and hopefully set for the future.
Aaron Burnett: We have had very much the same experience. It’s interesting, all of our clients, most without exception, are on a HIPAA-compliant data solution down. All had the same fear that they would lose fidelity, but they were all, and we sometimes, on their behalf, were using the things that felt easy, that the platforms provide. You identify and target audiences in the way that’s available to you through a platform. You measure through that platform and our experiences, having gone through the same process that we went through with you, with the rest of our clients, every single client this year has had a record year. Performance is up, not a little like a lot, sometimes up 150% with less data, but the right data and the targeting that isn’t in the third-party platform, but is more deliberate and optimizing for the true moment of value creation that wasn’t self-evident in analytics before. Then you know when you have to develop a whole new system, you develop the right system.
David Patton: Yeah. From our point of view, we kind of don’t want to have a banner year, but we do want to have a banner year, right?
Aaron Burnett: Right. Yeah.
David Patton: If we have a banner year, it’s a lot of people have cancer, but I do think we’re going to be able to much more clearly show the impact of the things that marketing and communications is doing for the organization in a way that we’ve never been before. And tie it, I don’t think, certainly in the short term, we’re not really set to go all the way through in terms of journeys to come in for care. That’s extremely problematic, but at least more directionally, is this stuff we’re doing working? Yeah. To map to our goals. When I moved from journalism to first PR and be like comms, like I was just staggered at the number of like assumptions that was built into the whole system. Like PR itself is just filled with assumptions. What is the value to your business of getting your name in the New York Times? Sure. Unmeasurable. Yeah. Both not useful and very distant from your actual business impacts. And so, coming from financial journalism, and this is a little bit of, and plus when you’re in journalism, yes, the business side is, poo; we don’t want to talk to those folks, but you do get a lot of vanity in there. Do people read things? You feel like you’re having an impact. And so now, getting all these years later to really like very, I don’t want to say tactical, but very tactical measurements of the efforts to business impact is super exciting and learning a whole bunch of things that I didn’t think I would learn later in my career. Sure. I hate the technical aspects of it.
Aaron Burnett: Yeah. It’s really gratifying to be able to see the value of content in particular, where we have clients that are more focused on achieving a transactional outcome, new customers start, patient appointments, that sort of thing, and we can see that. It’s really gratifying to see, oh, the content strategy that we implemented now converts at 6%, whereas before we did this work, it converted at 1.5%, and we understand the value of the conversion as well in a way that we didn’t before.
David Patton: Part of the overall trend where everything is much more measurable. And that’s actually even to my experience and probably to yours, 15 or 20 or 30 years ago, almost nothing was very measurable, right? There’s a lot of assumptions built in. Then we got overwhelmed, particularly with social media, where we kind of went the other direction, where we were like, every dollar has to be accounted for with this level of precision because you can see it in your search engine marketing and you can see it in your social marketing. That was very constraining as well. And now it feels like we’re getting to a better balance between understanding things that we can feel better directionally of how it’s being measured, even if there’s not like a perfect precision, just like in search, where it’s like you pay for the click and you’re through. Yeah. That’s where we’re getting to, and that feels much more reasonable than a bunch of assumptions and then very clear like performance marketing. It’s got to be a mix of the two.
Aaron Burnett: Yeah, I agree. Just a little bit more common sense. Yeah. So, you mentioned the New York Times. New York Times has sued OpenAI.
David Patton: Yes.
Aaron Burnett: For training on their, stole their content. Yeah. Yeah. They should sue. Yeah. Alright, so I was going to ask you your perspective on that, but I think you gave me your perspective on that.
Why Healthcare Should Let AI “Steal” Content
David Patton: But the New York Times, their business is creating content that then has advertising adjacent to it, right? Yeah. Our, in some ways, we’re just talking about this today in our editorial meeting, our business is none of those things. Like we’re creating content to hopefully educate and inform people about the amazing science that’s happening there, and get them excited, and ease the burden of people who may or may not get a cancer diagnosis or worried about cancer. Yep. So, steal my content. Want you to get it. And I would say, I say that, that’s certainly my opinion. Sometimes we have some conversations on the value because there’s some worry that somewhere we’re losing control, or we don’t get it. But from my point of view, take it. I feel for the Times, and certainly a lot of my former colleagues, but this is the challenge for that industry. Yeah. That, as we said, with Google now running ads against their generative AI, which costs, all forever the cost of running an algorithm. There’s some cost, but it’s not as compared to a bunch of humans going out into the world and pulling in stories and writing them or publishing them as videos. Yeah. Like, the cost thing is really problematic.
Aaron Burnett: Yeah. It’ll be interesting to see what happens with the antitrust remedies. I saw last week that the judge announced that her opinion will be published, I think, early this coming year with regard to whether Google has to divest of their advertising business or can simply make some structural changes. In the current climate, I don’t know that divestiture is going to be the thing.
David Patton: I don’t think so either. Yeah, even to our conversation around the OCR stuff, like we went through all this exercise, and we’re not even sure if anybody’s going to pay attention to whether we’re following guidelines or not, because there just seems to be much less administrative oversight on almost everything. Yeah. It’s really hard to argue that these are monopolies when you can just not use the thing, you can use something else. The advertising is certainly more problematic, but I’m not sure that the solution matches the problem.
Aaron Burnett: I’ve interviewed a number of attorneys focused on compliance, and I can tell you I think you’re right that there is less oversight federally, but the attorneys are still very active.
David Patton: Because you’re more concerned about individuals taking action, because there is at least a climate of accountability that maybe wasn’t there before. Yeah. And a better understanding of how it’s being used, essentially monetized, that you’re losing your own data and someone else is monetizing it in a way, and so you feel like you’re missing out.
Aaron Burnett: Also, when it comes back to the compliance issues around data privacy, attorneys are very active. And my understanding from having spoken with several these attorneys is that there’s an awful lot that happens that is not publicized. There are demand letters, there are quick settlements that never see a court because the entity involved just doesn’t want to go through that pain, but it’s very active.
David Patton: That does not surprise me. Yeah. I think there’s good reason to be compliant and protected. Yes.
Aaron Burnett: And the flip side, right, is that we were all a bit cavalier. And once you understand what was happening, and you’re like, yeah, I don’t want that to happen.
David Patton: Even, obviously, I’ve been in the industry for 15 years with this company, but managing digital advertising and digital marketing for much longer than that. And there are certainly aspects of, let’s say, the megapixel and what was being collected and shared by the megapixel or what might have been impacted by a configuration that I did not know until all of this happened.
Aaron Burnett: I know because it was just too much work to figure it out, and it was working so well. Why would I question, right?
David Patton: Yeah, exactly. Yeah. Yeah. This is great. I really appreciate it.
Aaron Burnett: Oh, I appreciate it. It’s lovely. Thank you.
David Patton: Yeah.






