“I think it really opened my eyes to how tightly coupled our system is and how vulnerable we are to these cascading failures.” - Natalie Sullivan
Our host Bryson Bort welcomes Dr. Natalie Sullivan, Medical Director of the Emergency Response Medical Group and an emergency medicine physician at a D.C. area hospital. Trained in EMS and disaster and operational medicine, Natalie turned her attention to the critical intersection of clinical medicine, patient safety, and cybersecurity resilience after experiencing a prolonged ransomware attack on a major hospital. Dr. Sullivan lays out the disaster preparedness cycle, and the many vectors of risks for hospitals.
How does a cyberattack on one hospital lead to increased cardiac arrest mortality at the hospital three blocks away? Why is a generation of "digital native" doctors a hidden vulnerability in an analog emergency? And what happens when a hospital's reliance on these "tightly coupled" systems—like water, power, and the Medical IoT—collapses during a ransomware event?
“We are critical infrastructure, but we're deeply, deeply dependent on the surrounding critical infrastructure,” Dr. Sullivan said.
Join us for this and more on this episode of Hack the Plan[e]t.
The views and opinions expressed in this podcast represent those of the speaker, and do not necessarily represent the views and opinions of their employers.
Hack the Plant is brought to you by ICS Village and the Institute for Security and Technology.
Bryson: I’m Bryson Bort, and this is Hack the Plant, season 5, brought to you by ICS Village and the Institute for Security and Technology. Electricity. Healthcare. The food we eat. Our water supply. We take these critical infrastructure systems for granted, but they're all becoming increasingly dependent on computers to function.
In Season 5, it’s more important than ever to ensure that our essential services are resilient to disruptions. This season, we’ll bring you insights on four of our most vital lifeline sectors - electricity, healthcare, food, and water. We know that our interconnectivity makes us vulnerable to our enemies – but what can we do about it?
We walk you through the world of hackers working on the front lines of cybersecurity and public safety to protect the systems you rely upon every day. From the threat posed by Volt Typhoon to the aftershocks of the Change Healthcare data breach, it is clear: the time for action is now.
In my day job, I'm the CEO and founder of Scythe, a start-up building a next-generation threat emulation platform, and GRIMM, a cybersecurity consultancy and co-founder with Tom Van Norman of ICS Village, a non-profit advancing awareness of industrial control system security.
I'm also an adjunct Senior Advisor at the Institute for Security and Technology, a 501c3 Think Tank dedicated to tackling technology-driven emerging security threats.
Subscribe wherever you find podcasts to get each episode when it drops.
In today's episode, I’m joined by Dr. Natalie Sullivan, an emergency medical physician and disaster management expert. With a background in Emergency Medical Services and a fellowship in disaster and operational medicine, Natalie shifted her focus to the critical intersection of clinical medicine, patient safety, and cybersecurity resilience after living through a prolonged ransomware attack at a major hospital.
“It really opened my eyes to how tightly coupled our system is and how vulnerable we are to these cascading failures…it does make you think about, you know, what if my medical devices weren't reliable? What if my radiology wasn't reliable? What if I, you know, get a lab result back four hours late, and now I'm four hours behind the ball of a heart attack?”
We discuss the reality of the Disaster Cycle—Preparedness, Mitigation, Response, and Recovery—and why cybersecurity in a hospital is far more than a data privacy issue; it is a life-and-death systems failure.
How does a cyberattack on one hospital lead to increased cardiac arrest mortality at the hospital three blocks away? Why is a generation of "digital native" doctors a hidden vulnerability in an analog emergency? And what happens when a hospital's reliance on these "tightly coupled" systems—like water, power, and the Medical IoT—collapses during a ransomware event?
Join us for this and more on this episode of Hack the Plant.
Natalie Sullivan: My name is Natalie Sullivan. I'm an emergency medicine physician. Early in life, I knew I wanted to be a physician, was very interested in emergencies and EMS. And over the course of my career, I did a fellowship in Disaster and Operational Medicine and became very interested in Emergency Management and hospital systems. And then, incidentally, several years ago, I was working in a hospital that experienced a ransomware attack for a prolonged period of time, and that really sparked my interest, specifically in cybersecurity disasters and healthcare systems.
Bryson Bort: So for our listeners, what is emergency management? What is it about that discipline that stands out in the healthcare profession? So why did that attract you? And then what is it about it that's different? How does that work?
Natalie: Emergency Management is systems engineering. So when you go to medical school, you're learning about how to take care of patients, pathophysiology. You're focused on the individual. In emergency management, we are thinking about, how can we optimize systems to mitigate and prepare for whatever kind of threat you could have to the system. How do we use systems to create better outcomes?
Bryson: Okay. And then where does—how does that tie into disaster preparedness, right? You talked about system engineering. This is designing a system for operation, but of course, we have to design it for resilience. So how do you plan for that kind of disaster preparedness that backside the response, the recovery…
Natalie: Sure, absolutely. So the disaster cycle is preparedness, mitigation, response and recovery, right? And so we talk about, in that cycle, what are the different inflection points where we can make a difference in outcomes for our patients. And so if we're talking about cybersecurity, we're not just talking about the technical implementation that we can put into place to better protect our systems. That's, you know, something that is in a different wheelhouse than my area. We're also talking about, how can we implement better training, better exercises, better downtime planning, with the understanding that we're not going to be able to prevent cyber attacks, right? That is going to happen. But what can we do to protect patients and to try to prevent bad outcomes, when there is some kind of bad actor?
Bryson: What kind of training is there for that today, that meets that need or supposed to be preparing these professionals for that?
Natalie: So there's all kinds of when we're talking about disaster preparedness generally, you know, we have Hospital Incident Command Systems, right? We have our emergency operation plans. Every hospital does a Hazard Vulnerability Analysis to identify the kinds of threats that are most likely and to prioritize their planning and preparedness. Once you've identified those major threats, you're going to work down to okay, what kind of plans can we put into place in order to mitigate risk? And then how are we going to drill those. And so, that can happen at the hospital level. You have your hospital emergency preparedness team that is going to conduct drills, that is going to create your EOPs [Emergency Operations Plans], that is going to teach hospital staff about how to prevent whatever the threat may be. But then you also have systems level involvement.
So for example, in Washington, DC, we have the Regional Healthcare Coalition. And so they not only put into place protocols, they exercise the region on the different types of response, depending on what the threat is.
Bryson: What is an EOP?
Natalie: Oh, I'm sorry. Emergency Operation Plan.
Bryson: Okay. And then, what kinds of exercises do you all do to validate that training? And then, of course, stress test the whole system and processes all together?
Natalie: Yeah. So it's really interesting. And I think part of the problem here is it's different across the board, right? So what I'm doing at my hospital is not necessarily what one hospital over is doing to prepare, and that's where I think the regional piece comes in and the benefit of something like the Healthcare Coalition, because if you're all regionally practicing what your response is going to be for cyber that is going to make you a much more resilient system as a whole, right? And we've seen that when one hospital goes down, there are costs across the system.
And so, sorry, to answer your question. Within a hospital, we have our emergency operation plans, and then we will come together, versus representatives from different departments, representatives from not only, you know, the clinical side, but also administrative. And we'll run through a scenario. And so depending on what that is going to be, it could be a tabletop scenario. It could be an actual physical exercise. I recently pushed a hospital bed two blocks down the street to mimic an evacuation.
In the case of cyber what you want to do is practice your downtime procedures, right? And so in the event of a real cyber attack, obviously you're going to want representatives from the clinical side and your technical side in your incident command who are going to help make larger level decisions. So talking about the practice of clinical medicine. We have to have a method for how we're going to do everything that we normally do with the Medical Internet of Things on paper. I know I have residents who have never written a paper prescription. That's just not something that came up in their training, because they had lived a life of computer access. And so you'd be surprised when you get down to the nitty gritty, the small things you struggle with, right? Like so, for example, originally in our plan, we had. That lab results were going to be faxed to the emergency department, like, when's the last time you used a fax machine? At least, I haven't for a very long time. So we're having a lot of issues with that. And ultimately, we decided, hey, we have all these non medical people who want to help out. They're going to be runners, so they're going to physically walk back and forth from the lab and bring us the lab results. And that was way more efficient and fast, but we never would have gotten to that point if we hadn't had the opportunity to practice it.
Bryson: And it sounds like some of those are tabletop exercises too. When you're coming together, like you have anything from just talking to the paper all the way to… Can you talk through an example of that, and some of the lessons learned? And then, it sounds like you also do things that involve more physical parts of an exercise for simulation too. So can you kind of talk through some examples of those?
Natalie: Yeah, absolutely. So in my mind, when you're thinking about fleshing out a plan and trying to address where your failure points are, tabletop is probably the first step, right? So it's a really low threshold. You can get a bunch of people in a room, and the way that it works is you have your scenario, and as the scenario progresses through various injects, you have all of your stakeholders in the room participating in their anticipated roles. And so say, for example, your scenario was a ransomware attack, right? So you're going to want to make sure that you've gathered all the key stakeholders. So that's going to be your hospital administrators, your someone representing your clinicians. You're going to want the lab. You're going to want radiology. You're going to want, obviously, your IT/OT specialists. You're also going to want your building engineers in the room. You're going to want everyone who could possibly be effective, so that they can play the game, right? And so then you're going to go through your scenario and various injects, and they're going to be able to say, okay, you know, I, as a clinician, do not really know how we're going to get this network back online. I don't really know. Oh, are we going to have to shut down the CT scan if the EMR is affected? Are they totally separate systems? What am I going to be able to access? What am I not going to be able to access? So they're going to be able to say, like, “Hey, these are the steps we're going to do to protect what we can.” “You're going to have this available.” You're not going to have this available.” And maybe we, I can offer, “hey, can you please prioritize getting this system up first?” Or, “can you please prioritize XYZ process, because that is most going to directly impact patient care.”
Bryson: How do you measure the impacts of that to patient care? Right? I mean, it's not binary.
Natalie: No, totally, and it's, it's a really interesting question, and it's one that's been historically really difficult for us to study, right? Because no hospital is going to stand up and raise their hand in the current climate and say, oh, yeah, we had two patients die because of the cyber attack. Right? We can extrapolate, knowing that a lot of our clinical care is based on metrics. And say, hey, if the way that CMS is judging my ability to care for a heart attack is whether they get to the cath lab in 90 minutes, and I know that my care is being massively delayed by the fact that my entire workflow has changed, I think I could extrapolate there's probably going to be some clinical impact there.
There's a study out of University of San Diego, UCSD, that when scripts went down, scripts taking down one hospital affected cardiac outcomes at other hospitals. Because they had to defer patients who were in cardiac arrest. There's also some data, I think, from one of the NHS hacks where they said that, because there's a delay in lab results, they believe it led to a patient death. But even then, that didn't really come straight from NHS. I think that came from the news.
So, and long story short, we don't have a lot of great data on this because there's not a lot of protection to study it, and there's a lot of liability associated with acknowledging those kinds of failures. But if you think about health care, if you've ever taken your loved one to the hospital, and you think about the importance of just even time in getting a diagnosis, getting treatment, you can infer that there's certainly going to be clinical impact to these kinds of disruptions.
Bryson: Do you think that's something that's well understood in the healthcare side of the industry? And I recognize that's a broad question, right? Because it's gonna be different. But considering that there is an overall ecosystem threat, do you think that different levels, or there's even different perspectives within those different levels, from hospital leadership to hospital professionals to even the cybersecurity folks on that—is that understood?
Natalie: I think it's becoming… it's gotten a lot better. So I think there's still a lot of misunderstanding, and I think there's also just a lot of maybe not thinking about it. You know, like among clinicians, we're very focused on our patients. We're very focused on taking care of the person in front of us. If you're not necessarily very interested in this kind of thing, if you're not very interested in, you know, systems, if you're not, if you've never experienced it, it might not be something that's high on on your list of potential threats to your patients. But I think more and more, as healthcare is becoming an enormous target for cybersecurity threats, people are becoming more and more aware of it. And it's becoming, you know, I talked earlier about Hazard Vulnerability Analysis and kind of ranking your threats. Years ago, we did a survey that showed that it really wasn't on the top of most emergency managers' threat vulnerability analysis. And I think if you redid it today, it would be much higher, and there's much more of a focus.
Bryson: So of course, the bad thing did happen. As you mentioned earlier, there was a ransomware incident that you went through. With as much detail as you are comfortable sharing, can you please walk us through? Like we're there, Murder Mystery. It's a dark and stormy night. What happened? How did it feel? What did you go through?
Natalie: So as someone interested in disaster response, I always say, you know, I don't want bad things to happen, but I'm kind of excited when I'm there for it. So in that way, it's kind of a lucky chance. I came on at 7am, and at around 4am the computer had said, I'll just shut off. And really, I'd never experienced anything like that. In medicine, we'd certainly had downtime, and thankfully, we do have a very robust Emergency Management System at the hospital I was working at, and so we were familiar with the downtime paperwork, and we started getting the ball rolling. And it was funny, because, you know, our physicians who had been practicing longer, were like, Oh, this is old hat. We'll get out the whiteboard. We've done this many times. You know, we've most of their careers, they didn't have the Medical Internet of Things. It was a different form of practice. So in that way, we had that kind of historical knowledge that we're relying on. But it was really interesting, and I think it really opened my eyes to how tightly coupled our system is and how vulnerable we are to these cascading failures. Because it's not just, unfortunately we don't have the EMR (Electronic Medical Record) to type in. It's, how am I getting my lab results? How am I going to find out about my patient’s radiology results? How are we going to make sure that our monitoring systems are working correctly and that we're able to do our normal monitoring? And thankfully, ours was largely the EMR, but it does make you think about, you know, what if my medical devices weren't reliable? What if my radiology wasn't reliable? What if I, you know, get lab results back four hours late, and now I'm four hours behind the ball of a heart attack? Thankfully, those things didn't happen. But those are the kinds of things that you're thinking about when you're practicing in a totally different system.
Bryson: What do you think we can do to get around that? I mean, trusting a medical device is not what a doctor needs to be thinking about when they're trying to use the medical device.
Natalie: No, it's terrifying, right?
Bryson: Right. So what do we do?
Natalie: Yeah, I mean, the FDA (U.S. Food andFederal Drug Administration) has made huge progress in the last few years around regulating medical devices, if we're just talking about regulating medical devices. So I think that at least seems to be progressing in an exciting way. In terms of as a clinician, what can we do as someone who's interested in emergency management? I think that we can create our more robust ways of responding and mitigating risk when you know that you're not going to have access to what the normal standard framework is for practicing medicine. I think we can still deliver a high standard of care, but you have to think about how you're going to do that, and so in some ways, having this experience really helped us to better prepare, because we know what worked and what didn't during that time, and now we can generate better practices. But I do think there's a lot of hospitals who think they're never going to get hit, they're never going to be affected, or they don't, maybe they don't have the funding to do some of this, and they just haven't even practiced what their procedures are going to be, and then in terms of, you know, trusting devices, trusting your information, I think that's going to get even tougher as AI gets integrated into practicing clinical medicine, so you really have to fall back on being a doctor. And. And trusting your clinical gestalts and getting back to the basics of medicine. But certainly, that's not what we plan for, right? We're not going to plan for like, oh, just be a good doctor. We're going to plan for, how do we not put you in that position?
Bryson: Exactly. We're planning for resilience. Prevention and resilience in this case. Have you heard, I mean, through the grapevine, right, anonymized. Have you heard of incidents where healthcare providers, physicians have been in that kind of scenario, or where they've been at risk?
Natalie: I've certainly… –- I have not, I have not experienced that personally. If you read between the lines of a lot of the literature that has come out about patient outcomes, . --there was a CISA article around during the pandemic that indicated that ICU patients were, -- had a higher risk of mortality in hospitals who were under cyber attack. So I think the information is out there, and these are physicians who have been put in a position where they're experiencing poor patient outcomes because of a failure of the system that they work in.
Bryson: It's not even just the hospital that's under attack or under impact. I hate saying attack, because it can mean different things, but under not just the hospital that's directly impacted. The entire region also experiences a cascading effect. Because if you were going to be sent to the hospital that's closest to you, and that's the one that's impacted, you're going to be rerouted to another hospital, which is now not planned for that capacity and the distance it took from here to there, you're now adding time for proper treatment. And we know about the impact of time to certain ailments and right what those patient outcomes can mean. So there's also an indirect and, or regional impact from something being in that situation.
Natalie: What some people who maybe don't work in directly in hospitals may not be aware of, although, if you've gotten an emergency department anytime in the last 10 years, I think you probably know. We are functioning at or near capacity almost every day in every hospital across the country. And so any added burden is going to affect the entire system. So if you have to put one hospital on diversion, meaning ambulances are not going to go to that hospital, for whatever the reason is, that burden doesn't disappear. It's a zero sum game. It's going to the hospital next door. And so, like I said, there's actual data that you're going to do worse in cardiac arrest in a city where one hospital is affected by a cyber attack. And then when you start thinking about that regionality as well, one thing I also think about are these utilities that we're all using, right? So I love the way Josh Corman speaks about this. We are critical infrastructure, but we're deeply, deeply dependent on the surrounding critical infrastructure. So if one hospital goes without water, that hospital cannot function, and that is going to dramatically affect all of the other hospitals in the region, even if they had functional utilities.
Bryson
Is there anything more around that that you want to discuss?
Natalie
To me, we've kind of sounded the alarm, right? And maybe I still experience some situations where maybe there's a lack of insight into the clinical implications or the possible impact on clinical outcomes, just because there's, you know, people are a little siloed in their expertise, right? We have our technical experts, our clinical experts, our systems experts. But what I'm more interested in now is, okay, we've sounded the alarm. There's obviously a problem. We know that we're heavily dependent on these tightly coupled, complex systems. We know from you know, events like Change Healthcare, that these, like third party vendors, can have enormous impacts on healthcare in a way that you maybe don't even know, because there are these invisible ties. So how can we approach that? Like, what are we going to do to address that? And I think you know, it's more than just training. Definitely, to me, exercising and running through drills isn't just training, it's absolutely training, but it's also kind of building. It's relationship building and communication. Because part of this is we're not speaking the same language. We might not know, I don't know the technical specifics of what we're going to do beyond like the superficial in a cyber attack at a hospital or at a water plant, but I do know that my patients need dialysis. I know which patients we need to prioritize. I know which clinical diagnostics and management tools we need to prioritize. And so having these drills, having an emergency operation plan, coming together and creating a protocol. We're not only generating a plan and practicing it, but we're also building those relationships and making sure that we're continuing to communicate and not operating in parallel.
Bryson
Alright, so open, open question of anything the high level that Natalie getting on her podium wants to discuss. We've sounded the alarm. Any more detail around that that you want to pull out before we go into the lightning round?
Natalie
I think, you know, my big takeaway points is just that, cyber is a lot more than losing data. You know, privacy is so important, and I would never argue that it's not, but that was really the focus for a long time. And there are so many other consequences of this that we need to focus on to prevent true patient safety events. I think, you know, just continuing to emphasize that hospitals are highly reliant on IT and OT, and we're just need to focus on optimizing that resilience. And assume that this is going to happen at some point to you, and you need to be ready for it.
Bryson
All right. You ready for the lightning round?
Natalie
I'm ready for the lightning round.
Bryson
This is where the pressure comes.
Natalie
It's a little scary, yeah.
Bryson
If you could wave a magic, non-i internet connected wand, what's one thing you would change, Natalie?
Natalie
So even though I've had time to think about this, I think it's really tough. If I could, I want to talk about my friends at UCSD who are building a really important, I think a really cool program. So rather than just having our paper plan, which we obviously need to have, two of my colleagues, Christian Dameff and Jeff Tully at UCSD, are building kind of a cyber alternative. So instead of just, you know, throwing up our hands and saying, let's go to paper --which I think we absolutely need to have a plan for -- what if we were able to stand up a different system that mimics our our normal workflow, so people aren't operating outside of their comfort zone, but still allows us to provide safe patient care. So I think in an ideal world, rather than having people reinvent the their whole workflow when something like this happens. If we could have a better backup system that more closely mimics the way that we normally operate, that would be a magic wand, to me. Much easier said than done. I think.,
Bryson: Yeah, I mean, that's, that's the kind of the range challenge of if I could just replicate everything effectively so it's 100% to be able to test in, then I would be able to understand everything you know from a systems perspective before it happens. Right? Nobody's ever going to get one of those, which is why it is absolutely a magic wand request. You've waved your magic wand now looking into your crystal ball, which looks suspiciously like a cardiac monitor, what is one good thing and one bad thing that you think is going to happen in the future?
Natalie: One good thing, I think, is that the alarm has been raised and that this is starting to become more and more captured as a national security threat. And for better or worse, in our country and in the world in general, as things continue to be framed in that way, I find that they get more attention because it is a national security threat, possibly one of the largest national security threats. Because of that, I see a lot of movement towards protecting critical infrastructure like power, water, hospitals, and I'm hoping that that is going to continue, especially since this is really a bipartisan thing that everyone can get behind. One bad thing is that I think we're going to continue to see that our weakest links are our greatest vulnerabilities, and that we're going to continue to have cyber attacks on hospital systems, and unfortunately, for hospital systems that are least prepared, it's usually not because they don't want to protect themselves or they don't want to protect their patients. It's because they're rural hospitals, they're underfunded, they're overcrowded, they're already, you know, practicing at their the margins of their business, and so until we address our most vulnerable systems, I think we're going to be at risk as a whole.
Bryson: Sad trombone.
Natalie: Yeah, sorry. Maybe I should have said the bad one first.
Bryson: That's okay anyway. And probably anything when you talk about security is it inevitably leads to the wet blanket of, nobody says everything's fine. Everybody's always like, well, this is what we need to do, and if we don't the bad thing. So you're not alone. You stand on the pantheon of others who - including myself.
Natalie: I mean, I think it's tough. I don't think anyone has the solution or the magic wand --
Bryson: Cyber security is unsolved, that's why we have job security?
Natalie: Yeah, it's unsolved and it's constantly evolving, right?
Bryson: Is cybersecurity truly bipartisan?
Natalie: It should be bipartisan.
Bryson: Yeah.
Natalie: Because there's nothing partisan about protecting our most vulnerable, and there's nothing partisan about your basic utilities and critical infrastructure, right? I think we should all be able to get behind power, water and hospitals.
Bryson: This is Hack the Plant, a podcast from the ICS Village. Catch us at an event near you. Subscribe wherever you find podcasts to get episodes as soon as they're released. Thanks for listening.