top of page
Search
Writer's pictureWendy Chapman

What could go wrong?

Updated: Sep 25, 2023

Because I’ve been more on the methodological side of the translational research continuum for most of my career, perhaps it isn’t surprising that I’m only now learning firsthand the complexity of implementation of digital tools in healthcare settings. In particular, I’ve been asking this question: How much evidence do you need before you should feel comfortable putting a digital intervention into a clinical setting?


One of the most compelling examples of not successfully answering that question comes from a publication from 2005 where a Pittsburgh children’s hospital rapidly implemented a commercially sold platform for patients transfers and showed a double in deaths after implementation:


Mortality rate significantly increased from 2.80% (39 of 1394) before CPOE implementation to 6.57% (36 of 548) after CPOE implementation.


The authors chalked it up to unintended consequences of health IT, but several letters to the editor, including one by my colleagues partly titled “Common Sense Health Information Technology”, pointed out that the way you implement health IT makes a huge difference (email me if you want the full text version).


The authors described a 6-day, out-of-the-box implementation plan for off-the-shelf software without customization (eg, order sets, or a module specifically developed for a pediatric ICU). They described dramatic workflow changes that were necessary to use CPOE functions for ordering, verification, and dispensing medication. They also reported altered interactions within the care team that reduced beneficial, synchronous communication…. There is a critical need to understand existing culture and processes so that organizational efficiencies may be preserved or effectively transformed.


We are latecomers to a project where a solution was co-designed with patients and clinicians, and the tool has been built in an agile way with feedback from real users. Is it ready for prime time? They have developed a large language model add-on as well. Luckily, we have a multidisciplinary Validitron team with expertise in digital health app development, qualitative research, and implementation science, and they are mapping out the evidence needed to progress to a real-world pilot, including aspects such as these: Usability, Compatibility with work practices, Perceived usefulness/multi-stakeholder endorsements, Indicative evidence of potential for benefit, Technology performance, Compliance/standards adherence.


I wish I would have known this 10 years ago when I launched an exciting project as a new chair at the University of Utah to build and implement an easy-to-use electronic medical record in a student-led clinic. I got buy-in from stakeholders, we did user-centred design, we mapped workflow, we involved students with a variety of skills, I got foundation funding to work with a colleague who had a system they used in Africa that we could customize. After a year and a half, the project ended with a whimper. I really had no idea what I was getting into. Enthusiasm, creativity, and good will were not near enough to make a project like that succeed. We needed a plan led by the same questions we are asking for our current collaboration: what evidence is needed to progress to a real-world pilot and how do we prepare for a successful implementation? I have avoided thinking about this failure for awhile, but when I listened to Debbie’s recent seminar, it struck me that the answers for our clinic project were there the whole time.


----------

This Cautionary Tales Episode on The Tragedy of the Sydney Opera House is a fun example of the same principles outside of health IT.

Two of the frameworks we are using for our project:

180 views0 comments

Comments


bottom of page