First name
Evan
Middle name
W
Last name
Orenstein

Title

Clinical Decision Support Stewardship: Best Practices and Techniques to Monitor and Improve Interruptive Alerts.

Year of Publication

2022

Number of Pages

560-568

Date Published

05/2022

ISSN Number

1869-0327

Abstract

Interruptive clinical decision support systems, both within and outside of electronic health records, are a resource that should be used sparingly and monitored closely. Excessive use of interruptive alerting can quickly lead to alert fatigue and decreased effectiveness and ignoring of alerts. In this review, we discuss the evidence for effective alert stewardship as well as practices and methods we have found useful to assess interruptive alert burden, reduce excessive firings, optimize alert effectiveness, and establish quality governance at our institutions. We also discuss the importance of a holistic view of the alerting ecosystem beyond the electronic health record.

DOI

10.1055/s-0042-1748856

Alternate Title

Appl Clin Inform

PMID

35613913
Inner Banner
Publication Image
Inner Banner
Publication Image

Title

Alert burden in pediatric hospitals: a cross-sectional analysis of six academic pediatric health systems using novel metrics.

Year of Publication

2021

Number of Pages

Date Published

2021 Oct 19

ISSN Number

1527-974X

Abstract

<p><strong>BACKGROUND: </strong>Excessive electronic health record (EHR) alerts reduce the salience of actionable alerts. Little is known about the frequency of interruptive alerts across health systems and how the choice of metric affects which users appear to have the highest alert burden.</p>

<p><strong>OBJECTIVE: </strong>(1) Analyze alert burden by alert type, care setting, provider type, and individual provider across 6 pediatric health systems. (2) Compare alert burden using different metrics.</p>

<p><strong>MATERIALS AND METHODS: </strong>We analyzed interruptive alert firings logged in EHR databases at 6 pediatric health systems from 2016-2019 using 4 metrics: (1) alerts per patient encounter, (2) alerts per inpatient-day, (3) alerts per 100 orders, and (4) alerts per unique clinician days (calendar days with at least 1 EHR log in the system). We assessed intra- and interinstitutional variation and how alert burden rankings differed based on the chosen metric.</p>

<p><strong>RESULTS: </strong>Alert burden varied widely across institutions, ranging from 0.06 to 0.76 firings per encounter, 0.22 to 1.06 firings per inpatient-day, 0.98 to 17.42 per 100 orders, and 0.08 to 3.34 firings per clinician day logged in the EHR. Custom alerts accounted for the greatest burden at all 6 sites. The rank order of institutions by alert burden was similar regardless of which alert burden metric was chosen. Within institutions, the alert burden metric choice substantially affected which provider types and care settings appeared to experience the highest alert burden.</p>

<p><strong>CONCLUSION: </strong>Estimates of the clinical areas with highest alert burden varied substantially by institution and based on the metric used.</p>

DOI

10.1093/jamia/ocab179

Alternate Title

J Am Med Inform Assoc

PMID

34664664
Inner Banner
Publication Image
Inner Banner
Publication Image

Title

Hidden health IT hazards: a qualitative analysis of clinically meaningful documentation discrepancies at transfer out of the pediatric intensive care unit.

Year of Publication

2019

Number of Pages

392-398

Date Published

2019 Oct

ISSN Number

2574-2531

Abstract

<p><strong>Objective: </strong>The risk of medical errors increases upon transfer out of the intensive care unit (ICU). Discrepancies in the documented care plan between notes at the time of transfer may contribute to communication errors. We sought to determine the frequency of clinically meaningful discrepancies in the documented care plan for patients transferred from the pediatric ICU to the medical wards and identified risk factors.</p>

<p><strong>Materials and Methods: </strong>Two physician reviewers independently compared the transfer note and handoff document of 50 randomly selected transfers. Clinically meaningful discrepancies in the care plan between these two documents were identified using a coding procedure adapted from healthcare failure mode and effects analysis. We assessed the influence of risk factors via multivariable regression.</p>

<p><strong>Results: </strong>We identified 34 clinically meaningful discrepancies in 50 patient transfers. Fourteen transfers (28%) had ≥1 discrepancy, and ≥2 were present in 7 transfers (14%). The most common discrepancy categories were differences in situational awareness notifications and documented current therapy. Transfers with handoff document length in the top quartile had 10.6 (95% CI: 1.2-90.2) times more predicted discrepancies than transfers with handoff length in the bottom quartile. Patients receiving more medications in the 24 hours prior to transfer had higher discrepancy counts, with each additional medication increasing the predicted number of discrepancies by 17% (95% CI: 6%-29%).</p>

<p><strong>Conclusion: </strong>Clinically meaningful discrepancies in the documented care plan pose legitimate safety concerns and are common at the time of transfer out of the ICU among complex patients.</p>

DOI

10.1093/jamiaopen/ooz026

Alternate Title

JAMIA Open

PMID

31984372
Inner Banner
Publication Image
Inner Banner
Publication Image

Title

Formative Usability Testing Reduces Severe Blood Product Ordering Errors.

Year of Publication

2019

Number of Pages

981-990

Date Published

2019 Oct

ISSN Number

1869-0327

Abstract

<p><strong>BACKGROUND: </strong> Medical errors in blood product orders and administration are common, especially for pediatric patients. A failure modes and effects analysis in our health care system indicated high risk from the electronic blood ordering process.</p>

<p><strong>OBJECTIVES: </strong> There are two objectives of this study as follows:(1) To describe differences in the design of the original blood product orders and order sets in the system (original design), new orders and order sets designed by expert committee (DEC), and a third-version developed through user-centered design (UCD).(2) To compare the number and type of ordering errors, task completion rates, time on task, and user preferences between the original design and that developed via UCD.</p>

<p><strong>METHODS: </strong> A multidisciplinary expert committee proposed adjustments to existing blood product order sets resulting in the DEC order set. When that order set was tested with front-line users, persistent failure modes were detected, so orders and order sets were redesigned again via formative usability testing. Front-line users in their native clinical workspaces were observed ordering blood in realistic simulated scenarios using a think-aloud protocol. Iterative adjustments were made between participants. In summative testing, participants were randomized to use the original design or UCD for five simulated scenarios. We evaluated differences in ordering errors, time on task, and users' design preference with two-sample -tests.</p>

<p><strong>RESULTS: </strong> Formative usability testing with 27 providers from seven specialties led to 18 changes made to the DEC to produce the UCD. In summative testing, error-free task completion for the original design was 36%, which increased to 66% in UCD (30%, 95% confidence interval [CI]: 3.9-57%;  = 0.03). Time on task did not vary significantly.</p>

<p><strong>CONCLUSION: </strong> UCD led to substantially different blood product orders and order sets than DEC. Users made fewer errors when ordering blood products for pediatric patients in simulated scenarios when using the UCD orders and order sets compared with the original design.</p>

DOI

10.1055/s-0039-3402714

Alternate Title

Appl Clin Inform

PMID

31875648
Inner Banner
Publication Image
Inner Banner
Publication Image

Title

Towards a Maturity Model for Clinical Decision Support Operations.

Year of Publication

2019

Number of Pages

810-819

Date Published

2019 Oct

ISSN Number

1869-0327

Abstract

<p>Clinical decision support (CDS) systems delivered through the electronic health record are an important element of quality and safety initiatives within a health care system. However, managing a large CDS knowledge base can be an overwhelming task for informatics teams. Additionally, it can be difficult for these informatics teams to communicate their goals with external operational stakeholders and define concrete steps for improvement. We aimed to develop a maturity model that describes a roadmap toward organizational functions and processes that help health care systems use CDS more effectively to drive better outcomes. We developed a maturity model for CDS operations through discussions with health care leaders at 80 organizations, iterative model development by four clinical informaticists, and subsequent review with 19 health care organizations. We ceased iterations when feedback from three organizations did not result in any changes to the model. The proposed CDS maturity model includes three main "pillars": "Content Creation," "Analytics and Reporting," and "Governance and Management." Each pillar contains five levels-advancing along each pillar provides CDS teams a deeper understanding of the processes CDS systems are intended to improve. A "roof" represents the CDS functions that become attainable after advancing along each of the pillars. Organizations are not required to advance in order and can develop in one pillar separately from another. However, we hypothesize that optimal deployment of preceding levels and advancing in tandem along the pillars increase the value of organizational investment in higher levels of CDS maturity. In addition to describing the maturity model and its development, we also provide three case studies of health care organizations using the model for self-assessment and determine next steps in CDS development.</p>

DOI

10.1055/s-0039-1697905

Alternate Title

Appl Clin Inform

PMID

31667818
Inner Banner
Publication Image
Inner Banner
Publication Image

Title

Development and dissemination of clinical decision support across institutions: standardization and sharing of refugee health screening modules.

Year of Publication

2019

Number of Pages

Date Published

2019 Aug 02

ISSN Number

1527-974X

Abstract

<p><strong>OBJECTIVES: </strong>We developed and piloted a process for sharing guideline-based clinical decision support (CDS) across institutions, using health screening of newly arrived refugees as a case example.</p>

<p><strong>MATERIALS AND METHODS: </strong>We developed CDS to support care of newly arrived refugees through a systematic process including a needs assessment, a 2-phase cognitive task analysis, structured preimplementation testing, local implementation, and staged dissemination. We sought consensus from prospective users on CDS scope, applicable content, basic supported workflows, and final structure. We documented processes and developed sharable artifacts from each phase of development. We publically shared CDS artifacts through online dissemination platforms. We collected feedback and implementation data from implementation sites.</p>

<p><strong>RESULTS: </strong>Responses from 19 organizations demonstrated a need for improved CDS for newly arrived refugee patients. A guided multicenter workflow analysis identified 2 main workflows used by organizations that would need to be supported by shared CDS. We developed CDS through an iterative design process, which was successfully disseminated to other sites using online dissemination repositories. Implementation sites had a small-to-modest analyst time commitment but reported a good match between CDS and workflow.</p>

<p><strong>CONCLUSION: </strong>Sharing of CDS requires overcoming technical and workflow barriers. We used a guided multicenter workflow analysis and online dissemination repositories to create flexible CDS that has been adapted at 3 sites. Organizations looking to develop sharable CDS should consider evaluating the workflows of multiple institutions and collecting feedback on scope, design, and content in order to make a more generalizable product.</p>

DOI

10.1093/jamia/ocz124

Alternate Title

J Am Med Inform Assoc

PMID

31373356
Inner Banner
Publication Image
Inner Banner
Publication Image

Title

Accuracy of Pulse Oximetry-Based Home Baby Monitors.

Year of Publication

2018

Number of Pages

717-719

Date Published

2018 Aug 21

ISSN Number

1538-3598

Abstract

<p>Smartphone-integrated consumer baby monitors that measure vital signs are popular among parents but are not regulated by the US Food and Drug Administration (FDA). This study measured the accuracy of pulse oximetry-based consumer baby monitors using an FDA-cleared oximeter as a reference.</p>

DOI

10.1001/jama.2018.9018

Alternate Title

JAMA

PMID

30140866
Inner Banner
Publication Image
Inner Banner
Publication Image

Title

Influence of simulation on electronic health record use patterns among pediatric residents.

Year of Publication

2018

Number of Pages

Date Published

2018 Aug 21

ISSN Number

1527-974X

Abstract

<p><strong>Objective: </strong>Electronic health record (EHR) simulation with realistic test patients has improved recognition of safety concerns in test environments. We assessed if simulation affects EHR use patterns in real clinical settings.</p>

<p><strong>Materials and Methods: </strong>We created a 1-hour educational intervention of a simulated admission for pediatric interns. Data visualization and information retrieval tools were introduced to facilitate recognition of the patient's clinical status. Using EHR audit logs, we assessed the frequency with which these tools were accessed by residents prior to simulation exposure (intervention group, pre-simulation), after simulation exposure (intervention group, post-simulation), and among residents who never participated in simulation (control group).</p>

<p><strong>Results: </strong>From July 2015 to February 2017, 57 pediatric residents participated in a simulation and 82 did not. Residents were more likely to use the data visualization tool after simulation (73% in post-simulation weeks vs 47% of combined pre-simulation and control weeks, P &lt;. 0001) as well as the information retrieval tool (85% vs 36%, P &lt; .0001). After adjusting for residents' experiences measured in previously completed inpatient weeks of service, simulation remained a significant predictor of using the data visualization (OR 2.8, CI: 2.1-3.9) and information retrieval tools (OR 3.0, CI: 2.0-4.5). Tool use did not decrease in interrupted time-series analysis over a median of 19 (IQR: 8-32) weeks of post-simulation follow-up.</p>

<p><strong>Discussion: </strong>Simulation was associated with persistent changes to EHR use patterns among pediatric residents.</p>

<p><strong>Conclusion: </strong>EHR simulation is an effective educational method that can change participants' use patterns in real clinical settings.</p>

DOI

10.1093/jamia/ocy105

Alternate Title

J Am Med Inform Assoc

PMID

30137348
Inner Banner
Publication Image
Inner Banner
Publication Image