Skip Navigation

Strengthen the Evidence for Maternal and Child Health Programs

Sign up for MCHalert eNewsletter

Established Evidence Results

Results for Keyword:

Below are articles that support specific interventions to advance MCH National Performance Measures (NPMs) and Standardized Measures (SMs). Most interventions contain multiple components as part of a coordinated strategy/approach.

You can filter by intervention component below and sort to refine your search.

Start a New Search


Displaying records 1 through 11 (11 total).

Barry S, Paul K, Aakre K, Drake-Buhr S, Willis R. Final Report: Developmental and Autism Screening in Primary Care. Burlington, VT: Vermont Child Health Improvement Program; 2012.

Evidence Rating: Emerging Evidence

Intervention Components (click on component to see a list of all articles that use that intervention): PROVIDER/PRACTICE, Provider Training/Education, Educational Material (Provider), Participation Incentives, Quality Improvement/Practice-Wide Intervention, Expert Support (Provider), Modified Billing Practices, Data Collection Training for Staff, Screening Tool Implementation Training, Office Systems Assessments and Implementation Training, Expert Feedback Using the Plan-Do-Study-Act-Tool, Collaboration with Local Agencies (State), Collaboration with Local Agencies (Health Care Provider/Practice), Engagement with Payers, STATE, POPULATION-BASED SYSTEMS, Audit/Attestation, HEALTH_CARE_PROVIDER_PRACTICE, Audit/Attestation (Provider)

Intervention Description: The Vermont Child Health Improvement Program (VCHIP) at the University of Vermont collaborated with state agencies and professional societies to conduct a survey of Vermont pediatric and family medicine practices regarding their developmental screening and autism screening processes, referral patterns, and barriers. The survey was administered in 2009 to 103 primary care practices, with a 65% response rate (89% for pediatric practices, 53% for family medicine practices).

Intervention Results: The survey results revealed that while 88% of practices have a specific approach to developmental surveillance and 87% perform developmental screening, only 1 in 4 use structured tools with good psychometric properties. Autism screening was performed by 59% of practices, with most using the M-CHAT or CHAT tool and screening most commonly at the 18-month visit. When concerns were identified, 72% referred to a developmental pediatrician and over 50% to early intervention. Key barriers to both developmental and autism screening were lack of time, staff, and training. Over 80% of practices used a note in the patient chart to track at-risk children, and most commonly referred to child development clinics, audiology, early intervention, and pediatric specialists.

Conclusion: The survey conducted by VCHIP revealed wide variation in developmental and autism screening practices among Vermont pediatric and family medicine practices. While most practices conduct some form of screening, there is room for improvement in the use of validated tools, adherence to recommended screening ages, and implementation of office systems for tracking at-risk children. The survey identified knowledge gaps and barriers that can be addressed through quality improvement initiatives, which most respondents expressed interest in participating in.

Study Design: QE: pretest-posttest

Setting: Pediatric and family medicine practices in Vermont

Population of Focus: Children up to age 3

Data Source: Child medical record; ProPHDS Survey

Sample Size: Chart audits at 37 baseline and 35 follow-up sites (n=30 per site) Baseline charts (n=1381) - Children 19-23 months (n=697) - Children 31-35 months (n=684) Follow-up charts (n=1301) - Children 19-23 months (n=646) - Children 31-35 months (n=655)

Age Range: Not specified

Access Abstract

Conroy, K., Rea, C., Kovacikova, G. I., Sprecher, E., Reisinger, E., Durant, H., Starmer, A., Cox, J., & Toomey, S. L. (2018). Ensuring Timely Connection to Early Intervention for Young Children With Developmental Delays. Pediatrics, 142(1), e20174017. https://doi.org/10.1542/peds.2017-4017

Evidence Rating: Moderate

Intervention Components (click on component to see a list of all articles that use that intervention): Multicomponent Approach, Office Systems Assessments and Implementation Training, Data Collection Training for Staff,

Intervention Description: The intervention implemented in the study aimed to improve the process of referring patients to early intervention (EI) services. The multifaceted intervention included several components: 1. Patient and provider activation: The improvement team met with local EI staff to review eligibility criteria and best practices in motivating families to connect with EI. An EI brochure was developed to educate families on EI's services and evaluation process. 2. Centralizing and tracking referrals through an EI registry: The referral routes were streamlined by encouraging the use of an electronic order form within the electronic medical record (EMR) to direct the referral into the database after an intake visit had been scheduled. An EI registry was utilized to track referrals and facilitate follow-up for patients. 3. Plan-Do-Study-Act (PDSA) cycles: The team conducted a series of PDSA cycles regarding communication with EI sites to refine the intervention and address any identified barriers. The intervention was designed to address the identified drivers of successful EI referral and to streamline the referral process, ensuring that patients were connected with EI in a timely manner. The multifaceted approach aimed to improve the connection of patients to EI services and to track the effectiveness of the intervention.

Intervention Results: The percentage of patients evaluated by EI within 120 days increased from a baseline median of 50% to a median of 72% after implementation of the systems (N = 309). After implementation, the centralized referral system was used a median of 90% of the time. Tracking of referral outcomes revealed decreases in families refusing evaluations and improvements in exchange of information with EI.

Conclusion: Yes, the study reported statistically significant findings related to the evaluation of patients referred to early intervention (EI) services. The study found that the percentage of patients evaluated by EI within 120 days of referral increased from a baseline median of 50% to a median of 72% after the implementation of the new referral process. Additionally, the study identified demographic and clinical predictors of successful evaluation, with insurance status and specific diagnoses being statistically significant factors associated with timely evaluation. Furthermore, the study used t tests, χ2 testing, and multivariate logistic regression to identify these predictors and assess the statistical significance of the findings. The results of the study demonstrated the effectiveness of the intervention in improving the timely connection of patients to EI services.,

Study Design: The study design used in this research is a quality improvement (QI) initiative. The authors engaged in a quality improvement study to redesign the early intervention (EI) referral process with the goal of ensuring that 70% of patients referred to EI were evaluated by the program. The QI initiative involved implementing a multifaceted referral process, including a centralized electronic referral system used by providers, patient navigators responsible for processing all EI referrals, and a tracking system post-referral to facilitate identification of patients failing to connect with EI. The study utilized a QI approach to address the issue of timely connection to early intervention for young children with developmental delays.,

Setting: The quality improvement initiative was implemented at an academic hospital-based primary care clinic that cares for approximately 16,000 patients, with 17% of them being under 3 years of age and potentially eligible for early intervention services. The families primarily reside in urban neighborhoods, and 68% of them are Medicaid insured. The pediatric provider team consists of attending physicians, nurse practitioners, and resident physicians. The clinic serves a low-income population, and 20% of well-child visits are billed as having a developmental-behavioral concern. The study was conducted in this setting to improve the connection of patients to early intervention services.

Population of Focus: The target audience for the study includes healthcare professionals, particularly those involved in pediatric primary care, early intervention programs, and quality improvement initiatives. Additionally, policymakers and researchers interested in early childhood development, developmental services, and interventions for children with developmental delays would also find the study relevant. The findings and recommendations from the study are likely to be of interest to professionals and organizations involved in improving the coordination of early intervention referrals and services for young children with developmental delays.

Sample Size: The sample size for the study was 309 patients who were referred to early intervention services from the academic primary care clinic. Of these patients, 219 were evaluated within 120 days of referral. The study analyzed the demographic and diagnostic characteristics of the patients and their associations with timely referral to early intervention services.

Age Range: The article discusses early intervention for children under the age of 3 years who are experiencing or at risk for developmental delays.

Access Abstract

Earls MF, Hay SS. Setting the stage for success: implementation of developmental and behavioral screening and surveillance in primary care practice--the North Carolina Assuring Better Child Health and Development (ABCD) Project. Pediatrics. 2006;118(1):e183-188.

Evidence Rating: Emerging Evidence

Intervention Components (click on component to see a list of all articles that use that intervention): PROVIDER/PRACTICE, Provider Training/Education, Educational Material (Provider), Expert Support (Provider), Participation Incentives, Modified Billing Practices, Data Collection Training for Staff, Screening Tool Implementation Training, Office Systems Assessments and Implementation Training, Expert Feedback Using the Plan-Do-Study-Act-Tool, Collaboration with Local Agencies (State), Collaboration with Local Agencies (Health Care Provider/Practice), Engagement with Payers, STATE, POPULATION-BASED SYSTEMS, Audit/Attestation, HEALTH_CARE_PROVIDER_PRACTICE, Audit/Attestation (Provider)

Intervention Description: Early identification of children with developmental and behavioral delays is important in primary care practice, and well-child visits provide an ideal opportunity to engage parents and perform periodic screening. Integration of this activity into office process and flow is necessary for making screening a routine and consistent part of primary care practice.

Intervention Results: In the North Carolina Assuring Better Child Health and Development Project, careful attention to and training for office process has resulted in a significant increase in screening rates to >70% of the designated well-child visits. The data from the project prompted a change in Medicaid policy, and screening is now statewide in primary practices that perform Early Periodic Screening, Diagnosis, and Treatment examinations.

Conclusion: Although there are features of the project that are unique to North Carolina, there are also elements that are transferable to any practice or state interested in integrating child development services into the medical home.

Study Design: QE: pretest-posttest

Setting: Partnership for Health Management, a network within Community Care of North Carolina

Population of Focus: Children ages 6 to 60 months receiving Early Periodic Screening, Diagnosis, and Treatment services

Data Source: Child medical record

Sample Size: Unknown number of charts – screening rates tracked in 2 counties (>20,000 screens by 2004)

Age Range: Not specified

Access Abstract

Flower, K. B., Massie, S., Janies, K., Bassewitz, J. B., Coker, T. R., Gillespie, R. J., ... & Earls, M. F. (2020). Increasing early childhood screening in primary care through a quality improvement collaborative. Pediatrics, 146(3).

Evidence Rating: Moderate

Intervention Components (click on component to see a list of all articles that use that intervention): Quality Improvement/Practice-Wide Intervention, Office Systems Assessments And Implementation Training, HEALTH_CARE_PROVIDER_PRACTICE, Audit/Attestation (Provider), Data Collection Training for Staff , Provider Training/Education

Intervention Description: This 1-year national quality improvement collaborative involved 19 pediatric primary care practices. Supported by virtual and in-person learning opportunities, practice teams implemented changes to early childhood screening. Monthly chart reviews were used to assess screening, discussion, referral, and follow-up for development, ASD, maternal depression, and SDoH. Parent surveys were used to assess parent-reported screening and referral and/or resource provision. Practice self-ratings and team surveys were used to assess practice-level changes.

Intervention Results: Participating practices included independent, academic, hospital-affiliated, and multispecialty group practices and community health centers in 12 states. The collaborative met development and ASD screening goals of >90%. Largest increases in screening occurred for maternal depression (27% to 87%; +222%; P < .001) and SDoH (26% to 76%; +231%; P < .001). Statistically significant increases in discussion of results occurred for all screening areas. For referral, significant increases were seen for development (53% to 86%; P < .001) and maternal depression (23% to 100%; P = .008). Parents also reported increased screening and referral and/or resource provision. Practice-level changes included improved systems to support screening.

Conclusion: Practices successfully implemented multiple screenings and demonstrated improvement in subsequent discussion, referral, and follow-up steps. Continued advocacy for adequate resources to support referral and follow-up is needed to translate increased screening into improved health outcomes.

Setting: Pediatric primary care practices

Population of Focus: Physician leader, staff and parent partner

Access Abstract

Gray C, Fox K,Williamson ME. Improving Health Outcomes for Children (IHOC): First STEPS II Initiative: Improving Developmental, Autism, and Lead Screening for Children: Final Evaluation. Portland, ME: University of Southern Maine Muskie School of Public Service; 2013.

Evidence Rating: Emerging Evidence

Intervention Components (click on component to see a list of all articles that use that intervention): PROVIDER/PRACTICE, Provider Training/Education, Expert Support (Provider), Modified Billing Practices, Screening Tool Implementation Training, Office Systems Assessments and Implementation Training, Expert Feedback Using the Plan-Do-Study-Act-Tool, Engagement with Payers, STATE, POPULATION-BASED SYSTEMS, Collaboration with Local Agencies (State), Collaboration with Local Agencies (Health Care Provider/Practice), HEALTH_CARE_PROVIDER_PRACTICE, Audit/Attestation (Provider)

Intervention Description: This report evaluates the impact of Phase II of Maine's First STEPS initiative

Intervention Results: Average percentage of documented use of a developmental screening tool increased substantially from baseline to followup for all three age groups (46% to 97% for children under one; 22% to 71% for children 18-23 months; and 22% to 58% for children 24-35 months). Rate of developmental screening based on MaineCare claims increased from the year prior to intervention implementation to the year after implementation for all three age groups (5.3% to 17.1% for children age one; 1.5% to 13.3% for children age two; and 1.2% to 3.3% for children age 3).

Conclusion: The authors summarize lessons learned in implementing changes in practices and challenges in using CHIPRA and IHOC developmental, autism, and lead screening measures at the practice-level to inform quality improvement.

Study Design: QE: pretest-posttest

Setting: Pediatric and family practices serving children with MaineCoverage

Population of Focus: Children ages 6 to 35 months

Data Source: Child medical record; MaineCare paid claims

Sample Size: Unknown number of chart reviews from 9 practice sites completing follow-up

Age Range: Not specified

Access Abstract

Lannon CM, Flower K, Duncan P, Moore KS, Stuart J, Bassewitz J. The Bright Futures Training Intervention Project: implementing systems to support preventive and developmental services in practice. Pediatrics. 2008;122(1):e163-171.

Evidence Rating: Emerging Evidence

Intervention Components (click on component to see a list of all articles that use that intervention): PROVIDER/PRACTICE, Provider Training/Education, Educational Material (Provider), Expert Support (Provider), Quality Improvement/Practice-Wide Intervention, Data Collection Training for Staff, Office Systems Assessments and Implementation Training, Expert Feedback Using the Plan-Do-Study-Act-Tool, POPULATION-BASED SYSTEMS, STATE, Collaboration with Local Agencies (State), Collaboration with Local Agencies (Health Care Provider/Practice), Audit/Attestation, HEALTH_CARE_PROVIDER_PRACTICE, Audit/Attestation (Provider)

Intervention Description: The objectives of this study were to assess the feasibility of implementing a bundle of strategies to facilitate the use of Bright Futures recommendations and to evaluate the effectiveness of a modified learning collaborative in improving preventive and developmental care.

Intervention Results: Office system changes most frequently adopted were use of recall/reminder systems (87%), a checklist to link to community resources (80%), and systematic identification of children with special health care needs (80%). From baseline to follow-up, increases were observed in the use of recall/reminder systems, the proportion of children's charts that had a preventive services prompting system, and the families who were asked about special health care needs. Of 21 possible office system components, the median number used increased from 10 to 15. Comparing scores between baseline and follow-up for each practice site, the change was significant. Teams reported that the implementation of office systems was facilitated by the perception that a component could be applied quickly and/or easily. Barriers to implementation included costs, the time required, and lack of agreement with the recommendations.

Conclusion: This project demonstrated the feasibility of implementing specific strategies for improving preventive and developmental care for young children in a wide variety of practices. It also confirmed the usefulness of a modified learning collaborative in achieving these results. This model may be useful for disseminating office system improvements to other settings that provide care for young children.

Study Design: QE: pretest-posttest

Setting: Primary care practices (15 at baseline, 8 at follow- up) throughout the US (9 states total), with most in the Midwest

Population of Focus: Children from birth through 21 years of age

Data Source: Child medical record

Sample Size: Unknown number of chart audits from 8 practice sites completing follow-up

Age Range: Not specified

Access Abstract

Margolis PA, McLearn KT, Earls MF, et al. Assisting primary care practices in using office systems to promote early childhood development. Ambul Pediatr. 2008;8(6):383-387.

Evidence Rating: Emerging Evidence

Intervention Components (click on component to see a list of all articles that use that intervention): PROVIDER/PRACTICE, Provider Training/Education, Expert Support (Provider), Quality Improvement/Practice-Wide Intervention, Data Collection Training for Staff, Office Systems Assessments and Implementation Training

Intervention Description: The aim of this study was to use family-centered measures to estimate the effect of a collaborative quality improvement program designed to help practices implement systems to promote early childhood development services.

Intervention Results: The number of care delivery systems increased from a mean of 12.9 to 19.4 of 27 in collaborative practices and remained the same in comparison practices (P=.0002). The proportion of children with documented developmental and psychosocial screening among intervention practices increased from 78% to 88% (P<.001) and from 22% to 29% (P=.002), respectively. Compared with control practices, there was a trend toward improvement in the proportion of parents who reported receiving at least 3 of 4 areas of care.

Conclusion: The learning collaborative was associated with an increase in the number of practice-based systems and tools designed to elicit and address parents' concerns about their child's behavior and development and a modest improvement in parent-reported measures of the quality of care.

Study Design: QE: pretest-posttest nonequivalent control group

Setting: Pediatric and family primary care practices (17 collaborative education, 18 comparison practices) in Vermont and North Carolina

Population of Focus: Children ages 0-48 months receiving well-child visits

Data Source: Child medical record

Sample Size: Unknown number of chart audits

Age Range: Not specified

Access Abstract

Minkovitz CS, Hughart N, Strobino D, et al. A practice-based intervention to enhance quality of care in the first 3 years of life: the Healthy Steps for Young Children Program. JAMA. 2003;290(23):3081- 3091.

Evidence Rating: Moderate Evidence

Intervention Components (click on component to see a list of all articles that use that intervention): PATIENT/CONSUMER, Home Visits, PROVIDER/PRACTICE, Provider Training/Education, Educational Material (Provider), Expert Support (Provider), Screening Tool Implementation Training, Office Systems Assessments and Implementation Training, Data Collection Training for Staff

Intervention Description: To determine the impact of the Healthy Steps for Young Children Program on quality of early childhood health care and parenting practices.

Intervention Results: Percentage of children with developmental assessments was 83.1% for intervention and 41.4% for control group (OR=8.00; 95% CI=6.69, 9.56; P<.001)

Conclusion: Universal, practice-based interventions can enhance quality of care for families of young children and can improve selected parenting practices.

Study Design: RCT and QE: nonequivalent control group

Setting: Pediatric practices in 14 states (6 randomization sites: San Diego, CA; Iowa City, IA; Allentown, PA; Pittsburgh, PA; Florence, SC; Amarillo, TX. 9 QE sites: Birmingham, AL/Chapel Hill, NC; Grand Junction, CO/Montrose, CO; Chicago, IL; Kansas City, KS; Boston, MA; Detroit, MI; Kansas City, MO; New York, NY; Houston, TX/Richmond, TX)

Population of Focus: Children ages 0-36 months

Data Source: Child medical record

Sample Size: Randomization Sites: - Intervention (n=832) - Control (n=761) - Total (n=1593) Quasi-Experimental Sites: - Intervention (n=1189) - Control (n=955) - Total (n=2144) Total: - All families (n=3737) - Intervention: (n=2021) - Control (n=1716)

Age Range: Not specified

Access Abstract

Ray, K. N., Drnach, M., Mehrotra, A., Suresh, S., & Docimo, S. G. (2018). Impact of Implementation of Electronically Transmitted Referrals on Pediatric Subspecialty Visit Attendance. Academic pediatrics, 18(4), 409–417. https://doi.org/10.1016/j.acap.2017.12.008

Evidence Rating: Emerging

Intervention Components (click on component to see a list of all articles that use that intervention): Office Systems Assessments and Implementation Training, Referrals,

Intervention Description: The intervention involved implementing electronically transmitted referrals for pediatric subspecialty care. The intervention included three main changes: 1. Redesigning the EMR referral order to transmit electronically to CHP sub-specialty schedulers through a shared electronic health platform for referrals specific to CHP specialists. 2. For referrals electronically transmitted to CHP, schedulers then called families up to 3 times. These two steps bypassed many steps and decisions that the family would otherwise need to navigate (i.e., understanding the need for referral, deciding to schedule the referral, actually calling the scheduler, and navigating the phone tree). 3. To improve PCP’s ability to track referrals, subspecialty schedulers sent electronic notifications to PCPs regarding the final scheduling outcome: appointment scheduled, family not reached, or family declined.

Intervention Results: From April 2015 through September 2016 there were 33,485 referral orders across all practices (7770 before the pilot, 11,776 during the pilot, 13,939 after full implementation). At pilot practices, there was a significant and sustained improvement in subspecialty visits attended within 4 weeks of referral (10.9% to 20.0%; P < .001). Relative to control practices, pilot practices experienced an 8.6% improvement (P = .001). After implementation at control practices, rates of visits attended also improved but to a smaller degree: 11.8% to 14.7% (P < .001). In survey responses, referring pediatricians noted improved scheduling processes but had continued concerns with appointment availability and referral tracking.

Conclusion: Yes, the study reports statistically significant findings. The study found that the percentage of referrals with a visit attended within 4 weeks increased significantly from 11.8% before the pilot to 21.1% after the pilot (P < .001). The percentage of referrals from control practices with a visit attended within 4 weeks also increased significantly, but more modestly, from 11.8% to 14.7% (P < .001). The interrupted time-series analysis confirmed a statistically significant change in the percentage of visits attended within 4 weeks of referral (P < .001).

Study Design: The study design is a quality improvement evaluation that uses an interrupted time-series analysis to evaluate the impact of implementing electronically transmitted referrals on pediatric subspecialty visit attendance. The study uses administrative data from referring practices and subspecialty services to examine appointment scheduling and attendance. The study also includes a survey of referring pediatricians to assess their perceptions of care processes before and after the intervention.

Setting: The setting for the study was the Children's Hospital of Pittsburgh (CHP) of University of Pittsburgh Medical Center (UPMC) in southwestern Pennsylvania. The hospital is a freestanding academic 315-bed children’s hospital with over 240,000 outpatient subspecialty visits occurring in 2015 at the main hospital and 9 satellite sites.

Population of Focus: The target audience for the study is likely healthcare providers, hospital administrators, and researchers interested in improving access to pediatric subspecialty care. The study evaluates the impact of implementing electronically transmitted referrals on pediatric subspecialty visit attendance and provides insights into the potential benefits and challenges of this intervention. The study may be of interest to those working in pediatric healthcare settings, as well as those interested in healthcare quality improvement and patient access to care.

Sample Size: The sample size for the study involved 39 community general pediatric practices affiliated with the Children's Hospital of Pittsburgh (CHP).

Age Range: The age group of the patients referred for pediatric subspecialty care is not specified in the abstract. However, Table 2 on shows the distribution of referrals by patient age, with categories of 0-2 years, 3-5 years, 6-11 years, 12-17 years, and 18+ years.

Access Abstract

Sanderson, D., Braganza, S., Philips, K., Chodon, T., Whiskey, R., Bernard, P., Rich, A., & Fiori, K. (2021). "Increasing Warm Handoffs: Optimizing Community Based Referrals in Primary Care Using QI Methodology". Journal of primary care & community health, 12, 21501327211023883. https://doi.org/10.1177/21501327211023883

Evidence Rating: Moderate

Intervention Components (click on component to see a list of all articles that use that intervention): Communication Tools, Office Systems Assessments and Implementation Training, Expert Feedback Using the Plan-Do-Study-Act-Tool,

Intervention Description: The intervention description in the study "Increasing Warm Handoffs: Optimizing Community Based Referrals in Primary Care Using QI Methodology" included several key components aimed at improving the warm handoff process and referral workflow. Some of the interventions implemented during the study period are as follows: 1. Dedicating CHW Space: The study involved dedicating space near providers for Community Health Workers (CHWs) and creating electronic CHW schedules and warm handoff blocks. 2. Improving Communication: Efforts were made to improve communication with providers using email and huddle reminders, as well as posting informative signs in exam rooms to facilitate the warm handoff process. 3. Workflow Enhancements: Workflow enhancements were implemented, including the creation of warm handoff blocks in the electronic medical record (EMR) and the CHW's schedule, as well as the co-location of CHWs with pediatricians for a specified period each week. 4. Regular Updates and Reminders: Monthly update emails were sent to the entire clinic staff, providing program data, workflow reminders, and success stories of patients who were referred to community resources. Additionally, workflow reminders were placed in exam rooms to prompt and enable providers to conduct warm handoffs. 5. Leadership Engagement: Leadership buy-in to the workflow changes was emphasized, and monthly emails were used to keep providers and staff updated on the screening and referral workflow and improvement initiatives. Success stories of patients who connected with a referral resource were shared to positively reinforce referral behavior. These interventions were part of the Plan-Do-Study-Act (PDSA) cycles performed during the study and were aimed at optimizing the warm handoff process and increasing the effectiveness of referrals for patients with unmet social needs.

Intervention Results: Using quality improvement (QI) methods our pediatric clinic worked to increase the warm handoff rate between Community Health Workers (CHWs) and patients with unmet social needs. CHW warm handoff rates increased two-fold over the intervention period. Our results illustrate that QI methods can be used to optimize workflows to increase warm handoffs with CHWs.

Conclusion: Yes, the study reported statistically significant findings related to the impact of the interventions on increasing warm handoffs with Community Health Workers (CHWs) and improving the referral process for patients with unmet social needs. Specifically, the study found the following statistically significant results: 1. CHW Referral Rate: The study reported a significantly higher referral rate in the intervention period compared to the baseline period (P = 0.03). 2. Warm Handoff Rate: The study found a statistically significant increase in the warm handoff rate between families requesting assistance with unmet social needs and CHWs over the intervention period compared to the baseline period (P < 0.001). These statistically significant findings indicate that the quality improvement (QI) interventions implemented during the study had a significant impact on increasing the warm handoff rate and improving the referral process for patients with unmet social needs.

Study Design: The study design used in the research article is a quality improvement (QI) project. The study aimed to optimize community-based referrals in primary care using QI methodology. The authors used Plan-Do-Study-Act (PDSA) cycles to test and implement interventions aimed at increasing the warm handoff rate between patients with unmet social needs requesting assistance and Community Health Workers (CHWs). The study used a pre-post design, comparing the baseline period to the intervention period, to evaluate the effectiveness of the interventions. The study did not use a randomized controlled trial (RCT) design, which is commonly used in clinical research to evaluate the effectiveness of interventions.

Setting: The setting for the study was a pediatric clinic affiliated with the Albert Einstein College of Medicine and Montefiore Medical Group in Bronx, NY, USA. The study took place at an academic-affiliated Federally Qualified Health Center (FQHC) where providers and residents are accustomed to partaking in Quality Improvement (QI) and research projects. The clinic served underserved communities and aimed to optimize community-based referrals in primary care using QI methodology.

Population of Focus: The target audience for the study includes healthcare professionals, particularly those working in pediatric primary care settings, as well as professionals involved in community health and social services. Additionally, individuals and organizations involved in quality improvement initiatives within healthcare settings may also find the study relevant. The findings and recommendations of the study are likely to be of interest to practitioners, researchers, and policymakers seeking to improve social needs screening and referral programs, especially in underserved communities.

Sample Size: The sample size for the study was not explicitly mentioned in the provided excerpts. However, the study reported that a total of 3100 patients were screened for social needs in the baseline period, and 6278 patients were screened in the intervention period. Additionally, the study mentioned that 527 patients (8.4%) were referred to a Community Health Worker (CHW) in the intervention period. While the specific sample size for the intervention group was not provided, the study's findings were based on the outcomes observed during the intervention period involving the referred patients.

Age Range: The provided excerpts from the study "Increasing Warm Handoffs: Optimizing Community Based Referrals in Primary Care Using QI Methodology" did not explicitly mention the specific age range of the study participants. However, based on the context of the study, which focused on pediatric care and addressing the social needs of families, it can be inferred that the study likely involved children and their families. The study primarily focused on the impact of warm handoffs and referrals in a pediatric clinic, indicating that the age range of the study participants likely encompassed children and possibly their caregivers or family members.

Access Abstract

Wissel, B. D., Greiner, H. M., Glauser, T. A., Mangano, F. T., Holland-Bouley, K. D., Zhang, N., Szczesniak, R. D., Santel, D., Pestian, J. P., & Dexheimer, J. W. (2023). Automated, machine learning-based alerts increase epilepsy surgery referrals: A randomized controlled trial. Epilepsia, 64(7), 1791–1799. https://doi.org/10.1111/epi.17629

Evidence Rating: Emerging

Intervention Components (click on component to see a list of all articles that use that intervention): Screening Tool Implementation, Office Systems Assessments and Implementation Training,

Intervention Description: The intervention in the study involved the use of a natural language processing (NLP)-based clinical decision support system embedded in the electronic health record (EHR) to identify potential surgical candidates among children with epilepsy. Patients identified as potential surgical candidates by the NLP were then randomized for their provider to receive an alert or no reminder prior to the patient's visit. The alerts were delivered through two modalities: half of the alerts were sent via email, and the other half were in-basket messages that appeared in the EHR. The primary aim of the intervention was to assess whether these automated alerts increased referrals for epilepsy surgery evaluations.

Intervention Results: Between April 2017 and April 2019, at total of 4858 children were screened by the system, and 284 (5.8%) were identified as potential surgical candidates. Two hundred four patients received an alert, and 96 patients received standard care. Median follow-up time was 24 months (range: 12-36 months). Compared to the control group, patients whose provider received an alert were more likely to be referred for a presurgical evaluation (3.1% vs 9.8%; adjusted hazard ratio [HR] = 3.21, 95% confidence interval [CI]: 0.95-10.8; one-sided p = .03). Nine patients (4.4%) in the alert group underwent epilepsy surgery, compared to none (0%) in the control group (one-sided p = .03).

Conclusion: Yes, the study reported statistically significant findings related to the impact of automated alerts on the referral patterns for epilepsy surgery evaluations. Specifically, the study found that providers were more likely to refer patients with epilepsy for a presurgical evaluation after receiving an automated alert. Furthermore, the study results indicated that the alerts significantly increased the likelihood of referral for presurgical evaluations, as evidenced by the statistical analysis using a Cox proportional hazards model to estimate the hazard ratio (HR) of referrals after receiving an alert and Wald's test to estimate the corresponding p-value. Additionally, the study reported statistically significant differences in the proportion of patients referred for presurgical evaluations and surgeries between the group that received alerts and the control group that did not receive alerts.

Study Design: The study design was a prospective, randomized controlled trial. The trial evaluated the effectiveness of a natural language processing (NLP)-based clinical decision support system embedded in the electronic health record (EHR) to increase referrals for epilepsy surgery evaluations. The study randomly assigned potential surgical candidates to either receive an automated alert or standard of care (no alert) prior to their scheduled visit. The primary outcome was referral for a neurosurgical evaluation, and the likelihood of referral was estimated using a Cox proportional hazards regression model. The study was conducted over a 2-year period, from April 16, 2017, to April 15, 2019.

Setting: The study was conducted at a large pediatric epilepsy center in Cincinnati, OH, USA, specifically at the Cincinnati Children's Hospital Medical Center (CCHMC). The providers involved in the study were attending neurologists and nurse practitioners from this center. The research was carried out at 14 pediatric neurology outpatient clinic sites affiliated with the hospital.

Population of Focus: The target audience for this study includes healthcare providers, particularly neurologists and nurse practitioners involved in the care of children with epilepsy. Additionally, researchers and professionals in the fields of medical informatics, natural language processing, and clinical decision support systems may also find this study relevant and valuable. Furthermore, healthcare administrators and policymakers interested in improving the utilization of referrals for epilepsy surgery evaluations, as well as those involved in the implementation of technology-based interventions in clinical practice, would benefit from the findings of this research.

Sample Size: The study included a total of 284 children with epilepsy who were identified as potential surgical candidates by the natural language processing (NLP) algorithm and were randomized 2:1 for their provider to receive an alert or standard of care (no alert). Of these, 96 patients were assigned to the control group, 93 whose treating provider received an email, and 95 whose treating provider received an EHR alert. The study was conducted over a 2-year period, from April 16, 2017, to April 15, 2019.

Age Range: The study does not focus on a specific age group. However, the patients included in the study were children with epilepsy who were being treated at Cincinnati Children's Hospital Medical Center. The age range of the patients is not specified in the article.

Access Abstract

The MCH Digital Library is one of six special collections at Geogetown University, the nation's oldest Jesuit institution of higher education. It is supported in part by the Health Resources and Services Administration (HRSA) of the U.S. Department of Health and Human Services (HHS) under award number U02MC31613, MCH Advanced Education Policy with an award of $700,000/year. The library is also supported through foundation and univerity funding. This information or content and conclusions are those of the author and should not be construed as the official position or policy of, nor should any endorsements be inferred by HRSA, HHS or the U.S. Government.