Prof. Suzanne Kerns discusses her new book on selecting and implementing evidence-based practice
In December 2019, Graduate School of Social Work (GSSW) Research Associate Professor Suzanne Kerns will conclude a two-year term as conference co-chair of the 5th Biennial Society for Implementation Research and Collaboration. Kerns and co-chair Bob Franks, president and CEO of Harvard University’s Judge Baker Children’s Center, planned the September 2019 conference, which facilitated communication and collaboration between implementation research teams, researchers and community providers. Abstract submissions for the conference more than doubled for 2019, and more than 425 people attended from around the globe.
In the following Q&A interview, Kerns discusses additional accomplishments from the past year, including publication of a new book and highlights from her work as executive director of the Center for Effective Interventions.
You have a new book out: Selecting and Implementing Evidence-Based Practice: A Practical Program Guide (Springer, 2019), co-authored with Rosalyn Bertram at the University of Missouri-Kansas City School of Social Work. What does the book cover?
The book touches on a number of important topics related to implementation of evidence-based interventions and can be combined with other resources and articles in a course. We explore definitional issues related to evidence-based practice, including directly addressing some pervasive myths in the field — for example, that evidence-based practices devalue client choice and voice. We also explore how academic curricula may or may not provide needed competencies to deliver evidence-based practice and then explain how implementation science can be leveraged to expand the reach of such effective practices.
Implementing new practices is hard. It requires a number of diverse competencies, including knowing how to strategically select practices to meet specified needs, workforce training and support, agency and organizational strategies to embed practices, strategies to monitor treatment fidelity and outcomes, and how to have meaningful feedback loops to enhance practice and evaluate any needed adaptations. This book starts to unpack these competencies.
Selecting and Implementing Evidence-Based Practice: A Practical Program GuideGet the Book
What are some recent examples of training and workforce development work the Center for Effective Interventions (CEI) is doing?
CEI trains therapists from across the country in Multisystemic Therapy (MST) but predominately trains therapists in the Rocky Mountain west. We support 11 different agencies and 22 different MST teams in Colorado, New Mexico, Texas, Arizona and Washington, working with agencies on how to deliver MST with fidelity to the model, with the aim of achieving positive outcomes for the youth who participate. CEI staff provides weekly clinical consultation to therapists and supervisors for each team, quarterly on-site booster sessions, and we track data and outcomes for all of our teams. In a typical year, we support delivery of MST to between 525 and 625 families.
Multisystemic Therapy helps youth involved or at risk of involvement in the juvenile justice system.Learn More
In the past year, we partnered with the governor’s office to expand Multisystemic Therapy in underserved regions of Colorado using a Pay for Success financing mechanism. To my knowledge, this was the first Pay for Success initiative to be hosted at a university. To date, we have launched two out of three cohorts of new MST teams; each cohort has two different sites. Due to this project, we now have expanded capacity in Weld, Adams, Pueblo and Mesa counties, and we plan to add two additional sites in July 2020.
During 2019, CEI staff conducted five training events for approximately 70 new MST therapists and supervisors in Colorado and Arizona, and there is one more planned in December.
What role do community partnerships play in CEI’s work?
Community partnerships are the essence of CEI’s work. As an intermediary organization for MST, CEI provides localized tailored support for implementation. We’ve been actively developing relationships with the new communities that are part of the Pay for Success initiative, and all of the new teams are doing an incredible job serving their communities. For example, our first cohort of two new teams included one in Pueblo, where we started with a series of community stakeholder meetings in which we established that there was sufficient need to justify building a new team in the area. We met with many community members to ensure that MST would be a good fit with the area’s needs. When Health Solutions was suggested as the provider agency to house the new program, we had several meetings directly with their staff to ensure that MST implementation would be feasible within the agency and sustainable long term. Once these assurances were in place, we collaborated with Health Solutions in their hiring process, including providing perspectives on applicants to ensure good fit with the program. Because MST was mostly new to the Pueblo area, we worked with the stakeholder group and agency to identify referral sources and referral pathways. As the broader community became aware of the program, Health Solutions staff — with support from CEI — provided education about appropriate referrals and conducted outreach efforts to potential referral agencies. After about four months, the Health Solutions team was fully up and running! They are now functioning at full capacity, serving families and delivering highly adherent MST.
Are there any other new projects you’re particularly excited about?
This past year, I contracted with Abt Associates to be the principal investigator for the Prevention Services Clearinghouse (read more about this project). I’m also really excited about a collaborative effort being undertaken by the Child and Family Evidence-Based Practices Consortium in collaboration with the California Evidence-Based Clearinghouse for Child Welfare. With colleagues Jennifer Rolls-Reutz, Cricket Mitchel and GSSW PhD student Jennifer Sedivy, I am documenting the “implementation gap” by looking at existing evidence-based interventions and the extent to which program developers incorporate best practices for implementation. Our first study has been accepted for publication and should be in press in the next month or two. That study looked at whether programs had fidelity support measures, and if they did, how they approached fidelity measurement. A second study examines the existence of pre-implementation supports: the types of resources program developers have that help agencies to plan for adoption of a new program.