Topics:EvaluationIntervention developmentMeasures and outcomesStudy design
Recorded at ISHCA 21, this session explored how to balance patient and clinical outcomes with implementation outcomes in implementation science protocols. Presenters reflected on the different factors that influence outcome choice, including team input, funding opportunities, as well as stakeholder receptivity and measurement availability.
Developing implementation research capacity: longitudinal evaluation of the King’s College London Implementation Science Masterclass, 2014–2019
This paper describes the Implementation Science Masterclass developed and delivered by King’s College London and an International faculty of implementation experts. The paper provides delegates’ quantitative and qualitative evaluations (gathered through a survey at the end of the Masterclass) and faculty reflections over the period it has been running (2014–2019).
Evaluation of Systems-Oriented Public Health Interventions: Alternative Research Designs
This article explores the key features and complexities associated with a range of research designs that can be used as alternatives to the individual randomized controlled trial (RCT). Some of these research designs include the cluster RCT, stepped wedge design, interrupted time series, multiple baseline, and controlled pre- and post- designs.
This guide provides a number of prompts to help identify the steps you need to take to a) demonstrate that a health service, program or policy innovation works, b) understand the conditions under which it was successful (or unsuccessful) and c) where appropriate, identify how to scale up an innovation for greatest impact.