This best practices resource is designed to provide school and district leaders with guidance on the types of data that they should collect, as well as how to collect, analyze, and interpret the data to gain insights into the effectiveness of their summer programs.
Academic summer learning programs can lead to improved student achievement, but effectiveness is not guaranteed. Certain program elements, such as duration, attendance, use of time, and quality of instruction, appear to be key factors in programs that show stronger academic benefits. But, as helpful as evidence-based design principles are, using research to inform programs that maximize academic growth and social-emotional outcomes is a complicated process. Even once researchers have identified likely drivers of effectiveness across programs, these design principles often cannot be easily replicated across districts with varying contexts and goals. This series of program profiles provides a glimpse into how different districts in Rhode Island and Tennessee created programs that align with design principles in some respects and diverge at other points. These are not meant to be overviews of perfect programs, but rather real-world examples of what developing a research-aligned program looks like in practice. The intent here is to highlight possible choices and tradeoffs to gain a clearer sense of how districts balanced research recommendations, program goals, and local priorities.
District staff face difficult trade-offs as they balance their local needs against the eight design principles described in the research brief. Limitations around district capacity and personnel as well as families’ demands for summer flexibility can directly conflict with calls for greater academic rigor or longer program length. While research should always be used to guide decision-making, it needs to be evaluated in relation to local values. Brighouse et. al argue that the process of using evidence to make effective decisions requires value judgments in evaluating the evidence and in determining which evidence is most important. Further, implementing processes for collecting data is critical to understanding program effectiveness and informing subsequent decisions on program improvement. This case study details how one district – Woonsocket, Rhode Island – chose to navigate these trade-offs and the ways this has played out in program design. In examining Woonsocket’s programmatic choices, three foundational values emerge. They empowered site-based leadership, prioritized program personnel, and designed for student personalization. These values guided decision-making and helped the district build a strong program that aims to balance research recommendations, program goals, and local priorities.