Design high-impact professional learning programs with results-based evaluations With increasing accountability pressure for evidence-based strategies and ever-tightening budgets, you want to make sure that the time, effort, and resources you are investing in your professional learning programs is truly making an impact on educator effectiveness and student achievement. In this third edition of Assessing Impact, Joellen Killion guides administrators, professional learning leaders, school improvement teams, and evaluators step by step through the rigors of producing an effective, in-depth, results-based analysis of your professional learning programs. A recognized expert in professional learning, Killion emphasizes the critical role of evaluation in bolstering effectiveness and retaining stakeholder support for ongoing educator development. The methods outlined here help you: • Adhere to changes in federal and state policy relating to professional learning and educator development • Facilitate the use of extensive datasets crucial for measuring feasibility, equity, sustainability, and impact of professional learning • Produce more powerful, data-driven professional learning programs that benefit both students and educators • Evaluate the effectiveness and impact of professional learning to make data-informed decisions and increase quality and results Assessing Impact is a vital resource for staff developers and educational leaders seeking to improve the effectiveness and sustainability of professional learning, while retaining the support of parents and the public alike. Praise for the Second Edition: “Anyone who reads this book has to feel obligated to ‘set their world on fire.’ The text not only forces the reader to see how we are failing our children and their teachers, it provides the means for each of us to do better.” —Michael J. Ford, Superintendent Phelps-Clifton Springs CSD, Clifton Springs, NY
Chapter 7: Collect Data
In this stage, the evaluator is gathering data as planned and being thoughtful about altering the data plan to ensure that the data needed to answer the evaluation questions are collected. Alterations might include collecting less or more data, changing the data-collection method, or changing the time when data are collected. If the evaluation framework was thoughtfully planned, alterations will be limited, yet even with the best plans, evaluators may find reasons to make changes. If a plan to interview program participants proves too challenging to schedule or an insufficient number of participants agree to be interviewed, the evaluator may opt to use focus groups or a survey to supplement data collected from the completed interviews or to replace interviews completely. ...