How to Use Value-Added Analysis to Improve Student Learning: A Field Guide for School and District Leaders

Books

Kate Kennedy, Mary Peters & Mike Thomas

  • Citations
  • Add to My List
  • Text Size

  • Chapters
  • Front Matter
  • Back Matter
  • Subject Index
  • Copyright

    View Copyright Page

    List of Figures

    Foreword

    This helpful book comes at an important time for teachers and school administrators. While value-added analysis has been around for more than two decades, the widespread use of value-added assessment is just now exploding into common use. Moreover, there are many different forms of analysis using the moniker value-added that are widely variable in the assessments that they use, the analytical tools applied, and the manner in which they are used—and not used—for the evaluation of teachers and educational administrators. Even the best assessment and evaluation system is useless if teachers and school leaders do not use it to improve practice, and a commitment to practical utility at the classroom and school level is at the heart of this book.

    Breaking Through the Maze of Complexity

    The development of student assessments that are reliable and valuable requires the professional energies of test designers and psychometricians. Conducting value-added analysis can include mind-numbingly complex mathematics, such as variance-covariance matrices. But teachers and administrators need not become psychometricians or statisticians in order to apply the results of value-added analysis to the classroom. Just as we do not need to design and build the computers we use on a daily basis in order to apply them to our professional responsibilities, we do not need to design the analytical framework of assessments in order to use them.

    From Blame to Analysis

    While value-added analysis has been widely reported to demonstrate the profound impact that teaching quality has on student results, it has often been stuck in a tautology: The way to have higher scores is to be a better teacher; the way to be a better teacher is to have higher scores. But little has been established within the valued-added inquiry to identify specifically what practices differentiate the top teachers from the others. Indeed, when the research is stuck in percentile analysis, there will mathematically always be only 20% that are in the top quintile, and it is statically impossible for any more than a fifth of teachers to achieve that distinction. It is an unremarkable statement to contend that 49.9% of all teachers are below average in any place other than Lake Woebegone. Unfortunately, this sort of analysis, while perhaps interesting from a research standpoint, is unlikely to motivate any group of teachers and administrators I know. That is why Kennedy, Peters, and Thomas's work is so important. By placing the lens not merely on the data but on specific professional practices associated with gains in student performance, the entire conversation shifts from blame to analysis, from defensiveness to professional learning. The extensive quotations from teachers and administrators, including those with very strong unions, provide testimony to the fact that it is possible to use value-added analysis in a way that does not undermine respect for teachers or threaten a barrage of grievances and lawsuits if this tool is used correctly—to improve teaching and leadership. Importantly, the analysis includes all elements of the system, including factors at the classroom, building, and central office level that contribute to adding value for students. While teaching quality remains exceptionally significant, factors throughout the system can either nurture or undermine effective teaching, and this book wisely considers the entire system and its impact on teaching and learning.

    From Micromanagement to Inquiry

    Chapters 7 and 8 are particularly worthy of deep study and reflection. The strongest ideas this book contains are in the blueprint for a deliberate process of inquiry about the causes of learning and the nature of effective instruction. This is not an instructional system in which an external expert says, “I'm the doctor—take two of these and call me in the morning.” Rather, the agendas, tools, and action steps guide staff members through a process of inquiry and discovery. That may strike some readers as tedious and frustrating. “Just tell us what to do!” they will shout in exasperation. But billions of dollars and decades of learning opportunities have been squandered by policymakers who insisted on giving teachers various recipes without giving them the time to understand what they were doing and why they were doing it. That is why implementation levels for even promising instructional interventions are low and the shelf life of most instructional innovations is typically only a few years. Years of initiative fatigue have led to a frantic game of Whack-A-Mole, in which policymakers simultaneously strike at every problem that raises its head, without ever dealing with any of them successfully. As Kennedy, Peters, and Thomas write, “Unfortunately, the result of trying to improve everything is that nothing much improves” (see page 48). When we shift the strategy from frantic to focused, teachers are able to focus on specific student needs and identifiable instructional practices, monitoring results and interventions in a clear and understandable manner.

    Accountability as a Learning System: Five Warnings for Educational Leaders

    As good as Kennedy, Peters, and Thomas's work is, I would not be doing my duty if I did not express some of the same concerns that the authors candidly acknowledge. Specifically, I offer five warnings for educational leaders and policymakers who intend to use value-added analysis. First, systems without professional learning are worse than valueless. Value-added systems will be and have been undermined when they have been used inappropriately. When teachers and administrators are drowning in data they do not understand, they will not use it to make better decisions. If teachers neither trust nor understand the assessment data, then a sense of despair and impotence creeps into a school that can take years to repair. In cash-strapped school systems and states there is a temptation to let the federal government fund value-added assessment systems but deny teachers the time and resources to use those systems wisely. Then inevitably, when the money runs out, critics will say, “See, we told you that spending more money on education was a waste.” Buying a space shuttle without training the astronauts is an ill-advised strategy. Moreover, training is an endeavor that is continuous. A single orientation seminar in value-added techniques is not enough. Rather, data analysis must become “the way we do business,” a habit, not a just another new initiative.

    Second, accountability can be either a learning system or an instrument of brute force, but not both. Kennedy, Peters, and Thomas wisely suggest a focus on root causes and instructional quality. I might suggest that policymakers change the threat of “We're going to hold teachers accountable for their data!” to a more nuanced “We're going to hold teachers, administrators and policymakers accountable for their response to the data.” I've worked throughout the world with teachers and administrators and have never met one that wanted to be unsuccessful, but I met many who did not know how to become successful. They will not acquire the knowledge needed for success with threats.

    Third, the best value-added analysis will be undermined by tests that are not adequate for the task. For example, the vast majority of state assessments only have items associated with the standards for a single grade level, but the very nature of value-added analysis is that it helps teachers understand how students show growth when they are below, at, and above grade level. In order to fully realize the potential of value-added analysis, therefore, tests must be more frequent and include a broader range of items than is presently the case. Until states and districts fix this serious problem, schools and districts will need to have other measurements of students, particularly at high levels of performance, to avoid the patently ridiculous situation in which a teacher who has 100% of students meeting state standards and all scoring very well on state tests labeled as ineffective because there was no more room on the scale for those students to improve.

    Fourth, the essence of value-added analysis relies upon making same-student comparisons. That is a far more reasonable assumption in stable suburban communities than in highly transient urban schools. Moreover, high poverty schools are also more likely to have higher rates of transiency as families move due to a range of factors including rent incentives, family disruption, and unemployment. It would be a horrible unintended consequence of value-added analysis for teachers to have an economic incentive to give more instructional attention to stable students than transient students. While some value-added systems address this challenge by using reasonable estimates for missing test scores, the process makes a coherent root cause analysis very difficult for teachers.

    Fifth and finally, leaders must take care not to have value-added analysis supplant rational judgment. The daily work of teaching, assessing, and providing feedback to students remains at the heart of effective instruction. This daily work—planning and delivering effective instruction, intervening to meet the learning needs of students, and providing appropriate challenges and remediation where necessary—must also be the focus of educational leaders at every level. Value-added analysis can, as the authors suggest, inform the daily work in a powerful way. But the most sophisticated analysis will never replace the necessity for implementing and monitoring effective instructional practices.

    Douglas B. Reeves Nahant, Massachusetts
    Note: Dr. Reeves is the Founder of The Leadership and Learning Center (http://www.LeadandLearn.com). He is the author of more than thirty books on leadership and organizational effectiveness.

    Preface

    In the United States, we are in the midst of a painful transition from a manufacturing-based economy to a knowledge-based economy. Most of us know full well that the knowledge, skills, and dispositions that were required for the industrial era are different from those that are required by the new economy, but few of us seem ready to actually learn and do something different. The institution of education sits squarely in the middle of the storm. Our citizenry needs to be retooled, but the people who are largely responsible for this—educators—are also trying to figure out how to retool themselves. Given this conundrum, the most pressing questions for educators today may be, How do we get started?, How do we focus attention on improvement?, and How do we change the face of education so that the face of our economy can also change? These are complicated questions that require thoughtful, practical answers.

    One potential answer comes in the form of a relatively recent innovation in education: value-added analysis. By conducting a robust statistical analysis of longitudinal test data it is now possible to reliably estimate the contribution of a district, a school, or even individual teachers on the academic gains of students. For the first time, many educators have a reasonably reliable measure of productivity. This sounds like a small thing, but it is not. Because productivity measures connect processes to outcomes, they are essential for any kind of deliberate systematic improvement. This is why value-added analysis is one of the best levers available to help educators move from where they are to where they want and need to be. But what is value-added analysis? How can it be used to influence educational improvement? How can it help educators to build on their strengths and address their areas of weakness? How can value-added analysis be used to help more teachers help more students?

    This book has been written to answer these and other essential questions about value-added information. This is not a book about testing or statistical modeling. It is also not a book about whether value-added information should be used for teacher evaluation or for differentiated compensation. Instead this book is about improving student learning. It is a book written for educators and for those who work with educators to improve the quality of education that K–12 students so desperately need and deserve.

    How This Book Is Unique

    We wrote this book because, at this time, there is nothing like it available to educators. Much of the current discourse about value-added analysis is about the policy implications, the methodological concerns, or the evaluative uses of this tool. What has remained largely unexamined is the power of value-added information to inform and shape educational improvement. We believe in educators and in their capacity to transform themselves. We also believe that value-added information is the right tool to start the crucial conversations to make this happen.

    This book gives us the opportunity to address the all too common gap between having value-added information and using value-added information (McCaffrey & Hamilton, 2007). Perhaps we can diminish the knowing-doing gap by first bringing clarity to what value-added analysis is and then by providing concrete guidance to use value-added analysis that is grounded in the real-life stories of educators who are actually doing this work. We used this approach because stories tend to be powerful educational tools (Rossiter, 2002).

    Further, our intention is to help educators think about what they can do to improve the focus and the quality of all the learning that goes on in their building. In our work with educators across the country, we have experienced firsthand how value-added information can be used to improve both the breadth and the depth of student learning. We have seen educators transform their practice when they have had the opportunity to access, comprehend, and respond to their value-added information. We have seen schools turn their results upside down when leaders have instituted the mindset and the structures necessary to carefully examine and act on data. We have seen school districts boost their performance to unprecedented levels when value-added information has been thoughtfully and systematically inserted into the school improvement process. And in many places, student engagement increases, office referrals go down, and teacher self-efficacy skyrockets.

    These stories are worth telling. They provide the impetus that most of us need to try something new and different. In this book you will meet Katie and Heather, teachers who used their value-added information to enrich the learning of all their students. You will be introduced to Kimi, a teacher leader whose school dramatically improved its results by carefully analyzing and responding to its value-added information. You will also hear about Tina and Bobby, principals who led their schools to remarkable levels of improvement.

    Finally, and perhaps the most important reason we wrote this book, was to bring a fresh perspective on educational improvement. Improvement should not be an activity reserved solely for those who have poor performance evaluations. It should also not be an activity owned and led by outside experts. Instead, educational improvement must become a routine activity that defines what it means to be an educator. The good news is that ongoing educational improvement is not any more difficult than improvement in any other area of your life. It's about straightforward processes and continuous disciplined conversation and experimentation. It is about liberating and sharing the knowledge and the know-how that already exists in most schools. The value of data, and especially value-added data, is that it can help you identify and take advantage of the heretofore-undiscovered islands of excellence.

    Whether value-added analysis is new to you or you are interested in putting your value-added information to better use, this book will help you think about and put in place the things you need to improve.

    Overview of the Book

    To provide a clear pathway for readers, we have organized this book around a common-sense, five-step improvement cycle shown in Figure P.1. Based on years of working with teachers and principals, we designed this cycle to be simple enough to provide meaningful support, yet complex enough to capture the real issues associated with improvement at the classroom, the building, and the district levels. The iterative process depicted below is not unlike other improvement cycles that have been employed in the education and business fields for years. Here we elaborate on the classic Plan-Do-Study-Act cycle that W. Edwards Deming put forth beginning in the 1950s. The Value-Added Improvement Cycle consists of the following:

    Step I: Jump Into Value-Added

    Step II: Assess Results to Determine Strengths and Challenges

    Step III: Identify Root Causes

    Step IV: Produce an Improvement Plan

    Step V: Take Action, Monitor, and Adjust

    Figure P.1 Value-Added Improvement Cycle
    Step I: Jump into Value-Added

    The first step of the cycle, Jump Into Value-Added, is covered in the first three chapters of the book. There is no magic formula for how to get started other than making a commitment to fully delve into it with purpose and zeal.

    In Chapter 1, we begin by defining value-added analysis and differentiating it from achievement data. Making this distinction is critical to your understanding about how to interpret value-added results. It is also important that you come to appreciate the summative assessment role that value-analysis plays in a balanced assessment system. This chapter also addresses some of the implications of value-added analysis and points to some significant research findings that have emerged from this metric.

    Chapter 2 focuses on how to develop the conditions for success with value-added analysis, including how to create a data-driven culture, provide effective professional development, and gain access to value-added reports. In this chapter you'll also learn how some of our school partners have jumped into their value-added reports. This chapter includes practical tools to help you and your colleagues hit the ground running with value-added analysis.

    In Chapter 3, the focus is on systemic educational improvement and how value-added analysis provides a unique starting point for improvement at the district, school, and classroom levels. We introduce you to our educational improvement framework that we call BFK·Focus. This framework takes the form of a multilevel nested funnel designed to take educators through a data-based goal-setting process. Through the course of this book, you will systematically advance through each of the three stages of the funnel at each level of your organization.

    Step II: Assess Results to Determine Strengths and Challenges

    Chapters 4, 5, and 6 engage you in the second step of the cycle—Assess Results and Determine Strengths and Challenges. In these chapters you will learn how to read district-, school-, and classroom-level value-added reports; produce a matrix to assess the results of those reports; and analyze disaggregated data to determine strengths and challenges of your district, schools, and classrooms.

    Steps III and IV: Identify Root Causes and Produce an Improvement Plan

    The third and fourth steps of the value-added improvement cycle—Identify Root Causes and Produce an Improvement Plan—are presented in Chapter 7. The primary purpose of this chapter is to describe a process for probing the root causes of your system's core strengths and most pressing challenges. Our intention is to lead you through a guided root cause analysis that results in a useful improvement plan.

    Step V: Take Action, Monitor, and Adjust

    Chapter 8 brings the value-added improvement cycle full circle. In this step, Take Action, Monitor, and Adjust, you will learn how to implement your improvement plan by acting on value-added information; monitor your implementation and make adjustments where needed; evaluate the success of your improvement plan; and then begin anew with fresh data. Along the way, we will share stories of teachers, schools, and districts that have moved from data to analysis to planning to action. These stories make clear that improvement is, in fact, a consequence of thoughtful leadership, systematic action, and continuous improvement.

    Special Features

    This book offers a unique, five-step, implementable approach to value-added analysis that will ensure a solid, robust plan based on your own specific strengths and challenges and inspired by the root causes of your own singular set of issues. Embedded in this approach is guidance on how to produce an improvement plan and then implement it, monitor, and adjust it as needed over time. Each chapter describes how to use the steps in the spirit of school improvement.

    To support you in your implementation of the five-step process, we have included useful features in each chapter: case studies, real-life examples, end-of-chapter action steps, and reflective questions. The action steps and reflection questions will lead educators and professional learning communities in discussions about how to incorporate successful data analysis practices into their schools and classrooms.

    A hands-on resource guide at the end of most chapters will include samples, protocols, and other tools to accompany the action steps for using value-added analysis. These will be short, one-page pieces that can be reproduced and used by teachers and leaders.

    Suggestions for Using This Book

    The learning-doing gap is a formidable obstacle that has stood in the way of many well-intentioned improvement initiatives. Perhaps this is because we have historically spent more time in the learning and not as much time in the learning how to do. Learning how to do is best when we can learn from and with others. Children do not learn language without others who encourage their speech and shape their verbalizations through modeling. Likewise, it is best to commit to weight loss by engaging in a program designed to teach us new behaviors and hold us accountable for measureable results. In that vein, we encourage you to engage a professional learning community to read and discuss this book with you. Together, the members of your community can not only learn about value-added information, but also make plans to put their learning into action. We recommend that one chapter be assigned each month to each member. Then, as a team, work through the discussion and reflection questions and action steps at the end of the chapters.

    Putting Value-Added Reporting into Context

    By now you may be wondering where value-added analysis is available and whether you have access to this information. There are many places around the United States where value-added analyses are being produced, but districts that currently use value-added analysis to link student progress to classroom teachers are still ahead of the curve.

    At this writing, sixteen states have value-added information available, and as a result of the Race to the Top (RttT) competition, there are many other districts interested in the possibilities that value-added analysis offers. States such as North Carolina, Ohio, Tennessee, and Pennsylvania provide value-added reports statewide. The System for Teacher and Student Advancement (TAP) partners with schools in 14 states and in Washington, D.C. and uses value-added analysis to support its approach to school improvement that involves multiple career paths, ongoing professional development, instructionally focused accountability, and performance-based compensation.

    Several large cities also produce value-added reports. San Diego Unified School District and Eagle County Schools, Colorado, produce annual measures of student growth. The New York City Department of Education issues teacher data reports to show teachers how effective they are compared to other similar teachers across the district. Recently, the Los Angeles Unified School District began issuing value-added measures of teacher effects. Texas school districts—including the Houston Independent School District, Forth Worth Independent School District, and Longview School District—use value-added analysis to determine which teachers are eligible for additional compensation. As well, several recipients of federal Teacher Incentive Fund (TIF) awards also rely on value-added estimates to inform teacher award models.

    Reform Initiatives

    In a 2009 address to the National Conference of State Legislatures, Bill Gates, former Chairman of the Microsoft Corporation, shared his foundation's interest in identifying highly effective teachers. He observed that “when you see the power of the top quartile teachers, you naturally think: We should identify those teachers. We should reward them. We should retain them. We should make sure other teachers learn from them.”

    Gates's statement signals the high priority that the Bill and Melinda Gates Foundation has given to identifying, rewarding, retaining, and sharing the lessons of the most effective teachers. Its interests run parallel to reforms encouraged by the Obama administration. In 2009, President Barack Obama and Secretary of Education Arne Duncan announced the $4.3 billion RttT educational innovation fund. In order to compete for the funding, states needed to accelerate educational innovation and embrace bold improvement efforts. The RttT winners of 2010 include the District of Columbia, Delaware, Florida, Georgia, Hawaii, Maryland, Massachusetts, New York, North Carolina, Pennsylvania, Ohio, Tennessee, and Rhode Island. A key component involves teacher effectiveness and evaluation reform, and as such, connects directly to value-added analysis. RttT states are adopting student growth measures as one component of a multiple-measure evaluation design. Value-added analysis is a growth measure that links teacher practice to student growth and can potentially be used to identify and reward effective teachers and schools, as well as to inform teachers and principals on how they can improve their practices.

    To give some idea of how value-added analysis has been put into action, we can look to the Benwood schools of Chattanooga, Tennessee. Once considered the worst in the state, Benwood, a collection of elementary schools, has achieved well-documented success by using professional development and strategic teacher placement and retention strategies to turn their school system around. “Benwood schools went from 53 percent of their 3rd graders scoring at the advanced or proficient level in reading on the Tennessee Comprehensive Assessment Program to 80% scoring at that level in 2007” (Haycock & Crawford, 2008).

    What did they do to improve? Benwood principals and teachers began to routinely review value-added reports to determine areas of strength and challenge. Teachers observed and sought guidance from teachers who were strong in particular areas based on value-added results. The highest-performing teachers were recruited to teach the lowest-performing reading students in a privately funded after-school program. Those teachers and principals whose students grew more than expected received a monetary bonus. Throughout this book we provide other examples of how value-added information has been a centerpiece of school improvement. Your path to increased student achievement and overall school improvement can start right now as you begin to tailor these five steps to meet the needs of your own school setting.

    Acknowledgments

    First and foremost, we wish to thank the thousands of educators we've worked with over the years who have informed our work. There are many teachers and leaders we have talked with during the course of writing this book; a few we name, but many we do not. Thank you for everything you do to ensure students are learning and growing.

    Jim Mahoney, Battelle for Kids's executive director, has been instrumental in setting forward a vision for using value-added analysis for school improvement purposes, and without his leadership, this book could not have been written. Every day we are privileged to work with many smart, mission-driven people who have been unwaveringly supportive of this endeavor. Special thanks to Mary Schultz, Joyce Ellis, Leanne Siegenthaler, Ania Striker, Rick Studer, Barb Leeper, Diane Stultz, Sandy Shedenhelm, Julianne Nichols, Leslie Damron, and Todd Hellman. Debbie Stollenwork, Desiree Bartlett, and Kim Greenberg at Corwin have been a delight to work with; thank you for making a momentous task as easy as it could be! We thank Ernie Morgan at the Value-Added Research Center for his insights and help connecting to New York City and Wisconsin value-added information.

    This book relies on the hardworking teachers and leaders who took time out of their busy schedules to meet with us and share their stories. They are the real stars of this book. In no particular order, we thank Kimi Dodds, Tina Thomas-Manning, Bobby Moore, Heather Dzikiy, Susanne Lintz, Matthew Lutz, Melissa Krempansky, Mark Abrahamson, Amanda Garner, Renee Faenza, Lauren Collier, Alesha Quick, Revonda Johnson, Francis Rogers, Terri Stahl, Ned Kerstetter, Susanne King, Elisa Luna, Anne Lefler, Adam Withycombe, Maureen Tiller, Robert Tosh Corley, Nancy Shealy, Brenda Romines, Elizabeth Shindledecker, Jessica Cynkar, and Susie Bailey.

    Publisher's Acknowledgments

    Corwin would like to thank the following individuals for taking the time to provide their editorial insight and guidance:

    • Sherry L. Annee
    • Biotechnology Instructor
    • Brebeuf Jesuit Preparatory School
    • Indianapolis, IN
    • Dalane E. Bouillion
    • Associate Superintendent for Curriculum and Instructional Services
    • Spring ISD
    • Houston, TX
    • Barbara Smith Chalou
    • Professor
    • University of Maine at Presque Isle
    • Presque Isle, ME
    • Catherine Duffy
    • English Department Chairperson
    • Three Village Central School District
    • Stony Brook, NY
    • Kathy J. Grover
    • Assistant Superintendent
    • Clever Public Schools
    • Clever, MO
    • Martin J. Hudacs
    • Superintendent
    • Solanco School District
    • Quarryville, PA
    • Glen Ishiwata
    • Superintendent
    • Moreland School District
    • San Jose, CA
    • Dee Martindale
    • K–8 STEM Coordinator
    • Reynoldsburg City Schools
    • Reynoldsburg, OH
    • Patti Palmer
    • Sixth-Grade Teacher
    • Wynford Elementary School
    • Bucyrus, OH
    • Joy Rose
    • Retired High School Principal
    • Westerville South High School
    • Westerville, OH
    • Jill Shackelford
    • Superintendent
    • Kansas City Kansas Public Schools
    • Kansas City, KS
    • Janet Slowman-Chee
    • Special Education Director
    • Central Consolidated Schools
    • Shiprock, NM
    • Lyne Ssebikindu
    • Assistant Principal
    • Crump Elementary School
    • Memphis, TN

    About the Authors

    We are former teachers, education leaders, and researchers who work at Battelle for Kids—a national, not-for-profit organization that provides strategic counsel and innovative solutions for today's complex educational-improvement challenges.

    Kate Kennedy helps teachers and leaders to better use and understand their value-added information. She also designs and leads professional learning experiences focused on formative instructional practices. Kate earned a master of education leadership from Teachers College, Columbia University, a master of arts in elementary education from Loyola Marymount University, and a bachelor of arts in women's studies from The Ohio State University. She is a former teacher and Teach For America corps member. Kate lives in Columbus, Ohio, with her husband, Matt, and young sons, Nathan and Owen.

    Mary Peters has been a lifelong advocate for all children to have equitable opportunities to a high-quality education. Mary has worked at the classroom, district, college, and state levels. She is an expert on special education, data and value-added analysis and currently leads a state-wide rollout of value-added analysis in Ohio. Mary holds a PhD in education from The Ohio State University, a master of arts from University of Connecticut, and a bachelor of science from The State University of Geneseo, New York. She has developed and led several grant projects that pertain to teacher effectiveness. Mary and her husband, David, collectively have four children and live in Westerville, Ohio.

    Mike Thomas has worked throughout his career to help educators improve their practice. As a part of this work, he has created tools, resources, and professional development experiences to help educators understand and use value-added analysis for improvement at the district, school, and individual teacher levels. Mike has also studied and presented all over the country on the topic of highly effective teachers. Mike holds a PhD in educational leadership from The Ohio State University, a master of science in future studies from the University of Houston, and a bachelor of science in physics from Otterbein College. Mike is happily married to his wife, Lu Anne, a proud father of three children, Lindsay, Christopher, and Emily, and a doting grandfather to Kameron.

  • Afterword

    Nearly 300 years ago, on October 22, 1707, just off the southwestern tip of England, 4 homebound British ships ran aground, and 2,000 men lost their lives. There was no battle. The admiral miscalculated his position in the Atlantic Ocean, and his flagship smashed into the rocks off the coast of England. The rest of the fleet, following blindly behind, also went aground and piled up.

    The concept of latitude and longitude had been around for a long time, but even as late as 1700, mariners had not managed to devise an accurate way to measure longitude. Nobody ever knew for sure how far east or west they had travelled. And that day when the fog was dense, the results were tragic.

    In response, the British government created a competition to be judged by the Board of Longitude, offering a prize of $1 million (in today's dollars) to solve this measurement problem. Most expected the answer to come from the astronomers or scientists of the day. However, an English clockmaker, John Harrison, who pioneered the science of a portable, precision time-keeping device (chronometer) solved the problem. Yet, for the longest time, the Board of Longitude just wouldn't welcome a mechanical answer to what they saw as an astronomical question.

    And likewise today, many educators still don't see the value of statistical answers to education challenges. Value-added analysis has shown itself to be, in part, a statistical answer to our quest to improve student achievement. As seafarers needed accurate readings of latitude and longitude, today's educators need readings of both progress and achievement to determine direction and destination of students. It's not the exclusiveness of or but the brilliance of and. We need both.

    Today's modern computing power, coupled with data from student achievement testing, provides us a wealth of information that, when properly mined, can assist us in making wise choices with respect to instructional practices and programs. Taking a page from arguments made about guns: data doesn't change practice, people do. You can see from the numerous examples provided herein that value-added analysis provides a lens that truly provides evidence to lever changes that benefit students. It affirms empirically what all students have known forever: great teachers matter.

    Purpose of Value-Added Analysis

    Our work with value-added reporting has never been simply to prove but rather to improve. In this book, we've highlighted a multi-step process with value-added data at the center that can help principals and teachers share practices, look for relationships, and develop strategies to improve student achievement. We've also seen how a child's zip code may define where they are but not where they are going. Why would a teacher who has a disproportionate number of poor or low-performing children want to continue teaching there if awards and recognition are limited to achievement? Value-added analysis levels the playing field by answering “What did you do with the children you taught?” rather than “Who did you get?” This is a twenty-first century tool that enables analysis of productivity.

    Future of Value-Added Analysis

    Is value-added analysis a trend that we are likely to see come and go, vis-à-vis modern mathematics, site-based management, or the overhead projector? That's not likely given the value of the information it provides to make key decisions. The danger, of course, lies in overreaching its capacity to discern, almost single-handedly, good teaching. The danger will come in the form of leadership that uses this data as a single weapon of judgment as opposed to a complementary instrument of learning.

    Here's a practical example. A speaker meeting with a large group of reformers recently asked, “How many of you believe we can take something as complex as the development of human intellectual capital in children, assess it with an annual test, parse out the appropriate teacher attribution, and then pay, hire, and fire teachers on that basis?” The answer from the group was overwhelmingly affirmative because of our need to quickly determine accountability or assign blame. It's so much easier if you can distill something down to a simple number from 1 to 10 that everyone can understand. However, it is a disservice to the complexity of teaching to ever believe it can really be that simple—because it isn't.

    Can value-added analysis contribute answers to the questions of accountability, compensation, and evaluation? Absolutely! It's a misnomer to make it the sole piece for any of those or, frankly, for any improvement. Just as pictures are improved through the use of more pixel points, so is our assessment of instruction enhanced by multiple data points.

    At this writing, Race to the Top, with one of its four assurances focused squarely on effective teachers and leaders, would appear to make value-added analysis a critical piece of evidence on the efficacy of our work. Merit pay of the 1980s, and even career ladders, often went by the wayside because of the appearance of only subjective data. The Bill and Melinda Gates Foundation is using their Measures of Effective Teaching Project to help discern those factors that contribute most to student learning. They are doing it empirically, and value-added calculations are serving as a rich source of evidence and independent data. They are asking questions such as, Is there a relationship between how students view teachers and academic gain? They are using digital panoramic cameras to capture classroom performance to answer the question, What are teachers doing to produce gains?

    We believe that the continuous improvement approach emphasized in this book will enable value-added analysis to be used by educators to improve effectiveness. At its best, the goals of value-added analysis ought to be assisting the uncovering, discovering, and recovering and not simply the naming, blaming and, shaming. That, of course, leads to a critical piece of all reform efforts and tools, including this one—leadership.

    Leadership

    Just as great teaching matters for students, so does great leading. It becomes impossible to create the improvement culture we have discussed without effective leadership. Does it have to be the principals? After all, they are the ones who decide which data will be used, assign time for collaboration or not, develop teams or not, and assign inquiry or blame. The school culture is largely influenced by the principal. It becomes too hard and unsustainable for a group of well-intentioned, reform-minded teachers to work around the principal. What works best, of course, is collaboration between the leader, teachers, and other stakeholders to improve student achievement.

    At the heart of the leader's responsibility is the need to inspire trust, especially around the use of data. We have found leaders who can lead and inspire crucial conversations around improvement. They create an atmosphere of trust and inquiry and use data that supports actions that create extraordinary student results. These are leaders who focus on strengths first but also facilitate discussions or implement actions that cause things to happen with people and for children. It is hard to overstate the importance of leadership in creating and instilling a culture that empowers, offers hope, discusses, and acts on behalf of children.

    Improving Effective Teaching

    Our work with districts has enabled us to identify teachers who consistently make huge student gains each year with different groups of students. Repeated conversations and interviews with these teachers using appreciative inquiry strategies have yielded several revelations. One is that great teaching is not made out of a half dozen unitary actions that can be written, easily described, and, if implemented, will improve student achievement. Student learning is not the enemy of enthusiasm, passion, or a consequence of those who try to ensure that students properly answer every last multiple-choice question. Indeed, what we have discovered is that effective teachers are able to build relationships with students, expect productivity, and have classrooms with management structures to maintain discipline, while knowing when to be flexible enough for teachable moments. Great teaching appears to be a series of moves over time that make the movie as opposed to individual still shots. It's a process.

    Do students work harder for someone they perceive actually cares for them enough to build relationships? Of course they do. The challenge for teachers is always when this is not balanced with the other pieces necessary to good teaching. For example, not expecting productivity at all is akin to really caring for someone except, of course, for their learning. On the other hand, focusing rigidly on productivity can create a sweatshop where students give up. Highly effective teachers take explicit, replicable steps to create environments that produce great learning results for students.

    Can other teachers learn from them? Of course they can. One of the most powerful uses of value-added analysis is to use it along with other evidence to identify excellent teachers and learn from them. In fact, there is greater variance of teacher effectiveness across buildings than across districts. There are great teachers everywhere, and the key is identifying them, providing opportunities to learn from them, and leveraging their talent to help others. Their work fits perfectly into the seven-step systemic framework for improvement described throughout this book. These are the individuals who can help others think critically using the fishbone to clarify specific root causes and practices for improvement.

    We'd like to end with one last story about a particular professor known for his high expectations of student work. The story goes that when you turn in a first draft of your paper, you get it back with instructions to simplify and clarify. A second iteration elicits the same instructions. It is only when the student turns the paper in for the fourth time that he finally agrees to read it for the first time!

    The point here is that improvement is a process. Our step-by-step framework enables educators to systematically uncover evidence, ask questions, make assumptions, construct theories, design actions, and do it repeatedly to help students. The room for improvement is an ever-expanding one that becomes better when educators act upon the best evidence available.

    Glossary

    • Above expected gain: Classification of performance assigned to a teacher, a school, or a district that is producing growth significantly above expected levels.
    • Aggregate data: Data that are brought together to examine larger scale results. Building data may be aggregated together to produce district data.
    • Below expected gain: Classification of performance assigned to a teacher, a school, or a district that is producing growth significantly below expected levels.
    • Cohort: A group or division of people or items. An example of a cohort of students is a group of all students in a building at a particular grade level.
    • Disaggregate data: Data that are broken into smaller pieces to examine results associated with smaller subgroups. Grade level data may be disaggregated into teacher-level subgroups.
    • Growth standard: The standard for the amount of growth a student is expected to achieve in a given year.
    • Longitudinal data: Data collected and linked over time.
    • Normal curve equivalent (NCE): The NCE scale is an equal interval scale that ranges in scores from 1 to 99. This equal interval property makes it a favorable metric for manipulating group achievement performances across years.
    • Observed scores: Actual scores that students earn on a test.
    • Progress: Amount of growth students experience during one academic year.
    • School diagnostic report: Provides achievement subgroup comparisons of student progress organized by grade level and subject area. Report displays progress by prior-achievement. Reports also can be generated for user-selected adequate yearly progress (AYP) subgroups.
    • School effect or mean NCE gain: Average impact the school has on students' progress in a specific grade level and subject area.
    • School value-added report: Provides aggregate growth rates for students across the tested subject areas in the school. Report displays student progress by grade level and subject area.
    • Standard error: A statistic that establishes a level of certainty associated with the estimated mean gain. Generally speaking, the smaller the standard error, the more precise the estimate of the effect. Whenever student test data are used to produce a value-added effect, standard error accompanies the effect.
    • Value-added analysis: Statistical methodology used to measure student progress.
    • Value-added summary report: Provides grade level by grade level comparisons of student progress rates in all schools in the district.

    References

    Aaronson, D., Barrow, L., & Sander, L. (2007). Teachers and student achievement in the Chicago public high schools. Journal of Labor Economics, 25(1), 95–135. http://dx.doi.org/10.1086/508733
    Bill & Melinda Gates Foundation. (2009, July). Speech to the National Conference of State Legislatures. Retrieved from http://www.gatesfoundation.org/speeches-commentary/Pages/bill-gates-2009-conference-state-legislatures.aspx.
    Boston Public Schools. (1998, March 9). High school restructuring. Boston: Author
    Coleman, J. (1966). Equality of educational opportunity study. Ann Arbor, MI: Inter-university Consortium for Political and Social Research.
    Doran, G.T. (1981). There's a SMART way to write management's goals and objectives. Management Review, 70(11), 33–36.
    Downey, C. J., Steffy, B. E., English, F. W., Frase, L. E., & Poston, W. K. (2004). The three-minute classroom walk-through: Changing school supervisory practice one teacher at a time. Thousand Oaks, CA: Corwin.
    Duncan, A. (2009, July 24). Education reform's moon shot. The Washington Post. Retrieved from http://www.washingtonpost.com/wp-dyn/content/article/2009/07/23/AR2009072302634.html.
    Felch, J., Song, J., & Smith, D. (2010, August 14). Who's teaching L.A.'s kids? Los Angeles Times. Retrieved from http://articles.latimes.com/2010/aug/14/local/la-me-teachers-value-20100815
    Harris, D. N. (2011). Value-added measures in education: What every educator needs to know. Cambridge, MA: Harvard Education Press.
    Hattie, J. (2009). Visible learning. London: Routledge, UK.
    Haycock, K., & Crawford, C. (2008, April). Closing the Teacher Quality Gap. Educational Leadership, 65 (7), 14–19. Retrieved from http://www.ascd.org/publications/educational-leadership/apr08/vol65/num07/Closing-the-Teacher-Quality-Gap.aspx
    Hershberg, T. (2004). Value-added assessment: Powerful diagnostics to improve instruction and promote student achievement. Monograph presented at American Association of School Administrators Women Administrators Conference, Arlington, VA.
    Hoff, D. J. (1999). Echoes of the Coleman report. Education Week, 18(28), 33.
    Jacob, B. A., Lefgren, L., & Sims, D. (2008). The persistence of teacher-induced learning gains (NBER Working Paper No. 14065). Cambridge, MA: National Bureau of Economic Research. http://dx.doi.org/10.3386/w14065
    Jordan, H., Mendro, R., & Weerasinghe, D. (1997). Teacher effects on longitudinal student achievement. Dallas, TX: Dallas Independent School District.
    Kane, T. J., & Staiger, D. O. (2008). Estimating teacher impacts on student achievement: An experimental evaluation (Preliminary draft PDF, NBER Working Paper No. 14607). Cambridge, MA: National Bureau of Economic Research. http://dx.doi.org/10.3386/w14607
    Konstantopoulos, S. (2007). How long do teacher effects persist? (Discussion Paper No. 2893). Boston, MA: Boston College, Institute for the Study of Labor.
    Marzano, R. J. (2000). A new era in school reform: Going where the research takes us. Aurora, CO: Mid-Continent Regional Education Lab.
    McCaffrey, D. F., & Hamilton, L. S. (2007). Value-added assessment in practice: Lessons from the Pennsylvania Value-Added Assessment System pilot project. Santa Monica, CA: RAND.
    National Commission on Excellence in Education. (1983). A nation at risk: The imperative for educational reform. Washington, DC: U.S. Government Printing Office.
    Reeves, D. (2002). [Diagram]. The leadership for learning framework. Retrieved from http://www.ascd.org/publications/books/105151/chapters/Introduction@-What-The-Learning-Leader-Will-Do-for-You.aspx
    Rivers, J. C. (2000). The impact of teacher effect on student math competency achievement (Doctoral dissertation, University Microfilms International, 9959317). Ann Arbor, MI: University of Tennessee.
    Rivkin, S. G., Hanushek, E. A., & Kain, J. F. (2001). Teachers, schools, and academic achievement (Working Paper No. 6691, revised). Cambridge, MA: National Bureau of Economic Research
    Rivkin, S. G., Hanushek, E. A., & Kain, J. F. (2005). Teachers, schools, and academic achievement. Econometrica, 73(2): 417–458. http://dx.doi.org/10.1111/j.1468-0262.2005.00584.x
    Rockoff, J. E. (2004). The impact of individual teachers on student achievement: Evidence from panel data. American Economic Review, 94(2): 247–252. http://dx.doi.org/10.1257/0002828041302244
    Rossiter, M. (2002). Narrative and stories in adult teaching and learning. (ERIC Document Reproduction Service No. ED473147). Retrieved from ERIC database: http://www.eric.ed.gov/ERICWebPortal/custom/portlets/recordDetails/detailmini.jsp?_nfpb=true&_&ERICExtSearch_SearchValue_0=ED473147&ERICExtSearch_SearchType_0=no&accno=ED473147
    Sanders, W. L. (2004, June). A summary of conclusions drawn from longitudinal analysis of student achievement data over the past 22 years. Paper presented to Governors Education Symposium, Asheville, NC.
    Sanders, W. L., & Rivers, J. C. (2009). Choosing a value-added model. In T. Hershberg & C. Robertson-Kraft (Eds.), A grand bargain for education reform: New rewards and supports for accountability (pp. 43–58). Cambridge, MA: Harvard Education Press.
    U.S. Department of Education. (2009). Race to the top program executive summary (CFDA number: 84.395). Retrieved from http://www.ed.gov/programs/racetothetop/index.html
    University of Tennessee Value-Added Research and Assessment Center. (1997). Graphical summary of educational findings from the Tennessee Value-Added Assessment System. Knoxville: University of Tennessee.

    CORWIN: A SAGE Company

    The Corwin logo—a raven striding across an open book—represents the union of courage and learning. Corwin is committed to improving education for all learners by publishing books and other professional development resources for those serving the field of PreK–12 education. By providing practical, hands-on materials, Corwin continues to carry out the promise of its motto: “Helping Educators Do Their Work Better.”


    • Loading...
Back to Top

Copy and paste the following HTML into your website