Practical Program Evaluations: Getting from Ideas to Outcomes
- Front Matter
- Back Matter
- Subject Index
- Chapter 1: Program Evaluations that Matter
- The Myth of Ceteris Paribus
- Practicing Program Evaluation in a Complex World
- Why Should You Read This Book?
- The Paradox of Applied Program Evaluation
- The Core Principles of the C4 Approach
- An X-Ray of the Book
- Our Purpose is Program Evaluations that Matter
- Chapter 2: The Landscape of Program Evaluation
- The Public Interest
- Pursuing the Public Interest through Rationality
- Delving into Program Evaluation
- The Four C's and the Landscape of Program Evaluation
- Chapter 3: Understand your Client
- Know What your Client's Interests are
- Know What Success is
- Have a Principal Who can do Something
- Put the Evaluation in a Management Context
- Do the Right Thing
- Exercises and Discussion Questions
- Chapter 4: Know the Content
- Build the Analysis on Facts
- Align your Evidence and your Conclusions
- Simplicity Always Trumps Elegance
- Don't Let the Illusion of the Perfect Drive Out the Reality of the Good
- Never Underestimate the Power of Accurate Description
- Exercises and Discussion Questions
- Chapter 5: Control the Work
- Have a Real Work Plan
- Meet the Deadline
- Getting People with the Right Skills and Temperament
- Expect Something to Go Wrong
- Use more than One Set of Eyes
- Know your Core Values
- Exercises and Discussion Questions
- Chapter 6: Communicate with Clarity
- Lead with Communicating Ideas not Details
- Write and Speak in Short Words with Short Sentences
- Preview, Provide, and Review the Messages
- Visuals that Support, not Supplant, the Briefing
- Exercises and Discussion Questions
- Chapter 7: In the Long Run
- Personal Success
- Professional Success
1255 22nd Street, NW, Suite 400
Washington, DC 20037
Phone: 202-729-1900; toll-free, 1-866-4CQ-PRESS (1-866-427-7737)
Copyright © 2007 by CQ Press, a division of Congressional Quarterly Inc.
All rights reserved. No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher.
Cover design: Jeffrey Everett/El Jefe Design
∞ The paper used in this publication exceeds the requirements of the American National Standard for Information Sciences—Permanence of Paper for Printed Library Materials, ANSI Z39.48-1992.
Printed and bound in the United States of America
10 09 08 07 06 1 2 3 4 5
Library of Congress Cataloging-in-Publication Data
Emison, Gerald A.
Practical program evaluations: getting from ideas to outcomes / Gerald Andrews Emison.
Includes bibliographical references and index.
ISBN-13: 978-0-87289-302-3 (alk. paper)
1. Public administration—Evaluation. 2. Administrative agencies—Evaluation. I. Title.
To Robert and Carol Emison and Grace and Beau, whose memories guide me still[Page vi]
Tables, Figures, and Boxes
- 2-1 The Rational Decision-Making Model 18
- 4-1 Program Characteristics Table Sample 53
- 5-1 Example Work Plans 70
- 6-1 A Sample Concept Paper on Strategic Direction 97
- 6-2 Oral Presentation Checklist 103
- 3-1 The Client Practices 31
- 3-2 Know What Your Client's Interests Are 36
- 3-3 Know What Success Is 39
- 3-4 Have a Principal Who Can Do Something 42
- 3-5 Put the Evaluation in a Management Context 45
- 3-6 Do the Right Thing 47
- 4-1 The Content Practices 50
- 4-2 Build the Analysis on Facts 54
- 4-3 Align Your Evidence and Your Conclusions 56
- 4-4 Simplicity Always Trumps Elegance 59 [Page xii]
- 4-5 Don't Let the Illusion of the Perfect Drive Out the Reality of the Good 61
- 4-6 Never Underestimate the Power of Accurate Description 64
- 5-1 The Control Practices 68
- 5-2 Have a Real Work Plan 71
- 5-3 Meet the Deadline 74
- 5-4 Getting People with the Right Skills and Temperament 77
- 5-5 Expect Something to Go Wrong 81
- 5-6 Use More Than One Set of Eyes 84
- 5-7 Know Your Core Values 87
- 6-1 The Communication Practices 89
- 6-2 Lead with Communicating Ideas Not Details 92
- 6-3 Write and Speak in Short Words with Short Sentences 95
- 6-4 Preview, Provide, and Review the Messages 99
- 6-5 Visuals That Support, Not Supplant, the Briefing 102
Program evaluation is an important way to advance the public interest. It opens windows to improving the performance of public organizations. Performance in the public sector has always been a major concern, and the past decade has seen an increasing emphasis on it. Whether termed reinventing government, new public management, or results-based management, this new emphasis on results reflects the fact that achievement requires reflection, and program evaluation is institutionalized reflection. It enables the intellectual underbrush to be cleared away and performance improvements to be identified. This identification, however, is not enough for improvements to be realized; implementation is also necessary. This book concerns the practices that heighten the likelihood that a program evaluation will lead to implemented recommendations and subsequent improvement.
As a career member of the federal senior executive service for more than twenty years, I saw that two components typically made up successful program improvements. The first was rigorous, unbiased, and thorough analysis. The second was a series of practices that enabled decision makers to translate the analysis into change. The first component is the focus of most program evaluation texts and courses, whereas the second usually is left to “education by osmosis” when an evaluator begins work in the “real world.” The focus on the former at the expense of the latter is easy to understand. Learning the tools and methodology of program evaluation is not easy, so teaching competency in these skills is crucial. Seemingly, the latter is really just good common sense and can be more [Page xiv]easily taught on the fly. But what seems so obvious is often not so for students first entering the workforce.
It is not necessary to leave a critical aspect of successful evaluation to happenstance. This book identifies those practices that savvy evaluators follow so that their evaluations get implemented. It adds another dimension to the preparation, reflection, and practice that compose the essentials of program evaluation—a handy way to offer concrete advice and reinforce the practical.
The foundations of this book are my own experiences in the public sector and in the classroom. I initially conducted program evaluations as an analyst for the U.S. Environmental Protection Agency (EPA). As a manager, and later, as the director of the program evaluation division, I saw the effective, the ineffective, and the neglected as this group dealt with highly controversial issues. During this period I had many conversations with colleagues about what composed a truly worthwhile evaluation. Almost every practitioner spelled out successful change as the measure of a combination of rigor and practical action.
My work on evaluations led to my crossing over from evaluator to director of a large EPA regulatory program. As the director of air quality planning and standards, I found myself a customer of program evaluations and policy analyses. In this role I was able to observe, during my interactions with political executives and senior career appointee colleagues, what worked and what did not. This experience validated my belief that a combination of rigor with the practical is essential. When I moved to a regional office to become its senior career executive, my observations were reconfirmed in yet another venue. A good program evaluation needs conceptual rigor and practical application in order to be implemented.
Shortly after I left the regional office, I found myself teaching policy analysis in a university setting. As a practitioner I often had wondered why the practical skills essential for successful evaluations were so randomly distributed among newly graduated evaluators. I soon realized it was because most academic training in program evaluation emphasized conceptual preparation without much stress on practical pathways to success. When I retired from the senior executive service and became a full-time academic, I could not find a satisfactory text that exposed my students to this complementary aspect. So I wrote this book.[Page xv]
For those teaching introductory program evaluation courses, this book supplements the many fine core texts available. It introduces the practices essential to effectiveness in applied settings. It supplements, rather than replaces, the conceptual emphasis that is the staple of traditional program evaluation courses. The intention is to round out graduate students’ education and preparation. Its most useful place is early in a graduate program evaluation course, when students can employ this guidance on the content of the course throughout a semester. The book also can serve as an accessible reference to remind practicing evaluators in the rush of day-to-day work what is important for effectiveness.
The book is organized to promote accessibility. Chapter 1 explains the reasoning behind the text and its relevance to today's program evaluator. Chapter 2 places the book in the terrain of the overall enterprise of program evaluation. The text's core lies in the next four chapters. Each examines a key attribute of successful practical program evaluations. The 4Cs—client, content, control, and communication—are used to bundle the essential practices and to examine a series of related practices in a framework that students can return to easily.
Since this book is practice based, it is impossible to thank adequately everyone who played a part; it is my exposure to many dedicated public officials that enabled me to write it. Nevertheless, there are a number of people who contributed mightily to my ideas. Ron Brand, first as my boss, then as my mentor and friend, contributed extensively to most of the ideas found within. Stan Meiburg, as a staff assistant and then a colleague, has never failed to shed new insight upon public evaluation. John Thillmann, David Ziegele, and Tom Kelly were always able to bring me back to earth and remind me that in the long run if a practice does not improve program performance, it is not worthy of extensive effort. And the staffs of the EPA's Office of Air Quality Planning and Standards in Washington, D.C.; Research Triangle Park, North Carolina; and the Seattle, Washington, regional office consistently demonstrated that long-run improvement of the public's interest was why we were in the game.
My colleagues at Mississippi State University deserve special thanks, since they provided both the models and the encouragement to pursue this project. Similarly, colleagues at Duke University gave me the opportunity to structure these ideas into a coherent package for the first time.[Page xvi]
This work had its start in a conversation with Michael Dunaway of CQ Press. I am indebted to him for recognizing that there might be a book somewhere in my ramblings. Charisse Kiino has served as my guide and encourager at CQ Press, and this book simply would not exist without her counsel and encouragement. I am quite grateful for her persistence and patience. The production of the book was aided immeasurably by the thoughtful review of Abigail Harrison Emison. I also would like to thank the reviewers, whose supportive yet unvarnished critiques led to improvements that would not have occurred without their input: Daniel Baracskay, Valdosta State University; Steve Daniels, California State University at Bakersfield; Heidi Koenig, Northern Illinois University; Laura Langbein, American University; Tom Liou, University of Central Florida; Peter Mameli, City University of New York–John Jay College of Criminal Justice; Elizabethann O'Sullivan, North Carolina State University; and Dan Williams, City University of New York–Baruch College.
Last, I must thank the one person who has been a model of good humor and support, not only in the writing of this book but also in the experiences upon which this book draws. My wife, Donna Kay Harrison, has been for more than thirty years the litmus test for ideas on the practicality, humanity, and wisdom of major choices I have made in my career. Without her, not only would this book be impossible, the career that underlies it would have been impossible as well. I owe her a debt that is simply unpayable. But it is balanced, I hope, by a bottomless well of gratitude.
Such a wide range of contributors has allowed me to bring a number of experiences to this book. Nevertheless, while many helped, the work is mine alone, and I take responsibility for it while gratefully thanking everyone who helped directly or indirectly.
1. William James, “What Pragmatism Means,” in Pragmatism, ed. Louis Menand (New York: Random House, 1907).
2. Peter Drucker, The Effective Executive (New York: Harper & Row, 1966).Chapter Two
1. Peter H. Rossi, Howard E. Freeman, and Mark W. Lipsey, Evaluation: A Systematic Approach, 6th ed. (Thousand Oaks, Calif.: Sage, 1999), 93–111.
2. Leonard Merewitz and Stephen H. Sosnick, The Budget's New Clothes: A Critique of Planning-Programming-Budgeting and Benefit-Cost Analysis (Chicago: Markham, 1971).
3. Deborah Stone, Policy Paradox: The Art of Political Decision Making (New York: Norton, 2002).
4. Henry Weinstein, “1st Suit in State to Attack ‘Intelligent Design’ Filed,” Los Angeles Times, January 11, 2006, http://www.latimes.com.
5. Tim W. Clark, The Policy Process: A Practical Guide for Natural Resource Professionals (New Haven, Conn.: Yale University Press, 2002).
6. Martin Meyerson and Edward C. Banfield, Politics, Planning, and the Public Interest (Toronto, Canada: Free Press, 1955).
7. Kenneth Arrow, Social Choice and Individual Values, 2d ed. (New York: Wiley, 1963).
8. Brian Barry, Political Argument (Berkeley, Calif.: University of California Press, 1990).
9. Herbert Simon,“From Substantive to Procedural Rationality,” in Philosophy and Economic Theory, ed. Frank Hahn and Martin Hollis (New York: Oxford University Press, 1979), 65–86; and Thomas McCarthy, The Critical Theory of Jurgen Habermas (Cambridge, Mass.: MIT Press, 1981).
10. Herbert Simon, Administrative Behavior, 4th ed. (New York: Free Press, 1997).
11. Thomas Dewey, Liberalism and Social Action (New York: G. P. Putnam, 1935).
12. Arnold Love, “Implementation Evaluation,” in Handbook of Practical Program Evaluation, 2nd ed., ed. Joseph Wholey, Harry P. Hatry, and Kathryn E. Newcomer (San Francisco: Jossey-Bass, 2004), 63–98.
13. Rossi, Freeman, and Lipsey, Evaluation.
14. James C. McDavid and Laura R. L. Hawthorn, Program Evaluation and Performance Measurement: An Introduction to Practice (Thousand Oaks, Calif.: Sage, 2006).
15. Winston S. Churchill,“Address to Parliament,” November 11, 1947, http://www.enterstageright.com/archive/articles/0105/0105churchilldem.htm.
16. David Braybrooke and Charles E. Lindblom, A Strategy of Decision (New York: Free Press, 1963); and Simon, “From Substantive to Procedural Rationality.”[Page 114]
17. Simon, “From Substantive to Procedural Rationality;” Robert A. Dahl, A Preface to Democracy (Chicago: University of Chicago Press, 1956); and Braybrooke and Lindblom, A Strategy of Decision.
18. Simon, “From Substantive to Procedural Rationality.”
19. Braybrooke and Lindblom, A Strategy of Decision.
20. Rossi, Freeman, and Lipsey, Evaluation.
21. Richard D. Bingham and Claire L. Felbinger, Evaluation in Practice: A Methodological Approach, 2nd ed. (New York: Chatham, 2002).
22. Jody L. Fitzpatrick, James R. Sanders, and Blaine R. Worthen, Program Evaluation: Alternative Approaches and Practical Guidelines, 3rd ed. (Boston: Pearson, 2004).
23. Richard Berk and Peter H. Rossi, Thinking about Program Evaluation (Newbury Park, Calif.: Sage, 1990).
24. Earl Babbie, The Practice of Social Research (Belmont, Calif.: Wadsworth, 1992); and Martin Bulmer, The Uses of Social Research: Social Investigation in Public Policy-Making (London: George Allen and Unwin, 1982).
25. William Trochim, Research Design for Program Evaluation (Beverly Hills, Calif.: Sage, 1984).
26. Fitzpatrick, Sanders, and Worthen, Program Evaluation.
27. John A. McLaughlin and Gretchen B. Jordan, “Using Logic Models,” in Handbook of Practical Program Evaluation, 2nd ed., ed. Joseph S. Wholey, Harry P. Hatry, and Kathryn E. Newcomer (San Francisco: Jossey-Bass, 2004), 7–32.
28. Rossi, Freeman, and Lipsey, Evaluation.
29. Hubert M. Blalock, An Introduction to Social Research (Englewood Cliffs, N.J.: Prentice-Hall, 1970); and William L. Hays, Statistics (Ft. Worth, Texas: Harcourt Brace, 1994).
30. Rossi, Freeman, and Lipsey, Evaluation; and Love, “Implementation Evaluation.”
31. Donald Campbell and Julian Stanley, Experimental and Quasi-Experimental Designs for Research (Chicago: Rand McNally, 1966); and Thomas Cook and Donald Campbell, Quasi-Experimentation: Design and Analysis Issues for Field Settings (Chicago: Rand McNally, 1979).
32. Charles S. Reichardt and Melvin M. Mark, “Quasi-Experimentation,” in Handbook of Practical Program Evaluation, 2nd ed., ed. Joseph S. Wholey, Harry P. Hatry, and Kathryn E. Newcomer (San Francisco: Jossey-Bass, 2004), 126–149.
33. Blalock, An Introduction to Social Research; and William H. Greene, Econometric Analysis, 4th ed. (Upper Saddle River, N.J.: Prentice Hall, 2000).
34. Sharon L. Caudle, “Qualitative Data Analysis,” in Handbook of Practical Program Evaluation, 2nd ed., ed. Joseph S. Wholey, Harry P. Hatry, and Kathryn E. Newcomer (San Francisco: Jossey-Bass, 2004), 417–438.
35. Anselm Strauss and Juliet Corbin, Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory, 2nd ed. (Thousand Oaks, Calif.: Sage, 1998); Matthew B. Miles. and A. Michael Huberman, Qualitative Data Analysis: A Sourcebook of New Methods (Newbury Park, Calif.: Sage, 1984); and Joseph Maxwell, Qualitative Research Design (Thousand Oaks, Calif.: Sage, 1996).
36. Robert K. Yin, Case Study Research: Design and Methods, 2nd ed. (Newbury Park, Calif.: Sage, 1994).
37. Joe R. Feagin, Anthony M. Orum, and Gideon Sjoberg, eds., A Case for the Case Study (Chapel Hill, N.C.: University of North Carolina Press, 1991).
38. Fitzpatrick, Sanders, and Worthen, Program Evaluation; and Melvin M. Mark and R. Lance Shotland, Multiple Methods in Program Evaluation (San Francisco: Jossey-Bass, 1987).
39. Caudle, “Qualitative Data Analysis;” and Michael Patton, How to Use Qualitative Methods in Evaluation (Newbury Park, Calif.: Sage, 1987).
40. Caudle, “Qualitative Data Analysis.”
41. Patton, How to Use Qualitative Methods in Evaluation.
42. John Van Maanen, “Introduction,” in Varieties of Qualitative Research, ed. John Van Maanen, James M. Dabbs Jr., and Robert R. Faulkner (Newbury Park, Calif.: Sage, 1982), 11–30.
43. Caudle, “Qualitative Data Analysis.”
44. Caudle, “Qualitative Data Analysis.”
45. Yin, Case Study Research.
46. Caudle, “Qualitative Data Analysis.”
47. Joseph S. Wholey, Harry P. Hatry, and Kathryn E. Newcomer, eds., Handbook of Practical Program Evaluation (San Francisco: Jossey-Bass, 1994); and Joseph Wholey, Harry P. Hatry and Kathryn E. Newcomer, eds., Handbook of Practical Evaluation, 2d ed. (San Francisco: Jossey-Bass, 2004).[Page 115]
48. Yvonna Lincoln and Egon Guba, Naturalistic Inquiry (Newbury Park, Calif.: Sage, 1985); Miles and Huberman, Qualitative Data Analysis; and Babbie, The Practice of Social Research.
49. George Geis, “Formative Evaluation: Developmental Testing and Expert Review,” Performance and Instruction (1987), 26; Harvey Averch, “Megaproject Selection: Criteria and Rules for Evaluating Competing R&D Megaprojects,” Science and Public Policy 20, (1993); and Cynthia Weston, “The Importance of Involving Experts and Learners in Formative Evaluation,” Canadian Journal of Educational Communication 16 (1987), 45–58.
50. Margery Austin Turner and Wendy Zimmermann, “Role Playing,” in Handbook of Practical Program Evaluation, 2nd ed., ed. Joseph S. Wholey, Harry P. Hatry, and Kathryn E. Newcomer (San Francisco: Jossey-Bass, 2004), 320–339; and Debra L. Dean, “How to Use Focus Groups,” in Handbook of Practical Program Evaluation ed. Joseph S. Wholey, Harry P. Hatry, and Kathryn E. Newcomer (San Francisco: Jossey-Bass, 1994), 338–348.
51. Kathryn E. Newcomer and Philip W. Wirtz, “Using Statistics in Evaluation,” in Handbook of Practical Program Evaluation, 2nd ed., ed. Joseph S. Wholey, Harry P. Hatry, and Kathryn E. Newcomer (San Francisco: Jossey-Bass, 2004), 439–478.
52. Blalock, An Introduction to Social Research; Hays, Statistics; and Greene, Econometric Analysis.
53. Newcomer and Wirtz, “Using Statistics in Evaluation.”
54. Rossi, Freeman, and Lipsey, Evaluation.
55. Leigh Burstein, Howard Freeman, and Peter H. Rossi, eds., Collection Evaluation Data (Newbury Park, Calif.: Sage, 1985).
56. Newcomer and Wirtz, “Using Statistics in Evaluation.”
57. Floyd J. Fowler, Survey Research Methods: Applied Social Research Methods (Thousand Oaks, Calif.: Sage, 1993); Arlene Fink and Jacqueline Kosecoff, How to Conduct Surveys: A Step-by-Step Guide (Newbury, Calif.: Sage, 1985); and Seymour Sudman and Norman M. Bradburn, Asking Questions: A Practical Guide to Questionnaire Design (San Francisco: Jossey Bass, 1986).
58. Fitzpatrick, Sanders, and Worthen, Program Evaluation.
59. Caudle, “Qualitative Data Analysis.”
60. Berk and Rossi, Thinking about Program Evaluation.
61. Edward R. Tufte, The Visual Display of Quantitative Information (Cheshire, Conn.: Graphics Press, 1983); Edward R. Tufte, Envisioning Information (Cheshire, Conn.: Graphics Press, 1990); and Edward R. Tufte, Visual Explanations (Cheshire, Conn.: Graphics Press, 1997).
62. Gerald E. Jones, How to Lie with Charts (San Jose, Calif: iUniverse, 2000).
63. Blalock, An Introduction to Social Research; Hays, Statistics; and Greene, Econometric Analysis.
64. Darrell Huff, How to Lie with Statistics (New York: W.W. Norton, 1982).
65. John L. Phillips Jr., How to Think about Statistics (New York: W.H. Freeman, 1996).
66. Wholey, Hatry, and Newcomer, Handbook of Practical Program Evaluation; Wholey, Hatry, and Newcomer, Handbook of Practical Evaluation, 2d ed.; Rossi, Freeman, and Lipsey, Evaluation; Fitzpatrick, Sanders, and Worthen, Program Evaluation; and McDavid and Hawthorn, Program Evaluation and Performance Measurement.
67. Newcomer and Wirtz, “Using Statistics in Evaluating.”
68. James Edwin Kee, “Cost-Effectiveness and Cost-Benefit Analysis,” in Handbook of Practical Program Evaluation, 2nd ed., ed. Joseph Wholey, Harry P. Hatry, and Kathryn E. Newcomer (San Francisco: Jossey-Bass, 2004), 506–541; Henry M. Levin, Cost Effectiveness Primer (Newbury Park, Calif.: Sage, 1983); Dale E. Berger, “Using Regression Analysis,” in Handbook of Practical Program Evaluation, 2nd ed., ed. Joseph Wholey, Harry P. Hatry, and Kathryn E. Newcomer (San Francisco: Jossey-Bass, 2004), 479–505; and Lawrence B. Mohr, Impact Analysis for Program Evaluation, 2nd ed. (Thousand Oaks, Calif.: Sage, 1995).
69. McLaughlin and Jordan, “Using Logic Models;” and Office of Management and Budget, “Program Assessment Rating Tool,” http://www.whitehouse.gov/omb/part/index.html.Chapter Four
1. Rudyard Kipling, The Just-So Stories (London, England: Random House, 1902).
2. Peter J. Haas and J. Fred Springer, Applied Policy Research: Concepts and Cases (New York: Garland, 1998), 161–184.[Page 116]
3. Joseph S. Wholey, Harry P. Hatry, and Kathryn E. Newcomer, eds., Handbook of Practical Program Evaluation (San Francisco: Jossey-Bass, 2004).
4. Steven Cohen and Ronald Brand, Total Quality Management in Government (San Francisco: Jossey-Bass, 1993).
5. Wholey, Hatry, and Newcomer, Handbook of Practical Program Evaluation.
6. Peter Drucker, The Effective Executive (New York: Harper & Row, 1966).Chapter Five
1. Arthur Bloch, Murphy's Law: The 26th Anniversary Edition (New York: Penguin, 2003).
2. Klaus Mainzer, Thinking in Complexity (Berlin, Germany: Springer, 1996).
3. Stephen E. Ambrose, Eisenhower: Soldier, General of the Army, President-Elect, 1890–1952 (New York: Simon and Schuster, 1983).
4. Roger Fisher and William Ury, Getting to Yes (New York: Houghton Mifflin, 1981).Chapter Six
1. Rudolf Flesch, The Art of Readable Writing (New York: Harper and Row, 1974).
2. Eugene J. McCarthy, “An Indefensible War,” in Great American Speeches, ed. Gregory R. Suriano (New York: Gramercy Books, 1993), 244–248.
3. Barbara Jordan, “On the Impeachment of the President,” in Great American Speeches, ed. Gregory R. Suriano (New York: Gramercy Books, 1993), 281–286.