Summary
Contents
Subject index
The Human Resources Program-Evaluation Handbook is the first book to present state-of-the-art procedures for evaluating and improving human resources programs. Editors Jack E. Edwards, John C. Scott, and Nambury S. Raju provide a user-friendly yet scientifically rigorous "how to" guide to organizational program-evaluation. Integrating perspectives from a variety of human resources and organizational behavior programs, a wide array of contributing professors, consultants, and governmental personnel successfully link scientific information to practical application. Designed for academics and graduate students in industrial-organizational psychology, human resources management, and business, the handbook is also an essential resource for human resources professionals, consultants, and policy makers.
-
Front Matter
-
Chapters
Part I: Framework for Human Resources Program Evaluation
- Chapter 1: Overview of Program Evaluation
- Program Evaluation in Human Resources
- Evaluation Myths
- Myth 1: It is Impossible to Measure …
- Myth 2: There are too many Variables to do a Good Study
- Myth 3: No One is Asking for Evaluation, So why Bother?
- Myth 4: Negative Results will Hurt my Program
- Key Distinctions
- Process versus Outcome Evaluation
- Program Improvement versus Program Selection
- Program Evaluation versus Utility Analysis
- Who does Program Evaluation?
- Choosing Criteria for Success
- Practical Design Considerations
- Standards of Proof
- Designing an Adequate Evaluation
- Measurement Issues
- Reliability and Validity
- Quantitative and Qualitative Data
- Costs and Benefits
- Identifying Human Resource Needs
- Considering Cost-Benefit Trade-Offs
- Concluding Comments on Costs and Benefits
- Utilization
- Evaluation Readiness
- Communicating Results
- Applying Findings
- Conclusion
- Chapter 2: Job Analysis—The Basis for Developing Criteria for all Human Resources Programs
- Uses of a Proactive Job Analysis Program
- Assessing the Need for a Job Analysis Program and Preparing for it
- Conducting a Job Analysis Program
- Competence of Job Analysts
- Sources and Number of SMEs
- Methods of Collecting Information
- Questionnaires
- O*NET Database
- Steps to Collect Job Information
- Task-Generation Interviews and Survey
- KSAO-Identification Interviews and Survey
- Applications for Job Analysis Results
- Application to Personnel Selection
- Application to Training
- Application to Performance Evaluation and Competency Modeling
- Application to Employee Physical and Psychological Well-Being
- Other Applications
- Conclusion
- Chapter 3: Criteria for Human Resources Program Evaluation
- Common Approaches and Pitfalls
- One Measure to Serve All Masters
- Getting Past the Obvious
- Ramifications of Selecting Poor Criteria
- Characteristics of Good Criteria
- Reliability, Validity, and Other Measurement Factors
- Reliability and Validity
- Measures Based on Clearly Observable Events
- Measurable
- Freedom from Bias
- Relevance
- Meaningfulness to Stakeholders
- Focus on Value as Opposed to Return on Investment as a Proxy Measure
- Actionable Results
- Practicality
- Practicality and Costs
- Realistic and Credible Goals
- Organization Politics
- Practical Steps in Criterion Development and Implementation
- Involve a Broad Project Team
- Clarify Program Goals and Expected Impacts
- Review All Available Data
- Involve Stakeholders other than the HR Program Evaluation Team
- Develop Data Collection Strategy and Tools
- Implement Data Collection
- Analyze Criterion Measurements
- Communicate Results
- Final Comments
Part II: Staffing
- Chapter 4: Recruitment
- Understanding the Recruitment Process
- Recruitment Sources
- Traditional Sources
- Employee Referrals
- Print Ad
- Search Firms
- College Campus Recruitment
- Radio Ads
- Internet-Based Approaches
- Job Boards
- E-Recruiting
- Relationship Recruiting
- Evaluating the Recruitment Function
- Using Recruitment Outcomes for Evaluation
- Assessing Costs
- Advantages of Estimating Costs
- Potential Concerns in Estimating Costs
- Using Applicant Predictors and Criteria for Evaluation
- Assessing Predictor and Criterion Results
- Advantages of Evaluating Predictors and Criteria
- Potential Concerns when Evaluating Predictors and Criteria
- Using Applicant Perceptions of the Recruitment Process for Evaluation
- Assessing Applicant Reaction
- Advantages of Assessing Applicant Perceptions
- Potential Concerns in Assessing Applicant Perceptions
- Using Organizational Reputation for Evaluation
- Assessing Organizational Reputation
- Advantages of Assessing Organizational Reputation
- Potential Concerns when Assessing Organizational Reputation
- Conclusions
- Chapter 5: Setting Standards
- Setting Standards for Program Evaluation
- Criterion-Referenced versus Norm-Referenced Approaches
- Nonmeasurement Aspects of Standard Setting
- Evaluation of the Standard-Setting Programs
- Subject Matter Expert (SME) Selection
- Job Experience and Competency
- Geographic Representation
- Number of SMEs
- Job Analysis
- Examination Specifications
- Standard-Setting Procedures
- Angoff Method
- Nedelsky Method
- Bookmark Method
- Summary
- Chapter 6: Evaluating Personnel Selection Systems
- Program Evaluation Process
- Forming the Evaluation Team
- Evaluation Criteria
- Reliability
- Types of Reliability Estimates
- Test-Retest Method
- Alternate Forms Method
- Internal Consistency Methods
- Interrater Reliability
- Generalizability Theory
- Interpreting Reliability
- Reliability Coefficient
- Standard Error of Measurement (SEM)
- Effect of Study Design
- Validity
- Criterion-Related Validity
- Content-Oriented Validity
- Construct Validity
- Validity Generalization (VG)
- Selection Decisions
- Cutoff Scores, Ranking, and Banding
- Combining Scores from Multiple Employment Tests
- Test Administration Practices
- Fairness, Bias, and Discrimination
- Bias
- Item Bias
- Test Bias
- Illegal Discrimination
- Perceived Fairness
- Utility Analysis
- Conclusion
- Chapter 7: Selecting Managers and Executives: The Challenge of Measuring Success
- Selection Context
- Context of Management and Executive Roles
- Role Complexity and Change
- Management versus Executive Positions
- Impact of the Individual
- Selection Considerations
- Multiple Stakeholders
- Sequential Selections and Candidates
- Levels of Fit
- Evaluating Selection Design
- Evaluating Target Competencies
- Evaluating Assessment Tools
- Evaluating Selection Administration
- Design of Selection Administration
- Records and Documents
- Implementation of the Selection Process
- Evaluating Selection Decisions
- Data Interpretation
- Behavioral Indicators
- Actual Behavior
- Data Integration
- Evaluating Selection Outcomes
- Conclusions
Part III: Evaluating and Rewarding Employees
- Chapter 8: Performance Appraisal and Feedback Programs
- Goals of Appraisal and Feedback Systems
- Organizational Perspectives and Goals
- Appraisers' Perspectives and Goals
- Workers' Perspectives and Goals
- Functions of Performance Appraisal
- Evaluating Performance Appraisal Measurement Functions
- What should be Measured?
- Who should Measure?
- How to Measure?
- Evaluating the Communication Function of Performance Appraisal
- Corporate Communication Function
- Individual Performance Expectations and Feedback
- Role and Preparation of the Appraiser
- Timing and Frequency of the Performance Appraisal Communication
- Communicating what is Expected
- Communicating how the Individual Performed
- Summary and Conclusions
- Chapter 9: The Evaluation of 360-Degree Feedback Programs
- An Overview of 360-Degree Feedback
- Administering the Program and Using the Resulting Information
- Frequency and Method of Delivery
- Underlying Assumptions about the Benefits of 360-Degree Feedback
- Criteria for Evaluating 360-Degree Feedback Systems
- Survey Design
- Process Components
- Survey Results
- Interrater Agreement and Self–Other Discrepancy Scores
- Relationships between 360-Degree Feedback and Other Performance Measures
- Isolating the Unique Contribution of 360-Degree Feedback as Part of a Comprehensive Development Program
- Methods for Evaluating the Quality of the 360-Degree Program
- Reviewing Archival Records
- Stakeholder Assessments
- Benchmarking Analyses
- Evaluators of the Survey Program
- Organizational Leaders
- Internal HR Staff
- External 360-Degree Feedback Assessment Experts
- Evaluating the Quality and Long-Term Effects of 360-Degree Feedback
- Attitudes about the Process
- Awareness of Performance Dimensions and Performance Management
- Creating a Feedback Culture
- Tracking Change in 360-Degree Feedback Ratings
- Examining Summary Data and Tracking Change across the Organization
- Assessing Sensitivity to Others' Ratings
- Longitudinal Study
- Recommendations and Conclusion
- Chapter 10: Compensation Analysis
- Who should be Involved in the Preparation of Compensation Analyses?
- Pay Elements Included in a Compensation Study
- Methods of Analyzing Compensation
- Simple Pay Equity Analyses
- Organizationwide “Raw” Average (Median) Salary Comparisons
- What Factors Influence Pay?
- Fair Labor Standards Act (FLSA) Average Pay Comparison
- Average Pay Comparisons by Grade
- Job Title Cohort Analysis
- Criticisms of Simple Pay Analyses
- Applying Inferential Statistical Tests to Simple Pay Models
- Complex Pay Equity Techniques—Multiple Regression Analysis
- Explanatory Factors
- How are Regression Analyses Structured?
- Dangers of using an Overall Regression Method to Assess Pay Equity
- Consider Practical as Well as Statistical Significance
- How Well does the Regression Model Fit the Data?
- Tainted Variables
- Common Root Causes of Compensation Disparities
- Artificial Pay Differences
- Employment Policies and Practices
- Summary
Part IV: Employee Effectiveness
- Chapter 11: Conducting Training Evaluation
- Overview of Training Evaluation
- A Five-Step Model of Training Evaluation
- Step 1: Identify Training Objectives
- What are Training Objectives and why do we Need them?
- Three Components of Training Objectives
- Writing Training Objectives
- Step 2: Develop Evaluation Criteria
- Importance of the Criteria
- Kirkpatrick's Levels
- Additional Evaluation Criteria
- Matching Criteria to Training Objectives
- Step 3: Select an Evaluation Design
- Classical Experimental Designs
- Alternative Designs
- Selecting an Optimum Design
- Step 4: Assess Change Due to Training
- An Illustrative Example
- Choosing an Analytic Strategy
- Step 5: Perform a Utility Analysis
- Calculating Training Program Costs
- Calculating Program Benefits
- Calculating the Utility of a Training Program
- Summary and Conclusions
- Chapter 12: Succession Management
- What is Succession Management?
- Methods for Evaluating Competencies
- Multisource (360-Degree) Feedback Surveys
- Acceleration CentersSM
- Providing Feedback to Pool Members
- Determining Appropriate Developmental Activities
- Role of the CEO
- Line Manager Involvement
- Identifying the Organizational Level to be the Target of the Succession Management Process and the Current and Future Requirements
- Selection Decisions
- Additional Considerations
- Evaluating Succession Management
- A Case Example
- Conclusion
- Chapter 13: A Practical Guide to Evaluating Coaching: Translating State-of-the-Art Techniques to the Real World
- Research on Coaching
- Challenges and Issues in Evaluating Coaching
- Purpose
- Design
- Return on Investment and the Impact of Coaching
- A Practical Guide to Evaluating Coaching
- Step 1: Lay the Foundation
- Step 2: Design the Process
- Recommendations
- Step 3: Implement the Process
- Step 4: Analyze the Data
- Step 5: Present the Findings
- Final Comments
Part V: Team and Organizational Effectiveness
- Chapter 14: Team Performance
- Framework for the Chapter
- Team Designs are Not Panaceas
- Performance Evaluation as a General Process
- Measurement Framework for Understanding Team Performance
- Getting Started: How to Develop Team Performance Measures
- Five-Step Process for Developing Team Performance Measures
- Step 1: Review Existing Organizational Measures
- Step 2: Define Team Measurement Factors
- Step 3: Identify and Weight Team Member Activities
- Step 4: Develop Team Performance Measures and Standards
- Step 5: Create a Feedback System
- Sources of Measurement in Teams
- The Future of Team Performance Evaluation
- Conclusions
- Chapter 15: The Evaluation of Job Redesign Processes
- Five Principles of Job Redesign Evaluation
- Principle 1: Job Redesign and its Evaluation must be Understood from a Systems Perspective
- The Work Organization as a Systems Component
- The Worker as a Systems Component
- The Work as a Systems Component
- Principle 2: The Worker is the Most Significant Factor in Effective Job Redesign
- Principle 3: Job Redesign and its Evaluation are Continuous Processes
- Principle 4: A Realistic and Practical Understanding of the Work System is Needed to Effectively Use Evaluation Results
- Principle 5: Conditions before and during the Job Redesign must be Considered in Evaluation
- Worker Criteria for the Evaluation of Job Redesign Programs
- Adequate Discretion in Decision Making
- Opportunity to Learn on the Job and Keep on Learning
- Job Variety
- Mutual Support and Respect
- Experienced Meaningfulness of the Work
- A Desirable Future for the Worker
- Management Criteria for the Evaluation of Job Redesign Programs
- Reduction of Bottlenecks and Production Problems
- Improvement of Work Team Functioning
- Compliance with Government Laws and Regulations
- The Summative Evaluation of Job Redesign
- Bringing together Worker and Management Criteria in Successful Job Redesign
- Conclusions
- Chapter 16: Organization Development
- Overview of Organization Development
- A Process for Evaluating OD Interventions
- Scoping
- Purpose of the Evaluation
- The Role of Evaluator and Key Stakeholders
- Timing of the Evaluation
- Designing
- Determining the Level of Impact to Evaluate
- Identifying the Evaluation Methods
- Deciding on Data Source and Level of Detail
- Collecting and Analyzing Data
- Working with International Populations
- Ensuring Collection of the Right Amount of Data
- Communicating
- Telling a Compelling Story 000
- Maintaining Balance and Integrity
- Understanding Reactions to Feedback
- Several Case Examples
- Case 1: Formative Evaluation Feedback Saves the Day
- Case 2: A Case of Poor Scoping
- Case 3: Showing that Survey Action Planning Really Works
- Conclusion
- Chapter 17: Evaluating Diversity Programs
- Evaluating Diversity Programs: Barriers and Benefits
- Barriers: Reasons Diversity Programs might not be Evaluated
- Superficial Commitment to Diversity
- Ignorance is Bliss: Fear of what might be Learned
- Impact, Cost, and Time Involved in Evaluation
- Benefits: Reasons Diversity Programs should be Evaluated
- Determines Impact, Detects Deficiencies, and Identifies Areas for Improvement
- Signals Commitment
- Fends off the Critics
- Evaluating Diversity Programs: A Six-Step Plan
- Step 1. Form the Diversity Evaluation Team
- Internal versus External Evaluators
- Step 2. Develop the Evaluation Plan and Measures of Success
- Developing the Evaluation Plan
- Identifying Measures of Success
- Step 3. Obtain Commitment from Organizational Leaders
- Step 4. Gather Data
- Policy and Procedure Documents
- Demographic Breakouts Showing Trends over Time
- Survey Findings
- Individual and Focus Group Interviews
- Naturalistic Observations
- Best Practices from Organizations that are Recognized as Leaders in Diversity
- Step 5. Analyze Evaluation Data
- Step 6. Prepare an Evaluation Report with Action Plan
- Develop Presentation and Evaluation Report
- Develop and Implement Action Plan
- Summary and Conclusions
Part VI: Organizational Communications
- Chapter 18: Evaluating Organizational Survey Programs
- Methods for Gathering Evaluation Data
- Reviewing Archival Records
- Interviewing Stakeholders
- Evaluators of the Survey Program
- Internal Survey Staff
- Consortia
- External Experts
- Organization Leaders
- Rank and File Employees
- Criteria for Judging Survey Program Quality
- Qualifications of the Survey Staff
- Questionnaire Quality
- Bad Items
- Inadvertent Mistakes
- Respondent Inquiries and Concerns
- International Concerns
- Generalizability of Survey Findings
- Response Rates
- Precision of Findings
- Data Analysis and Presentation of Findings
- Analyses and Statistics
- Presenting Findings
- Benchmarking and Best Practices
- Decisions and Changes Linked to Survey Findings
- Timeliness and Cost
- Rapidity with which a Survey can be Conducted
- Cost
- Summary and Conclusions
- Chapter 19: A Practical Guide to Evaluating Computer-Enabled Communications
- Dimensions of Communication Technologies
- Dimensions
- Groupware
- Evaluating Corporate Needs
- Strategies for Selecting among a Set of Alternatives
- Compensatory Model
- Noncompensatory Model
- Application of the Models
- Results of the Compensatory Model
- Results of the Noncompensatory Model
- Which Model to Use?
- Prevalent Communication Technologies
- Videoconferencing
- Discussion Groups
- Technology-Based Training
- Instant Messaging
- Electronic Mail (E-mail)
- Corporate Web Sites
- Computer-Enabled Communication: Impact and Policies
- Evaluating Corporate Communications Policies
- Acceptable Use Policies
- Netiquette
- Policies regarding the Monitoring of Communications
- Conclusion
- Chapter 20: Customer Service Programs
- The Role of Human Resources in Customer Service
- Identifying Stakeholders (Who)
- Evaluators of the Customer Service Program
- Staff Departments
- Line Departments
- External Groups
- Working with the Stakeholders
- Selecting the Evaluation Criteria (What)
- Internal Customer Measures
- Performance Management and Performance Discipline Data
- Employee Attitudes
- Medical Incidents
- External Customer Measures
- Wallet Expansion
- Customer Retention
- Referrals
- Requests for Rework and Complaints
- Customer Attitudes
- Linking HR Programs with Customer Service Outcomes (Why)
- Summary
Part VII: Health and Work/Life Balance
- Chapter 21: Health and Safety Training Programs
- A Systems Approach to Health and Safety Training
- Assessing Training Needs and the Regulatory Nature of Health and Safety Training
- Developing Instructional Objectives
- Selecting and Designing Training Course Content and Delivering Training
- Enhanced Work Planning and Continuous Improvement through Training Program Evaluation
- Measures of Health and Safety Training Program Effectiveness
- Guidelines for Assessing On-the-Job Behavior (STEP-3) Associated with Health and Safety Training
- Planning an Evaluation of On-the-Job Behavior
- Developing and Administering New Training Program Evaluation Forms
- Analyzing Data, following up with Participants, and Reporting Results
- Issues Concerning the Transfer of Health and Safety Training
- Conclusion
- Chapter 22: Work/Life Balance Policies and Programs
- Why Evaluate Work/Life Policies and Programs?
- Historical Overview
- Work/Family Focus on Child Care (1970s–1980s)
- Broad Work/Life Focus (1980s–1990s)
- Work/Life Business Imperative (Late 1990s to the Present)
- Evaluating Work/Life Policies and Programs
- Step 1: Identify Objectives
- Step 2: Determine Methods
- Step 3: Gather and Analyze the Data
- Step 4: Link Analysis to Bottom Line Measures
- Step 5: Make Recommendations Based on the Work/Life Evaluation
- Summary
Part VIII: Issues Spanning Human Resources Programs
- Chapter 23: Evaluation of Human Resource Information Systems
- Brief Historical Overview of HRISs
- Primary Research Strategies for Evaluating an HRIS
- Assessors who can Conduct HRIS Evaluations
- Internal Groups
- External HRIS Experts
- Consortium Participation
- Criteria for Judging HRIS Quality
- Financial Criteria
- Human Infrastructure
- Technical Quality of System Functioning
- Reactions
- Value-Added Functions
- Benchmarking and Best Practices Criteria
- Integrating Criteria and Reporting Evaluation Results
- Chapter 24: Global Human Resource Metrics
- Talentship: A Decision Science for HR
- A Strategic Approach to the Measurement of Global HR
- A Model for Global HR Metrics
- External Factors Influencing Global HR Metrics
- Organizational Factors Influencing HR in MNEs
- Linking Elements
- Impact
- Effectiveness
- Efficiency
- Outcomes: MNE Concerns and Goals
- Summary and Conclusions
- Chapter 25: Strategic Planning for Human Resources
- Key Strategic Planning Issues for HR
- The Strategic Planning–HR Interface
- Strategic Planning Orientations
- Business Paramount
- Corporate Command
- Corporate Strategies
- Strategic Networks
- The Strategic Management Process
- HR Roles in the Strategic Management Process
- Role 1: Implementation of Corporate and Business Strategies and HR Program Development
- Implementation Roles in HR
- Testing the Implementation Models
- Program Development Models
- Role 2: HR Strategies
- Corporate Development Stages Model
- Role 3: HR Participation in Change Management
- Role 4: HR Participation in Acquisitions and Mergers
- Evaluation of HR Strategy
- Self-Audit Questionnaire for HR Strategy
- HR Strategy Benchmarking
- Achievement of Expected Values of Performance
- Conclusions
-
Back Matter
- Loading...