You are previewing Evaluation in Organizations: A Systematic Approach to Enhancing Learning, Performance, and Change.
O'Reilly logo
Evaluation in Organizations: A Systematic Approach to Enhancing Learning, Performance, and Change

Book Description

From new product launches to large-scale training initiatives, organizations need the tools to measure the effectiveness of their programs, processes, and systems. In Evaluation in Organizations, learning theory experts Darlene Russ-Eft and Hallie Preskill integrate the most current research with practical applications to provide a fully revised new edition of this essential resource for managers, human resource professionals, students, and teachers.

Table of Contents

  1. Title Page
  2. Preface
  3. Acknowledgements
  4. chapter 1 - Defining Evaluation
    1. What Evaluation Is and Isn’t
    2. The Relationship Between Evaluation and Research
    3. What Can Be Evaluated?
    4. Why Evaluate?
    5. Top Ten Reasons Evaluation Is Neglected
    6. Kinds of Evaluations
    7. The Logic of Evaluation
    8. Evaluation Use
    9. Internal and External Evaluation
    10. Keep in Mind . . .
  5. Chapter 2 - The Evolution of Evaluation
    1. The Role of Evaluation in Today’s Organizations
    2. The (Really) Early Days of Evaluation
    3. Types of Evaluation Models and Approaches
    4. Conducting Cross-Culturally Competent Evaluation
    5. The Role of Evaluation in Today’s Organizations
    6. Keep in Mind . . .
  6. Chapter 3 - Evaluating Learning, Performance, and Change Initiatives
    1. The Role of the Learning, Performance, and Change Professional
    2. General Models, Approaches, and Taxonomies for Evaluating Training and Performance
    3. Evaluation Models Focused on Training Transfer
    4. Research on Training Program Evaluation Models
    5. Future Directions for Evaluating Learning, Performance, and Change
    6. Keep in Mind . . .
  7. Chapter 4 - The Politics and Ethics of Evaluation Practice
    1. Vignette 1: Evaluation as a Political Act in Worldwide Learning, Inc.
    2. Ways in Which Evaluation Becomes Political
    3. Political Influences During an Evaluation
    4. The Politics and Ethics of Evaluation Practice
    5. Examples of Evaluation Politics
    6. Strategies for Managing the Politics of Evaluation
    7. The Ethics of Evaluation Practice
    8. Scenario 4.1
    9. Scenario 4.2
    10. Keep in Mind . . .
  8. Chapter 5 - Focusing the Evaluation
    1. Developing an Evaluation Plan
    2. Developing the Evaluation’s Rationale and Purpose
    3. Developing a Program Logic Model
    4. Developing the Evaluation’s Purpose Statement
    5. Identifying the Evaluation’s Stakeholders
    6. Developing Key Evaluation Questions
    7. Keep in Mind . . .
  9. Chapter 6 - Selecting an Evaluation Design
    1. Basic Design Issues
    2. Commonly Used Evaluation Designs
    3. One-Shot Design
    4. Keep in Mind . . .
  10. Chapter 7 - Choosing Data Collection Methods
    1. Menu of Data Collection Methods
    2. Considerations for Choosing Data Collection Methods
    3. Using Multiple Data Collection Methods
    4. Keep in Mind . . .
  11. Chapter 8 - Archival Data
    1. Records
    2. Documents
    3. Existing Databases
    4. Using or Developing an Evaluation Database
    5. Some Guidelines for Using Records, Documents, and Databases
    6. Examples
    7. Advantages and Disadvantages of Collecting Archival Data
    8. Keep in Mind . . .
  12. Chapter 9 - Observation
    1. Using Methods of Observation
    2. Observer Roles
    3. Collecting and Recording Observation Data
    4. Conducting Observations
    5. Advantages and Disadvantages of Collecting Observation Data
    6. Keep in Mind . . .
  13. Chapter 10 - Surveys and Questionnaires
    1. .
    2. Types of Surveys
    3. Guidelines for Constructing Surveys
    4. Format Considerations
    5. Online and Web-Based Surveys
    6. Pilot Testing the Items and Surveys
    7. Summary of Steps in Survey Construction
    8. Logistics
    9. Handling Nonresponse Bias
    10. Managing the Data Collection Process
    11. Keep in Mind . . .
  14. Chapter 11 - Individual and Focus Group Interviews
    1. Types of Interviews
    2. Determining Which Interview Approach to Use
    3. Advantages of Individual and Focus Group Interviews
    4. Disadvantages of Individual and Focus Group Interviews
    5. Guidelines for Constructing Individual and Focus Group Interview Guides
    6. Guidelines for Conducting Individual and Focus Group Interviews
    7. The Interviewer’s Role
    8. Selecting and Training Interviewers
    9. Managing the Interview Process
    10. Computer-Aided Interviewing
    11. Keep in Mind . . .
  15. Chapter 12 - Sampling
    1. Why Sample?
    2. Approaches to Sampling
    3. Sampling Procedures
    4. Issues to Consider When Selecting a Sample
    5. Keep in Mind . . .
  16. Chapter 13 - Analyzing Evaluation Data
    1. Basic Considerations for Analyzing Data
    2. Approaches to Qualitative Data Analysis
    3. Approaches to Quantitative Data Analysis
    4. Economic Analyses of Learning, Performance, and Change Interventions
    5. Potential Problems in Analyzing Data
    6. Keep in Mind . . .
  17. Chapter 14 - Communicating and Reporting Evaluation Activities and Findings
    1. Purposes of Communicating and Reporting
    2. Audiences for Communicating and Reporting
    3. Timing of Communicating and Reporting
    4. Contents of Communications and Reports
    5. Formats for Communicating and Reporting
    6. Returning to the Case of MyFuture Unlimited
    7. Keep in Mind . . .
  18. Chapter 15 - Planning, Managing, and Budgeting the Evaluation
    1. Planning the Evaluation
    2. Managing the Evaluation
    3. Managing the Risks
    4. Developing an Evaluation Budget
    5. .
  19. Chapter 16 - Evaluating the Evaluation
    1. Focusing the Evaluation
    2. Implementing the Evaluation
    3. Managing the Evaluation
    4. Communicating and Reporting the Evaluation’s Activities and Findings
    5. Keep in Mind . . .
  20. Chapter 17 - Strategies for Implementing Evaluation in Organizations
    1. Gaining Commitment and Support for Evaluation Work
    2. Involving Stakeholders
    3. Understanding the Evaluation Context
    4. Engaging in Evaluation Professional Development
    5. Choosing an Evaluator Role
    6. Building Evaluation Capacity in Organizations
    7. Concluding Thoughts
  21. Appendix A: - The Readiness for Organizational Learning and Evaluation ...
  22. References
  23. Index
  24. About the Authors
  25. Copyright Page