You are previewing Performance Evaluation: Proven Approaches for Improving Program and Organizational Performance.
O'Reilly logo
Performance Evaluation: Proven Approaches for Improving Program and Organizational Performance

Book Description

Performance Evaluation is a hands-on text for practitioners, researchers, educators, and students in how to use scientifically-based evaluations that are both rigorous and flexible. Author Ingrid Guerra-López, an internationally-known evaluation expert, introduces the foundations of evaluation and presents the most applicable models for the performance improvement field. Her book offers a wide variety of tools and techniques that have proven successful and is organized to illustrate evaluation in the context of continual performance improvement.

Table of Contents

  1. Cover Page
  2. Title Page
  3. Copyright
  4. CONTENTS
  5. ACKNOWLEDGMENTS
  6. PREFACE
  7. THE AUTHOR
  8. PART 1: INTRODUCTION TO EVALUATION
  9. CHAPTER 1: FOUNDATIONS OF EVALUATION
    1. A BRIEF OVERVIEW OF EVALUATION HISTORY
    2. EVALUATION: PURPOSE AND DEFINITION
    3. PERFORMANCE IMPROVEMENT: A CONCEPTUAL FRAMEWORK
    4. MAKING EVALUATION HAPPEN: ENSURING STAKEHOLDERS' BUY-IN
    5. THE EVALUATOR: A JOB OR A ROLE?
    6. THE RELATIONSHIP TO OTHER INVESTIGATIVE PROCESSES
    7. WHEN DOES EVALUATION OCCUR?
    8. GENERAL EVALUATION ORIENTATIONS
    9. CHALLENGES THAT EVALUATORS FACE
    10. ENSURING COMMITMENT
    11. BENEFITS OF EVALUATION
    12. BASIC DEFINITIONS
    13. KEY POINTS
    14. REFLECTION QUESTIONS
  10. CHAPTER 2: PRINCIPLES OF PERFORMANCE-BASED EVALUATION
    1. PRINCIPLE 1: EVALUATION IS BASED ON ASKING THE RIGHT QUESTIONS
    2. PRINCIPLE 2: EVALUATION OF PROCESS IS A FUNCTION OF OBTAINED RESULTS
    3. PRINCIPLE 3: GOALS AND OBJECTIVES OF ORGANIZATIONS SHOULD BE BASED ON VALID NEEDS
    4. PRINCIPLE 4: DERIVE VALID NEEDS USING A TOP-DOWN APPROACH
    5. PRINCIPLE 5: EVERY ORGANIZATION SHOULD AIM FOR THE BEST THAT SOCIETY CAN ATTAIN
    6. PRINCIPLE 6: THE SET OF EVALUATION QUESTIONS DRIVES THE EVALUATION STUDY
    7. KEY POINTS
    8. REFLECTION QUESTIONS
  11. PART 2: MODELS OF EVALUATION
  12. CHAPTER 3: OVERVIEW OF EXISTING EVALUATION MODELS
    1. OVERVIEW OF CLASSIC EVALUATION MODELS
    2. SELECTED EVALUATION MODELS
    3. SELECTING A MODEL
    4. CONCEPTUALIZING A USEFUL EVALUATION THAT FITS THE SITUATION
    5. KEY POINTS
    6. REFLECTION QUESTIONS
  13. CHAPTER 4: KIRKPATRICK'S FOUR LEVELS OF EVALUATION
    1. KIRKPATRICK'S LEVELS
    2. COMMENTS ON THE MODEL
    3. STRENGTHS AND LIMITATIONS
    4. APPLICATION EXAMPLE: WAGNER (1995)
    5. KEY POINTS
    6. REFLECTION QUESTIONS
  14. CHAPTER 5: PHILLIPS'S RETURN-ON-INVESTMENT METHODOLOGY
    1. PHILLIPS'S ROI PROCESS
    2. COMMENTS ON THE MODEL
    3. STRENGTHS AND LIMITATIONS
    4. APPLICATION EXAMPLE: BLAKE (1999)
    5. KEY POINTS
    6. REFLECTION QUESTIONS
  15. CHAPTER 6: BRINKERHOFF'S SUCCESS CASE METHOD
    1. THE SCM PROCESS
    2. STRENGTHS AND WEAKNESSES
    3. APPLICATION EXAMPLE: BRINKERHOFF (2005)
    4. KEY POINTS
    5. REFLECTION QUESTIONS
  16. CHAPTER 7: THE IMPACT EVALUATION PROCESS
    1. THE ELEMENTS OF THE PROCESS
    2. COMMENTS ON THE MODEL
    3. STRENGTHS AND LIMITATIONS
    4. APPLICATION EXAMPLE
    5. KEY POINTS
    6. REFLECTION QUESTIONS
  17. CHAPTER 8: THE CIPP MODEL
    1. STUFFLEBEAM'S FOUR TYPES OF EVALUATION
    2. ARTICULATING CORE VALUES OF PROGRAMS AND SOLUTIONS
    3. METHODS USED IN CIPP EVALUATIONS
    4. STRENGTHS AND LIMITATIONS
    5. APPLICATION EXAMPLE: FILELLA-GUIU AND BLANCH-PANA (2002)
    6. KEY POINTS
    7. REFLECTION QUESTIONS
  18. CHAPTER 9: Evaluating Evaluations
    1. EVALUATION STANDARDS
    2. THE AMERICAN EVALUATION ASSOCIATION PRINCIPLES FOR EVALUATORS
    3. APPLICATION EXAMPLE: LYNCH ET AL. (2003)
    4. KEY POINTS
    5. REFLECTION QUESTIONS
  19. PART 3: TOOLS AND TECHNIQUES OF EVALUATION
  20. CHAPTER 10: DATA
    1. CHARACTERISTICS OF DATA
    2. SCALES OF MEASUREMENT
    3. DEFINING REQUIRED DATA FROM PERFORMANCE OBJECTIVES
    4. DERIVING MEASURABLE INDICATORS
    5. Results
    6. Indicators
    7. A Glance at Evaluation Findings
    8. FINDING DATA SOURCES
    9. FOLLOW-UP QUESTIONS AND DATA
    10. KEY POINTS
    11. REFLECTION QUESTIONS
  21. CHAPTER 11: DATA COLLECTION
    1. OBSERVATION METHODOLOGY AND THE PURPOSE OF MEASUREMENT
    2. DESIGNING THE EXPERIMENT
    3. PROBLEMS WITH CLASSIC EXPERIMENTAL STUDIES IN APPLIED SETTINGS
    4. TIME-SERIES STUDIES
    5. SIMULATIONS AND GAMES
    6. DOCUMENT-CENTERED METHODS
    7. CONCLUSION
    8. KEY POINTS
    9. REFLECTION QUESTIONS
  22. CHAPTER 12: ANALYSIS OF EVALUATION DATA
    1. ANALYSIS OF MODELS AND PATTERNS
    2. ANALYSIS USING STRUCTURED DISCUSSION
    3. METHODS OF QUANTITATIVE ANALYSIS
    4. STATISTICS
    5. GRAPHICAL REPRESENTATIONS OF DATA
    6. MEASURES OF RELATIONSHIP
    7. INFERENTIAL STATISTICS: PARAMETRIC AND NONPARAMETRIC
    8. INTERPRETATION
    9. KEY POINTS
    10. REFLECTION QUESTIONS
  23. CHAPTER 13: COMMUNICATING THE FINDINGS
    1. RECOMMENDATIONS
    2. CONSIDERATIONS FOR IMPLEMENTING RECOMMENDATIONS
    3. DEVELOPING THE REPORT
    4. THE EVALUATOR'S ROLE AFTER THE REPORT
    5. KEY POINTS
    6. REFLECTION QUESTIONS
  24. PART 4: CONTINUAL IMPROVEMENT
  25. CHAPTER 14: COMMON ERRORS IN EVALUATION
    1. ERRORS OF SYSTEM MAPPING
    2. ERRORS OF LOGIC
    3. ERRORS OF PROCEDURE
    4. CONCLUSION
    5. KEY POINTS
    6. REFLECTION QUESTIONS
  26. CHAPTER 15: CONTINUAL IMPROVEMENT
    1. WHAT IS CONTINUAL IMPROVEMENT?
    2. MONITORING PERFORMANCE
    3. ADJUSTING PERFORMANCE
    4. THE ROLE OF LEADERSHIP
    5. KEY POINTS
    6. Reflection Questions
  27. CHAPTER 16: CONTRACTING FOR EVALUATION SERVICES
    1. THE CONTRACT
    2. CONTRACTING CONTROLS
    3. ETHICS AND PROFESSIONALISM
    4. SAMPLE STATEMENT OF WORK
    5. General Information
    6. Contract Award Meeting
    7. General Requirements
    8. Mandatory Tasks and Associated Deliverables
    9. Schedule for Deliverables
    10. Changes to Statement of Work
    11. Reporting Requirements
    12. Travel and Site Visits
    13. Sellalot Corporation Responsibilities
    14. Contractor Experience Requirements
    15. Confidentiality and Nondisclosure
    16. KEY POINTS
    17. REFLECTION QUESTIONS
  28. CHAPTER 17: INTELLIGENCE GATHERING FOR DECISION MAKING
    1. PERFORMANCE MEASUREMENT SYSTEMS
    2. ISSUES IN PERFORMANCE MEASUREMENT SYSTEMS
    3. CONCLUSION
    4. KEY POINTS
    5. REFLECTION QUESTIONS
  29. CHAPTER 18: THE FUTURE OF EVALUATION IN PERFORMANCE IMPROVEMENT
    1. EVALUATION AND MEASUREMENT IN PERFORMANCE IMPROVEMENT TODAY
    2. WHAT DOES THE FUTURE HOLD?
    3. CONCLUSION
    4. KEY POINTS
    5. REFLECTION QUESTIONS
  30. REFERENCES AND RELATED READINGS
  31. INDEX