You are previewing Advanced Software Testing - Vol. 3, 2nd Edition, 2nd Edition.
O'Reilly logo
Advanced Software Testing - Vol. 3, 2nd Edition, 2nd Edition

Book Description

This book is for the technical test analyst who wants to achieve advanced skills in test analysis, design, and execution. With an exercise-rich approach, this book teaches you how to define and carry out the tasks required to implement a test strategy. You will be able to analyze, design, and execute tests using risk considerations to determine the appropriate effort and priority for tests.

The ISTQB certification program is the leading software tester certification program in the world, and this book will prepare you for the ISTQB Advanced Technical Test Analyst exam. Included are sample exam questions for the learning objectives covered by the latest (2012) ISTQB Advanced Level syllabus.

With over thirty years of software and systems engineering experience, author Rex Black is President of RBCS, a leader in software, hardware, and systems testing, and the most prolific author practicing in the field of software testing today. Previously, he served as President of both the International and American Software Testing Qualifications Boards (ISTQB and ASTQB).

Jamie Mitchell is a consultant who has been working in software testing, test automation, and development for over 20 years. He was a member of the Technical Advisory Group for ASTQB, and one of the primary authors for the ISTQB Advanced Technical Test Analyst 2012 syllabus.

Table of Contents

  1. Cover Page
  2. Title Page
  3. Copyright Page
  4. Table of Contents
  5. Jamie Mitchell’s Acknowledgements
  6. Rex Black’s Acknowledgements
  7. Introduction
  8. 1 The Technical Test Analyst’s Tasks in Risk-Based Testing
    1. 1.1 Introduction
    2. 1.2 Risk Identification
    3. 1.3 Risk Assessment
    4. 1.4 Risk Mitigation or Risk Control
    5. 1.5 An Example of Risk Identification and Assessment Results
    6. 1.6 Risk-Aware Testing Standard
    7. 1.7 Sample Exam Questions
  9. 2 Structure-Based Testing
    1. 2.1 Introduction
      1. 2.1.1 Control Flow Testing Theory
      2. 2.1.2 Building Control Flow Graphs
      3. 2.1.3 Statement Coverage
      4. 2.1.4 Decision Coverage
      5. 2.1.5 Loop Coverage
      6. 2.1.6 Hexadecimal Converter Exercise
      7. 2.1.7 Hexadecimal Converter Exercise Debrief
    2. 2.2 Condition Coverage
    3. 2.3 Decision Condition Coverage
    4. 2.4 Modified Condition/Decision Coverage (MC/DC)
      1. 2.4.1 Complicating Issues: Short-Circuiting
      2. 2.4.2 Complicating Issues: Coupling
    5. 2.5 Multiple Condition Coverage
      1. 2.5.1 Control Flow Exercise
      2. 2.5.2 Control Flow Exercise Debrief
        1. 1. Determine the tests required for decision condition coverage.
        2. 2. Determine the total number of tests required to achieve MC/DC coverage (assuming no short-circuiting) and define them.
        3. 3. Determine the total number of tests needed for multiple condition coverage.
        4. 4. If the compiler is set to short-circuit, which of those tests are actually needed?
    6. 2.6 Path Testing
      1. 2.6.1 Path Testing via Flow Graphs
      2. 2.6.2 Basis Path Testing
      3. 2.6.3 Cyclomatic Complexity Exercise
      4. 2.6.4 Cyclomatic Complexity Exercise Debrief
    7. 2.7 API Testing
    8. 2.8 Selecting a Structure-Based Technique
      1. Structure-Based Testing Exercise
      2. 2.8.1 Structure-Based Testing Exercise Debrief
        1. 1. How many test cases are needed for basis path coverage?
        2. 2. If we wanted to test this module to the level of multiple condition coverage (ignoring the possibility of short-circuiting), how many test cases would we need?
        3. 3. If this code were in a system that was subject to FAA/DO178C and was rated at Level A criticality, how many test cases would be needed for the first if() statement alone?
        4. 4. To achieve only statement coverage, how many test cases would be needed?
    9. 2.9 A Final Word on Structural Testing
    10. 2.10 Sample Exam Questions
  10. 3 Analytical Techniques
    1. 3.1 Introduction
    2. 3.2 Static Analysis
      1. 3.2.1 Control Flow Analysis
      2. 3.2.2 Data Flow Analysis
        1. 3.2.2.1 Define-Use Pairs
        2. 3.2.2.2 Define-Use Pair Example
        3. 3.2.2.3 Data Flow Exercise
        4. 3.2.2.4 Data Flow Exercise Debrief
        5. 3.2.2.5 A Data Flow Strategy
      3. 3.2.3 Static Analysis to Improve Maintainability
        1. 3.2.3.1 Code Parsing Tools
        2. 3.2.3.2 Standards and Guidelines
      4. 3.2.4 Call Graphs
        1. 3.2.4.1 Call-Graph-Based Integration Testing
        2. 3.2.4.2 McCabe’s Design Predicate Approach to Integration
        3. 3.2.4.3 Hex Converter Example
        4. 3.2.4.4 McCabe Design Predicate Exercise
        5. 3.2.4.5 McCabe Design Predicate Exercise Debrief
    3. 3.3 Dynamic Analysis
    4. 3.3.1 Memory Leak Detection
      1. 3.3.2 Wild Pointer Detection
      2. 3.3.3 Dynamic Analysis Exercise
      3. 3.3.4 Dynamic Analysis Exercise Debrief
    5. 3.4 Sample Exam Questions
  11. 4 Quality Characteristics for Technical Testing
    1. 4.1 Introduction
    2. 4.2 Security Testing
      1. 4.2.1 Security Issues
        1. 4.2.1.1 Piracy
        2. 4.2.1.2 Buffer Overflow
        3. 4.2.1.3 Denial of Service
        4. 4.2.1.4 Data Transfer Interception
        5. 4.2.1.5 Breaking Encryption
        6. 4.2.1.6 Logic Bombs/Viruses/Worms
        7. 4.2.1.7 Cross-Site Scripting
        8. 4.2.1.8 Timely Information
        9. 4.2.1.9 Internal Security Metrics
        10. 4.2.1.10 External Security Metrics
        11. 4.2.1.11 Exercise: Security
        12. 4.2.1.12 Exercise: Security Debrief
    3. 4.3 Reliability Testing
      1. 4.3.1 Maturity
        1. 4.3.1.1 Internal Maturity Metrics
        2. 4.3.1.2 External Maturity Metrics
      2. 4.3.2 Fault Tolerance
        1. 4.3.2.1 Internal Fault Tolerance Metrics
        2. 4.3.2.2 External Fault Tolerance Metrics
      3. 4.3.3 Recoverability
        1. 4.3.3.1 Internal Recoverability Metrics
        2. 4.3.3.2 External Recoverability Metrics
      4. 4.3.4 Compliance
        1. 4.3.4.1 Internal Compliance Metrics
        2. 4.3.4.2 External Compliance Metrics
      5. 4.3.5 An Example of Good Reliability Testing
      6. 4.3.6 Exercise: Reliability Testing
      7. 4.3.7 Exercise: Reliability Testing Debrief
    4. 4.4 Efficiency Testing
      1. 4.4.1 Multiple Flavors of Efficiency Testing
      2. 4.4.2 Modeling the System
        1. 4.4.2.1 Identify the Test Environment
        2. 4.4.2.2 Identify the Performance Acceptance Criteria
        3. 4.4.2.3 Plan and Design Tests
        4. 4.4.2.4 Configure the Test Environment
        5. 4.4.2.5 Implement the Test Design
        6. 4.4.2.6 Execute the Test
        7. 4.4.2.7 Analyze the Results, Tune and Retest
      3. 4.4.3 Time Behavior
        1. 4.4.3.1 Internal Time Behavior Metrics
        2. 4.4.3.2 External Time Behavior Metrics
      4. 4.4.4 Resource Utilization
        1. 4.4.4.1 Internal Resource Utilization Metrics
        2. 4.4.4.2 External Resource Utilization Metrics
      5. 4.4.5 Compliance
        1. 4.4.5.1 Internal Compliance Metric
        2. 4.4.5.2 External Compliance Metric
      6. 4.4.6 Exercise: Efficiency Testing
      7. 4.4.7 Exercise: Efficiency Testing Debrief
    5. 4.5 Maintainability Testing
      1. 4.5.1 Analyzability
        1. 4.5.1.1 Internal Analyzability Metrics
        2. 4.5.1.2 External Analyzability Metrics
      2. 4.5.2 Changeability
        1. 4.5.2.1 Internal Changeability Metrics
        2. 4.5.2.2 External Changeability Metrics
      3. 4.5.3 Stability
        1. 4.5.3.1 Internal Stability Metrics
        2. 4.5.3.2 External Stability Metrics
      4. 4.5.4 Testability
        1. 4.5.4.1 Internal Testability Metrics
        2. 4.5.4.2 External Testability Metrics
      5. 4.5.5 Compliance
        1. 4.5.5.1 Internal Compliance Metric
        2. 4.5.5.2 External Compliance Metric
      6. 4.5.6 Exercise: Maintainability Testing
      7. 4.5.7 Exercise: Maintainability Testing Debrief
    6. 4.6 Portability Testing
      1. 4.6.1 Adaptability
        1. 4.6.1.1 Internal Adaptability Metrics
        2. 4.6.1.2 External Adaptability Metrics
      2. 4.6.2 Replaceability
        1. 4.6.2.1 Internal Replaceability Metrics
        2. 4.6.2.2 External Replaceability Metrics
      3. 4.6.3 Installability
        1. 4.6.3.1 Internal Installability Metrics
        2. 4.6.3.2 External Installability Metrics
      4. 4.6.4 Coexistence
        1. 4.6.4.1 Internal Coexistence Metrics
        2. 4.6.4.2 External Coexistence Metrics
      5. 4.6.5 Compliance
        1. 4.6.5.1 Internal Compliance Metrics
        2. 4.6.5.2 External Compliance Metrics
      6. 4.6.6 Exercise: Portability Testing
      7. 4.6.7 Exercise: Portability Testing Debrief
    7. 4.7 General Planning Issues
    8. 4.8 Sample Exam Questions
  12. 5 Reviews
    1. 5.1 Introduction
    2. 5.2 Using Checklists in Reviews
      1. 5.2.1 Some General Checklist Items for Design and Architecture Reviews
      2. 5.2.2 Deutsch’s Design Review Checklist
      3. 5.2.3 Some General Checklist Items for Code Reviews
      4. 5.2.4 Marick’s Code Review Checklist
      5. 5.2.5 The Open Laszlo Code Review Checklist
    3. 5.3 Deutsch Checklist Review Exercise
    4. 5.4 Deutsch Checklist Review Exercise Debrief
    5. 5.5 Code Review Exercise
    6. 5.6 Code Review Exercise Debrief
    7. 5.7 Sample Exam Questions
  13. 6 Test Tools and Automation
    1. 6.1 Integration and Information Interchange between Tools
    2. 6.2 Defining the Test Automation Project
      1. 6.2.1 Preparing for a Test Automation Project
      2. 6.2.2 Why Automation Projects Fail
      3. 6.2.3 Automation Architectures (Data Driven vs. Keyword Driven)
        1. 6.2.3.1 The Capture/Replay Architecture
        2. 6.2.3.2 Evolving from Capture/Replay
        3. 6.2.3.3 The Simple Framework Architecture
        4. 6.2.3.4 The Data-Driven Architecture
        5. 6.2.3.5 The Keyword-Driven Architecture
      4. 6.2.4 Creating Keyword-Driven Tables
        1. 6.2.4.1 Building an Intelligent Front End
        2. 6.2.4.2 Keywords and the Virtual Machine
        3. 6.2.4.3 Benefits of the Keyword Architecture
        4. 6.2.4.4 Creating Keyword-Driven Tables Exercise
        5. 6.2.4.5 Keyword-Driven Exercise Debrief
    3. 6.3 Specific Test Tools
      1. 6.3.1 Fault Seeding and Fault Injection Tools
      2. 6.3.2 Performance Testing and Monitoring Tools
        1. 6.3.2.1 Data for Performance Testing
        2. 6.3.2.2 Building Scripts
        3. 6.3.2.3 Measurement Tools
        4. 6.3.2.4 Performing the Testing
        5. 6.3.2.5 Performance Testing Exercise
        6. 6.3.2.6 Performance Testing Exercise Debrief
      3. 6.3.3 Tools for Web Testing
      4. 6.3.4 Model-Based Testing Tools
      5. 6.3.5 Tools to Support Component Testing and Build Process
    4. 6.4 Sample Exam Questions
  14. 7 Preparing for the Exam
    1. Learning Objectives
  15. Appendix
    1. Bibliography
      1. Advanced Syllabus Referenced Standards
      2. Advanced Syllabus Referenced Books
      3. Advanced Syllabus Other References
      4. Other Referenced Books
      5. Other References
      6. Referenced Standards
    2. HELLOCARMS The Next Generation of Home Equity Lending
      1. I Table of Contents
      2. II Versioning
      3. III Glossary
      4. 000 Introduction
      5. 001 Informal Use Case
      6. 003 Scope
      7. 004 System Business Benefits
      8. 010 Functional System Requirements
      9. 020 Reliability System Requirements
      10. 030 Usability System Requirements
      11. 040 Efficiency System Requirements
      12. 050 Maintainability System Requirements
      13. 060 Portability System Requirements
      14. A Acknowledgement
  16. Answers to Sample Questions
  17. Footnotes
    1. 1 The Technical Test Analyst’s Tasks in Risk-Based Testing
    2. 2 Structure-Based Testing
    3. 3 Analytical Techniques
    4. 4 Quality Characteristics for Technical Testing
    5. 5 Reviews
    6. 6 Test Tools and Automation