You are previewing Software Design for Six Sigma: A Roadmap for Excellence.
O'Reilly logo
Software Design for Six Sigma: A Roadmap for Excellence

Book Description

This proposal constitutes an algorithm of design applying the design for six sigma thinking, tools, and philosophy to software design. The algorithm will also include conceptual design frameworks, mathematical derivation for Six Sigma capability upfront to enable design teams to disregard concepts that are not capable upfront, learning the software development cycle and saving development costs.

The uniqueness of this book lies in bringing all those methodologies under the umbrella of design and provide detailed description about how these methods, QFD, DOE, the robust method, FMEA, Design for X, Axiomatic Design, TRIZ can be utilized to help quality improvement in software development, what kinds of different roles those methods play in various stages of design and how to combine those methods to form a comprehensive strategy, a design algorithm, to tackle any quality issues in the design stage.

Table of Contents

  1. Copyright
  2. PREFACE
  3. ACKNOWLEDGMENTS
  4. 1. SOFTWARE QUALITY CONCEPTS
    1. 1.1. WHAT IS QUALITY
    2. 1.2. QUALITY, CUSTOMER NEEDS, AND FUNCTIONS
    3. 1.3. QUALITY, TIME TO MARKET, AND PRODUCTIVITY
    4. 1.4. QUALITY STANDARDS
    5. 1.5. SOFTWARE QUALITY ASSURANCE AND STRATEGIES
    6. 1.6. SOFTWARE QUALITY COST
    7. 1.7. SOFTWARE QUALITY MEASUREMENT
      1. 1.7.1. Understandability
      2. 1.7.2. Completeness
      3. 1.7.3. Conciseness
      4. 1.7.4. Portability
      5. 1.7.5. Consistency
      6. 1.7.6. Maintainability
      7. 1.7.7. Testability
      8. 1.7.8. Usability
      9. 1.7.9. Reliability
      10. 1.7.10. Structuredness
      11. 1.7.11. Efficiency
      12. 1.7.12. Security
    8. 1.8. SUMMARY
    9. 1.9. REFERENCES
  5. 2. TRADITIONAL SOFTWARE DEVELOPMENT PROCESSES
    1. 2.1. INTRODUCTION
    2. 2.2. WHY SOFTWARE DEVELOPMENTAL PROCESSES?
      1. 2.2.1. Categories of Software Developmental Process
    3. 2.3. SOFTWARE DEVELOPMENT PROCESSES
      1. 2.3.1. Different Software Process Methods in Practice
        1. 2.3.1.1. PSP and TSP.
        2. 2.3.1.2. Waterfall Process
          1. 2.3.1.2.1. Advantage.
          2. 2.3.1.2.2. Disadvantage.
          3. 2.3.1.2.3. Suitability.
        3. 2.3.1.3. Sashimi Model.
          1. 2.3.1.3.1. Advantage.
          2. 2.3.1.3.2. Disadvantage.
          3. 2.3.1.3.3. Suitability.
        4. 2.3.1.4. V-Model.
          1. 2.3.1.4.1. Advantages
          2. 2.3.1.4.2. Disadvantages
          3. 2.3.1.4.3. Suitability.
        5. 2.3.1.5. V-Model XT.
          1. 2.3.1.5.1. Advantages
          2. 2.3.1.5.2. Disadvantages.
          3. 2.3.1.5.3. Suitability.
        6. 2.3.1.6. Spiral Model.
          1. 2.3.1.6.1. Advantages
          2. 2.3.1.6.2. Disadvantages
          3. 2.3.1.6.3. Suitability.
        7. 2.3.1.7. Chaos Model.
          1. 2.3.1.7.1. Advantages
          2. 2.3.1.7.2. Disadvantages
          3. 2.3.1.7.3. Suitability
        8. 2.3.1.8. Top-Down and Bottom-Up.
          1. 2.3.1.8.1. Advantages
          2. 2.3.1.8.2. Disadvantages
          3. 2.3.1.8.3. Suitability.
        9. 2.3.1.9. Joint Application Development (JAD).
          1. 2.3.1.9.1. Advantages
          2. 2.3.1.9.2. Disadvantages
          3. 2.3.1.9.3. Suitability.
        10. 2.3.1.10. Rapid Application Development (RAD).
          1. 2.3.1.10.1. Advantages
          2. 2.3.1.10.2. Disadvantages
          3. 2.3.1.10.3. Suitability.
          4. 2.3.1.10.4. Model-Driven Engineering (MDE).
          5. 2.3.1.10.5. Advantages
          6. 2.3.1.10.6. Disadvantages
          7. 2.3.1.10.7. Suitability
        11. 2.3.1.11. Iterative Development Processes.
          1. 2.3.1.11.1. Advantages
          2. 2.3.1.11.2. Disadvantages
          3. 2.3.1.11.3. Suitability
      2. 2.3.2. Agile Software Development
        1. 2.3.2.1.
          1. 2.3.2.1.1. Advantages (Stevens et al., 2007)
          2. 2.3.2.1.2. Disadvantages (Stevens et al., 2007)
          3. 2.3.2.1.3. Suitability
        2. 2.3.2.2. Unified Process.
          1. 2.3.2.2.1. Advantages
          2. 2.3.2.2.2. Disadvantages
          3. 2.3.2.2.3. Suitability
        3. 2.3.2.3. eXtreme Programming.
          1. 2.3.2.3.1. Advantages
          2. 2.3.2.3.2. Disadvantages
          3. 2.3.2.3.3. Suitability
        4. 2.3.2.4. Wheel and Spoke Model.
          1. 2.3.2.4.1. Advantages
          2. 2.3.2.4.2. Disadvantages
          3. 2.3.2.4.3. Suitability
        5. 2.3.2.5. Constructionist Design Methodology.
          1. 2.3.2.5.1. Advantages
          2. 2.3.2.5.2. Disadvantages
          3. 2.3.2.5.3. Suitability
    4. 2.4. SOFTWARE DEVELOPMENT PROCESSES CLASSIFICATION
    5. 2.5. SUMMARY
    6. 2.6. REFERENCES
  6. 3. DESIGN PROCESS OF REAL-TIME OPERATING SYSTEMS (RTOS)
    1. 3.1. INTRODUCTION
    2. 3.2. RTOS HARD VERSUS SOFT REAL-TIME SYSTEMS
      1. 3.2.1. Real Time versus General Purpose
    3. 3.3. RTOS DESIGN FEATURES
      1. 3.3.1. Memory Management
      2. 3.3.2. Peripheral Communication (Input / Output)
      3. 3.3.3. Task Management
    4. 3.4. TASK SCHEDULING: SCHEDULING ALGORITHMS
      1. 3.4.1. Interrupt-Driven Systems
      2. 3.4.2. Periodic versus Aperiodic Tasks
      3. 3.4.3. Preemption
      4. 3.4.4. Static Scheduling
      5. 3.4.5. Dynamic Scheduling
    5. 3.5. INTERTASK COMMUNICATION AND RESOURCE SHARING
      1. 3.5.1. Semaphores
    6. 3.6. TIMERS
      1. 3.6.1. Watchdog Timer
      2. 3.6.2. System Timer
    7. 3.7. CONCLUSION
    8. 3.8. REFERENCES
  7. 4. SOFTWARE DESIGN METHODS AND REPRESENTATIONS
    1. 4.1. INTRODUCTION
    2. 4.2. HISTORY OF SOFTWARE DESIGN METHODS
    3. 4.3. SOFTWARE DESIGN METHODS
      1. 4.3.1. Object-Oriented Design
      2. 4.3.2. Level-Oriented Design
      3. 4.3.3. Data Flow or Structured Design
      4. 4.3.4. Data-Structure-Oriented Design
    4. 4.4. ANALYSIS
      1. 4.4.1. Future Trends
    5. 4.5. SYSTEM-LEVEL DESIGN APPROACHES
      1. 4.5.1. Hardware/Software Codesign
      2. 4.5.2. Specification and Modeling
      3. 4.5.3. Design and Refinement
      4. 4.5.4. Validation
      5. 4.5.5. Specification and Modeling
      6. 4.5.6. Models of Computation
        1. 4.5.6.1. Finite State Machines (FSM).
        2. 4.5.6.2. Discrete-Event Systems.
        3. 4.5.6.3. Petri Nets.
        4. 4.5.6.4. Data Flow Graphs.
        5. 4.5.6.5. Synchronous/Reactive Models.
        6. 4.5.6.6. Heterogeneous Models.
      7. 4.5.7. Comparison of Models of Computation
    6. 4.6. PLATFORM-BASED DESIGN
      1. 4.6.1. Platform-based Design Advantages
      2. 4.6.2. Platform-based Design Principles
    7. 4.7. COMPONENT-BASED DESIGN
    8. 4.8. CONCLUSIONS
    9. 4.9. REFERENCES
  8. 5. DESIGN FOR SIX SIGMA (DFSS) SOFTWARE MEASUREMENT AND METRICS
    1. 5.1. INTRODUCTION
    2. 5.2. SOFTWARE MEASUREMENT PROCESS
    3. 5.3. SOFTWARE PRODUCT METRICS
      1. 5.3.1. McCabe's Cyclomatic Number
      2. 5.3.2. Henry–Kafura (1981) Information Flow
      3. 5.3.3. Halstead's (1997) Software Science
    4. 5.4. GQM (GOAL–QUESTION–METRIC) APPROACH
    5. 5.5. SOFTWARE QUALITY METRICS
    6. 5.6. SOFTWARE DEVELOPMENT PROCESS METRICS
    7. 5.7. SOFTWARE RESOURCE METRICS
    8. 5.8. SOFTWARE METRIC PLAN
    9. 5.9. REFERENCES
  9. 6. STATISTICAL TECHNIQUES IN SOFTWARE SIX SIGMA AND DESIGN FOR SIX SIGMA (DFSS)
    1. 6.1. INTRODUCTION
    2. 6.2. COMMON PROBABILITY DISTRIBUTIONS
    3. 6.3. SOFTWARE STATISTICAL METHODS
      1. 6.3.1. Descriptive Statistics
        1. 6.3.1.1. Measures of Central Tendency.
        2. 6.3.1.2. Measures of Dispersion.
    4. 6.4. INFERENTIAL STATISTICS
      1. 6.4.1. Parameter Estimation
        1. 6.4.1.1. Hypothesis Testing.
      2. 6.4.2. Experimental Design
    5. 6.5. A NOTE ON NORMAL DISTRIBUTION AND NORMALITY ASSUMPTION
      1. 6.5.1. Violating the Normality Assumption
    6. 6.6. SUMMARY
    7. 6.7. REFERENCES
  10. 7. SIX SIGMA FUNDAMENTALS
    1. 7.1. INTRODUCTION
    2. 7.2. WHY SIX SIGMA?
    3. 7.3. WHAT IS SIX SIGMA?
    4. 7.4. INTRODUCTION TO SIX SIGMA PROCESS MODELING
      1. 7.4.1. Process Mapping
      2. 7.4.2. Value Stream Mapping
    5. 7.5. INTRODUCTION TO BUSINESS PROCESS MANAGEMENT
    6. 7.6. SIX SIGMA MEASUREMENT SYSTEMS ANALYSIS
    7. 7.7. PROCESS CAPABILITY AND SIX SIGMA PROCESS PERFORMANCE
      1. 7.7.1. Motorola's Six Sigma Quality
    8. 7.8. OVERVIEW OF SIX SIGMA IMPROVEMENT (DMAIC)
      1. 7.8.1. Phase 1: Define
      2. 7.8.2. Phase 2: Measure
      3. 7.8.3. Phase 3: Analyze
      4. 7.8.4. Phase 4: Improve
      5. 7.8.5. Phase 5: Control
    9. 7.9. DMAIC SIX SIGMA TOOLS
    10. 7.10. SOFTWARE SIX SIGMA
      1. 7.10.1. Six Sigma Usage in Software Industry
    11. 7.11. SIX SIGMA GOES UPSTREAM—DESIGN FOR SIX SIGMA
    12. 7.12. SUMMARY
    13. 7.13. REFERENCES
  11. 8. INTRODUCTION TO SOFTWARE DESIGN FOR SIX SIGMA (DFSS)
    1. 8.1. INTRODUCTION
    2. 8.2. WHY SOFTWARE DESIGN FOR SIX SIGMA?
    3. 8.3. WHAT IS SOFTWARE DESIGN FOR SIX SIGMA?
    4. 8.4. SOFTWARE DFSS: THE ICOV PROCESS
    5. 8.5. SOFTWARE DFSS: THE ICOV PROCESS IN SOFTWARE DEVELOPMENT
    6. 8.6. DFSS VERSUS DMAIC
    7. 8.7. A REVIEW OF SAMPLE DFSS TOOLS BY ICOV PHASE
      1. 8.7.1. Sample Identify Phase DFSS Tools
      2. 8.7.2. Sample Conceptualize Phase DFSS Tools
      3. 8.7.3. Sample Optimize Phase DFSS Tools
      4. 8.7.4. Sample Verify and Validate Phase DFSS Tools
    8. 8.8. OTHER DFSS APPROACHES
    9. 8.9. SUMMARY
    10. 8.10. APPENDIX 8.A (Shenvi, 2008)
      1. 8.10.1. Design of DivX DVD Player Using DIDOVM Process
    11. 8.11. DIDOVM PHASE: DEFINE
    12. 8.12. DIDOVM PHASE: IDENTIFY
    13. 8.13. DIDOVM PHASE: DESIGN
    14. 8.14. DIDOVM PHASE: OPTIMIZE
    15. 8.15. DIDOVM PHASE: VERIFY
    16. 8.16. DIDOVM PHASE: MONITOR
    17. 8.17. REFERENCES
  12. 9. SOFTWARE DESIGN FOR SIX SIGMA (DFSS): A PRACTICAL GUIDE FOR SUCCESSFUL DEPLOYMENT
    1. 9.1. INTRODUCTION
    2. 9.2. SOFTWARE SIX SIGMA DEPLOYMENT
    3. 9.3. SOFTWARE DFSS DEPLOYMENT PHASES
      1. 9.3.1. Predeployment
      2. 9.3.2. Predeployment Considerations
        1. 9.3.2.1. Deployment Structure Established (Yang and El-Haik, 2008).
        2. 9.3.2.2. Other Deployment Operatives.
          1. 9.3.2.2.1. Deployment Champions.
          2. 9.3.2.2.2. Project Champions.
          3. 9.3.2.2.3. Design Owner.
          4. 9.3.2.2.4. Master Black Belt (MBB).
          5. 9.3.2.2.5. Black Belt (BB).
          6. 9.3.2.2.6. Green Belt.
        3. 9.3.2.3. Communication Plan.
        4. 9.3.2.4. Software DFSS Project Sources.
        5. 9.3.2.5. Proactive DFSS Project Sources: MultiGeneration Planning.
        6. 9.3.2.6. Training.
        7. 9.3.2.7. Existence of a Software Program Development Management System.
      3. 9.3.3. Deployment
        1. 9.3.3.1. Training.
          1. 9.3.3.1.1. Senior Leadership.
          2. 9.3.3.1.2. Deployment Champions.
          3. 9.3.3.1.3. Master Black Belts.
          4. 9.3.3.1.4. Black Belts.
          5. 9.3.3.1.5. Green Belts.
        2. 9.3.3.2. Six Sigma Project Financial.
      4. 9.3.4. Postdeployment Phase
        1. 9.3.4.1. DFSS Sustainability Factors.
    4. 9.4. BLACK BELT AND DFSS TEAM: CULTURAL CHANGE
    5. 9.5. REFERENCES
  13. 10. DESIGN FOR SIX SIGMA (DFSS) TEAM AND TEAM SOFTWARE PROCESS (TSP)
    1. 10.1. INTRODUCTION
    2. 10.2. THE PERSONAL SOFTWARE PROCESS (PSP)
    3. 10.3. THE TEAM SOFTWARE PROCESS (TSP)
      1. 10.3.1. Evolving the Process
    4. 10.4. PSP AND TSP DEPLOYMENT EXAMPLE
      1. 10.4.1. Simple and Small-Size Project
        1. 10.4.1.1. Deployment Example: Start–Stop Module for a Hybrid Engine Controls Subsystem.
      2. 10.4.2. Moderate and Medium-Size Project
        1. 10.4.2.1. Deployment Example: Electrical Power Steering Subsystem (Chhaya, 2008).
      3. 10.4.3. Complex and Large Project
        1. 10.4.3.1. Deployment Example: Alternative Energy Controls and Torque. Arbitration Controls.
    5. 10.5. THE RELATION OF SIX SIGMA TO CMMI/PSP/TSP FOR SOFTWARE
    6. 10.6. APPENDIX 10.A
      1. 10.6.1. Software Support
    7. 10.7. APPENDIX 10.A1
      1. 10.7.1. PSP1 Plan Summary
    8. 10.8. APPENDIX 10.A2
      1. 10.8.1. PROBE Estimating Script
    9. 10.9. APPENDIX 10.A3
      1. 10.9.1. PSP Defect Recording
    10. 10.10. APPENDIX 10.A4
      1. 10.10.1. PSP2
    11. 10.11. REFERENCES
  14. 11. SOFTWARE DESIGN FOR SIX SIGMA (DFSS) PROJECT ROAD MAP
    1. 11.1. INTRODUCTION
    2. 11.2. SOFTWARE DESIGN FOR SIX SIGMA TEAM
    3. 11.3. SOFTWARE DESIGN FOR SIX SIGMA ROAD MAP
      1. 11.3.1. Software DFSS Phase I: Identify Requirements
        1. 11.3.1.1. Identify Phase Road Map.
        2. 11.3.1.2. Software Company Growth and Innovation Strategy: Multigeneration Planning (MGP).
        3. 11.3.1.3. Research Customer Activities.
      2. 11.3.2. Software DFSS Phase 2: Conceptualize Design
    4. 11.4. SUMMARY
  15. 12. SOFTWARE QUALITY FUNCTION DEPLOYMENT
    1. 12.1. INTRODUCTION
    2. 12.2. HISTORY OF QFD
    3. 12.3. QFD OVERVIEW
    4. 12.4. QFD METHODOLOGY
    5. 12.5. HOQ EVALUATION
    6. 12.6. HOQ 1: THE CUSTOMER'S HOUSE
    7. 12.7. KANO MODEL
    8. 12.8. QFD HOQ 2: TRANSLATION HOUSE
    9. 12.9. QFD HOQ3—DESIGN HOUSE
    10. 12.10. QFD HOQ4—PROCESS HOUSE
    11. 12.11. SUMMARY
    12. 12.12. REFERENCES
  16. 13. AXIOMATIC DESIGN IN SOFTWARE DESIGN FOR SIX SIGMA (DFSS)
    1. 13.1. INTRODUCTION
    2. 13.2. AXIOMATIC DESIGN IN PRODUCT DFSS: AN INTRODUCTION
    3. 13.3. AXIOM 1 IN SOFTWARE DFSS
      1. 13.3.1. Example: Simple Drawing Program
    4. 13.4. COUPLING MEASURES
    5. 13.5. AXIOM 2 IN SOFTWARE DFSS
      1. 13.5.1. Axiom 2: The Information Axiom
        1. 13.5.1.1. Minimize the Information Content in a Design.
      2. 13.5.2. Axiom 2 in Hardware DFSS: Measures of Complexity
    6. 13.6. REFERENCES
    7. 13.7. BIBLIOGRAPHY
  17. 14. SOFTWARE DESIGN FOR X
    1. 14.1. INTRODUCTION
    2. 14.2. SOFTWARE RELIABILITY AND DESIGN FOR RELIABILITY
      1. 14.2.1. Basic Software Reliability Concepts
      2. 14.2.2. Software Reliability Modeling Techniques
      3. 14.2.3. Software Reliability Measurement and Metrics
      4. 14.2.4. DFR in Software DFSS
        1. 14.2.4.1. DFSS Identify Phase DFR Practices.
        2. 14.2.4.2. DFSS Conceptualize Phase DFR Practices.
        3. 14.2.4.3. DFSS Optimize Phase DFR Practices.
        4. 14.2.4.4. DFSS Verify and Validate Phase DFR Practices.
    3. 14.3. SOFTWARE AVAILABILITY
    4. 14.4. SOFTWARE DESIGN FOR TESTABILITY
    5. 14.5. DESIGN FOR REUSABILITY
    6. 14.6. DESIGN FOR MAINTAINABILITY
    7. 14.7. APPENDIX 14.A
    8. 14.8. REFERENCES
    9. 14.9. APPENDIX REFERENCE
    10. 14.10. BIBLIOGRAPHY
  18. 15. SOFTWARE DESIGN FOR SIX SIGMA (DFSS) RISK MANAGEMENT PROCESS
    1. 15.1. INTRODUCTION
    2. 15.2. PLANNING FOR RISK MANAGEMENT ACTIVITIES IN DESIGN AND DEVELOPMENT
    3. 15.3. SOFTWARE RISK ASSESSMENT TECHNIQUES
      1. 15.3.1. Preliminary Hazard Analysis (PHA)
      2. 15.3.2. Hazard and Operability Study (HAZOP)
      3. 15.3.3. Software Failure Mode and Effects Analysis (SFMEA)
      4. 15.3.4. Fault Tree Analysis (FTA)
    4. 15.4. RISK EVALUATION
    5. 15.5. RISK CONTROL
    6. 15.6. POSTRELEASE CONTROL
    7. 15.7. SOFTWARE RISK MANAGEMENT ROLES AND RESPONSIBILITIES
    8. 15.8. CONCLUSION
    9. 15.9. APPENDIX 15.A
      1. 15.9.1. Risk Management Terminology
    10. 15.10. REFERENCES
  19. 16. SOFTWARE FAILURE MODE AND EFFECT ANALYSIS (SFMEA)
    1. 16.1. INTRODUCTION
    2. 16.2. FMEA: A HISTORICAL SKETCH
    3. 16.3. SFMEA FUNDAMENTALS
      1. 16.3.1. SFMEA Hierarchy
      2. 16.3.2. SFMEA Input
      3. 16.3.3. SFMEA Steps
    4. 16.4. SOFTWARE QUALITY CONTROL AND QUALITY ASSURANCE
      1. 16.4.1. Software Quality Control Methods
    5. 16.5. SUMMARY
    6. 16.6. REFERENCES
  20. 17. SOFTWARE OPTIMIZATION TECHNIQUES
    1. 17.1. INTRODUCTION
    2. 17.2. OPTIMIZATION METRICS
      1. 17.2.1. Lines of Source Code
      2. 17.2.2. Function Point Metrics
      3. 17.2.3. Conditional Complexity
      4. 17.2.4. Halstead's Metric
      5. 17.2.5. Cohesion
      6. 17.2.6. Coupling
    3. 17.3. COMPARING SOFTWARE OPTIMIZATION METRICS
      1. 17.3.1. Response Time Techniques
      2. 17.3.2. Interrupt Latency
      3. 17.3.3. Time Loading
      4. 17.3.4. Memory Requirements
      5. 17.3.5. Queuing Theory
    4. 17.4. PERFORMANCE ANALYSIS
    5. 17.5. SYNCHRONIZATION AND DEADLOCK HANDLING
    6. 17.6. PERFORMANCE OPTIMIZATION
      1. 17.6.1. Look-Up Tables
      2. 17.6.2. Scaled Numbers
    7. 17.7. COMPILER OPTIMIZATION TOOLS
      1. 17.7.1. Reduction in Strength
      2. 17.7.2. Common Subexpression Elimination
      3. 17.7.3. Constant Folding
      4. 17.7.4. Loop Invariant Optimization
      5. 17.7.5. Loop Induction Elimination
      6. 17.7.6. Removal of Dead Code
      7. 17.7.7. Flow of Control Optimization
      8. 17.7.8. Loop Unrolling
      9. 17.7.9. Loop Jamming
      10. 17.7.10. Other Techniques
    8. 17.8. CONCLUSION
    9. 17.9. REFERENCES
  21. 18. ROBUST DESIGN FOR SOFTWARE DEVELOPMENT
    1. 18.1. INTRODUCTION
    2. 18.2. ROBUST DESIGN OVERVIEW
      1. 18.2.1. The Relationship of Robust Design to DFSS
    3. 18.3. ROBUST DESIGN CONCEPT #1: OUTPUT CLASSIFICATION
    4. 18.4. ROBUST DESIGN CONCEPT #2: QUALITY LOSS FUNCTION
      1. 18.4.1. Larger-the-Better Loss Function
      2. 18.4.2. Smaller-the-Better Loss Function
    5. 18.5. ROBUST DESIGN CONCEPT #3: SIGNAL, NOISE, AND CONTROL FACTORS
      1. 18.5.1. Usage Profile: The Major Source of Noise
      2. 18.5.2. Software Environment: A Major Source of Noise
    6. 18.6. ROBUSTNESS CONCEPT #4: SIGNAL–TO-NOISE RATIOS
    7. 18.7. ROBUSTNESS CONCEPT #5: ORTHOGONAL ARRAYS
    8. 18.8. ROBUSTNESS CONCEPT #6: PARAMETER DESIGN ANALYSIS
    9. 18.9. ROBUST DESIGN CASE STUDY NO. 1: STREAMLINING OF DEBUGGING SOFTWARE USING AN ORTHOGONAL ARRAY
    10. 18.10. SUMMARY
    11. 18.11. APPENDIX 18.A
      1. 18.11.1. Analysis of Variance (ANOVA)
    12. 18.12. ANOVA STEPS FOR TWO FACTORS COMPLETELY RANDOMIZED EXPERIMENT17
    13. 18.13. REFERENCES
  22. 19. SOFTWARE DESIGN VERIFICATION AND VALIDATION
    1. 19.1. INTRODUCTION
    2. 19.2. THE STATE OF V&V TOOLS FOR SOFTWARE DFSS PROCESS
    3. 19.3. INTEGRATING DESIGN PROCESS WITH VALIDATION/VERIFICATION PROCESS
    4. 19.4. VALIDATION AND VERIFICATION METHODS
      1. 19.4.1. The Successive Processes of Model in the Loop (MIL), Software in the Loop (SIL), Processor in the Loop (PIL), and Hardware in the Loop (HIL) Testing Approaches
        1. 19.4.1.1. MIL.
        2. 19.4.1.2. SIL.
        3. 19.4.1.3. PIL.
        4. 19.4.1.4. HIL.
      2. 19.4.2. Design Verification Process Validation (DVPV) Testing
      3. 19.4.3. Process Knowledge Verification Method Based on Petri Net
        1. 19.4.3.1. Process Model Verification Based on Petri Net.
          1. 19.4.3.1.1. Process Knowledge Verification Technologies Based on Petri Net.
          2. 19.4.3.1.2. Deadlock Issue.
          3. 19.4.3.1.3. Verification Using Petri Net Tool: Petri Net Analyzer Version 1.0.
          4. 19.4.3.1.4. Evaluating Verification approach using Petri Net.
      4. 19.4.4. A Hybrid Verification Approach
        1. 19.4.4.1. Hybrid Method Workflow.
        2. 19.4.4.2. A Hybrid Verification Tool.
        3. 19.4.4.3. Evaluating MIST: A Hybrid Verification Approach.
    5. 19.5. BASIC FUNCTIONAL VERIFICATION STRATEGY
      1. 19.5.1. Coverage Analysis
      2. 19.5.2. Evaluating Functional Approach
      3. 19.5.3. Verification Methods Summary Table
    6. 19.6. COMPARISON OF COMMERCIALLY AVAILABLE VERIFICATION AND VALIDATION TOOLS
      1. 19.6.1. MATHWORKS HDL Coder
      2. 19.6.2. dSPACE Targetlink Production Code Generator
      3. 19.6.3. MxVDEV—Unit/System Test Tool Solution
      4. 19.6.4. Hewlett-Packard Quality Center Solution
    7. 19.7. SOFTWARE TESTING STRATEGIES
      1. 19.7.1. Test Data-Generation Testing
      2. 19.7.2. Traditional Manual V&V Testing
      3. 19.7.3. Proof of Correctness Testing Strategy
      4. 19.7.4. Simulation Testing Strategy
      5. 19.7.5. Software Testing Summary Table
    8. 19.8. SOFTWARE DESIGN STANDARDS
    9. 19.9. CONCLUSION
    10. 19.10. REFERENCES