You are previewing How to Think About Algorithms.
O'Reilly logo
How to Think About Algorithms

Book Description

This textbook, for second- or third-year students of computer science, presents insights, notations, and analogies to help them describe and think about algorithms like an expert, without grinding through lots of formal proof. Solutions to many problems are provided to let students check their progress, while class-tested PowerPoint slides are on the web for anyone running the course. By looking at both the big picture and easy step-by-step methods for developing algorithms, the author guides students around the common pitfalls. He stresses paradigms such as loop invariants and recursion to unify a huge range of algorithms into a few meta-algorithms. The book fosters a deeper understanding of how and why each algorithm works. These insights are presented in a careful and clear way, helping students to think abstractly and preparing them for creating their own innovative ways to solve problems.

Table of Contents

  1. Cover
  2. Title Page
  3. Copyright Page
  4. Contents
  5. Preface
  6. Introduction
  7. Part One: Iterative Algorithms and Loop Invariants
    1. 1 Iterative Algorithms: Measures of Progress and Loop Invariants
      1. 1.1 A Paradigm Shift: A Sequence of Actions vs. a Sequence of Assertions
      2. 1.2 The Steps to Develop an Iterative Algorithm
      3. 1.3 More about the Steps
      4. 1.4 Different Types of Iterative Algorithms
      5. 1.5 Typical Errors
      6. 1.6 Exercises
    2. 2 Examples Using More-of-the-Input Loop Invariants
      1. 2.1 Coloring the Plane
      2. 2.2 Deterministic Finite Automaton
      3. 2.3 More of the Input vs. More of the Output
    3. 3 Abstract Data Types
      1. 3.1 Specifications and Hints at Implementations
      2. 3.2 Link List Implementation
      3. 3.3 Merging with a Queue
      4. 3.4 Parsing with a Stack
    4. 4 Narrowing the Search Space: Binary Search
      1. 4.1 Binary Search Trees
      2. 4.2 Magic Sevens
      3. 4.3 VLSI Chip Testing
      4. 4.4 Exercises
    5. 5 Iterative Sorting Algorithms
      1. 5.1 Bucket Sort by Hand
      2. 5.2 Counting Sort (a Stable Sort)
      3. 5.3 Radix Sort
      4. 5.4 Radix Counting Sort
    6. 6 Euclid’s GCD Algorithm
    7. 7 The Loop Invariant for Lower Bounds
  8. Part Two: Recursion
    1. 8 Abstractions, Techniques, and Theory
      1. 8.1 Thinking about Recursion
      2. 8.2 Looking Forward vs. Backward
      3. 8.3 With a Little Help from Your Friends
      4. 8.4 The Towers of Hanoi
      5. 8.5 Checklist for Recursive Algorithms
      6. 8.6 The Stack Frame
      7. 8.7 Proving Correctness with Strong Induction
    2. 9 Some Simple Examples of Recursive Algorithms
      1. 9.1 Sorting and Selecting Algorithms
      2. 9.2 Operations on Integers
      3. 9.3 Ackermann’s Function
      4. 9.4 Exercises
    3. 10 Recursion on Trees
      1. 10.1 Tree Traversals
      2. 10.2 Simple Examples
      3. 10.3 Generalizing the Problem Solved
      4. 10.4 Heap Sort and Priority Queues
      5. 10.5 Representing Expressions with Trees
    4. 11 Recursive Images
      1. 11.1 Drawing a Recursive Image from a Fixed Recursive and a Base Case Image
      2. 11.2 Randomly Generating a Maze
    5. 12 Parsing with Context-Free Grammars
  9. Part Three: Optimization Problems
    1. 13 Definition of Optimization Problems
    2. 14 Graph Search Algorithms
      1. 14.1 A Generic Search Algorithm
      2. 14.2 Breadth-First Search for Shortest Paths
      3. 14.3 Dijkstra’s Shortest-Weighted-Path Algorithm
      4. 14.4 Depth-First Search
      5. 14.5 Recursive Depth-First Search
      6. 14.6 Linear Ordering of a Partial Order
      7. 14.7 Exercise
    3. 15 Network Flows and Linear Programming
      1. 15.1 A Hill-Climbing Algorithm with a Small Local Maximum
      2. 15.2 The Primal–Dual Hill-Climbing Method
      3. 15.3 The Steepest-Ascent Hill-Climbing Algorithm
      4. 15.4 Linear Programming
      5. 15.5 Exercises
    4. 16 Greedy Algorithms
      1. 16.1 Abstractions, Techniques, and Theory
      2. 16.2 Examples of Greedy Algorithms
      3. 16.2.1 Example: The Job/Event Scheduling Problem
      4. 16.2.2 Example: The Interval Cover Problem
      5. 16.2.3 Example: The Minimum-Spanning-Tree Problem
      6. 16.3 Exercises
    5. 17 Recursive Backtracking
      1. 17.1 Recursive Backtracking Algorithms
      2. 17.2 The Steps in Developing a Recursive Backtracking
      3. 17.3 Pruning Branches
      4. 17.4 Satisfiability
      5. 17.5 Exercises
    6. 18 Dynamic Programming Algorithms
      1. 18.1 Start by Developing a Recursive Backtracking
      2. 18.2 The Steps in Developing a Dynamic Programming Algorithm
      3. 18.3 Subtle Points
      4. 18.3.1 The Question for the Little Bird
      5. 18.3.2 Subinstances and Subsolutions
      6. 18.3.3 The Set of Subinstances
      7. 18.3.4 Decreasing Time and Space
      8. 18.3.5 Counting the Number of Solutions
      9. 18.3.6 The New Code
    7. 19 Examples of Dynamic Programs
      1. 19.1 The Longest-Common-Subsequence Problem
      2. 19.2 Dynamic Programs as More-of-the-Input Iterative Loop Invariant Algorithms
      3. 19.3 A Greedy Dynamic Program: The Weighted Job/Event Scheduling Problem
      4. 19.4 The Solution Viewed as a Tree: Chains of Matrix Multiplications
      5. 19.5 Generalizing the Problem Solved: Best AVL Tree
      6. 19.6 All Pairs Using Matrix Multiplication
      7. 19.7 Parsing with Context-Free Grammars
      8. 19.8 Designing Dynamic Programming Algorithms via Reductions
    8. 20 Reductions and NP-Completeness
      1. 20.1 Satisfiability Is at Least as Hard as Any Optimization Problem
      2. 20.2 Steps to Prove NP-Completeness
      3. 20.3 Example: 3-Coloring Is NP-Complete
      4. 20.4 An Algorithm for Bipartite Matching Using the Network Flow Algorithm
    9. 21 Randomized Algorithms
      1. 21.1 Using Randomness to Hide the Worst Cases
      2. 21.2 Solutions of Optimization Problems with a Random Structure
  10. Part Four: Appendix
    1. 22 Existential and Universal Quantifiers
    2. 23 Time Complexity
      1. 23.1 The Time (and Space) Complexity of an Algorithm
      2. 23.2 The Time Complexity of a Computational Problem
    3. 24 Logarithms and Exponentials
    4. 25 Asymptotic Growth
      1. 25.1 Steps to Classify a Function
      2. 25.2 More about Asymptotic Notation
    5. 26 Adding-Made-Easy Approximations
      1. 26.1 The Technique
      2. 26.2 Some Proofs for the Adding-Made-Easy Technique
    6. 27 Recurrence Relations
      1. 27.1 The Technique
      2. 27.2 Some Proofs
    7. 28 A Formal Proof of Correctness
  11. Part Five: Exercise Solutions
  12. Conclusion
  13. Index