O'Reilly logo

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

PThreads Programming

Book Description

POSIX threads, or pthreads, allow multiple tasks to run concurrently within the same program. This book discusses when to use threads and how to make them efficient. It features realistic examples, a look behind the scenes at the implementation and performance issues, and special topics such as DCE and real-time extensions.

Table of Contents

  1. Pthreads Programming
  2. A Note Regarding Supplemental Files
  3. Examples
  4. Preface
    1. Organization
    2. Example Programs
      1. FTP
    3. Typographical Conventions
    4. Acknowledgments
  5. 1. Why Threads?
    1. What Are Pthreads?
    2. Potential Parallelism
    3. Specifying Potential Parallelism in a Concurrent Programming Environment
      1. UNIX Concurrent Programming: Multiple Processes
        1. Creating a new process: fork
      2. Pthreads Concurrent Programming: Multiple Threads
        1. Creating a new thread: pthread_create
        2. Threads are peers
    4. Parallel vs. Concurrent Programming
    5. Synchronization
      1. Sharing Process Resources
      2. Communication
      3. Scheduling
    6. Who Am I? Who Are You?
    7. Terminating Thread Execution
      1. Exit Status and Return Values
      2. Pthreads Library Calls and Errors
    8. Why Use Threads Over Processes?
    9. A Structured Programming Environment
    10. Choosing Which Applications to Thread
  6. 2. Designing Threaded Programs
    1. Suitable Tasks for Threading
    2. Models
      1. Boss/Worker Model
      2. Peer Model
      3. Pipeline Model
    3. Buffering Data Between Threads
    4. Some Common Problems
    5. Performance
    6. Example: An ATM Server
      1. The Serial ATM Server
        1. Handling asynchronous events: blocking with select
        2. Handling file I/O: blocking with read/write
      2. The Multithreaded ATM Server
        1. Model: boss/worker model
        2. The boss thread
        3. Dynamically detaching a thread
        4. A worker thread
        5. Synchronization: what’s needed
        6. Future enhancements
    7. Example: A Matrix Multiplication Program
      1. The Serial Matrix-Multiply Program
      2. The Multithreaded Matrix-Multiply Program
        1. Passing data to a new thread
        2. Synchronization in the matrix-multiply program
  7. 3. Synchronizing Pthreads
    1. Selecting the Right Synchronization Tool
    2. Mutex Variables
      1. Using Mutexes
      2. Error Detection and Return Values
      3. Using pthread_mutex_trylock
      4. When Other Tools Are Better
      5. Some Shortcomings of Mutexes
      6. Contention for a Mutex
      7. Example: Using Mutexes in a Linked List
      8. Complex Data Structures and Lock Granularity
      9. Requirements and Goals for Synchronization
      10. Access Patterns and Granularity
      11. Locking Hierarchies
      12. Sharing a Mutex Among Processes
    3. Condition Variables
      1. Using a Mutex with a Condition Variable
      2. When Many Threads Are Waiting
      3. Checking the Condition on Wake Up: Spurious Wake Ups
      4. Condition Variable Attributes
      5. Condition Variables and UNIX Signals
      6. Condition Variables and Cancellation
    4. Reader/Writer Locks
    5. Synchronization in the ATM Server
      1. Synchronizing Access to Account Data
      2. Limiting the Number of Worker Threads
      3. Synchronizing a Server Shutdown
    6. Thread Pools
      1. An ATM Server Example That Uses a Thread Pool
        1. Initializing a thread pool
        2. Checking for work
        3. Adding work
        4. Deleting a thread pool
        5. Adapting the atm_server_init and main routines
  8. 4. Managing Pthreads
    1. Setting Thread Attributes
      1. Setting a Thread’s Stack Size
      2. Specifying the Location of a Thread’s Stack
      3. Setting a Thread’s Detached State
      4. Setting Multiple Attributes
      5. Destroying a Thread Attribute Object
    2. The pthread_once Mechanism
      1. Example: The ATM Server’s Communication Module
        1. Using a statically initialized mutex
        2. Using the pthread_once mechanism
    3. Keys: Using Thread-Specific Data
      1. Initializing a Key: pthread_key_create
      2. Associating Data with a Key
      3. Retrieving Data from a Key
      4. Destructors
    4. Cancellation
      1. The Complication with Cancellation
      2. Cancelability Types and States
      3. Cancellation Points: More on Deferred Cancellation
      4. A Simple Cancellation Example
        1. The bullet_proof thread: no effect
        2. The ask_for_it thread: deferred cancellation
        3. The sitting_duck thread: asynchronous cancellation
      5. Cleanup Stacks
      6. Cancellation in the ATM Server
        1. Aborting a deposit
    5. Scheduling Pthreads
      1. Scheduling Priority and Policy
      2. Scheduling Scope and Allocation Domains
      3. Runnable and Blocked Threads
      4. Scheduling Priority
      5. Scheduling Policy
      6. Using Priorities and Policies
      7. Setting Scheduling Policy and Priority
      8. Inheritance
      9. Scheduling in the ATM Server
    6. Mutex Scheduling Attributes
      1. Priority Ceiling
      2. Priority Inheritance
      3. The ATM Example and Priority Inversion
  9. 5. Pthreads and UNIX
    1. Threads and Signals
      1. Traditional Signal Processing
        1. Sending signals and waiting for signals
        2. Using a signal mask to block signals
      2. Signal Processing in a Multithreaded World
        1. Synchronously generated signals
        2. Asynchronously generated signals
        3. Per-thread signal masks
        4. Per-process signal actions
        5. Putting it all together
      3. Threads in Signal Handlers
      4. A Simple Example
      5. Some Signal Issues
      6. Handling Signals in the ATM Example
    2. Threadsafe Library Functions and System Calls
      1. Threadsafe and Reentrant Functions
      2. Example of Thread-Unsafe and Threadsafe Versions of the Same Function
      3. Functions That Return Pointers to Static Data
      4. Library Use of errno
      5. The Pthreads Standard Specifies Which Functions Must Be Threadsafe
        1. Alternative interfaces for functions that return static data
        2. Additional routines for performance considerations
        3. File-locking functions for threads
        4. Where are the threadsafe functions?
      6. Using Thread-Unsafe Functions in a Multithreaded Program
    3. Cancellation-Safe Library Functions and System Calls
      1. Asynchronous Cancellation-Safe Functions
      2. Cancellation Points in System and Library Calls
    4. Thread-Blocking Library Functions and System Calls
    5. Threads and Process Management
      1. Calling fork from a Thread
        1. Fork-handling stacks
      2. Calling exec from a Thread
      3. Process Exit and Threads
    6. Multiprocessor Memory Synchronization
  10. 6. Practical Considerations
    1. Understanding Pthreads Implementation
      1. Two Worlds
      2. Two Kinds of Threads
      3. Who’s Providing the Thread?
        1. User-space Pthreads implementations
        2. Kernel thread–based Pthreads implementations
        3. Two-level scheduler Pthreads implementations: the best of both worlds
    2. Debugging
      1. Deadlock
      2. Race Conditions
      3. Event Ordering
      4. Less Is Better
      5. Trace Statements
      6. Debugger Support for Threads
      7. Example: Debugging the ATM Server
        1. Debugging a deadlock caused by a missing unlock
        2. Debugging a race condition caused by a missing lock
    3. Performance
      1. The Costs of Sharing Too Much—Locking
      2. Thread Overhead
        1. Thread context switches
      3. Synchronization Overhead
      4. How Do Your Threads Spend Their Time?
      5. Performance in the ATM Server Example
        1. Performance depends on input workload: increasing clients and contention
        2. Performance depends on a good locking strategy
        3. Performance depends on the type of work threads do
        4. Key performance issues between using threads and using processes
    4. Conclusion
  11. A. Pthreads and DCE
    1. The Structure of a DCE Server
    2. What Does the DCE Programmer Have to Do?
    3. Example: The ATM as a DCE Server
  12. B. Pthreads Draft 4 vs. the Final Standard
    1. Detaching a Thread
    2. Mutex Variables
    3. Condition Variables
    4. Thread Attributes
    5. The pthread_once Function
    6. Keys
    7. Cancellation
    8. Scheduling
    9. Signals
    10. Threadsafe System Interfaces
    11. Error Reporting
    12. System Interfaces and Cancellation-Safety
    13. Process-Blocking Calls
    14. Process Management
  13. C. Pthreads Quick Reference
  14. D. About the Authors
  15. Index
  16. About the Authors
  17. Colophon
  18. Copyright