Threads

References:

  1. Abraham Silberschatz, Greg Gagne, and Peter Baer Galvin, "Operating System Concepts, Ninth Edition ", Chapter 4

4.1 Overview


Figure 4.1 - Single-threaded and multithreaded processes

4.1.1 Motivation


Figure 4.2 - Multithreaded server architecture

4.1.2 Benefits

4.2 Multicore Programming


Figure 4.3 - Concurrent execution on a single-core system.


Figure 4.4 - Parallel execution on a multicore system

4.2.1 Programming Challenges ( New section, same content ? )

4.2.2 Types of Parallelism ( new )

In theory there are two different ways to parallelize the workload:

  1. Data parallelism divides the data up amongst multiple cores ( threads ), and performs the same task on each subset of the data. For example dividing a large image up into pieces and performing the same digital image processing on each piece on different cores.
  2. Task parallelism divides the different tasks to be performed among the different cores and performs them simultaneously.

In practice no program is ever divided up solely by one or the other of these, but instead by some sort of hybrid combination.

4.3 Multithreading Models

4.3.1 Many-To-One Model


Figure 4.5 - Many-to-one model

4.3.2 One-To-One Model


Figure 4.6 - One-to-one model

4.3.3 Many-To-Many Model


Figure 4.7 - Many-to-many model


Figure 4.8 - Two-level model

4.4 Thread Libraries

4.4.1 Pthreads


Figure 4.9


New

4.4.2 Windows Threads


Figure 4.11

4.4.3 Java Threads


Figure 4.12

4.5 Implicit Threading ( Optional )

Shifts the burden of addressing the programming challenges outlined in section 4.2.1 above from the application programmer to the compiler and run-time libraries.

4.5.1 Thread Pools

4.5.2 OpenMP

      #pragma omp parallel 
       {
             /* some parallel code here */
       }

would cause the compiler to create as many threads as the machine has cores available, ( e.g. 4 on a quad-core machine ), and to run the parallel block of code, ( known as a parallel region ) on each of the threads.

4.5.3 Grand Central Dispatch, GCD

4.5.4 Other Approaches

There are several other approaches available, including Microsoft's Threading Building Blocks ( TBB ) and other products, and Java's util.concurrent package.

4.6 Threading Issues

4.6.1 The fork( ) and exec( ) System Calls

4.6.2 Signal Handling

4.6.3 Thread Cancellation

4.6.4 Thread-Local Storage ( was 4.4.5 Thread-Specific Data )

4.6.5 Scheduler Activations


Figure 4.13 - Lightweight process ( LWP )

4.7 Operating-System Examples ( Optional )

4.7.1 Windows XP Threads


Figure 4.14 - Data structures of a Windows thread

4.7.2 Linux Threads

 
flag Meaning
CLONE_FS File-system information is shared
CLONE_VM The same memory space is shared
CLONE_SIGHAND Signal handlers are shared
CLONE_FILES The set of open files is shared

4.8 Summary