[Return to Bookshelf] [Contents] [Previous Section] [Next Section] [Index] [Help]


3.3.1 Shared Memory

Most threads do not operate independently. They cooperate to accomplish a task, and cooperation requires communication. There are many ways that threads can communicate, and which method is most appropriate depends on the task. Threads that cooperate only rarely (for example, a boss thread that only sends off a request for workers to do long tasks) may be satisfied with a relatively slow form of communication. Threads that must cooperate more closely (for example, a set of threads performing a parallelized matrix operation) need fast communication-maybe even to the extent of using machine-specific atomic hardware operations.

Most mechanisms for thread communication involve the use of shared memory, taking advantage of the fact that all threads within a process share their full address space. Although all addresses are shared, there are three kinds of memory that are characteristically used for communication. The following sections describe the scope (the areas of the program where code can access the memory) and lifetime (the length of time the memory exists) of each of the three types of memory.