Most modern operating systems are said to be multithreaded because they allow the creation of additional threads within a process beyond the one that begins the execution of the process. In a multithreaded operating system, a process may consist of more than one thread, all of them executing simultaneously from the user's point of view.
Of course, unless there is more than one processor, the threads are not really executing simultaneously. Instead, the operating system gives the impression of simultaneity by multiplexing among the threads. The operating system determines which thread gets control of the processor at any particular time. There are two models for operating a multithreaded process:
In the rest of this section we will assume a preemptive model of multithreading.
We use the term thread to refer to the smallest amount of processor context state necessary to encapsulate a computation. Practically speaking, a thread typically consists of a register set, a stack, a reference to the executable code's address space, and a references to the data address space. Some parts of the data space are private to the thread while other parts may be shared with other threads in the same process. Variables that belong to the storage class auto and that are instantiated by the thread are private to the thread and cannot be accessed by other threads. Variables that belong to the storage class static outside of functions may be accessed by other threads in the same process.
All threads also have access to the standard files, stdin, stdout, and stderr. In addition, a multithreading implementation may provide for data with the same kind of lifetime as data in the static storage class but where access is restricted to the thread that owns it. Such data uses the thread-local storage class.
A preemptive thread switch may occur between any two machine instructions, which might not coincide with a boundary between two source code statements. A thread switch can also occur part-way through the evaluation of a source code expression. One important consequence of this possibility is that, since switching between threads happens unpredictably, if more than one thread is changing the value of a shared variable, the results of an execution are likely to differ from one run to another. This lack of repeatability, called a race condition, makes debugging and validation difficult.
Multithreading requires mechanisms to protect against race conditions. Various methods exist for protecting segments of code from being executed by two or more threads at the same time. A program that is suitably protected against errors in the presence of multithreading is said to be thread-safe.