Previous Topic Next topic Print topic


Multi-threading and the Operating System

On most modern operating systems, many different users can run programs (processes) at what appears to be the same time. If the operating system is hosted by a hardware system with a single Central Processing Unit (CPU) then these processes are not running at exactly the same time - it just appears that they are. The operating system allocates the CPU and other system resources to the processes that need them and (preemptively) switches these resources from one running program to another. This switching of resources gives the appearance that the programs are running at the same time.

In a very similar way, many modern operating systems allow multiple threads to exist within each process. Again, the operating system handles the allocation of the CPU and various resources to individual threads and switches these allocations from one running thread. Again, this gives the appearance that the threads are running at the same time.

If process and threads are handled in a similar way by the operating system, what distinguishes them from each other?

All processes have distinct address spaces, open files and other system resources; this enables each process to run with minimal or no effect on other processes running at the same time. However, this separation of resources has a run-time cost:

Because threads run within a single process, all of the threads associated with a process share the same data and code address space, the same open files and most other resources. Each thread also has its own register set and stack space. Creation of a thread, therefore, does not require extensive system memory. When the operating system finds it has to switch execution context from one thread to another within the same process, about the only thing the operating system needs to do is switch CPU ownership and register sets. This is the reason that threads are often called `light-weight processes'.

Previous Topic Next topic Print topic