Synchronizing Execution and Resolving Contention |
Multi-threading is a term used to describe:
On most modern operating systems, many different users can run programs (processes) at what appears to be the same time. If the operating system is hosted by a hardware system with a single Central Processing Unit (CPU) then these processes are not running at exactly the same time - it just appears that they are. The operating system allocates the CPU and other system resources to the processes that need them and (preemptively) switches these resources from one running program to another. This switching of resources gives the appearance that the programs are running at the same time.
In a very similar way, many modern operating systems allow multiple threads to exist within each process. Again, the operating system handles the allocation of the CPU and various resources to individual threads and switches these allocations from one running thread. Again, this gives the appearance that the threads are running at the same time.
If process and threads are handled in a similar way by the operating system, what distinguishes them from each other?
All processes have distinct address spaces, open files and other system resources; this enables each process to run with minimal or no effect on other processes running at the same time. However, this separation of resources has a run-time cost:
Because threads run within a single process, all of the threads associated with a process share the same data and code address space, the same open files and most other resources. Each thread also has its own register set and stack space. Creation of a thread, therefore, does not require extensive system memory. When the operating system finds it has to switch execution context from one thread to another within the same process, about the only thing the operating system needs to do is switch CPU ownership and register sets. This is the reason that threads are often called `light-weight processes'.
A multi-threaded application is an application whose architecture takes advantage of the multi-threading provided by the operating system. Usually, these applications assign specific jobs to individual threads within the process and the threads communicate, through various means, to synchronize their actions. For example, a data processing application might be designed so that the graphical user interface is completely handled by a single thread, while the actual work of the application is performed by another thread. This architecture would allow the complete separation of the business logic from the user-interface logic within the application.
Another interesting use for multi-threading within an application is in the server side of a client/server application. The server could be designed so that for each service request submitted to a main controller/communications thread, a separate service thread is created to process that request and communicate the results back to the controller. The controller would then communicate these results back to the client. The main benefit of such a design is that the server is not tied up acting on one client's request to the extent that it cannot respond to other client's requests quickly.
Of course, client-server applications that have immediate (or very fast) server response have been written without multi-threading for years. This is usually accomplished by a main controller/communications process creating separate service processes for each client request. So why does an application like this need multi-threading? The main answer is: efficient use of system resources.
The creation of a thread requires very little operating system resource (memory, context swap overhead and so on) and the sharing of files and other resources is simplified. The creation of a process requires more system resources and it tends to be much more difficult to communicate and share files between processes. Multi-threading allows you to make the best use out of your existing hardware resources and also allows simple resource sharing.
There are, of course, disadvantages in using multi-threaded applications. Each thread must be aware of the resources that it might be sharing with other threads in the same process. The programmer must remember that threads execute at the same time and that if two threads write to the same data item, without any form of execution synchronization, the data-item is likely to be corrupted.
For example, consider the following code fragments:
Working-Storage Section. 01 group-item. 05 field-1 pic x. 05 field-2 pic x. procedure division. move 'A' to field-1 move 'B' to field-2 display group-item |
Working-Storage Section. 01 group-item. 05 field-1 pic x. 05 field-2 pic x. procedure division. move 'C' to field-2 move 'D' to field-1 display group-item |
Consider the following possible order of thread execution:
Processing step | Thread 1 execution | Thread 2 execution |
---|---|---|
1 | move 'A' to field-1 |
|
2 | move 'C' to field-2 |
|
3 | move 'B' to field-2 |
|
4 | move 'D' to field-1 |
|
5 | display group-item |
|
6 | display group-item |
In this example, neither thread will display its expected result. The intended result of Thread 1 was to display 'AB', while the intended result of Thread 2 was to display 'DC'. However, the actual result is that both threads display 'DB'. If you are creating multi-threaded applications, you must synchronize execution between threads, and avoid and resolve data contention between threads, as described in the chapter Synchronizing Execution and Resolving Contention.
Copyright © 2000 MERANT International Limited. All rights reserved.
This document and the proprietary marks and names
used herein are protected by international law.
Synchronizing Execution and Resolving Contention |