Java Concurrency in Practice - Chapter 1 - Introduction
Benefits of threads:
- Exploiting multiple processors.
- When properly designed, multithreaded programs can improve throughput by utilizing available processor resources more effectively.
- On single processor systems, while one thread is blocked/wait for a synchronous event such as I/O operation to complete, the another thread can still run, allow the application make progress.
- A program that processes one type of task sequentially is simpler to write, less error-prone, and easier to test than one managing multiple different types of tasks at once. With multi threads available, a complicated, asynchronous workflow can be decomposed into a number of simpler, synchronous workflows each running in a separate thread.
- Simplified handling of asynchronous events.
- In a single-threaded application, an I/O block could stall the whole processing. To avoid this problem, single-threaded applications are forced to use non-blocking I/O, which is far more complicated and error-prone than synchronous I/O.
- More responsive user interfaces.
- If the code called from the main event loop takes too long to execute, the user interface appears to "freeze" until that code finishes, because subsequent user interface events cannot be processed until control is returned to the main event thread. By assigning the long-running task to a separate thread, the main event thread remains free to process UI events, making UI more responsive.
Risks of threads
- Safety = nothing bad ever happens.
- Example: race condition.
- In the absence of sufficient synchronization, the ordering of operations in multiple threads is unpredictable and sometimes surprising.
- Liveness = something good eventually happens.
- Example: deadlock, starvation, livelock.
- A liveness failure occurs when an activity gets into a state such that it is permanently unable to make forward progress.
- Performance = good thing to happen quickly.
- Example: poor service time, responsiveness, throughput, resource consumption, scalability.
- Context switches: when the scheduler suspends the active thread temporarily so another thread can run. The higher number of threads, the higher significant costs for saving and restoring execution context, loss of locality, and CPU time spent scheduling threads instead of running them.
- The synchronization of shared data can inhibit compiler optimizations, flush or invalidate memory caches, and create synchronization traffic on the shared memory bus.
No comments:
Post a Comment