Concurrency and Parallelism
- Published on
Concurrency and parallelism are often confused but have distinct meanings.
What is Concurrency?
Concurrency is about dealing with many tasks at once. These tasks don't necessarily run at the same time. Instead, they progress in overlapping periods. It's like a chef preparing multiple dishes. The chef might switch between tasks, like chopping vegetables and boiling water. This switching creates the appearance of tasks happening simultaneously.
Example of Concurrency
In JavaScript, concurrency is managed with callbacks, promises, and async/await.
console.log('Start');
setTimeout(() => {
console.log('Timeout');
}, 1000);
console.log('End');
In this example, "Timeout" appears to be delayed but "Start" and "End" execute immediately.
What is Parallelism?
Parallelism is about running multiple tasks at the same time. It requires multiple processors or cores. It's like a restaurant with many chefs, each preparing a dish simultaneously. This setup increases efficiency and speed.
Example of Parallelism
Parallelism is often used in data processing. For example, in Python with multiprocessing:
from multiprocessing import Pool
def square_number(n):
return n * n
if __name__ == "__main__":
with Pool(4) as p:
print(p.map(square_number, [1, 2, 3, 4]))
Here, four numbers are squared at the same time using four processors.
Concurrency vs. Parallelism
- Concurrency: Managing multiple tasks, not necessarily simultaneously.
- Parallelism: Executing multiple tasks at the exact same time.
Why They Matter
- Concurrency improves responsiveness. It allows programs to handle multiple operations, like user input and data fetching.
- Parallelism enhances performance. It speeds up computations by utilizing multiple processors.
Practical Uses
- Concurrency: Used in web servers to handle multiple requests.
- Parallelism: Used in scientific computing for heavy calculations.