Concurrency is the ability of a system to handle multiple tasks at once. It’s about designing a program so that it can manage and make progress on several tasks in overlapping time periods, even if they aren’t all executing at the exact same instant. The core idea is that a program can have multiple independent processes or threads running, and it can switch between them rapidly to give the illusion that they are all running simultaneously. This is what allows you to browse the web, listen to music, and have a word processor open on your single-core computer without one task completely freezing the others.

A classic analogy for concurrency is a chef in a kitchen. The chef can be mixing a bowl of dough (Task A), waiting for water to boil (Task B), and chopping vegetables (Task C). The chef doesn’t do all these things at the same exact moment. Instead, they might chop some vegetables, then go and stir the dough, then check the water, and so on. The chef is “juggling” these tasks, making sure all of them are making progress without getting stalled. This is concurrency: dealing with many things at once.

Concurrency vs. Parallelism

This brings us to a key distinction: concurrency is not the same as parallelism.

  • Concurrency is about managing multiple tasks. It’s a structural property of a program that allows it to handle multiple tasks at once. A concurrent program can run on a single processor core by rapidly switching between tasks, a process known as context switching. The tasks are interleaved, and it appears to the user that they are happening at the same time.
  • Parallelism, on the other hand, is about executing multiple tasks at the same time. This requires hardware with multiple processors or processor cores. With parallelism, two or more tasks can literally be running at the very same instant, each on its own core.

To extend the chef analogy, parallelism would be having multiple chefs in the kitchen, each one working on a separate dish at the same time. The first chef is chopping vegetables, while the second chef is simultaneously stirring the dough. You need more than one chef (or in computing terms, more than one CPU core) to achieve true parallelism.

Concurrency and parallelism often work together. A concurrent program can be run on a multi-core machine to achieve true parallelism, where different parts of the program are executed on different cores at the same time.

Why is Concurrency Important?

Concurrency is essential in modern computing for a number of reasons:

  • Responsiveness: In a graphical user interface (GUI) application, concurrency allows the application to remain responsive to user input while it performs a long-running, “heavy” task in the background. Without it, the application would freeze and become unresponsive until the task is complete.
  • Efficient Resource Use: Many tasks, especially in web and network applications, involve waiting for external events, such as a file to be read from a disk or a response to come back from a server. Concurrency allows the program to put a waiting task on hold and work on other tasks, preventing the CPU from sitting idle.
  • Scalability: Concurrent systems are designed to scale, as they can be easily distributed across multiple processors or machines to achieve greater parallelism and performance.

Concurrency introduces complexity, such as race conditions (where the outcome depends on the unpredictable timing of multiple tasks accessing a shared resource) and deadlocks (where two or more tasks are stuck waiting for each other). However, with proper management using techniques like locks and semaphores, these challenges can be overcome to build robust and efficient software.


Discover more from Shafaat Ali Education

Subscribe to get the latest posts sent to your email.

Leave a comment

apple books

Buy my eBooks on Apple Books. Thanks! Shafaat Ali, Apple Books

Discover more from Shafaat Ali Education

Subscribe now to keep reading and get access to the full archive.

Continue reading