DEV Community

Cover image for Concurrency is not Parallelism - A deeper look at their key difference

Posted on

Concurrency is not Parallelism - A deeper look at their key difference

Today, in this article, we'll analyze two common processing techniques used by the operating system (OS). Concurrency & Parallelism. When we have more than one process waiting to be executed, these 2 concepts come into the picture. Before we jump into the brief explanation about each, let's have a quick look at how CPU works underneath the hood.

Generally, the processor cores are meant to process 1 instruction at a time. So let’s say, you’re playing a video somewhere on a single-core machine. Does it mean you will not get any notification from your mail app that is running in the background? No, your background processes are doing their job nicely. But why is that? Didn’t I tell you that a single core can run one instruction at a time? How is it possible? Ah well, that’s the main job of an Operating System (OS). This job is known as Context Switching. All these processes happening not necessarily at the same time but in a scheduled manner. Operating system schedules processes for execution. Let’s say for simplicity's sake, we have 3 processes happening.

Round-Robin scheduling

If the Scheduler schedules the processes like the figure above, the CPU will execute Process 1. Then after a certain amount of time, move to Process 2, then 3, then again 1. You get the idea. So when OS moves from one process to another, that act of switching is called Context Switching.

OS context switching

In the figure above, we see Process A and B, two processes are scheduled. CPU processing Process A. While it’s not finished, OS stopped that task and make the CPU process: Process B. Whether or not Process B is finished, it moved back to Process A again. That’s the operating system, doing its main job. But all these activities happening so fast, from the user's perspective, it feels at the same time. The user has the impression of parallelism. It gives the illusion of parallel execution even though it’s not.


So first thing first, when we talk about concurrency, concurrent execution is not necessarily executing tasks at the same time. Like the discussion above with a single-core CPU. When we have 2 tasks scheduled, at a given point in time we can say that, either task 1, or task 2 getting executed. Not both. OS scheduler & context-switcher decides how the CPU will finish both tasks.

At a given point in time, only one task gets processed.

From the figure above, we can see that, while Task 1 is not finished, Task 2 is getting processed. That's the key difference with Parallel execution which we will talk about in a sec. Yes, Task 2 is running while Task 1 is not finished yet, but at that time, Task 1 stopped. The state (code, memory, registry, unique values, etc) of Task 1 is frozen and saved somewhere by Context Switcher for later use when it will go back to Task 1.

Concurrency means executing multiple tasks at the same time but not necessarily simultaneously.


The key point of how parallel is different from concurrent is: for Parallel, we need different hardware. It can be a different core or an entirely different machine. From above, if we bring the same example for parallel we can see the execution like the figure below:

At a given point in time, multiple tasks get processed.

While Task 1 is getting processed, at a certain point Task 2 started without Task 1 getting stopped. They both running at the same time. Even, parallelism does not require two tasks to exist. It literally physically run parts of tasks or, multiple tasks, at the same time using the multi-core infrastructure of CPU, by assigning one core to each task or sub-task.

Parallelism is a specific kind of concurrency where tasks are really executed simultaneously.


Even though concurrent & parallel both will finish the tasks significantly faster, when it goes to parallel, it will finish a bit faster than concurrent (depending on amount & types of tasks).

But now you might think, why concurrent then? While it's not running multiple tasks at the same time. Isn't it better if we run tasks just sequentially and forget about concurrency? Well, there are some benefits to concurrency. I will talk about those some other time.

So for today, that's all about Concurrency & Parallelism, a very important concept. Hope you will get it simpler now when either of these is in the conversation.

Happy Learning !!!

Top comments (3)

karanpratapsingh profile image
Karan Pratap Singh

Great article, here's a cool quote I remember that helped me understand this topic initially.

Concurrency is dealing with lots of things at the same time. Parallelism is doing lots of things at the same time.

thumbone profile image
Bernd Wechner

I don't disagree, but I would be careful with the conclusion that the CPU is thus "run[ning] multiple tasks in parallel". The fact that a given process is waiting on something and hence happy to release other resources (like CPU) while it waits simply makes it a voluntary participant in the concurrency game and does not magically produce parallelism from a single resource (like a given CPU core).

The typical web scenario does support parallelism of course, even even with single threaded JavaScript engines running, because clients and servers can run code at the same time with no conflict over CPU access.

thumbone profile image
Bernd Wechner

It's worth observing that concurrency here, is just a special case of the more general resource conflict management that an OS and/or firmware manage. More than than one call for a given resource (in this case a processor) at a given time.

Get in line...