Mutual Exclusion in Synchronization Last Updated : 22 Jul, 2024 Comments Improve Suggest changes Like Article Like Report During concurrent execution of processes, processes need to enter the critical section (or the section of the program shared across processes) at times for execution. It might happen that because of the execution of multiple processes at once, the values stored in the critical section become inconsistent. In other words, the values depend on the sequence of execution of instructions - also known as a race condition. The primary task of process synchronization is to get rid of race conditions while executing the critical section.What is Mutual Exclusion?Mutual Exclusion is a property of process synchronization that states that "no two processes can exist in the critical section at any given point of time". The term was first coined by Dijkstra. Any process synchronization technique being used must satisfy the property of mutual exclusion, without which it would not be possible to get rid of a race condition. The need for mutual exclusion comes with concurrency. There are several kinds of concurrent execution:Interrupt handlersInterleaved, preemptively scheduled processes/threadsMultiprocessor clusters, with shared memoryDistributed systemsMutual exclusion methods are used in concurrent programming to avoid the simultaneous use of a common resource, such as a global variable, by pieces of computer code called critical sections.The requirement of mutual exclusion is that when process P1 is accessing a shared resource R1, another process should not be able to access resource R1 until process P1 has finished its operation with resource R1.Examples of such resources include files, I/O devices such as printers, and shared data structures.Conditions Required for Mutual ExclusionAccording to the following four criteria, mutual exclusion is applicable:When using shared resources, it is important to ensure mutual exclusion between various processes. There cannot be two processes running simultaneously in either of their critical sections.It is not advisable to make assumptions about the relative speeds of the unstable processes.For access to the critical section, a process that is outside of it must not obstruct another process.Its critical section must be accessible by multiple processes in a finite amount of time; multiple processes should never be kept waiting in an infinite loop.Approaches To Implementing Mutual ExclusionSoftware Method: Leave the responsibility to the processes themselves. These methods are usually highly error-prone and carry high overheads.Hardware Method: Special-purpose machine instructions are used for accessing shared resources. This method is faster but cannot provide a complete solution. Hardware solutions cannot give guarantee the absence of deadlock and starvation.Programming Language Method: Provide support through the operating system or through the programming language.Requirements of Mutual ExclusionAt any time, only one process is allowed to enter its critical section.The solution is implemented purely in software on a machine.A process remains inside its critical section for a bounded time only.No assumption can be made about the relative speeds of asynchronous concurrent processes.A process cannot prevent any other process from entering into a critical section.A process must not be indefinitely postponed from entering its critical section.In order to understand mutual exclusion, let's take an example. What is a Need of Mutual Exclusion?An easy way to visualize the significance of mutual exclusion is to imagine a linked list of several items, with the fourth and fifth items needing to be removed. By changing the previous node's next reference to point to the succeeding node, the node that lies between the other two nodes is deleted.To put it simply, whenever node "i" wants to be removed, node "with - 1"'s subsequent reference is changed to point to node "ith + 1" at that time. Two distinct nodes can be removed by two threads at the same time when a shared linked list is being used by many threads. This occurs when the first thread modifies node "ith - 1" next reference, pointing towards the node "ith + 1," and the second thread modifies node "ith" next reference, pointing towards the node "ith + 2." Although both nodes have been removed, the linked list's required state has not yet been reached because node "i + 1" still exists in the list because node "ith - 1" next reference still points to it.Now, this situation is called a race condition. Race conditions can be prevented by mutual exclusion so that updates at the same time cannot happen to the very bit about the list.Example:In the clothes section of a supermarket, two people are shopping for clothes.Boy, A decides upon some clothes to buy and heads to the changing room to try them out. Now, while boy A is inside the changing room, there is an 'occupied' sign on it - indicating that no one else can come in. Boy B has to use the changing room too, so she has to wait till boy A is done using the changing room. Once boy A comes out of the changing room, the sign on it changes from 'occupied' to 'vacant' - indicating that another person can use it. Hence, boy B proceeds to use the changing room, while the sign displays 'occupied' again.The changing room is nothing but the critical section, boy A and boy B are two different processes, while the sign outside the changing room indicates the process synchronization mechanism being used.ConclusionIn conclusion, mutual exclusion is a key concept in synchronization that ensures only one process accesses a shared resource at a time. This prevents conflicts and data corruption, making sure that processes run smoothly and correctly. By using mutual exclusion mechanisms, we can create stable and reliable systems that handle multiple processes efficiently. Comment More infoAdvertise with us Next Article Peterson's Algorithm in Process Synchronization A ArkadyutiBanerjee Follow Improve Article Tags : Operating Systems GATE CS Process Synchronization Similar Reads Operating System Tutorial An Operating System(OS) is a software that manages and handles hardware and software resources of a computing device. Responsible for managing and controlling all the activities and sharing of computer resources among different running applications.A low-level Software that includes all the basic fu 4 min read OS BasicsWhat is an Operating System?An Operating System is a System software that manages all the resources of the computing device. Acts as an interface between the software and different parts of the computer or the computer hardware. Manages the overall resources and operations of the computer. Controls and monitors the execution o 9 min read Functions of Operating SystemAn Operating System acts as a communication interface between the user and computer hardware. Its purpose is to provide a platform on which a user can execute programs conveniently and efficiently. The main goal of an operating system is to make the computer environment more convenient to use and to 7 min read Types of Operating SystemsOperating Systems can be categorized according to different criteria like whether an operating system is for mobile devices (examples Android and iOS) or desktop (examples Windows and Linux). Here, we are going to classify based on functionalities an operating system provides.8 Main Operating System 11 min read Need and Functions of Operating SystemsThe fundamental goal of an Operating System is to execute user programs and to make tasks easier. Various application programs along with hardware systems are used to perform this work. Operating System is software that manages and controls the entire set of resources and effectively utilizes every 9 min read Commonly Used Operating SystemThere are various types of Operating Systems used throughout the world and this depends mainly on the type of operations performed. These Operating Systems are manufactured by large multinational companies like Microsoft, Apple, etc. Let's look at the few most commonly used OS in the real world: Win 9 min read Structure of Operating SystemOperating System ServicesAn operating system is software that acts as an intermediary between the user and computer hardware. It is a program with the help of which we are able to run various applications. It is the one program that is running all the time. Every computer must have an operating system to smoothly execute ot 6 min read Introduction of System CallA system call is a programmatic way in which a computer program requests a service from the kernel of the operating system on which it is executed. A system call is a way for programs to interact with the operating system. A computer program makes a system call when it requests the operating system' 11 min read System Programs in Operating SystemSystem Programming can be defined as the act of building Systems Software using System Programming Languages. According to Computer Hierarchy, Hardware comes first then is Operating System, System Programs, and finally Application Programs.In the context of an operating system, system programs are n 5 min read Operating Systems StructuresThe operating system can be implemented with the help of various structures. The structure of the OS depends mainly on how the various standard components of the operating system are interconnected and merge into the kernel. This article discusses a variety of operating system implementation structu 8 min read History of Operating SystemAn operating system is a type of software that acts as an interface between the user and the hardware. It is responsible for handling various critical functions of the computer and utilizing resources very efficiently so the operating system is also known as a resource manager. The operating system 8 min read Booting and Dual Booting of Operating SystemWhen a computer or any other computing device is in a powerless state, its operating system remains stored in secondary storage like a hard disk or SSD. But, when the computer is started, the operating system must be present in the main memory or RAM of the system.What is Booting?When a computer sys 7 min read Types of OSBatch Processing Operating SystemIn the beginning, computers were very large types of machinery that ran from a console table. In all-purpose, card readers or tape drivers were used for input, and punch cards, tape drives, and line printers were used for output. Operators had no direct interface with the system, and job implementat 6 min read Multiprogramming in Operating SystemAs the name suggests, Multiprogramming means more than one program can be active at the same time. Before the operating system concept, only one program was to be loaded at a time and run. These systems were not efficient as the CPU was not used efficiently. For example, in a single-tasking system, 5 min read Time Sharing Operating SystemMultiprogrammed, batched systems provide an environment where various system resources were used effectively, but it did not provide for user interaction with computer systems. Time-sharing is a logical extension of multiprogramming. The CPU performs many tasks by switches that are so frequent that 5 min read What is a Network Operating System?The basic definition of an operating system is that the operating system is the interface between the computer hardware and the user. In daily life, we use the operating system on our devices which provides a good GUI, and many more features. Similarly, a network operating system(NOS) is software th 2 min read Real Time Operating System (RTOS)Real-time operating systems (RTOS) are used in environments where a large number of events, mostly external to the computer system, must be accepted and processed in a short time or within certain deadlines. such applications are industrial control, telephone switching equipment, flight control, and 6 min read Process ManagementIntroduction of Process ManagementProcess Management for a single tasking or batch processing system is easy as only one process is active at a time. With multiple processes (multiprogramming or multitasking) being active, the process management becomes complex as a CPU needs to be efficiently utilized by multiple processes. Multipl 8 min read Process Table and Process Control Block (PCB)While creating a process, the operating system performs several operations. To identify the processes, it assigns a process identification number (PID) to each process. As the operating system supports multi-programming, it needs to keep track of all the processes. For this task, the process control 6 min read Operations on ProcessesProcess operations refer to the actions or activities performed on processes in an operating system. These operations include creating, terminating, suspending, resuming, and communicating between processes. Operations on processes are crucial for managing and controlling the execution of programs i 5 min read Process Schedulers in Operating SystemA process is the instance of a computer program in execution. Scheduling is important in operating systems with multiprogramming as multiple processes might be eligible for running at a time.One of the key responsibilities of an Operating System (OS) is to decide which programs will execute on the C 7 min read Inter Process Communication (IPC)Processes need to communicate with each other in many situations. Inter-Process Communication or IPC is a mechanism that allows processes to communicate. It helps processes synchronize their activities, share information, and avoid conflicts while accessing shared resources.Types of Process Let us f 5 min read Context Switching in Operating SystemContext Switching in an operating system is a critical function that allows the CPU to efficiently manage multiple processes. By saving the state of a currently active process and loading the state of another, the system can handle various tasks simultaneously without losing progress. This switching 4 min read Preemptive and Non-Preemptive SchedulingIn operating systems, scheduling is the method by which processes are given access the CPU. Efficient scheduling is essential for optimal system performance and user experience. There are two primary types of CPU scheduling: preemptive and non-preemptive. Understanding the differences between preemp 5 min read CPU Scheduling in OSCPU Scheduling in Operating SystemsCPU scheduling is a process used by the operating system to decide which task or process gets to use the CPU at a particular time. This is important because a CPU can only handle one task at a time, but there are usually many tasks that need to be processed. The following are different purposes of a 8 min read CPU Scheduling CriteriaCPU scheduling is essential for the system's performance and ensures that processes are executed correctly and on time. Different CPU scheduling algorithms have other properties and the choice of a particular algorithm depends on various factors. Many criteria have been suggested for comparing CPU s 6 min read Multiple-Processor Scheduling in Operating SystemIn multiple-processor scheduling multiple CPUs are available and hence Load Sharing becomes possible. However multiple processor scheduling is more complex as compared to single processor scheduling. In multiple processor scheduling, there are cases when the processors are identical i.e. HOMOGENEOUS 8 min read Thread SchedulingThere is a component in Java that basically decides which thread should execute or get a resource in the operating system. Scheduling of threads involves two boundary scheduling. Scheduling of user-level threads (ULT) to kernel-level threads (KLT) via lightweight process (LWP) by the application dev 7 min read Threads in OSThread in Operating SystemA thread is a single sequence stream within a process. Threads are also called lightweight processes as they possess some of the properties of processes. Each thread belongs to exactly one process.In an operating system that supports multithreading, the process can consist of many threads. But threa 7 min read Threads and its Types in Operating SystemA thread is a single sequence stream within a process. Threads have the same properties as the process so they are called lightweight processes. On single core processor, threads are are rapidly switched giving the illusion that they are executing in parallel. In multi-core systems, threads can exec 8 min read Multithreading in Operating SystemA thread is a path that is followed during a programâs execution. The majority of programs written nowadays run as a single thread. For example, a program is not capable of reading keystrokes while making drawings. These tasks cannot be executed by the program at the same time. This problem can be s 7 min read Process SynchronizationIntroduction of Process SynchronizationProcess Synchronization is used in a computer system to ensure that multiple processes or threads can run concurrently without interfering with each other.The main objective of process synchronization is to ensure that multiple processes access shared resources without interfering with each other an 10 min read Race Condition VulnerabilityRace condition occurs when multiple threads read and write the same variable i.e. they have access to some shared data and they try to change it at the same time. In such a scenario threads are âracingâ each other to access/change the data. This is a major security vulnerability.What is Race Conditi 10 min read Critical Section in SynchronizationA critical section is a segment of a program where shared resources, such as memory, files, or ports, are accessed by multiple processes or threads. To prevent issues like data inconsistency and race conditions, synchronization techniques ensure that only one process or thread accesses the critical 8 min read Mutual Exclusion in SynchronizationDuring concurrent execution of processes, processes need to enter the critical section (or the section of the program shared across processes) at times for execution. It might happen that because of the execution of multiple processes at once, the values stored in the critical section become inconsi 6 min read Critical Section Problem SolutionPeterson's Algorithm in Process SynchronizationPeterson's Algorithm is a classic solution to the critical section problem in process synchronization. It ensures mutual exclusion meaning only one process can access the critical section at a time and avoids race conditions. The algorithm uses two shared variables to manage the turn-taking mechanis 15+ min read Semaphores in Process SynchronizationSemaphores are a tool used in operating systems to help manage how different processes (or programs) share resources, like memory or data, without causing conflicts. A semaphore is a special kind of synchronization data that can be used only through specific synchronization primitives. Semaphores ar 15+ min read Semaphores and its typesA semaphore is a tool used in computer science to manage how multiple programs or processes access shared resources, like memory or files, without causing conflicts. Semaphores are compound data types with two fields one is a Non-negative integer S.V(Semaphore Value) and the second is a set of proce 6 min read Producer Consumer Problem using Semaphores | Set 1The Producer-Consumer problem is a classic synchronization issue in operating systems. It involves two types of processes: producers, which generate data, and consumers, which process that data. Both share a common buffer. The challenge is to ensure that the producer doesn't add data to a full buffe 4 min read Readers-Writers Problem | Set 1 (Introduction and Readers Preference Solution)The readers-writer problem in operating systems is about managing access to shared data. It allows multiple readers to read data at the same time without issues but ensures that only one writer can write at a time, and no one can read while writing is happening. This helps prevent data corruption an 7 min read Dining Philosopher Problem Using SemaphoresThe Dining Philosopher Problem states that K philosophers are seated around a circular table with one chopstick between each pair of philosophers. There is one chopstick between each philosopher. A philosopher may eat if he can pick up the two chopsticks adjacent to him. One chopstick may be picked 11 min read Hardware Synchronization Algorithms : Unlock and Lock, Test and Set, SwapProcess Synchronization problems occur when two processes running concurrently share the same data or same variable. The value of that variable may not be updated correctly before its being used by a second process. Such a condition is known as Race Around Condition. There are a software as well as 4 min read Deadlocks & Deadlock Handling MethodsIntroduction of Deadlock in Operating SystemA deadlock is a situation where a set of processes is blocked because each process is holding a resource and waiting for another resource acquired by some other process. In this article, we will discuss deadlock, its necessary conditions, etc. in detail.Deadlock is a situation in computing where two 11 min read Conditions for Deadlock in Operating SystemA deadlock is a situation where a set of processes is blocked because each process is holding a resource and waiting for another resource acquired by some other process. In this article, we will discuss what deadlock is and the necessary conditions required for deadlock.What is Deadlock?Deadlock is 8 min read Banker's Algorithm in Operating SystemBanker's Algorithm is a resource allocation and deadlock avoidance algorithm used in operating systems. It ensures that a system remains in a safe state by carefully allocating resources to processes while avoiding unsafe states that could lead to deadlocks.The Banker's Algorithm is a smart way for 8 min read Wait For Graph Deadlock Detection in Distributed SystemDeadlocks are a fundamental problem in distributed systems. A process may request resources in any order and a process can request resources while holding others. A Deadlock is a situation where a set of processes are blocked as each process in a Distributed system is holding some resources and that 5 min read Handling DeadlocksDeadlock is a situation where a process or a set of processes is blocked, waiting for some other resource that is held by some other waiting process. It is an undesirable state of the system. In other words, Deadlock is a critical situation in computing where a process, or a group of processes, beco 8 min read Deadlock Prevention And AvoidanceDeadlock prevention and avoidance are strategies used in computer systems to ensure that different processes can run smoothly without getting stuck waiting for each other forever. Think of it like a traffic system where cars (processes) must move through intersections (resources) without getting int 5 min read Deadlock Detection And RecoveryDeadlock Detection and Recovery is the mechanism of detecting and resolving deadlocks in an operating system. In operating systems, deadlock recovery is important to keep everything running smoothly. A deadlock occurs when two or more processes are blocked, waiting for each other to release the reso 6 min read Deadlock Ignorance in Operating SystemIn this article we will study in brief about what is Deadlock followed by Deadlock Ignorance in Operating System. What is Deadlock? If each process in the set of processes is waiting for an event that only another process in the set can cause it is actually referred as called Deadlock. In other word 5 min read Recovery from Deadlock in Operating SystemIn today's world of computer systems and multitasking environments, deadlock is an undesirable situation that can bring operations to a halt. When multiple processes compete for exclusive access to resources and end up in a circular waiting pattern, a deadlock occurs. To maintain the smooth function 8 min read Like