Overview of synchronization issues

  • Assume that our shared count variable has a starting value of 0.
  • We wish to use this variable for two simultaneous activities.

Using Thread 1, add 1 to 100 to this variable.
Using Thread 2, subtract 1 to 100 from this value.

  • Print the variable's count at the end. It should ideally print 0. but it won't Because working on the same variable simultaneously by many threads can have unexpected outcomes.

When does the synchronization problem happen?
Critical Section
When more than one threads try to access the same code segment that segment is known as the critical section.
So, when more than one thread is there in the critical section at the same time, it can lead to unexpected results and synchronization problem.

Race Condition
If more than one thread tries to enter inside the critical section at the same time then it might lead to the synchronization problem.

Preemption
Preemption is the ability of the operating system to preempt(that is stop or pause) a currently scheduled task in favor of a higher priority task.
A program that is inside the critical section and CPU preempts then it can lead to the synchronization problem.

Solutions to the synchronization problem

In C#, there are several ways to synchronize access to shared resources to ensure thread safety and prevent race conditions.

Using lock keyword
The lock keyword provides a convenient way to create a synchronized block of code. It internally uses the Monitor class to achieve synchronization. The lock keyword ensures that only one thread can execute the locked code block at a time.

Example
private static object syncObject = new object();
private void Increment()
{
lock(syncObject)
{
    // Critical section: Access shared resource
}
}


Using Monitor Class
Instead of using the lock keyword, we can directly use methods of the Monitor class for synchronization.

Example
private static object syncObject = new object();
private void Increment()
{
Monitor.Enter(syncObject);
try
{
    // Critical section: Access shared resource
}
finally
{
    Monitor.Exit(syncObject);
}
}


Using Mutex
A mutex is a synchronization primitive that allows only one thread to acquire it at a time. It's typically used for inter-process synchronization to synchronize threads within the same process.

Example
private static Mutex mutex = new Mutex();
private void Increment()
{
mutex.WaitOne();
try
{
    // Critical section: Access shared resource
}
finally
{
    mutex.ReleaseMutex();
}
}

Using Semaphore
A semaphore is a synchronization primitive that allows a specified number of threads to enter a critical section simultaneously. It's useful when we want to limit the number of threads accessing a resource.

Example
private static Semaphore semaphore = new Semaphore(1, 1); // Limits access to one thread
private void Increment()
{
semaphore.WaitOne();
try
{
    // Critical section: Access shared resource
}
finally
{
    semaphore.Release();
}
}


Using Interlocked Class
The Interlocked class provides atomic operations for variables that are shared between threads. It's useful for performing simple operations like incrementing a counter without the need for locking.

Example
private int counter = 0;
public void Increment()
{
Interlocked.Increment(ref counter);
}

Properties of a good solution to the synchronization problem

Mutual Exclusion: Only one thread should be allowed inside the critical section at any point in time.
Progress: Overall system should keep on making progress. There shouldn't be a deadlock condition.
Bounded waiting: No thread should wait outside the critical section infinitely. There should be some bound on the waiting time.
No Busy Waiting: If a thread has to continuously check if they can enter inside the critical section or not is Busy Waiting.
    while(!allowed to enter critical section)

    {
        checking(); // <---- This is the busy waiting.
    }


There shouldn't be Busy waiting as it can have several consequences like

  • Inefficient use of CPU resources and wasted energy.
  • Reduced performance.
  • Increased power consumption.
  • Potential deadlocks and etc.