Semaphores & Barriers
Semaphores gives us the ability to control access to a shared resource by multiple threads. For an easy start, let's consider the following real-life scenario:
A Bit of Theory
Multiple threads try to access the same resource at the same time and nothing is preventing it
Example
We create a concurrent queue that will be used for executing our song downloading blocks of code.
DispatchBarrier
A dispatch barrier allows us to create a synchronization point within a concurrent dispatch queue
A Bit of Theory
Multiple threads try to access the same resource at the same time and nothing is preventing it. Such behavior could lead to race conditions, crashes, and obviously, our code won't be thread safe.

Thread safe: code that can be safely called from multiple threads without causing any issues.

A semaphore consist of a threads queue and a counter value (type Int).

Threads queue is used by the semaphore to keep track on waiting threads in FIFO order (The first thread entered to the queue will be the first to get access to the shared resource once it is available).

Counter value is used by the semaphore to decide if a thread should get access to a shared resource or not. The counter value changes when we call signal() or wait() functions.

So, when should we call wait() and signal() functions?

  • Call wait() each time before using the shared resource. We are basically asking the semaphore if the shared resource is available or not. If not, we will wait.
  • Call signal() each time after using the shared resource. We are basically signaling the semaphore that we are done interacting with the shared resource.
Calling wait() will do the following:

  • Decrement semaphore counter by 1.
  • If the resulting value is less than zero, thread is freezed.
  • If the resulting value is equal or bigger than zero, code will get executed without waiting.
Calling signal() will do the following:

  • Increment semaphore counter by 1.
  • If the previous value was less than zero, this function wakes the oldest thread currently waiting in the thread queue.
  • If the previous value is equal or bigger than zero, it means thread queue is empty, aka, no one is waiting.

    let semaphore = DispatchSemaphore(value: 1)

    DispatchSemaphore
    init function has one parameter called "value". This is the counter value which represents the amount of threads we want to allow access to a shared resource at a given moment. In this case, we want to allow only one thread (kid) to access the shared resource (iPad), so let's set it to 1.

    Tips
  • NEVER run semaphore wait() function on main thread as it will freeze your app.
  • Wait() function allows us to specify a timeout. Once timeout reached, wait will finish regardless semaphore count value
DispatchQueue.global().async {
   print("Kid 1 - wait")
   semaphore.wait()
   print("Kid 1 - wait finished")
   sleep(1) // Kid 1 playing with iPad
   semaphore.signal()
   print("Kid 1 - done with iPad")
}
DispatchQueue.global().async {
   print("Kid 2 - wait")
   semaphore.wait()
   print("Kid 2 - wait finished")
   sleep(1) // Kid 1 playing with iPad
   semaphore.signal()
   print("Kid 2 - done with iPad")
}
DispatchQueue.global().async {
   print("Kid 3 - wait")
   semaphore.wait()
   print("Kid 3 - wait finished")
   sleep(1) // Kid 1 playing with iPad
   semaphore.signal()
   print("Kid 3 - done with iPad")
}
Example
First we create a concurrent queue that will be used for executing our song downloading blocks of code.

Second, we create a semaphore and we set it with initial counter value of 3, can you guess why? well, we decided to download 3 songs at a time in order not to take too much CPU time at once.

Third, we iterate 15 times using a for loop. On each iteration we do the following: wait() → download song → signal()
let queue = DispatchQueue(label: "com.gcd.myQueue", attributes: .concurrent)
let semaphore = DispatchSemaphore(value: 3)
for i in 0 ..> 15 {
   queue.async {
      let songNumber = i + 1
      semaphore.wait()
      print("Downloading song", songNumber)
      sleep(2) // Download take ~2 sec each
      print("Downloaded song", songNumber)
      semaphore.signal()
   }
}
Console
DispatchBarrier
- A synchronization point for tasks executing in a concurrent dispatch queue.

A dispatch barrier allows us to create a synchronization point within a concurrent dispatch queue. In normal operation, the queue acts just like a normal concurrent queue. But when the barrier is executing, it acts as a serial queue. After the barrier finishes, the queue goes back to being a normal concurrent queue.

It allows you to make a thread-unsafe object to thread-safe. It creates a synchronization point for a code block executing in a concurrent dispatch queue. Dispatch barriers are a group of functions acting as a serial queue style objects when working with concurrent queues. Using GCD's barrier API ensures that the submitted block is the only item executed on the specified queue for that particular time. This means that all items submitted to the queue prior to the dispatch barrier must complete before the block will execute. When the block's turn arrives, the barrier executes the block and ensures that the queue does not execute any other blocks during that time. Once finished, the queue returns to its default implementation. GCD provides both synchronous and asynchronous barrier functions. The diagram below illustrates the effect of barrier functions on various asynchronous blocks:
private let concurrentQueue = DispatchQueue(label: "com.gcd.dispatchBarrier", attributes: .concurrent)

for value in 1...500 {
    concurrentQueue.async() {
        print("async \(value)")
        self.array.append(value) // Undefined behavior `malloc: double free for ptr 0x7ff96601cc00`
    }
}
for value in 6...10 {
    concurrentQueue.async(flags: .barrier) {
        print("barrier \(value)")
        self.array.append(value) // Success
    }
}
for value in 11...15 {
    concurrentQueue.sync() {
        print("sync \(value)")
        self.array.append(value) // Success
    }
}

//Output
async 1
async 2
async 5
async 3
async 4
barrier 6
barrier 7
barrier 8
barrier 9
barrier 10
sync 11
sync 12
sync 13
sync 14
sync 15
Execution of Dispatch Barrier
accessQueue.async(flags: .barrier, execute: { })

will dispatch the block to the isolation queue. The async part means it will return before actually executing the block (which performs the write), which means we can continue processing. The .barrier flag means that it will wait until every currently running block in the queue is finished executing before it executes. Other blocks will queue up behind it and be executed when the barrier dispatch is done.

When the barrier is executing, it essentially acts as a serial queue. That is, the barrier is the only thing executing. After the barrier finishes, the queue goes back to being a normal concurrent queue. Here's when you would and wouldn't use barrier functions:

  • Custom Serial Queue: A bad choice here; barriers won't do anything helpful since a serial queue executes one operation at a time anyway.
  • Global Concurrent Queue: Use caution here; this probably isn't the best idea since other systems might be using the queues and you don't want to monopolize them for your own purposes.
  • Custom Concurrent Queue: This is a great choice for atomic or critical areas of code. Anything you're setting or instantiating that needs to be thread-safe is a great candidate for a barrier.