DEV Community

Ajithmadhan
Ajithmadhan

Posted on

Multithreading - a deep dive in swift

What is concurrency?

Wikipedia defines concurrency as “the decomposability property of a program, algorithm, or problem into order-independent or partially-ordered components or units.” What this means is looking at the logic of your app to determine which pieces can run at the same time, and possibly in a random order, yet still result in a correct implementation of your data flow.

Why use concurrency?

It’s critical to ensure that your app runs as smoothly as possible and that the end user is not ever forced to wait for something to happen. A second is a minuscule amount of time for most everything not related to a computer. However, if a human has to wait a second to see a response after taking an action on a device like an iPhone, it feels like an eternity. “It’s too slow” is one of the main contributors to your app being uninstalled.

Scrolling through a table of images is one of the more common situations wherein the end user will be impacted by the lack of concurrency. If you need to download an image from the network, or perform some type of image processing before displaying it, the scrolling will stutter and you’ll be forced to display multiple “busy” indicators instead of the expected image.

GCD vs Operations

There are two APIs that you’ll use when making your app concurrent: Grand Central Dispatch, commonly referred to as GCD, and Operations. These are neither competing technologies nor something that you have to exclusively pick between. In fact, Operations are built on top of GCD!

Grand Central Dispatch

GCD is Apple’s implementation of C’s libdispatch library. Its purpose is to queue up tasks — either a method or a closure — that can be run in parallel, depending on availability of resources; it then executes the tasks on an available processor core.

While GCD uses threads in its implementation, you, as the developer, do not need to worry about managing them yourself. GCD’s tasks are so lightweight to enqueue that Apple, in its 2009 technical brief on GCD, stated that only 15 instructions are required for implementation, whereas creating traditional threads could require several hundred instructions.

All of the tasks that GCD manages for you are placed into GCD-managed first-in, first-out (FIFO) queues. Each task that you submit to a queue is then executed against a pool of threads fully managed by the system.

More details about GCD is covered in This blog

// Class level variable
let queue = DispatchQueue(label: "sample")

// Somewhere in your function
queue.async {
  // Call slow non-UI methods here

  DispatchQueue.main.async {
    // Update the UI here
  }
}
Enter fullscreen mode Exit fullscreen mode

Synchronous and asynchronous tasks

Work placed into the queue may either run synchronously or asynchronously. When running a task synchronously, your app will wait and block the current run loop until execution finishes before moving on to the next task. Alternatively, a task that is run asynchronously will start, but return execution to your app immediately. This way, the app is free to run other tasks while the first one is executing.

In general, you’ll want to take any long-running non-UI task that you can find and make it run asynchronously in the background. GCD makes this very simple via closures with a few lines of code.

Operations

Operations in Swift are a powerful way to separate responsibilities over several classes while keeping track of progress and dependencies. They’re formally known as NSOperations and used in combination with the OperationQueue.

An Operation is typically responsible for a single synchronous task. It’s an abstract class and never used directly. You can make use of the system-defined BlockOperation subclass or by creating your own subclass. You can start an operation by adding it to an OperationQueue or by manually calling the start method. However, it’s highly recommended to give full responsibility to the OperationQueue to manage the state.

Making use of the system-defined BlockOperation looks as follows:

let blockOperation = BlockOperation {
    print("Executing!")
}

let queue = OperationQueue()
queue.addOperation(blockOperation)
Enter fullscreen mode Exit fullscreen mode

And can also be done by adding the block directly on the queue:

queue.addOperation {
  print("Executing!")
}
Enter fullscreen mode Exit fullscreen mode

The given task gets added to the OperationQueue that will start the execution as soon as possible.

Creating a custom operation

You create separation of concern with custom operations. You could, for example, create a custom implementation for importing content and another one for uploading content.

The following code example shows a custom subclass for importing content:

final class ContentImportOperation: Operation {

    let itemProvider: NSItemProvider

    init(itemProvider: NSItemProvider) {
        self.itemProvider = itemProvider
        super.init()
    }

    override func main() {
        guard !isCancelled else { return }
        print("Importing content..")

        // .. import the content using the item provider

    }
}
Enter fullscreen mode Exit fullscreen mode

The class takes an item provider and imports the content within the main method. The main() function is the only method you need to overwrite for synchronous operations. Add the operation to the queue and set a completion block to track completion:

let fileURL = URL(fileURLWithPath: "..")
let contentImportOperation = ContentImportOperation(itemProvider: NSItemProvider(contentsOf: fileURL)!)

contentImportOperation.completionBlock = {
    print("Importing completed!")
}

queue.addOperation(contentImportOperation)

// Prints:
// Importing content..
// Importing completed!
Enter fullscreen mode Exit fullscreen mode

This moves all your logic for importing content into a single class on which you can track progress, completion, and you can easily write tests for it!

Different states of an operation

An operation can be in several states, depending on its current execution status.

  • Ready: It’s prepared to start
  • Executing: The task is currently running
  • Finished: Once the process is completed
  • Canceled: The task canceled It’s important to know that an operation can only execute once. Whenever it’s in the finished or canceled state, you can no longer restart the same instance.

The OperationQueue will remove the task automatically from its queue once it becomes finished, which happens both after execution or cancellation.

Making use of dependencies

A benefit of using operations is the use of dependencies. You can easily add a dependency between two instances. For example, to start uploading after the content is imported:

let fileURL = URL(fileURLWithPath: "..")
let contentImportOperation = ContentImportOperation(itemProvider: NSItemProvider(contentsOf: fileURL)!)
contentImportOperation.completionBlock = {
    print("Importing completed!")
}

let contentUploadOperation = UploadContentOperation()
contentUploadOperation.addDependency(contentImportOperation)
contentUploadOperation.completionBlock = {
    print("Uploading completed!")
}

queue.addOperations([contentImportOperation, contentUploadOperation], waitUntilFinished: true)

// Prints:
// Importing content..
// Uploading content..
// Importing completed!
// Uploading completed!
Enter fullscreen mode Exit fullscreen mode

The upload will only start after the content importation is finished. It does not take into account cancelation which means that if the import operation cancels, the upload would still start. You have to implement a check to see whether the dependencies were canceled or not:

final class UploadContentOperation: Operation {
    override func main() {
        guard !dependencies.contains(where: { $0.isCancelled }), !isCancelled else {
            return
        }

        print("Uploading content..")
    }
}
Enter fullscreen mode Exit fullscreen mode

GCD advantage over NSOperation:

i. implementation
For GCD implementation is very light-weight
Operation is complex and heavy-weight

Operation advantages over GCD:

i. Control On Operation
you can Pause, Cancel, Resume an NSOperation

ii. Dependencies
you can set up a dependency between two Operations
operation will not started until all of its dependencies return true for finished.

iii. State of Operation
can monitor the state of an operation or operation queue. ready ,executing or finished

iv. Max Number of Operation
you can specify the maximum number of queued operations that can run simultaneously

Racing Condition

A race condition occurs when two or more threads can access shared data and they try to change it at the same time. Because the thread scheduling algorithm can swap between threads at any time, you don't know the order in which the threads will attempt to access the shared data. Therefore, the result of the change in data is dependent on the thread scheduling algorithm, i.e. both threads are "racing" to access/change the data.

Dispatch Group

DispatchGroup allows for aggregate synchronization of work. You can use them to submit multiple different work items and track when they all complete, even though they might run on different queues. This behavior can be helpful when progress can’t be made until all of the specified tasks are complete.

A Scenario when this could be useful is if you have multiple webservice calls that all need to finish before continuing. For example, you need to download multiple sets of data that needs to be processed by some function. You have to wait for all webservices to complete before calling the function to process all the received data.

func doLongTasksAndWait () {
    print("starting long running tasks")
    let group = DispatchGroup()          //create a group for a bunch of tasks we are about to do
    for i in 0...3 {                     //launch a bunch of tasks (eg a bunch of webservice calls that all need to be finished before proceeding to the next ViewController)
        group.enter()                    //let the group know that something is being added
        DispatchQueue.global().async {   //run tasks on a background thread
            sleep(arc4random() % 4)      //do some long task eg webservice or database lookup (here we are just sleeping for a random amount of time for demonstration purposes)
            print("long task \(i) done!")
            group.leave()                //let group know that the task is finished
        }
    }
    group.wait()                         //will block whatever thread we are on here until all the above tasks have finished (so maybe dont use this function on your main thread)
    print("all tasks done!")
}

//starting long running tasks
//long task 0 done!
//long task 3 done!
//long task 1 done!
//long task 2 done!
//all tasks done!
Enter fullscreen mode Exit fullscreen mode

Semaphores

Semaphores give us the ability to control access to a shared resource by multiple threads.

A semaphore consists of a threads queue and a counter value (type Int).

The threads queue is used by the semaphore to keep track of waiting threads in FIFO order (The first thread entered into the queue will be the first to get access to the shared resource once it is available).

The counter value is used by the semaphore to decide if a thread should get access to a shared resource or not. The counter value changes when we call signal() or wait() function.

So, when should we call wait() and signal() functions?

  • Call wait() each time before using the shared resource. We are basically asking the semaphore if the shared resource is available or not. If not, we will wait.
  • Call signal() each time after using the shared resource. We are basically signaling the semaphore that we are done interacting with the shared resource.

Calling wait() will do the following:

Decrement semaphore counter by 1.
If the resulting value is less than zero, the thread is frozen.
If the resulting value is equal to or bigger than zero, the code will get executed without waiting

Calling signal() will do the following:

Increment semaphore counter by 1.
If the previous value was less than zero, this function wakes the oldest thread currently waiting in the thread queue.
If the previous value is equal to or bigger than zero, it means the thread queue is empty, aka, no one is waiting.

Example

    let queue = DispatchQueue.global()
        let group = DispatchGroup()

        let sem = DispatchSemaphore(value: 1)

        queue.async(group: group){
            sem.wait()
            let movie = self.downloadMovie(name: "Avatar")
            self.movies.append(movie)
            sem.signal()
        }

        queue.async(group: group){
            sem.wait()
            self.saveMovie()
            self.movies.remove(at: 0)
            sem.signal()
        }

    func downloadMovie(name:String) -> String{
        sleep(4)
        print("Movie has been Downloaded")
        return name
    }

    func saveMovie(){
        sleep(2)
        print("Movie has been Saved")
    }

Enter fullscreen mode Exit fullscreen mode

Top comments (0)