DispatchQueue.count > thread.count case

I'm trying to reproduce a case when there are more dispatch queues than there are threads serving them. Is that a possible scenario?

Replies

I think you’ll need to clarify your question because, as written, it’s trivial to hit that scenario:

import Dispatch

func main() {
    let queues = (0..<10).map { DispatchQueue(label: "test-\($0)") }
    withExtendedLifetime(queues) {
        dispatchMain()
    }
}

main()

This results in 10 Dispatch queues with no threads servicing them.

Share and Enjoy

Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + "apple.com"

Let me try:

let queues = (0 ..< 200).map { i in DispatchQueue(label: "queue-\(i)", serial or concurrent) }

queues.forEach { queue in
    queue.async {
        while true {
            // doing something lengthy here
            is Thread.current unique?
            In other words was a dedicated thread allocated for this queue? or could a single thread serve several queues?
        }
    }
}

In my tests so far a unique thread is allocated per queue.

In my tests so far a unique thread is allocated per queue.

Threads aren’t allocated to queues, threads are allocated to work items, that is, the jobs running a queue. If a work item comes to the head of a queue, Dispatch will try to find a thread to run it. That means, for example, if you have two serial queues A and B, there’s no guarantee that one specific thread will run all the work items on queue A. Imagine this scenario:

  1. You dispatch work item W1 to queue A.

  2. Dispatch assigns thread X to run W1.

  3. It runs to completion.

  4. You dispatch work item W2 to queue B.

  5. Dispatch reuses thread X to run W2.

  6. While that’s running, you dispatch W3 to queue A.

  7. Thread X is busy, so Dispatch will find another thread, Y, to run W3.

That’s why using thread-local storage in a work item is a bad idea, and why Dispatch supports queue-local storage.

For serial queues, a work item can only reach the head of the queue once all preceding work items have completed. So, you tend to see the same thread run a group of successive work items.

For concurrent queues that’s not the case.

IMPORTANT Concurrent queues are generally a bad idea. I explains this, with links to further resources, in Avoid Dispatch Global Concurrent Queues.

With concurrent queues, a work item can hit the head of the queue while other work items are still running. That breaks the simple model you see for serial queues, and it can result in a lot of worker threads. Dispatch tries to limit the number of worker threads to the core count. However, it has a concept of overcommit where, if a worker thread blocks, Dispatch might spin up a new one to replace it. This can lead to a phenomenon known as thread explosion.

The mechanics of this are complex, and the exact behaviour varies by platform. It’s best to avoid these sharp edges by avoiding concurrent queues.

Share and Enjoy

Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + "apple.com"

My fear was that while one queue's work item is still in progress the associated thread could be used somehow to run the other queue's work item.

OK. That can’t happen with Dispatch, because each workitem is (under the covers) a C function and C doesn’t have any sort of coroutine mechanism that would allow the work item to yield so that the thread could work on other stuff.

Notably, this will happen in Swift concurrency, where each await is (again, under the covers) a coroutine yield.

Share and Enjoy

Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + "apple.com"