Hi @DTS Engineer (Kevin),
Concurrent queues are a relatively late addition to the API and, IMHO, are something that you should actively avoid, as they create exactly the same issues as the global concurrent queues.The ONE exception to that is cases where you're specifically trying to create a limited amount of parallel activity with a specific component ("up to 4 jobs at once"). IF that's the case, then the correct solution would be to use NSOperationQueue to set the width. As a side note here, NSOperationQueue is actually the API I would recommend over dispatch for case where you want something that works like GCD. It's built as a wrapper around dispatch, however, it also provides things like a common work object base class, cancellation, progress, etc. It also exports the underlying GCD queue, so you can also use it with any API that requires a GCD queue.
As per your suggestion i tried using OperationQueue, which provide[s] us the capability of controlling the number of Concurrent Task[s] by specifying the maxConcurrentTask[s] property of the Operation Queue.
To test this, I created an OperationQueue with QOS set to default and concurrent attributes to allow multiple tasks to run in parallel. I then set maxConcurrentOperationCount = 2 and passed the queue to the .start call of NWListener.
Next, I sent 10 concurrent connections from another machine to observe whether only 2 connections would be processed at a time. However, despite setting maxConcurrentOperationCount, all 10 connections were handled simultaneously. It appears that the underlying dispatch queue is bypassing the concurrency limit.
import Network
import Foundation
var operation_queue = OperationQueue ()
operation_queue.underlyingQueue = DispatchQueue (label: "test_operation_queue", qos: .default, attributes: .concurrent)
operation_queue.maxConcurrentOperationCount = 2
do {
var params = NWParameters(tls: NWProtocolTLS.Options (), tcp: NWProtocolTCP.Options())
let listener = try NWListener(using: params, on: 52000) // Use a specific port
listener.stateUpdateHandler = { state in
switch state {
case .setup:
print("Setup state")
case .waiting(let error):
print("Waiting state with error: \(error)")
case .ready:
print("Server is ready on port \(listener.port ?? 0)")
case .failed(let error):
print("Failed state with error: \(error)")
case .cancelled:
print("Cancelled state")
@unknown default:
print("Unknown state")
}
}
listener.newConnectionHandler = {connection in
print ("Connection Received and processing Started")
}
listener.start(queue: operation_queue.underlyingQueue!)
} catch {
print("Failed to create listener: \(error)")
}
Am I missing something here?