Hello,
I’ve encountered a warning while working with UITableViewDiffableDataSource. Here’s the exact message:
Warning: applying updates in a non-thread confined manner is dangerous and can lead to deadlocks. Please always submit updates either always on the main queue or always off the main queue - view=<UITableView: 0x7fd79192e200; frame = (0 0; 375 667); clipsToBounds = YES; autoresize = W+H; gestureRecognizers = <NSArray: 0x600003f3c9f0>; backgroundColor = <UIDynamicProviderColor: 0x60000319bf80; provider = <NSMallocBlock: 0x600003f0ce70>>; layer = <CALayer: 0x6000036e8fa0>; contentOffset: {0, -116}; contentSize: {375, 20}; adjustedContentInset: {116, 0, 49, 0}; dataSource: <TtGC5UIKit29UITableViewDiffableDataSourceOC17ArticleManagement21DiscardItemsViewModel17SectionIdentifierSS: 0x600003228270>>
OS: iOS Version: iOS 17+,
Xcode Version: 16.0,
Frameworks: UIKit, Diffable Data Source,
View: UITableView used with a UITableViewDiffableDataSource.
Steps to Reproduce:
Using a diffable data source with a table view.
Applying snapshot updates in the data source from a main thread.
Warning occurs intermittently during snapshot application.
Expected Behavior:
The snapshot should apply without warnings, provided the updates are on a main thread.
Actual Behavior:
The warning suggests thread safety issues when applying updates on non-thread-confined queues.
Questions:
Is there a recommended best practice to handle apply calls in diffable data sources with thread safety in mind?
Could this lead to potential deadlocks if not addressed?
Note :- I confirm I am always reloading / reconfiguring data source on main thread. Please find the attached screenshots for the reference.
Any guidance or clarification would be greatly appreciated!
                    
                  
                Concurrency
RSS for tagConcurrency is the notion of multiple things happening at the same time.
Posts under Concurrency tag
            
              
                133 Posts
              
            
            
              
                
              
            
          
          
  
    
    Selecting any option will automatically load the page
  
  
  
  
    
  
  
              Post
Replies
Boosts
Views
Activity
                    
                      I am trying to port SceneKit projects to Swift 6, and I just can't figure out how that's possible. I even start thinking SceneKit and Swift 6 concurrency just don't match together, and SceneKit projects should - hopefully for the time being only - stick to Swift 5.
The SCNSceneRendererDelegate methods are called in the SceneKit Thread.
If the delegate is a ViewController:
class GameViewController: UIViewController {
    let aNode = SCNNode()
    func renderer(_ renderer: any SCNSceneRenderer, updateAtTime time: TimeInterval) {
        aNode.position.x = 10
    }
}
Then the compiler generates the error "Main actor-isolated instance method 'renderer(_:updateAtTime:)' cannot be used to satisfy nonisolated protocol requirement"
Which is fully understandable.
The compiler even tells you those methods can't be used for protocol conformance, unless:
Conformance is declare as @preconcurrency SCNSceneRendererDelegate like this:
class GameViewController: UIViewController, @preconcurrency SCNSceneRendererDelegate {
But that just delays the check to runtime, and therefore, crash in the SceneKit Thread happens at runtime...
Again, fully understandable.
or the delegate method is declared nonisolated like this:
    nonisolated func renderer(_ renderer: any SCNSceneRenderer, updateAtTime time: TimeInterval) {
        aNode.position.x = 10
    }
Which generates the compiler error: "Main actor-isolated property 'position' can not be mutated from a nonisolated context".
Again fully understandable.
If the delegate is not a ViewController but a nonisolated class, we also have the problem that SCNNode can't be used.
Nearly 100% of the SCNSceneRendererDelegate I've seen do use SCNNode or similar MainActor bound types, because they are meant for that.
So, where am I wrong ? What is the solution to use SceneKit SCNSceneRendererDelegate methods with full Swift 6 compilation ? Is that even possible for now ?
                    
                  
                
                    
                      Hi, I'm trying to modify the ScreenCaptureKit Sample code by implementing an actor for Metal rendering, but I'm experiencing issues with frame rendering sequence.
My app workflow is:
ScreenCapture -> createFrame -> setRenderData
Metal draw callback -> renderAsync (getData from renderData)
I've added timestamps to verify frame ordering, I also using binarySearch to insert the frame with timestamp,  and while the timestamps appear to be in sequence, the actual rendering output seems out of order.
// ScreenCaptureKit sample
func createFrame(for sampleBuffer: CMSampleBuffer) async {
    if let surface: IOSurface = getIOSurface(for: sampleBuffer) {
        await renderer.setRenderData(surface, timeStamp: sampleBuffer.presentationTimeStamp.seconds)
    }
}
class Renderer {
    ...
    func setRenderData(surface: IOSurface, timeStamp: Double) async {
        _ = await renderSemaphore.getSetBuffers(
            isGet: false,
            surface: surface,
            timeStamp: timeStamp
        )
    }
    func draw(in view: MTKView) {
        Task {
            await renderAsync(view)
        }
    }
    func renderAsync(_ view: MTKView) async {
        guard await renderSemaphore.beginRender() else { return }
        guard let frame = await renderSemaphore.getSetBuffers(
            isGet: true, surface: nil, timeStamp: nil
        ) else {
            await renderSemaphore.endRender()
            return }
        
        guard let texture = await renderSemaphore.getRenderData(
            device: self.device,
            surface: frame.surface) else {
            await renderSemaphore.endRender()
            return
        }
        
        guard let commandBuffer = _commandQueue.makeCommandBuffer(),
                let renderPassDescriptor = await view.currentRenderPassDescriptor,
                let renderEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor) else {
            await renderSemaphore.endRender()
            return
        }
        // Shaders ..
        renderEncoder.endEncoding()
        
        commandBuffer.addCompletedHandler() { @Sendable (_ commandBuffer)-> Swift.Void in
            updateFPS()
        }
        
        // commit frame in actor
        let success = await renderSemaphore.commitFrame(
            timeStamp: frame.timeStamp,
            commandBuffer: commandBuffer,
            drawable: view.currentDrawable!
        )
        
        if !success {
            print("Frame dropped due to out-of-order timestamp")
        }
        
        await renderSemaphore.endRender()
    }
}
actor RenderSemaphore {
    private var frameBuffers: [FrameData] = []
    private var lastReadTimeStamp: Double = 0.0
    private var lastCommittedTimeStamp: Double = 0
    
    private var activeTaskCount = 0
    private var activeRenderCount = 0
    private let maxTasks = 3
    
    private var textureCache: CVMetalTextureCache?
    
    init() {
    }
    
    func initTextureCache(device: MTLDevice) {
        CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, device, nil, &self.textureCache)
    }
    
    func beginRender() -> Bool {
        guard activeRenderCount < maxTasks else { return false }
        activeRenderCount += 1
        return true
    }
    
    func endRender() {
        if activeRenderCount > 0 {
            activeRenderCount -= 1
        }
    }
    
    func setTextureLoaded(_ loaded: Bool) {
        isTextureLoaded = loaded
    }
    
    func getSetBuffers(isGet: Bool, surface: IOSurface?, timeStamp: Double?) -> FrameData? {
        if isGet {
            if !frameBuffers.isEmpty {
                let frame = frameBuffers.removeFirst()
                if frame.timeStamp > lastReadTimeStamp {
                    lastReadTimeStamp = frame.timeStamp
                    print(frame.timeStamp)
                    return frame
                }
            }
            
            return nil
        } else {
            // Set
            let frameData = FrameData(
                surface: surface!,
                timeStamp: timeStamp!
            )
            
            // insert to the right position
            let insertIndex = binarySearch(for: timeStamp!)
            frameBuffers.insert(frameData, at: insertIndex)
            return frameData
        }
    }
    
    private func binarySearch(for timeStamp: Double) -> Int {
        var left = 0
        var right = frameBuffers.count
        
        while left < right {
            let mid = (left + right) / 2
            if frameBuffers[mid].timeStamp > timeStamp {
                right = mid
            } else {
                left = mid + 1
            }
        }
        
        return left
    }
    
    // for setRenderDataNormalized
    func tryEnterTask() -> Bool {
        guard activeTaskCount < maxTasks else { return false }
        activeTaskCount += 1
        return true
    }
    
    func exitTask() {
        activeTaskCount -= 1
    }
    
    func commitFrame(timeStamp: Double,
                     commandBuffer: MTLCommandBuffer,
                     drawable: MTLDrawable) async -> Bool {
        
        guard timeStamp > lastCommittedTimeStamp else {
            print("Drop frame at commit: \(timeStamp) <= \(lastCommittedTimeStamp)")
            return false
        }
        
        commandBuffer.present(drawable)
        commandBuffer.commit()
        lastCommittedTimeStamp = timeStamp
        
        return true
    }
    
    func getRenderData(
        device: MTLDevice,
        surface: IOSurface,
        depthData: [Float]
    ) -> (MTLTexture, MTLBuffer)? {
        let _textureName = "RenderData"
        
        var px: Unmanaged<CVPixelBuffer>?
        let status = CVPixelBufferCreateWithIOSurface(kCFAllocatorDefault, surface, nil, &px)
        guard status == kCVReturnSuccess, let screenImage = px?.takeRetainedValue() else {
            return nil
        }
        
        CVMetalTextureCacheFlush(textureCache!, 0)
        
        var texture: CVMetalTexture? = nil
        let width = CVPixelBufferGetWidthOfPlane(screenImage, 0)
        let height = CVPixelBufferGetHeightOfPlane(screenImage, 0)
        
        let result2 = CVMetalTextureCacheCreateTextureFromImage(
            kCFAllocatorDefault,
            self.textureCache!,
            screenImage,
            nil,
            MTLPixelFormat.bgra8Unorm,
            width,
            height,
            0, &texture)
        
        guard result2 == kCVReturnSuccess,
              let cvTexture = texture,
              let mtlTexture = CVMetalTextureGetTexture(cvTexture) else {
            return nil
        }
        mtlTexture.label = _textureName
        
        let depthBuffer = device.makeBuffer(bytes: depthData, length: depthData.count * MemoryLayout<Float>.stride)!
        return (mtlTexture, depthBuffer)
    }
}
Above's my code - could someone point out what might be wrong?
                    
                  
                
                    
                      I ran into a memory issue that I don't understand why this could happen. For me, It seems like ARC doesn't guarantee thread-safety.
Let see the code below
@propertyWrapper
public struct AtomicCollection<T> {
    private var value: [T]
    private var lock = NSLock()
    
    public var wrappedValue: [T] {
        set {
            lock.lock()
            defer { lock.unlock() }
            value = newValue
        }
        get {
            lock.lock()
            defer { lock.unlock() }
            return value
        }
    }
    
    public init(wrappedValue: [T]) {
        self.value = wrappedValue
    }
}
final class CollectionTest: XCTestCase {
    func testExample() throws {
        let rounds = 10000
        let exp = expectation(description: "test")
        exp.expectedFulfillmentCount = rounds
        
        @AtomicCollection var array: [Int] = []
        for i in 0..<rounds {
            DispatchQueue.global().async {
                array.append(i)
                exp.fulfill()
            }
        }
        
        wait(for: [exp])
    }
}
It will crash for various reasons (see screenshots below)
I know that the test doesn't reflect typical application usage. My app is quite different from traditional app so the code above is just the simplest form for proof of the issue.
One more thing to mention here is that array.count won't be equal to 10,000 as expected (probably because of copy-on-write snapshot)
So my questions are
Is this a bug/undefined behavior/expected behavior of Swift/Obj-c ARC?
Why this could happen?
Any solutions suggest?
How do you usually deal with thread-safe collection (array, dict, set)?
                    
                  
                
                    
                      I'm currently migrating a midsize (20k LOC) project to Swift structured concurrency. With complete checking turned on, it currently builds with only two warnings, both of which are related to the QLPreviewControllerDelegate protocol:
"Main actor-isolated instance method 'previewControllerDidDismiss' cannot be used to satisfy nonisolated protocol requirement; this is an error in the Swift 6 language mode" as well as the same warning but substituting 'previewController(_:transitionViewFor:)' for the method name.
I'm confused as to how to make these nonisolated, as they use UIKit classes/subclasses as arguments and/or return types.
                    
                  
                
                    
                      I make some small program to make dots. Many of them.
I have a Generator which generates dots in a loop:
        //reprat until all dots in frame
        while !newDots.isEmpty {
            virginDots = []
            for newDot in newDots {
                autoreleasepool{
                    virginDots.append(
                        contentsOf: newDot.addDots(in: size,  allDots: &result, inSomeWay))
                }
                newDots = virginDots
            }
            counter += 1
            print ("\(result.count) dots  in \(counter) grnerations")
        }
Sometimes this loop needs hours/days to finish (depend of inSomeWay settings), so it would be very nice to send partial result to  a View, and/or if result is not satisfying — break this loop and start over.
My understanding of Tasks and Concurrency became worse each time  I try to understand it, maybe it's my age, maybe language barier. For now, Button with {Task {...}} action doesn't removed Rainbow Wheel from my screen. Killing an app is wrong because killing is wrong.
How to deal with it?
                    
                  
                
                    
                      Hi everyone,
I’m planning to develop a cross-platform PDF viewer app for iOS and macOS that will read PDFs from local storage and cloud services (Google Drive, iCloud, WorkDrive, etc.). The app should be read-only and display both the PDF content and its metadata (author, title, creation date, etc.).
Key Features:
View PDFs: Local and remote (cloud storage integration).
Display metadata: Title, author, page count, etc.
Cloud integration: Google Drive, iCloud, Zoho WorkDrive, etc.
Read-only mode: No editing features, just viewing.
Questions:
Xcode Template: Should I use the Document App or Generic App template for this?
PDF Metadata: Any built-in libraries for extracting PDF metadata in a read-only app?
Performance: Any advice or documentation on handling large PDFs or cloud fetching efficiently?
Thanks in advance for any advice or resources!
                    
                  
                
                    
                      Hi,
I have a complex structure of classes, and I'm trying to migrate to swift6
For this classes I've a facade that creates the classes for me without disclosing their internals, only conforming to a known protocol
I think I've hit a hard wall in my knowledge of how the actors can exchange data between themselves. I've created a small piece of code that can trigger the error I've hit
import SwiftUI
import Observation
@globalActor
actor MyActor {
    static let shared: some Actor = MyActor()
    init() {
    }
}
@MyActor
protocol ProtocolMyActor {
    var value: String { get }
    func set(value: String)
}
@MyActor
func make(value: String) -> ProtocolMyActor {
    return ImplementationMyActor(value: value)
}
class ImplementationMyActor: ProtocolMyActor {
    private(set) var value: String
    init(value: String) {
        self.value = value
    }
    func set(value: String) {
        self.value = value
    }
}
@MainActor
@Observable
class ViewObserver {
    let implementation: ProtocolMyActor
    var value: String
    init() async {
        let implementation = await make(value: "Ciao")
        self.implementation = implementation
        self.value = await implementation.value
    }
    func set(value: String) {
        Task {
            await implementation.set(value: value)
            self.value = value
        }
    }
}
struct MyObservedView: View {
    @State var model: ViewObserver?
    
    var body: some View {
        if let model {
            Button("Loaded \(model.value)") {
                model.set(value: ["A", "B", "C"].randomElement()!)
            }
        } else {
            Text("Loading")
                .task {
                    self.model = await ViewObserver()
                }
        }
    }
}
The error
Non-sendable type 'any ProtocolMyActor' passed in implicitly asynchronous call to global actor 'MyActor'-isolated property 'value' cannot cross actor boundary
Occurs in the init on the line "self.value = await implementation.value"
I don't know which concurrency error happens... Yes the init is in the MainActor , but the ProtocolMyActor data can only be accessed in a MyActor queue, so no data races can happen... and each access in my ImplementationMyActor uses await, so I'm not reading or writing the object from a different actor, I just pass sendable values as parameter to a function of the object..
can anybody help me understand better this piece of concurrency problem?
Thanks
                    
                  
                
                    
                      Hi everyone,
I'm working on integrating object recognition from live video feeds into my existing app by following Apple's sample code. My original project captures video and records it successfully. However, after integrating the Vision-based object detection components (VNCoreMLRequest), no detections occur, and the callback for the request is never triggered.
To debug this issue, I’ve added the following functionality:
Set up AVCaptureVideoDataOutput for processing video frames.
Created a VNCoreMLRequest using my Core ML model.
The video recording functionality works as expected, but no object detection happens. I’d like to know:
How to debug this further? Which key debug points or logs could help identify where the issue lies?
Have I missed any key configurations? Below is a diff of the modifications I’ve made to my project for the new feature.
Diff of Changes:
(Attach the diff provided above)
Specific Observations:
The captureOutput method is invoked correctly, but there is no output or error from the Vision request callback.
Print statements in my setup function setForVideoClassify() show that the setup executes without errors.
Questions:
Could this be due to issues with my Core ML model compatibility or configuration?
Is the VNCoreMLRequest setup incorrect, or do I need to ensure specific image formats for processing?
Platform:
Xcode 16.1, iOS 18.1, Swift 5, SwiftUI, iPhone 11,
Darwin MacBook-Pro.local 24.1.0 Darwin Kernel Version 24.1.0: Thu Oct 10 21:02:27 PDT 2024; root:xnu-11215.41.3~2/RELEASE_X86_64 x86_64
Any guidance or advice is appreciated! Thanks in advance.
                    
                  
                
                    
                      Hi! I'm running into a warning from a SwiftUI.DynamicProperty on a 6.0 development build (swift-6.0-DEVELOPMENT-SNAPSHOT-2024-03-26-a).
I am attempting to build a type (conforming to DynamicProperty) that should also be MainActor. This type with also need a custom update function. Here is a simple custom wrapper (handwaving over the orthogonal missing pieces) that shows the warning:
import SwiftUI
@MainActor struct MainProperty: DynamicProperty {
  //  Main actor-isolated instance method 'update()' cannot be used to satisfy nonisolated protocol requirement; this is an error in the Swift 6 language mode
  @MainActor func update() {
    
  }
}
Is there anything I can do about that warning? Does the warning correctly imply that this will be a legit compiler error when 6.0 ships?
I can find (at least) two examples of types adopting DynamicProperty from Apple that are also MainActor: FetchRequest and SectionedFetchRequest. What is confusing is that both FetchRequest^1 and SectionedFetchRequest^2 explicitly declare their update method to be MainActor. Is there anything missing from my Wrapper declaration that can get me what I'm looking for? Any more advice about that? Thanks!
                    
                  
                
                    
                      I am using swiftui lately in my iOS mobile app, The Mobile app already has a pipeline that detect any experimental  features and throw an error
I am using swift 5 and as you all know SwiftUI is using some of OpaqueTypeErasure utility types like "some"
I heard that in swift 6 the OpaqueTypeErasure is not experimental anymore
But upgrading the app swift version will be a very long process
Also changing the pipeline will be a very long and tiring process
So i want to know if there is a way to remove OpaqueTypeErasure from SwiftUI  and what is the alternatives for bypassing the error that being thrown from the pipeline
                    
                  
                
                    
                      I am attempting to do batch Transcription of audio files exported from Voice Memos, and I am running into an interesting issue. If I only transcribe a single file it works every time, but if I try to batch it, only the last one works, and the others fail with No speech detected. I assumed it must be something about concurrency, so I implemented what I think should remove any chance of transcriptions running in parallel. And with a mocked up unit of work, everything looked good. So I added the transcription back in, and
1: It still fails on all but the last file. This happens if I am processing 10 files or just 2.
2: It no longer processes in order, any file can be the last one that succeeds. And it seems to not be related to file size. I have had paragraph sized notes finish last, but also a single short sentence that finishes last.
I left the mocked processFiles() for reference.
Any insights would be greatly appreciated.
import Speech
import SwiftUI
struct ContentView: View {
    @State private var processing: Bool = false
    @State private var fileNumber: String?
    @State private var fileName: String?
    @State private var files: [URL] = []
    
    let locale = Locale(identifier: "en-US")
    let recognizer: SFSpeechRecognizer?
    
    init() {
        self.recognizer = SFSpeechRecognizer(locale: self.locale)
    }
    
    var body: some View {
        VStack {
            if files.count > 0 {
                ZStack {
                    ProgressView()
                    Text(fileNumber ?? "-")
                        .bold()
                }
                Text(fileName ?? "-")
            } else {
                Image(systemName: "folder.badge.minus")
                Text("No audio files found")
            }
        }
        .onAppear {
            files = getFiles()
            Task {
                await processFiles()
            }
        }
    }
    private func getFiles() -> [URL] {
        do {
            let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
            let path = documentsURL.appendingPathComponent("Voice Memos").absoluteURL
            
            let contents = try FileManager.default.contentsOfDirectory(at: path, includingPropertiesForKeys: nil, options: [])
            
            let files = (contents.filter {$0.pathExtension == "m4a"}).sorted { url1, url2 in
                url1.path < url2.path
            }
            
            return files
        }
        catch {
            print(error.localizedDescription)
            return []
        }
    }
    
    private func processFiles() async {
        var fileCount = files.count
        for file in files {
            fileNumber = String(fileCount)
            fileName = file.lastPathComponent
            await processFile(file)
            fileCount -= 1
        }
    }
    
//    private func processFile(_ url: URL) async {
//        let seconds = Double.random(in: 2.0...10.0)
//        await withCheckedContinuation { continuation in
//            DispatchQueue.main.asyncAfter(deadline: .now() + seconds) {
//                continuation.resume()
//                print("\(url.lastPathComponent) \(seconds)")
//            }
//        }
//    }
    private func processFile(_ url: URL) async {
        let recognitionRequest = SFSpeechURLRecognitionRequest(url: url)
        recognitionRequest.requiresOnDeviceRecognition = false
        recognitionRequest.shouldReportPartialResults = false
        
        await withCheckedContinuation { continuation in
            recognizer?.recognitionTask(with: recognitionRequest) { (transcriptionResult, error) in
                guard transcriptionResult != nil else {
                    print("\(url.lastPathComponent.uppercased())")
                    print(error?.localizedDescription ?? "")
                    return
                }
                if ((transcriptionResult?.isFinal) == true) {
                    if let finalText: String = transcriptionResult?.bestTranscription.formattedString {
                        print("\(url.lastPathComponent.uppercased())")
                        print(finalText)
                    }
                }
            }
            continuation.resume()
        }
    }
}
                    
                  
                
                    
                      I've just tried to update a project that uses SwiftData to Swift 6 using Xcode 16 beta 1, and it's not working due to missing Sendable conformance on a couple of types (MigrationStage and Schema.Version):
struct LocationsMigrationPlan: SchemaMigrationPlan {
    static let schemas: [VersionedSchema.Type] = [LocationsVersionedSchema.self]
    static let stages: [MigrationStage] = []
}
struct LocationsVersionedSchema: VersionedSchema {
    static let models: [any PersistentModel.Type] = [
        Location.self
    ]
    static let versionIdentifier = Schema.Version(1, 0, 0)
}
This code results in the following errors:
error: static property 'stages' is not concurrency-safe because non-'Sendable' type '[MigrationStage]' may have shared mutable state
    static let stages: [MigrationStage] = []
               ^
error: static property 'versionIdentifier' is not concurrency-safe because non-'Sendable' type 'Schema.Version' may have shared mutable state
    static let versionIdentifier = Schema.Version(1, 0, 0)
               ^
Am I missing something, or is this a bug in the current seed? I've filed this as FB13862584.