Post

Replies

Boosts

Views

Activity

How does UIScene know whether it has the main key window and when it changes on iPad's multi-scene?
Hi. I know to know which window gets hardware keyboard events (such as shortcut key) currently on iPad. Until iPadOS 15.0, UIApplication.shared.keyWindow, which was deprecated on iPadOS 13.0 and didBecomeKeyNotification/didResignKeyNotification. But after iPadOS 15.0, a keyWindow is managed by UIScene, not by UIApplication. Each scene of my app always has just one window. For my purpose, checking deprecated UIApplication.shared.keyWindow is still effective but didBecomeKeyNotification and didResignKeyNotification don't work because they are fired when a change happens only inside the scene. So my questions are, What is the new alternative of UIApplication.shared.keyWindow? I know a wrong hack like UIApplication.shared.connectedScenes.compactMap { $0 as? UIWindowScene }.first?.windows.filter { $0.isKeyWindow }.first does not work since the order of connectedScenes is not related with getting hardware keyboard events. What are the new alternatives of didBecomeKeyNotification/didResignKeyNotification which work on inter-scene? The second question is more crucial. Because about the first question, I can still use deprecated UIApplication.shared.keyWindow. Thanks.
0
0
174
4w
How do we use the computational power of A17 Pro Neural Engine?
Hi. A17 Pro Neural Engine has 35 TOPS computational power. But many third-party benchmarks and articles suggest that it has a little more power than A16 Bionic. Some references are, Geekbench ML Core ML performance benchmark, 2023 edition How do we use the maximum power of A17 Pro Neural Engine? For example, I guess that logical devices of ANE on A17 Pro may be two, not one, so we may need to instantiate two Core ML models simultaneously for the purpose. Please let me know any technical hints.
1
0
2.3k
Oct ’23
How can I implement a template selection via creating a new document using DocumentGroup?
Hi. I want to implement a template selection such as Pages and Numbers. Currently I am using DocumentGroup scene on SwiftUI. How can I implement it? init( newDocument: @autoclosure @escaping () -> Document, @ViewBuilder editor: @escaping (FileDocumentConfiguration<Document>) -> Content ) The initializer of DocumentGroup may suggest that the newDocument argument should open template selector and return one when the selector is closed. But I think that it may beome a complicated implementation. What is a right way to implement the template selector?
1
0
520
Aug ’23
Banner of local notification from broadcast upload extension may not work.
Hi. I implemented a broadcast upload extension and it requests local notifications. The local notification works normally on broadcastStarted(withSetupInfo:), but the Banner of the local notification does not work on processSampleBuffer(_: with:) though its Notification Center works normally. What am I missing? Here is my code snippets. container app class AppDelegate: NSObject, UIApplicationDelegate { func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool { // Override point for customization after application launch. requestAuthorization() ... } private func requestAuthorization() { let center = UNUserNotificationCenter.current() center.requestAuthorization(options: [.alert]) { granted, error in if let error = error { // Handle the error here. print(error) } if granted == true { center.delegate = self center.getNotificationSettings(completionHandler: { setting in print(setting) }) } else { print("not permitted") } } } } upload extension class SampleHandler: RPBroadcastSampleHandler { override func broadcastStarted(withSetupInfo setupInfo: [String : NSObject]?) { super.broadcastStarted(withSetupInfo: setupInfo) notification(title: "Upload Extension", body: "broadcastStarted") ... } override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) { super.processSampleBuffer(sampleBuffer, with: sampleBufferType) ... if some condition { notification(title: "Upload Extension", body: "processSampleBuffer") } } private func notification(title: String, body: String) { let content = UNMutableNotificationContent() content.title = title content.body = body let request = UNNotificationRequest(identifier: UUID().uuidString, content: content, trigger: nil) let notificationCenter = UNUserNotificationCenter.current() notificationCenter.add(request) { error in if error != nil { print(error) } } } }
3
0
2.4k
Feb ’22
Is cleaning up StoreObserver in applicationWillTerminate method enough?
Hi. In the sample code "Offering, Completing, and Restoring In-App Purchases "(https://developer.apple.com/documentation/storekit/in-app_purchase/offering_completing_and_restoring_in-app_purchases), there is a cleanup code in applicationWillTerminate method.func applicationWillTerminate(_ application: UIApplication) { // Remove the observer. SKPaymentQueue.default().remove(StoreObserver.shared) } But applicationWillTerminate method is not called when the app is suspended according to the document (https://developer.apple.com/documentation/uikit/uiapplicationdelegate/1623111-applicationwillterminate). What would happen about StoreObserver.shared if the suspended app is terminated? Thanks.
0
0
655
Jan ’21
Could someone summarize Catalyst and M1 Mac relationship?
Hi. When Catalyst was released, I understood that it consisted of two factors, compiling for Intel CPU and UIKit on macOS. So you needs to add target "Mac" and Mac.entitlements on Xcode to run your app on Catalyst. Now M1 Mac appears and it runs iPhone/iPad apps without the above compilation if they are distributed. In this case, still do iPhone/iPad apps run on Catalyst? (I mean that macOS-specific features such as UIHoverGestureRecognizer work on M1 Mac if you implement some using these features.) 2. Why don't you need Mac.entitlements? The above two questions are just what I come up with. Detail summary is welcome. Thanks
0
0
662
Jan ’21