Search results for

“Visual Studio Maui IOS”

109,093 results found

Post

Replies

Boosts

Views

Activity

The audio of FairPlay protected content can be captured - Safari on iOS
Hi, Has anyone been able to protect the audio part of FairPlay protected content from being captured as part of screen recording on Safari/iOS (PWA and/or online web app)? We have tried many things but could not prevent the audio from being recorded. Same app and content on Safari/Mac does not allow audio to be recorded. Any tips?
1
0
183
1w
[Xcode 26 beta 4] Cannot receive device token from APNS using iOS 26 simulator
Since upgrading to Xcode 26 beta 4 and using the iOS 26 simulator for testing our app, we've stopped being able to receive device tokens for the simulator from the development APNS environment. The APNS environment is able to return meta device information (e.g. model, type, manufacturer) but there are no device tokens present. When running the same app using the iOS 18.5 simulator, we are able to register the device with the same APNS environment and receive a valid device token.
16
0
3.4k
1w
Reply to EASession(accessory:forProtocol:) always returns nil — MFI accessory iAP2
Hi, Thank you for the suggestions. I have tested the communication using the EADemo sample app as suggested, but unfortunately, I had no luck. The app is unable to establish a stable session with the accessory, which mirrors the issues we are seeing with our own application. To move this forward, I have filed an official bug report via Feedback Assistant and attached a full sysdiagnose captured during a failed communication attempt: Feedback ID: FB22116486 Current Project Status: Certification: The accessory is currently in an early development state, so we have not opted for MFi certification yet. We are using development identifiers/chips for testing the iAP2 protocol. Prior Success: We have not yet been able to successfully maintain a functional data session between the ExternalAccessory framework and this specific hardware revision. Technical Observations from Firmware Logs: Based on our local logs, the hardware side appears to be performing correctly: Link Synchronization: The iAP2 link completes the han
Topic: App & System Services SubTopic: Hardware Tags:
1w
Reply to iCloud Sync not working with iPhone, works fine for Mac.
I have the same issue and filed a bug report already: FB22324179 Since updating to iOS 26.4, iCloud remote notifications (CloudKit push notifications) are no longer delivered. The same setup worked reliably on iOS 26.3 without any changes to the app or server configuration, so this a new regression from Apple. didReceiveRemoteNotification: in AppDelegate gets not called anymore.
1w
Reply to Advanced App Clip Experiences stuck in "Received" — never transition to "Published"
Update with new findings: We discovered that our /g/{gameId} App Clip experience works perfectly on fresh devices via QR code scan — every time, first try. But /next-game never works on first scan for fresh devices. Both experiences show Received status in ASC. The /g experience was created earlier than /next-game. Our workaround: we deactivated the /next-game ASC experience. Now QR scans fall through to Safari, where — as you explained — the Smart App Banner reads the meta tag and cross-references the AASA locally, loading the App Clip successfully. Interesting behavior: once a device loads the App Clip binary through the Safari/meta tag path, subsequent QR scans for /next-game invoke the App Clip directly. iOS appears to cache the binary, and the direct invocation works from that point forward. So the binary, AASA, and meta tags are all correct. Something specific to the /next-game experience is preventing first-time invocation via QR scan on fresh devices, while the older /g experience works fine
1w
Reply to App Clips not working
Update: Even though App Store Connect is saying my domains are valid after correcting the team id, and the cdn cache has what appears to be proper values. App clips are still not working for me - meaning, every time I scan the QR code with the camera when the app isn't installed, it doesn't go to the app clip. If I get App Store to create an app code for me, it says no content to show (though this may be because the App Store is saying received for advanced app clip experiences I wrote). So I checked the CDN here: https://app-site-association.cdn-apple.com/a/v1/akin-server-side-staging.onrender.com, I can see my aasa is proper to my understanding in the CDN cache. I updated the entitlements as recommended: appclips:akin-server-side-staging.onrender.com I also reviewed several Apple Documentation pages including the Fruta, downloaded, Fruta and examined the configuration part by part. Specifically, the entitlements for both the app clips and parent target. On my iOS device, I went to Settings > Dev
Topic: App & System Services SubTopic: General Tags:
1w
Reply to SpeechTranscriber/SpeechAnalyzer being relatively slow compared to FoundationModel and TTS
I've been optimizing a similar STT-to-action pipeline on macOS 26 and found a few additional tricks beyond prepareToAnalyze that helped bring the finalization latency down: Use volatileResults aggressively for UI feedback, but trigger your downstream action (FoundationModel call) on the volatile transcript as soon as it stabilizes — don't wait for the finalized event. In my testing, the volatile transcript matches the final one ~95% of the time for short utterances. You can always correct if the final differs. Audio format matters more than you'd expect. If your input is coming through at 48kHz (common from ScreenCaptureKit or external mics), the internal resample to 16kHz adds measurable overhead. Setting up your AVAudioEngine tap at 16kHz mono from the start shaves ~200ms off the pipeline. The large variance Bersaelor observed with prepareToAnalyze (0.05s to 3s) likely correlates with whether the ANE was already warm. If other CoreML workloads are running concurrently (even system ones like Visual
Topic: Media Technologies SubTopic: Audio Tags:
1w
CoreML regression between macOS 26.0.1 and macOS 26.1 Beta causing scrambled tensor outputs
We’ve encountered what appears to be a CoreML regression between macOS 26.0.1 and macOS 26.1 Beta. In macOS 26.0.1, CoreML models run and produce correct results. However, in macOS 26.1 Beta, the same models produce scrambled or corrupted outputs, suggesting that tensor memory is being read or written incorrectly. The behavior is consistent with a low-level stride or pointer arithmetic issue — for example, using 16-bit strides on 32-bit data or other mismatches in tensor layout handling. Reproduction Install ON1 Photo RAW 2026 or ON1 Resize 2026 on macOS 26.0.1. Use the newest Highest Quality resize model, which is Stable Diffusion–based and runs through CoreML. Observe correct, high-quality results. Upgrade to macOS 26.1 Beta and run the same operation again. The output becomes visually scrambled or corrupted. We are also seeing similar issues with another Stable Diffusion UNet model that previously worked correctly on macOS 26.0.1. This suggests the regression may affect multiple diffusion-style ar
8
0
2.2k
1w
Reply to iOS 18 new RecognizedTextRequest DEADLOCKS if more than 2 are run in parallel
I've been working with the new Swift Vision API's RecognizeTextRequest on iOS 18 and hit this exact deadlock. After profiling with Instruments, I found that the Vision framework internally uses a limited thread pool for its neural engine requests — on most devices this caps at 2 concurrent ANE inference sessions. The workaround I'm using is a semaphore-based concurrency limiter that queues requests: actor OCRPipeline { private let maxConcurrent = 2 private var running = 0 private var pending: [CheckedContinuation] = [] func recognizeText(in image: CGImage) async throws -> [String] { await acquireSlot() defer { Task { await releaseSlot() } } let request = RecognizeTextRequest() let handler = ImageRequestHandler(image) let observations = try await handler.perform(request) return observations.compactMap { $0.topCandidates(1).first?.string } } } This keeps throughput high while never exceeding the 2-concurrent-request limit. In my testing across iPhone 15 Pro and iPad Air M2, this processes ~40 images
Topic: Machine Learning & AI SubTopic: General Tags:
1w
iOS 18 new RecognizedTextRequest DEADLOCKS if more than 2 are run in parallel
Following WWDC24 video Discover Swift enhancements in the Vision framework recommendations (cfr video at 10'41), I used the following code to perform multiple new iOS 18 `RecognizedTextRequest' in parallel. Problem: if more than 2 request are run in parallel, the request will hang, leaving the app in a state where no more requests can be started. -> deadlock I tried other ways to run the requests, but no matter the method employed, or what device I use: no more than 2 requests can ever be run in parallel. func triggerDeadlock() {} try await withThrowingTaskGroup(of: Void.self) { group in // See: WWDC 2024 Discover Siwft enhancements in the Vision framework at 10:41 // ############## THIS IS KEY let maxOCRTasks = 5 // On a real-device, if more than 2 RecognizeTextRequest are launched in parallel using tasks, the request hangs // ############## THIS IS KEY for idx in 0.. [RecognizedText] { // Create request var request = RecognizeTextRequest() // Single request: no need for ImageRequestHandler // Co
8
0
505
1w
Reply to The audio of FairPlay protected content can be captured - Safari on iOS
Hi, I would appreciate any feedback - either confirming or contradicting our observation above, that Audio cannot be protected from being captured on Safari/iOS. Thanks.
Topic: Media Technologies SubTopic: Streaming Tags:
Replies
Boosts
Views
Activity
1w
The audio of FairPlay protected content can be captured - Safari on iOS
Hi, Has anyone been able to protect the audio part of FairPlay protected content from being captured as part of screen recording on Safari/iOS (PWA and/or online web app)? We have tried many things but could not prevent the audio from being recorded. Same app and content on Safari/Mac does not allow audio to be recorded. Any tips?
Replies
1
Boosts
0
Views
183
Activity
1w
[Xcode 26 beta 4] Cannot receive device token from APNS using iOS 26 simulator
Since upgrading to Xcode 26 beta 4 and using the iOS 26 simulator for testing our app, we've stopped being able to receive device tokens for the simulator from the development APNS environment. The APNS environment is able to return meta device information (e.g. model, type, manufacturer) but there are no device tokens present. When running the same app using the iOS 18.5 simulator, we are able to register the device with the same APNS environment and receive a valid device token.
Replies
16
Boosts
0
Views
3.4k
Activity
1w
Reply to [Xcode 26 beta 4] Cannot receive device token from APNS using iOS 26 simulator
It seems that the issue has been resolved with iOS 26.4 and Xcode 26.4. It least my didRegisterForRemoteNotificationsWithDeviceToken gets called.
Replies
Boosts
Views
Activity
1w
Reply to EASession(accessory:forProtocol:) always returns nil — MFI accessory iAP2
Hi, Thank you for the suggestions. I have tested the communication using the EADemo sample app as suggested, but unfortunately, I had no luck. The app is unable to establish a stable session with the accessory, which mirrors the issues we are seeing with our own application. To move this forward, I have filed an official bug report via Feedback Assistant and attached a full sysdiagnose captured during a failed communication attempt: Feedback ID: FB22116486 Current Project Status: Certification: The accessory is currently in an early development state, so we have not opted for MFi certification yet. We are using development identifiers/chips for testing the iAP2 protocol. Prior Success: We have not yet been able to successfully maintain a functional data session between the ExternalAccessory framework and this specific hardware revision. Technical Observations from Firmware Logs: Based on our local logs, the hardware side appears to be performing correctly: Link Synchronization: The iAP2 link completes the han
Topic: App & System Services SubTopic: Hardware Tags:
Replies
Boosts
Views
Activity
1w
Reply to iCloud Sync not working with iPhone, works fine for Mac.
I don't know if it's related, but my sample iOS app (iOS 18.5) doesn't receive a remote notification today in line with CKQuerySubscription, which it received yesterday.
Replies
Boosts
Views
Activity
1w
Reply to iCloud Sync not working with iPhone, works fine for Mac.
I also did notice the same issue, I have a device that still running iOS 26.3 and another one with 26.4, when I make a change the 26.3 get the background notification and update the data but on the device with 26.4 it is not working. I tested also watchOS 26.4 it is working perfectly the notification is processed.
Replies
Boosts
Views
Activity
1w
Reply to iCloud Sync not working with iPhone, works fine for Mac.
I have the same issue and filed a bug report already: FB22324179 Since updating to iOS 26.4, iCloud remote notifications (CloudKit push notifications) are no longer delivered. The same setup worked reliably on iOS 26.3 without any changes to the app or server configuration, so this a new regression from Apple. didReceiveRemoteNotification: in AppDelegate gets not called anymore.
Replies
Boosts
Views
Activity
1w
Reply to Advanced App Clip Experiences stuck in "Received" — never transition to "Published"
Update with new findings: We discovered that our /g/{gameId} App Clip experience works perfectly on fresh devices via QR code scan — every time, first try. But /next-game never works on first scan for fresh devices. Both experiences show Received status in ASC. The /g experience was created earlier than /next-game. Our workaround: we deactivated the /next-game ASC experience. Now QR scans fall through to Safari, where — as you explained — the Smart App Banner reads the meta tag and cross-references the AASA locally, loading the App Clip successfully. Interesting behavior: once a device loads the App Clip binary through the Safari/meta tag path, subsequent QR scans for /next-game invoke the App Clip directly. iOS appears to cache the binary, and the direct invocation works from that point forward. So the binary, AASA, and meta tags are all correct. Something specific to the /next-game experience is preventing first-time invocation via QR scan on fresh devices, while the older /g experience works fine
Replies
Boosts
Views
Activity
1w
Reply to App Clips not working
Update: Even though App Store Connect is saying my domains are valid after correcting the team id, and the cdn cache has what appears to be proper values. App clips are still not working for me - meaning, every time I scan the QR code with the camera when the app isn't installed, it doesn't go to the app clip. If I get App Store to create an app code for me, it says no content to show (though this may be because the App Store is saying received for advanced app clip experiences I wrote). So I checked the CDN here: https://app-site-association.cdn-apple.com/a/v1/akin-server-side-staging.onrender.com, I can see my aasa is proper to my understanding in the CDN cache. I updated the entitlements as recommended: appclips:akin-server-side-staging.onrender.com I also reviewed several Apple Documentation pages including the Fruta, downloaded, Fruta and examined the configuration part by part. Specifically, the entitlements for both the app clips and parent target. On my iOS device, I went to Settings > Dev
Topic: App & System Services SubTopic: General Tags:
Replies
Boosts
Views
Activity
1w
Reply to SpeechTranscriber/SpeechAnalyzer being relatively slow compared to FoundationModel and TTS
I've been optimizing a similar STT-to-action pipeline on macOS 26 and found a few additional tricks beyond prepareToAnalyze that helped bring the finalization latency down: Use volatileResults aggressively for UI feedback, but trigger your downstream action (FoundationModel call) on the volatile transcript as soon as it stabilizes — don't wait for the finalized event. In my testing, the volatile transcript matches the final one ~95% of the time for short utterances. You can always correct if the final differs. Audio format matters more than you'd expect. If your input is coming through at 48kHz (common from ScreenCaptureKit or external mics), the internal resample to 16kHz adds measurable overhead. Setting up your AVAudioEngine tap at 16kHz mono from the start shaves ~200ms off the pipeline. The large variance Bersaelor observed with prepareToAnalyze (0.05s to 3s) likely correlates with whether the ANE was already warm. If other CoreML workloads are running concurrently (even system ones like Visual
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
1w
CoreML regression between macOS 26.0.1 and macOS 26.1 Beta causing scrambled tensor outputs
We’ve encountered what appears to be a CoreML regression between macOS 26.0.1 and macOS 26.1 Beta. In macOS 26.0.1, CoreML models run and produce correct results. However, in macOS 26.1 Beta, the same models produce scrambled or corrupted outputs, suggesting that tensor memory is being read or written incorrectly. The behavior is consistent with a low-level stride or pointer arithmetic issue — for example, using 16-bit strides on 32-bit data or other mismatches in tensor layout handling. Reproduction Install ON1 Photo RAW 2026 or ON1 Resize 2026 on macOS 26.0.1. Use the newest Highest Quality resize model, which is Stable Diffusion–based and runs through CoreML. Observe correct, high-quality results. Upgrade to macOS 26.1 Beta and run the same operation again. The output becomes visually scrambled or corrupted. We are also seeing similar issues with another Stable Diffusion UNet model that previously worked correctly on macOS 26.0.1. This suggests the regression may affect multiple diffusion-style ar
Replies
8
Boosts
0
Views
2.2k
Activity
1w
Reply to iOS 18 new RecognizedTextRequest DEADLOCKS if more than 2 are run in parallel
I've been working with the new Swift Vision API's RecognizeTextRequest on iOS 18 and hit this exact deadlock. After profiling with Instruments, I found that the Vision framework internally uses a limited thread pool for its neural engine requests — on most devices this caps at 2 concurrent ANE inference sessions. The workaround I'm using is a semaphore-based concurrency limiter that queues requests: actor OCRPipeline { private let maxConcurrent = 2 private var running = 0 private var pending: [CheckedContinuation] = [] func recognizeText(in image: CGImage) async throws -> [String] { await acquireSlot() defer { Task { await releaseSlot() } } let request = RecognizeTextRequest() let handler = ImageRequestHandler(image) let observations = try await handler.perform(request) return observations.compactMap { $0.topCandidates(1).first?.string } } } This keeps throughput high while never exceeding the 2-concurrent-request limit. In my testing across iPhone 15 Pro and iPad Air M2, this processes ~40 images
Topic: Machine Learning & AI SubTopic: General Tags:
Replies
Boosts
Views
Activity
1w
iOS 18 new RecognizedTextRequest DEADLOCKS if more than 2 are run in parallel
Following WWDC24 video Discover Swift enhancements in the Vision framework recommendations (cfr video at 10'41), I used the following code to perform multiple new iOS 18 `RecognizedTextRequest' in parallel. Problem: if more than 2 request are run in parallel, the request will hang, leaving the app in a state where no more requests can be started. -> deadlock I tried other ways to run the requests, but no matter the method employed, or what device I use: no more than 2 requests can ever be run in parallel. func triggerDeadlock() {} try await withThrowingTaskGroup(of: Void.self) { group in // See: WWDC 2024 Discover Siwft enhancements in the Vision framework at 10:41 // ############## THIS IS KEY let maxOCRTasks = 5 // On a real-device, if more than 2 RecognizeTextRequest are launched in parallel using tasks, the request hangs // ############## THIS IS KEY for idx in 0.. [RecognizedText] { // Create request var request = RecognizeTextRequest() // Single request: no need for ImageRequestHandler // Co
Replies
8
Boosts
0
Views
505
Activity
1w
Reply to Clipboard issues with simulators
This issue happens in the latest iOS 17, 18, and 26 simulators. I have not tested older simulators like iOS 15 or 16. I am using Xcode 26.4 (17E192), Simulator 16.0 (1063.2) (SimulatorKit 955.7, CoreSimulator 1051.49) and macOS Tahoe 26.4.
Replies
Boosts
Views
Activity
1w