Search results for

file uri scheme

79,827 results found

Post

Replies

Boosts

Views

Activity

Reply to Accessing Built-In iOS Alarm Sounds When Using AlarmKit
Thanks for the post. Sorry for the delay. Very interesting and great question. For alarm-like functionality, local notifications can be paired with custom sounds. The default alarm sounds available on iOS devices are proprietary. This restriction ensures that these sounds remain exclusive to and maintain their intended user experience. As of now, there is no public API that allows developers to access or directly play the built-in iOS alarm sounds in their apps using AlarmKit or any other framework within the iOS SDK. Developers looking to implement alarm functionality in their apps can provide their own audio files. This means you can bundle custom sound files with your app or allow users to select from their own music library I think as well. In summary, as far as I know now, you are currently limited to using custom audio files for alarm sounds in apps, as there is no supported way to access the default iOS alarm tones programmatically. Resources: https://developer.apple.com/docu
1w
CoreBluetooth multi-peripheral high-frequency BLE streaming shows uneven packet distribution and lag on some A16/A17 iPads
We are observing a reproducible issue on some (not all) iPad models equipped with A16, where BLE streaming from multiple peripherals at ≥33–40 Hz results in uneven packet distribution, burst delivery, and application-level lag. The same application, peripherals, firmware, iOS version, and physical environment do not exhibit this behaviour on A14-based iPads (iPad 10). Affected Hardware: • iPad 11 with A16 • iOS versions: identical across tested devices • Issue affects some devices of the same model, not all Internal field data • ~25 affected • ~5 unaffected • Customers actively prefer iPad 10 (A14) due to stability When two or more BLE peripherals stream data concurrently at frequencies ≥33–40 Hz, affected iPads exhibit: • Uneven packet arrival timing • Burst delivery instead of uniform intervals • Increasing latency over time • Observable application-level lag This does not present as simple packet loss. Instead, packets arrive in clusters, breaking real-time assumptions. At ≤30–33 Hz, the issue does not rep
1
0
109
1w
Reply to My app doesn't respond on iPhone Air iOS 26.1.
As ray_cai outlined the issue was related to the LaunchImage. @howard_kang this fixed the issue for me. Remove LaunchImage from the Asset catalog. Remove references to LaunchImage from your build settings. Add Launch Screen storyboard to the project. Add parameter Launch screen interface file base name with the name of your storyboard Launch Screen.storyboard to your Info.plist file.
Topic: UI Frameworks SubTopic: UIKit Tags:
1w
Reply to Liquid Glass TabBar animations causes Hangs, bug with UIKitCore?
@DTS Engineer I utilized iOS 26.1, the issue @ray_cai outlined above was related to the LaunchImage. The fix is: Remove LaunchImage from the Asset catalog. Remove references to LaunchImage from your build settings. Add Launch Screen storyboard to the project. Add Launch screen interface file base name with the name of your storyboard Launch Screen.storyboard to your Info.plist file. I am not sure why this was causing a hang with the tab bar, maybe the Launch Image is not compatible with iOS 26.1 & iPhone Air, this issue was not occurring on other versions of iPhones.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
1w
Reply to NSScreen's maximumExtendedDynamicRangeColorComponentValue does not seem to provide the proper value after sleep/wake on third party HDR displays even when there is EDR content on screen in macOS Tahoe
Our engineering teams need to investigate this issue, as resolution may involve changes to Apple's software. Please file a bug report, include a small Xcode project and some directions that can be used to reproduce the problem, and post the Feedback number here once you do. If you post the Feedback number here I'll check the status next time I do a sweep of forums posts where I've suggested bug reports. Bug Reporting: How and Why? has tips on creating your bug report.
Topic: Graphics & Games SubTopic: Metal Tags:
1w
Reply to 关于 WKWebView 加载本地文件时 localStorage 数据丢失的情况
I am tracking some other issues related to localStorage. Please consider filing a bug report. Our engineering teams need to investigate this issue, as resolution may involve changes to Apple's software. Please file a bug report, include a small Xcode project and some directions that can be used to reproduce the problem, and post the Feedback number here once you do. If you post the Feedback number here I'll check the status next time I do a sweep of forums posts where I've suggested bug reports. Bug Reporting: How and Why? has tips on creating your bug report.
Topic: Safari & Web SubTopic: General Tags:
1w
Reply to For receiving audio in PushtoTalk, channelManager(_:didActivate:) not called when app receives first push after backgrounding
No, I am not activating or changing anything in AudioSession. I have pretty straightforward setup; recording an audio using AVAudioEngine and sending to the server (which is running locally for now). And when PTT notification arrives saving url to audio file in incomingPushResult, then downloading and playing that sound using AVAudioPlayer when didActivate is executed. It's working fine on foreground, but when app goes to background incomingPushResult is called as expected but didActivate is not for some reason. Am I missing anything in my setup?
Topic: App & System Services SubTopic: General Tags:
1w
Sound not working on testflight / Appstore
I have a flutter iOS app that has some simple sound FX for button clicks, swipes, etc. In simulator and on real device the sound works fine, but when i upload the app to testflight (and App store) the sound FX don't play. When I upload the app to my phone via xcode I am using the release profile so I don't see what the difference could be. I have also gone through the archive that i uploaded and verified that the sound files are indeed there. I have other flutter apps that use sound but non since the iOS 26 update. I've tried 3 different flutter sound libraries and all face the same issue. Wondering if anyone else is seeing this issue or if I'm missing a simple permission or something that has changed recently? Thanks in advanced
2
0
178
1w
Reply to RealityKit / visionOS – Memory not released after dismissing ImmersiveSpace with USDZ models
Filed ticket number FB21327973 with sample test app demonstrating memory pressure with UnlitMaterial usage. Also seeing an Error on some images, although they are still being rendered on the device: IOSurface creation failed: e00002c2 parentID: 00000000 properties: { IOSurfaceAddress = 4380393472; IOSurfaceAllocSize = 20027995; IOSurfaceCacheMode = 0; IOSurfaceMapCacheAttribute = 1; IOSurfaceName = CMPhoto; IOSurfacePixelFormat = 1246774599; } IOSurface creation failed: e00002c2 parentID: 00000000 property: IOSurfaceCacheMode
Topic: Spatial Computing SubTopic: General Tags:
1w
Reply to [iOS 26 Beta] BGTaskScheduler.supportedResources incorrectly reports no GPU support for BGContinuedProcessingTask on capable hardware
I'd like to +1 the need for an official compatibility list. If you haven't already, you're welcome to file a bug on this and please send me the bug number if you do. However, I'm not sure that's something that’s likely to happen, as the main reason we use runtime checks like BGContinuedProcessingTaskRequest.Resources.gpu is that it gives the engineer team the flexibility to change what devices are supported (generally by expanding) without needing to worry about keeping the documentation in sync, particularly if the supported device set varies with OS version. Most of our APIs use this approach and don't explicitly describe what hardware they support. Case in point, the form here: (something like this, for example). ...is a specific exception because UIRequiredDeviceCapabilities is used for an install-time check of the entire app, not just a specific app feature. It's also tied to functionality that was inconsistently implemented across our product lines and system versions, which makes it much more
1w
Reply to We attempted to run a burn-in test while connected to our MacBook Pro M4 Max, but this crashed about 10 minutes into testing.
We attempted to run a burn-in test while connected to our MacBook Pro M4 Max, but this crashed about 10 minutes into testing. We tried to run a 2-hour burn-in on the M4 Max host while charging the battery from below 5%, running six bus-powered drives (via ATTO/Black Magic/IOmeter), hitting the RJ45 port for 2.5Gbps (via JPerf), and streaming at least 4K60Hz video content to two displays; however, the M4 Max crashed in 20minutes. If you haven't already, please file a bug on this and post the bug number back here. In the bug, please attach full panic logs as well as a sysdiagnose from the machine that panic-ed. One comment on the log picture you posted. That particular panic message (Halt/Restart Timed Out) comes from a very specific point, which you can actually see here: xnu/iokit/Kernel/IOPlatformExpert.cpp: ... IOShutdownNotificationsTimedOut( thread_call_param_t p0, thread_call_param_t p1) { #if !defined(__x86_64__) /* 30 seconds has elapsed - panic */ panic(Halt/Restart Timed Out); ... Critically
1w