Greetings! QUESTION: Is there any way to request expedited review of an in-app purchase? BACKGROUND: App Review approved our app yesterday, and I released it from the Connect app on iOS while on the go. Only when at home later logged into App Store Connect from the browser, did I realize that our In-App Purchases were still In Review. ISSUE: Users get the no products found error on our paywall, based on the fact the IAPs are not yet approved! Apple ID: 6742640564 URL: https://apps.apple.com/us/app/nophone-digital-detox/id6742640564
Search results for
We are unable to process your request
69,729 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I observed the following crash: Code Type: ARM-64 (Native) Parent Process: launchd [1] User ID: 0 Date/Time: 2025-10-07 13:48:29.082 OS Version: macOS 15.6 (24G84) Report Version: 12 Anonymous UUID: 8B651788-4B2E-7869-516B-1DA0D60F3744 Crashed Thread: 3 Dispatch queue: NEFlow queue Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Codes: KERN_INVALID_ADDRESS at 0x0000000000000054 ... Thread 3 Crashed: Dispatch queue: NEFlow queue 0 libdispatch.dylib 0x000000019af6da34 dispatch_async + 192 1 libnetworkextension.dylib 0x00000001b0cf8580 __flow_startup_block_invoke.216 + 124 2 com.apple.NetworkExtension 0x00000001adf97da8 __88-[NEExtensionAppProxyProviderContext setInitialFlowDivertControlSocket:extraValidation:]_block_invoke.90 + 860 3 libnetworkextension.dylib 0x00000001b0cf8140 __flow_startup_block_invoke.214 + 172 4 libdispatch.dylib 0x000000019af67b2c _dispatch_call_block_and_release + 32 5 libdispatch.dylib 0x000000019af8185c _dispatch_client_callout + 16 6 libdispatch.dylib 0x000000019af70350 _d
I’m unable to notarize the executable and the .app — the status has been showing “In Progress” for over an hour. Upon checking the xcrun logs, it indicates that the submission ID was not received. I also noticed there’s an Apple Developer Service outage reported since October 8, 2025. Could you please let me know when this outage is expected to be resolved? It would be very helpful.
I would like to know the answer to this. I want to integrate Visual Intelligence in processing text using my application from screenshot UI entry point.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
Hi everyone, I’m working on an iOS MusicKit app that overlays a metronome on top of Apple Music playback. To line the clicks up perfectly I’d like access to low-level audio analysis data—ideally a waveform / spectrogram or beat grid—while the track is playing. I’ve noticed that several approved DJ apps (e.g. djay, Serato, rekordbox) can already: • Display detailed scrolling waveforms of Apple Music songs • Scratch, loop or time-stretch those tracks in real time That implies they receive decoded PCM frames or at least high-resolution analysis data from Apple Music under a special entitlement. My questions: 1. Does MusicKit (or any public framework) expose real-time audio buffers, FFT bins, or beat markers for streaming Apple Music content? 2. If not, is there an Apple program or entitlement that developers can apply for—similar to the “DJ with Apple Music” initiative—to gain that deeper access? 3. Where can I find official documentation or a point of contact for this kind of request? I’ve searched the
Topic:
Media Technologies
SubTopic:
Audio
I'm in the process of add some swift code that is all objective-c. I have trouble with my actual app so I have worked on a prototype. There is what I have done Created a new Xcode project, selecting App and Objc Added a blank Swift file and accepted the generation of the -Bridging-Header. In project build setting, Yes for Defines Modules, Yes for Always Embed Swift Libraries Add appropriate .h file to Bridging header In Build Settings, in Swift Compiler - General, Bridging Header has correct path to the bridging header file Setup the single swift file in this was using @objc like this: @objcmembers class MySwiftClass: NSObject { func myMethod() { let output = ... } } When the project runs I execute in the objc ViewController: MySwiftClass *swiftObject = [[MySwiftClass alloc] init]; and get Use of undeclared identifier 'MySwiftClass'
Details: Device: iPhone 12 Pro Max System: iOS 26.1 beta 2 Issue Description: When testing MDM device restriction capabilities on iOS 26.1 beta 2, I found that the allowCamera restriction does not work as expected. Observed Behavior: • On a BYOD device: When allowCamera is set to false, the Camera and FaceTime apps disappear from the Home Screen, as expected. However, third-party apps (such as WeChat) can still access the camera and take photos. • On earlier versions (e.g. iOS 26.0.1): Setting allowCamera to false correctly blocks all apps, including third-party apps, from accessing the camera. Initially, I assumed Apple might have changed this restriction behavior so that allowCamera only applies to supervised devices. However, after testing on supervised devices, I found that even there, when allowCamera is set to false, the Camera and FaceTime apps are hidden, but third-party apps can still use the camera. This indicates that the restriction is not functioning correctly in iOS 26.1 beta 2. Expectation: Whe
Topic:
Business & Education
SubTopic:
Device Management
📱 [iOS 26.1 beta 2] allowCamera restriction not working properly on both supervised and BYOD devices
Details: Device: iPhone 12 Pro Max System: iOS 26.1 beta 2 Issue Description: When testing MDM device restriction capabilities on iOS 26.1 beta 2, I found that the allowCamera restriction does not work as expected. Observed Behavior: • On a BYOD device: When allowCamera is set to false, the Camera and FaceTime apps disappear from the Home Screen, as expected. However, third-party apps (such as WeChat) can still access the camera and take photos. • On earlier versions (e.g. iOS 26.0.1): Setting allowCamera to false correctly blocks all apps, including third-party apps, from accessing the camera. Initially, I assumed Apple might have changed this restriction behavior so that allowCamera only applies to supervised devices. However, after testing on supervised devices, I found that even there, when allowCamera is set to false, the Camera and FaceTime apps are hidden, but third-party apps can still use the camera. This indicates that the restriction is not functioning correctly in iOS 26.1 beta 2. Expectation: Whe
Topic:
Business & Education
SubTopic:
Device Management
Subject: iOS 26 WKWebView: Remote Pages Become Unresponsive After Loading Local HTML Files Description We're experiencing a critical issue with WKWebView in a React Native 0.64.3 application where remote web pages become completely unresponsive after loading local HTML files in iOS 26. It works well before iOS26. Environment: React Native 0.64.3 iOS 26.0 Xcode 26.0.1 Using custom WKWebView implementations in Native modules Problem Details App loads local HTML files using loadFileURL:allowingReadAccessToURL: Later, when loading remote pages via loadRequest:, the remote pages load successfully but become unresponsive to user interactions This occurs even when using different WKWebView instances The issue is reproducible 100% of the time once a local file has been loaded Restarting the app and loading remote pages directly works fine Code Example: // Loading local file (works fine) [self.webView loadFileURL:localFileURL allowingReadAccessToURL:accessURL]; // Later, loading remote page (loads but becomes unrespon
My implementation of LockedCameraCapture does not launch my app when tapped from locked screen. But when the same widget is in the Control Center, it launches the app successfully. Standard Xcode target template: Lock_Screen_Capture.swift @main struct Lock_Screen_Capture: LockedCameraCaptureExtension { var body: some LockedCameraCaptureExtensionScene { LockedCameraCaptureUIScene { session in Lock_Screen_CaptureViewFinder(session: session) } } } Lock_Screen_CaptureViewFinder.swift: import SwiftUI import UIKit import UniformTypeIdentifiers import LockedCameraCapture struct Lock_Screen_CaptureViewFinder: UIViewControllerRepresentable { let session: LockedCameraCaptureSession var sourceType: UIImagePickerController.SourceType = .camera init(session: LockedCameraCaptureSession) { self.session = session } func makeUIViewController(context: Self.Context) -> UIImagePickerController { let imagePicker = UIImagePickerController() imagePicker.sourceType = sourceType imagePicker.mediaTypes = [UTType.image.identifier, U
Thanks for the post. I see you filed a bug, you have provided some code in the bug, thanks for that, but may I ask to upload a focused project showing the issue? If so, please share a link to your test project or better upload it into the FB issue That'll help us better understand what's going on. If you're not familiar with preparing a test project, take a look at Creating a test project. There, you can track if the report is still being investigated, has a potential identifiable fix, or has been resolved in another way. The status appears beside the label Resolution. We're unable to share any updates on specific reports on the forums. For more details on when you'll see updates to your report, please see What to expect after submission. Albert Pascual
Worldwide Developer Relations.
Topic:
UI Frameworks
SubTopic:
SwiftUI
Tags:
Thanks so much for the comment, adding a post here as I would ask you to give engineering sometime to investigate FB20605681. You can see the status of your feedback in Feedback Assistant. There, you can track if the report is still being investigated, has a potential identifiable fix, or has been resolved in another way. The status appears beside the label Resolution. We're unable to share any updates on specific reports on the forums. For more details on when you'll see updates to your report, please see What to expect after submission. Albert Pascual
Worldwide Developer Relations.
Topic:
UI Frameworks
SubTopic:
SwiftUI
Tags:
I admit I am doing something unusual, and I would not be surprised if it didn't work. I am surprised, however, because after performing the equivalent operations on four bundles, all of the bundles work fine on macOS 15.6.1, but only two of them work on macOS 26.1 (beta 2). I don't know what causes the different outcomes. What I am trying to do is get Java to pass the macOS 26 AppKit UI SDK linkage checking without having to rebuild the JDK using Xcode 26. Rebuilding works for the latest SDK, but it is very inconvenient and may not work for older JDKs. It usually takes a while before the JDK build team successfully transitions to a new Xcode release. My approach is to use vtool to update the sdk version in the LC_BUILD_VERSION load command of $JAVA_HOME/bin/java, which is the launching executable for the JDK. I performed this operation on four JDKs: 25, 21, 17, and 11. (I ran vtool on macOS 15.) It was completely successful on JDK 25 and 21. The JDK launches correctly on macOS 15 and macOS 26. On macOS 26, Ap
Topic:
Developer Tools & Services
SubTopic:
General
Tags:
macOS
Linker
Gatekeeper
Signing Certificates
Thanks a lot! ...adding external-accessory is not changing ANYTHING about how your app wakes/sleep in the background. You will not be allowed to use it, but that's because it's not relevant to your app. In my tests 2 weeks ago I (might be by my mistake) saw a correspondence between external-accessoryand the facts of callbacks from IOServiceAddMatchingNotification. Or at least correlation with the moments of such callbacks at the time of application moves from background to foreground. Having your unambiguous statement I tediously rechecked my environment now and I found 2 possible weak-points: A) I conducted the tests above having 2 identical DExts (for 2 different but very similar UserClient-Apps) formally enabled in Settings|Applications|Drivers. Though, of course, only one UserClient was active. I shall repeat the test (on near weeks) with single DExt physically installed. Another Q&A in TestFlight was being conducted in distant country. I shall reconfirm about the environment there. B)To distinguish t
Topic:
App & System Services
SubTopic:
Drivers
Tags:
Yeah, I see the following error when trying to prompt the on-device foundation model from within a device activity report extension (DeviceActivityReportExtension), which I believe is the same as what you mentioned: establishment of session failed with Unrecognized underlying error: Underlying connection was invalidated. Reason: failed at lookup with error 159 - Sandbox restriction ModelManagerError: got unrecognized error Underlying connection was invalidated. Reason: failed at lookup with error 159 - Sandbox restriction I am pretty sure that is as-designed. To give you more context, when responding your prompt, the model needs to communicate with the model manager, which is a system daemon that manages the inference requests. To protect the user’s privacy, however, the system doesn't allow a device activity report extension to move data outside the extension’s address space, and that prevents the model from creating a connection to the model manager. Having said that, if you have a compelling case
Topic:
Machine Learning & AI
SubTopic:
Foundation Models
Tags: