Hello, I am unable to submit ANY of my apps for External TestFlight review. Every time I try to add a build to an external group and press Submit for Review, I immediately get this error: There was an error processing your request. Please try again later. This happens on multiple apps in the same account: LM Mobile (Apple ID: 6755979316) TPM Mobile (newly created app) All of the following have been verified and are active: Paid & Free App Agreements Banking & Tax forms App Privacy (Published) Test Information is fully completed Multiple clean builds uploaded (1.0.0 build 1 and 2) Correct distribution via App Store Connect selected in Xcode Admin + Account Holder role This clearly confirms that the problem is ACCOUNT-LEVEL on the backend. Case Number: 102766190091 Could an Apple engineer please check and reset the stuck External TestFlight submission state on the server side? Thank you.
Search results for
Apple Maps Guides
151,844 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
A month ago, a company filed a false complaint against one of my apps. I contacted Apple and informed them that this was untrue, uploading videos and screenshots proving that my account and apps comply with Apple's policies and that the complaint was baseless. Despite this, I was surprised to find all my apps deleted and my account flagged for deletion in the following days. What should I do? I've been wronged, and I've submitted numerous complaints and contacted technical support by phone and email, but so far, I haven't received any response or attention.
It's been a week waiting for the review Despite requesting a quick review and contacting technical support via email, there has been no change so far. What can I do? Apple ID: 6756184295
Hi Ed, Thank you for confirming that the native animation APIs are effectively non-interruptible. This validation saves us from chasing a magic configuration that doesn't exist. Sharing a bit about our use cases that the user must be able to interrupt the route preview immediately (e.g., to pan or zoom while the camera is moving on for example, 10 seconds of zooming). Regarding the performance spikes I mentioned: We have already implemented several optimizations to mitigate the 120Hz/8.33ms constraint you highlighted: Capping Frame Rate: We set preferredFrameRateRange to target 30-45 FPS (avoiding the 120Hz ProMotion tax) to reduce the workload and changed it depends on the device spec, how close or far is the zooming. Visual Thresholds: We try to skip rendering frames during CADisplayLink ticking with some threshold. Even with these throttles in place, simply calling setVisibleMapRect(..., animated: false) 30 times a second drives CPU usage to 80-100% on the simulator, likely due to the heavy vector tile re-
Topic:
UI Frameworks
SubTopic:
UIKit
Tags:
Thank you for your response and for taking the time to read the Technical Note. I believe you have now realized that Universal Links function consistently well, but there could be several reasons for their behavior deviating from expectations: When encountering issues related to Apple-app-site-association files and associated domains, particularly when they function correctly on the first attempt but fail on subsequent attempts within the same tab, several potential security and configuration factors could be at play: JavaScript Logic Errors: Although you mentioned that the code execution is identical, race conditions or state mismanagement within your JavaScript could lead to differing outcomes on successive runs. Please verify for such possibilities. Error Handling and Logs: Enhance error handling surrounding the associated domain validation logic and ensure comprehensive logging to capture any discrepancies during the second invocation. By systematically examining these areas, you should be able t
Topic:
Safari & Web
SubTopic:
General
Tags:
There is a good example of animating a map using MKMapCamera in our Look Around sample code project. Using the map camera gives you better control over writing your own animation sequences, as the animations provided by setting the animation parameter of setVisibleMapRect to true will be decided by the system. However, as you found, those animations are effectively non-interruptible in practice due to how MapKit manages animations, regardless if you use the UIView family of animation methods, or UIViewPropertyAnimator. Enhancement requests for better integration across the animation APIs are welcomed! I've tried using CADisplayLink to manually animate the camera per system fresh rate, it works very well, I can stop the camera movement anytime there are touches, but it causes the resource CPU spikes. While you can use CADisplayLink to control rendering of any UIView on a frame-by-frame basis, you have to keep in mind the stringent performance targets you need to hit in order to not introduce
Topic:
UI Frameworks
SubTopic:
UIKit
Tags:
I need to create a Mac application using Objective-C. The application has to use PHPickerViewController to provide user a familiar interface to pick photos. Here is the Objective-C code that used to present the photo picker. //ViewController.h #import #import @interface ViewController : NSViewController @property (nonatomic, weak) IBOutlet NSImageView *myImageView; @end // ViewController.m @implementation ViewController PHPickerViewController* pickerViewController = nil; - (void)pickPhotos { PHPickerConfiguration *config = [[PHPickerConfiguration alloc] init]; config.selectionLimit = 0; // Allow multiple selections config.filter = [PHPickerFilter imagesFilter]; // Filter for images pickerViewController = [[PHPickerViewController alloc] initWithConfiguration:config]; pickerViewController.preferredContentSize = NSMakeSize(800, 600); pickerViewController.delegate = self; // Set the delegate to handle selection [self presentViewControllerAsModalWindow:pickerViewController]; - (IBAction)clicked:(id)sender { NSLo
Topic:
UI Frameworks
SubTopic:
AppKit
Have you tried to just upload the app again with a new Build number so it will be reviewed again? My experience is that every now and then this problem occurs in Apple's test environment, without anybody really knowing why. And the reviewers are not really investigating this. On the next review it will probably work. At least this is my year long experience.
Topic:
App & System Services
SubTopic:
StoreKit
Tags:
Hi, this is a series of questions for the Apple developers, and also for anyone that would like to speculate. How are they able to get trees marked on the in-app map? And how come they are fairly but not completely accurately marked?
Topic:
App & System Services
SubTopic:
Maps & Location
Our engineering teams need to investigate this issue, as resolution may involve changes to Apple's software. Please file a bug report, include a small Xcode project and some directions that can be used to reproduce the problem, and post the Feedback number here once you do. If you post the Feedback number here I'll check the status next time I do a sweep of forums posts where I've suggested bug reports. Bug Reporting: How and Why? has tips on creating your bug report.
Topic:
Safari & Web
SubTopic:
General
Tags:
Hello, I recently had an unusual experience, and I’m wondering if this is related to Apple’s policies, so I wanted to ask. It's not really about policy as such, but is actually about how the system handles notifications generally. While a call is in Picture-in-Picture (PIP) mode, notification pushes from the same app do not appear. The API is being triggered, but the notification banner does not show on the device. The issue here isn't PIP, it's the fact that the app is awake and considered active. When an app is active, the system does not directly present notifications, but instead delivers them directly to the app through userNotificationCenter(_:willPresent:withCompletionHandler:). The app then has control over whether or not the notification is presented. Note that this is the same reason notifications don't necessarily appear when the app is foregrounded. The only difference here is that PIP is a slightly different variation of active/foreground. Once PIP is closed, the notifications start appe
Topic:
App & System Services
SubTopic:
Notifications
Tags:
Hi everyone, I'm building a task management app that layers on top of EventKit/Reminders. I'm also moderating /r/AppleReminders. I see a confusion around the semantics of dates on both the developer side and on the user side. I'm trying to map the standard productivity mental model to the EKReminder implementation and hitting some walls. In productivity contexts, a task tends to have three distinct dates: Start Date: When the task becomes actionable — Don’t alert the user before this date. Notification: When the device should buzz/ping the user — Meaning that they can get started on the task. Due Date: Hard deadline — If the system works well, tasks are meant to rarely be past-deadline; productivity systems are about meeting deadlines rather than about missing them. The EventKit Reality Here is what I’m seeing in practice, and I’m hoping someone can correct me if I’m wrong: Field Description In Practice (Reminders App) startDateComponents Docs say start date of the task Seemingly unused? I can set it
As you’ve noticed, Swift Playground currently doesn’t support the iOS 26 SDK. We understand that there’s strong demand for that, but I can’t offer any advice as to when it’ll happen. If you need you use iOS 26 SDK features right now, your only real option is to create your playground with Xcode. Make sure to create an app playground, as explained on Developer > Swift Student Challenge > Get ready. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Topic:
Community
SubTopic:
Swift Student Challenge
Tags:
This came up in a different context, and I wanted to post the associated bug number (FB21271237) here, just for the record. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Topic:
App & System Services
SubTopic:
Networking
Tags:
I love a question with an easy answer! No. Currently the Foundation Models framework only has access to the on-device model. PCC is never used. Ever. Inference is entirely on-device. For an added layer of privacy and security assurance, Apple does not log anything about the inference other than counting that your app called the model (since it's a shared operating system resource). Apple does not log input or output, so Apple has no access to your prompts, inputs, or results.
Topic:
Machine Learning & AI
SubTopic:
Foundation Models