Hello everyone, I’m a new developer still learning as I go. I’m building a simple watchOS app that tracks Apple Watch battery consumption, records hourly usage data, and uses that information to predict battery life in hours. I’ve run into an issue where background refresh completely stalls after charging and never recovers, regardless of what I do. The only way to restore normal behavior is to restart the watch. Background refresh can work fine for days, but if the watch is charging and a scheduled background refresh tries to run during that period, it appears to be deferred—and then remains in that deferred state indefinitely. Even reopening the app or scheduling new refreshes doesn’t recover it. Has anyone else encountered this behavior? Is there a reliable workaround? I’ve seen a few reports suggesting that there may be a regression in scheduleBackgroundRefresh() on watchOS 26, where tasks are never delivered after certain states. Any insights or confirmations would be greatly appreciated. Thank
Search results for
Apple Maps Guides
149,421 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Once, the App Store empowered indie creators - small teams could launch great apps Nothing has changed in that respect. There are more opportunities now for people to build low-quality apps by using low-quality, cross-platform frameworks and tools. But people are still able to launch great apps with discipline, intelligence, and hard work. and find users easily This part has changed. There are lots more apps to choose from, a radically different marketplace, and different user expectations. It now takes more work and more time to find users. But in 2025, that era feels long gone, with algorithms, big players, and pay-to-win discoverability reshaping everything. Many things have changed over the past 17 years. Is there still a real chance for indie iOS developers today, or has Apple’s evolving ecosystem made independent success nearly impossible? Many people are having success. They're just focused on their products and providing a great experience for their customers. Posting content on social media
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
Hi everyone, I’m working on an iOS MusicKit app that overlays a metronome on top of Apple Music playback. To line the clicks up perfectly I’d like access to low-level audio analysis data—ideally a waveform / spectrogram or beat grid—while the track is playing. I’ve noticed that several approved DJ apps (e.g. djay, Serato, rekordbox) can already: • Display detailed scrolling waveforms of Apple Music songs • Scratch, loop or time-stretch those tracks in real time That implies they receive decoded PCM frames or at least high-resolution analysis data from Apple Music under a special entitlement. My questions: 1. Does MusicKit (or any public framework) expose real-time audio buffers, FFT bins, or beat markers for streaming Apple Music content? 2. If not, is there an Apple program or entitlement that developers can apply for—similar to the “DJ with Apple Music” initiative—to gain that deeper access? 3. Where can I find official documentation or a point of contac
Topic:
Media Technologies
SubTopic:
Audio
I use ARKit for motion tracking. I get the skeleton joint coordinates and use them for animation. I didn't make any changes to the code, but I updated the iOS version from 18 to 26, and modelTransform now always returns nil. https://developer.apple.com/documentation/arkit/arskeleton3d/modeltransform(for:) For example bodyAnchor.skeleton.modelTransform(for: .init(rawValue: head_joint)) bodyAnchor is ARBodyAnchor. I see the default skeleton on the screen, but now I can't get the coordinates out of it. I'm using an example from Apple's WWDC presentation. https://developer.apple.com/documentation/arkit/capturing-body-motion-in-3d Are there any changes in the API? Or just bug?
I'm trying to understand how UIDevice.current.identifierForVendor behaves when an iOS app is restored via iCloud onto a different physical device. Context I'm building an app that needs to detect whether it’s running on a newly restored device (for example, after the user transfers their iPhone via iCloud setup). To do this, I save the value of UIDevice.current.identifierForVendor?.uuidString in persistent storage (e.g., UserDefaults). The question If I install my app on Device A, store the identifierForVendor value, back up the device to iCloud, and then restore that backup onto Device B, will the restored app see the same identifierForVendor value, or a new one? More specifically: Does iCloud backup/restore preserve the underlying “vendor” ID across devices? Is the identifierForVendor tied only to the bundle identifier and vendor prefix, or also to the physical device hardware? If the user deletes all apps from the same vendor, then restores them from iCloud, is the ID reset? What I’ve found so far Apple’s
Hi, I just received the same email from Apple. I just ran a promotional campaign that might have caused a spike in the downloads. but i'm not sure what to respond to Apple as I fear if I write anything Apple can just terminate the account completely. Could you shaer what exactly you've written? and has there been any aciton from Apple's end?
Topic:
App Store Distribution & Marketing
SubTopic:
General
Tags:
Thanks a lot for your super-fast advice! My results have increased quality. I only need a bit more of information about when is a best practique incorporate a description for a Guided attribute. @Generable struct SearchSuggestions { @Guide(description: A list of suggested search terms, .count(4)) var searchTerms: [String] } Listening the WWDC25 session Deep dive into Foundation Generation Models from min 9:00 to 9:15: Multiline Foundation Models will even automatically include details about your Generable type in the prompt, in a specific format that the model has been trained on. You don’t have to tell it about what fields your Generable type has BlockQuote I interpret that with an auto-descriptive name to the attribute like SearchSuggestions write a description is overwork because the model already may infer what must be the content. Am I right? If my state is true, go deeper and learn what must be set in the description. Some examples would be the best to set clearly best practiques 😀
Topic:
Machine Learning & AI
SubTopic:
Foundation Models
Hi everyone! I’m trying to implement a cinematic video shooting mode. While following the guide “Capture cinematic video in your app”, I added the following code to obtain metadata information: metadataOutput.metadataObjectTypes = metadataOutput.requiredMetadataObjectTypesForCinematicVideoCapture However, the following error appears in the console: *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'When Cinematic Video is enabled, AVCaptureMetadataOutput's metadataObjectTypes must match requiredMetadataObjectTypesForCinematicVideoCapture.' How can I resolve this issue?
@T.P: Apple has not responded to my ticket. For iOS 26.0, I've worked around the issue as follows: Original Code import MapKit import SwiftUI struct Demo: View { var body: some View { NavigationStack { Map() .navigationTitle(Text(Choose Location)) .navigationBarTitleDisplayMode(.inline) } } } Updated Code import MapKit import SwiftUI struct Demo: View { var body: some View { NavigationStack { Map() .navigationTitle(Text(Choose Location)) .navigationBarTitleDisplayMode(.inline) .preserveNavigationBarViaSafeAreaInset() } } } private extension View { // HACK: Work around https://developer.apple.com/forums//thread/794351 func preserveNavigationBarViaSafeAreaInset() -> some View { Color.clear .safeAreaInset(edge: .bottom) { self } } } Before After
Topic:
UI Frameworks
SubTopic:
SwiftUI
Tags:
Once, the App Store empowered indie creators - small teams could launch great apps and find users easily. But in 2025, that era feels long gone, with algorithms, big players, and pay-to-win discoverability reshaping everything. Is there still a real chance for indie iOS developers today, or has Apple’s evolving ecosystem made independent success nearly impossible?
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
App Store
App Review
App Store Connect
It seems Apple’s latest ASO update has shifted focus from app titles to overall metadata relevance and user engagement signals.
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
In-App Subscriptions Not Fetching in Sandbox or Production (expo-iap / React Native / Bare Workflow)
Hi everyone, I’m encountering an issue with my in-app subscriptions setup. When I test using the StoreKit configuration file in Xcode, everything works correctly — the subscriptions are fetched and I can simulate purchases without any issues. However, when I switch to the Sandbox or Production environment, my app fails to fetch the available products from Apple’s servers. The call to fetchProducts (from the expo-iap library) returns an empty array. Here’s some context about my setup: Framework: React Native (Expo Bare Workflow) Library: expo-iap Products: Auto-renewable subscriptions StoreKit Configuration: Synced with App Store Connect Status: Subscription Plans are approved in App Store Connect I’ve verified the following: The product identifiers in code match exactly with those in App Store Connect. The app is signed with the correct bundle ID. I’m testing with a Sandbox account (logged in via Settings -> Developer -> Sandbox Tester Account). Despite this, the response from Apple’s
Topic:
App & System Services
SubTopic:
StoreKit
Tags:
Subscriptions
StoreKit
App Store Connect
In-App Purchase
Details: Device: iPhone 12 Pro Max System: iOS 26.1 beta 2 Issue Description: When testing MDM device restriction capabilities on iOS 26.1 beta 2, I found that the allowCamera restriction does not work as expected. Observed Behavior: • On a BYOD device: When allowCamera is set to false, the Camera and FaceTime apps disappear from the Home Screen, as expected. However, third-party apps (such as WeChat) can still access the camera and take photos. • On earlier versions (e.g. iOS 26.0.1): Setting allowCamera to false correctly blocks all apps, including third-party apps, from accessing the camera. Initially, I assumed Apple might have changed this restriction behavior so that allowCamera only applies to supervised devices. However, after testing on supervised devices, I found that even there, when allowCamera is set to false, the Camera and FaceTime apps are hidden, but third-party apps can still use the camera. This indicates that the restriction is not functioning correctly in iOS 26.1 beta 2. Expecta
Topic:
App & System Services
SubTopic:
General
📱 [iOS 26.1 beta 2] allowCamera restriction not working properly on both supervised and BYOD devices
Details: Device: iPhone 12 Pro Max System: iOS 26.1 beta 2 Issue Description: When testing MDM device restriction capabilities on iOS 26.1 beta 2, I found that the allowCamera restriction does not work as expected. Observed Behavior: • On a BYOD device: When allowCamera is set to false, the Camera and FaceTime apps disappear from the Home Screen, as expected. However, third-party apps (such as WeChat) can still access the camera and take photos. • On earlier versions (e.g. iOS 26.0.1): Setting allowCamera to false correctly blocks all apps, including third-party apps, from accessing the camera. Initially, I assumed Apple might have changed this restriction behavior so that allowCamera only applies to supervised devices. However, after testing on supervised devices, I found that even there, when allowCamera is set to false, the Camera and FaceTime apps are hidden, but third-party apps can still use the camera. This indicates that the restriction is not functioning correctly in iOS 26.1 beta 2. Expecta
Topic:
App & System Services
SubTopic:
General
Hello, I recently watched the WWDC2025 session titled “Combine Metal 4 machine learning and graphics” (https://developer.apple.com/videos/play/wwdc2025/262/ ), and I’m very excited about the new Metal 4 features that integrate machine learning with graphics—such as neural ambient occlusion, shader-based ML inference, and the use of MTLTensor and MTL4MachineLearningCommandEncoder. While the session includes helpful code snippets and a compelling debug demo (e.g., the neural ambient occlusion example), the implementation details are not fully shown, and I haven’t been able to find a complete, runnable sample project that demonstrates end-to-end integration of ML and rendering in Metal 4. Would Apple be able to provide a full, working example—such as an Xcode project—that shows how to: Export a model to an .mlpackage, Convert it to an .mtlpackage, Use MTL4MachineLearningCommandEncoder alongside render passes, Or embed small neural networks directly in shaders using Shader ML? Having such a sample would