Search results for

Apple Maps Guides

149,546 results found

Post

Replies

Boosts

Views

Activity

In-App Subscriptions Not Fetching in Sandbox or Production (expo-iap / React Native / Bare Workflow)
Hi everyone, I’m encountering an issue with my in-app subscriptions setup. When I test using the StoreKit configuration file in Xcode, everything works correctly — the subscriptions are fetched and I can simulate purchases without any issues. However, when I switch to the Sandbox or Production environment, my app fails to fetch the available products from Apple’s servers. The call to fetchProducts (from the expo-iap library) returns an empty array. Here’s some context about my setup: Framework: React Native (Expo Bare Workflow) Library: expo-iap Products: Auto-renewable subscriptions StoreKit Configuration: Synced with App Store Connect Status: Subscription Plans are approved in App Store Connect I’ve verified the following: The product identifiers in code match exactly with those in App Store Connect. The app is signed with the correct bundle ID. I’m testing with a Sandbox account (logged in via Settings -> Developer -> Sandbox Tester Account). Despite this, the response from Apple’s
0
0
64
6d
[iOS 26.1 beta 2] allowCamera restriction not working properly on both supervised and BYOD devices
Details: Device: iPhone 12 Pro Max System: iOS 26.1 beta 2 Issue Description: When testing MDM device restriction capabilities on iOS 26.1 beta 2, I found that the allowCamera restriction does not work as expected. Observed Behavior: • On a BYOD device: When allowCamera is set to false, the Camera and FaceTime apps disappear from the Home Screen, as expected. However, third-party apps (such as WeChat) can still access the camera and take photos. • On earlier versions (e.g. iOS 26.0.1): Setting allowCamera to false correctly blocks all apps, including third-party apps, from accessing the camera. Initially, I assumed Apple might have changed this restriction behavior so that allowCamera only applies to supervised devices. However, after testing on supervised devices, I found that even there, when allowCamera is set to false, the Camera and FaceTime apps are hidden, but third-party apps can still use the camera. This indicates that the restriction is not functioning correctly in iOS 26.1 beta 2. Expecta
0
0
85
6d
📱 [iOS 26.1 beta 2] allowCamera restriction not working properly on both supervised and BYOD devices
Details: Device: iPhone 12 Pro Max System: iOS 26.1 beta 2 Issue Description: When testing MDM device restriction capabilities on iOS 26.1 beta 2, I found that the allowCamera restriction does not work as expected. Observed Behavior: • On a BYOD device: When allowCamera is set to false, the Camera and FaceTime apps disappear from the Home Screen, as expected. However, third-party apps (such as WeChat) can still access the camera and take photos. • On earlier versions (e.g. iOS 26.0.1): Setting allowCamera to false correctly blocks all apps, including third-party apps, from accessing the camera. Initially, I assumed Apple might have changed this restriction behavior so that allowCamera only applies to supervised devices. However, after testing on supervised devices, I found that even there, when allowCamera is set to false, the Camera and FaceTime apps are hidden, but third-party apps can still use the camera. This indicates that the restriction is not functioning correctly in iOS 26.1 beta 2. Expecta
0
0
36
6d
Are there complete code examples available for “Combine Metal 4 machine learning and graphics”?
Hello, I recently watched the WWDC2025 session titled “Combine Metal 4 machine learning and graphics” (https://developer.apple.com/videos/play/wwdc2025/262/ ), and I’m very excited about the new Metal 4 features that integrate machine learning with graphics—such as neural ambient occlusion, shader-based ML inference, and the use of MTLTensor and MTL4MachineLearningCommandEncoder. While the session includes helpful code snippets and a compelling debug demo (e.g., the neural ambient occlusion example), the implementation details are not fully shown, and I haven’t been able to find a complete, runnable sample project that demonstrates end-to-end integration of ML and rendering in Metal 4. Would Apple be able to provide a full, working example—such as an Xcode project—that shows how to: Export a model to an .mlpackage, Convert it to an .mtlpackage, Use MTL4MachineLearningCommandEncoder alongside render passes, Or embed small neural networks directly in shaders using Shader ML? Having such a sample would
3
0
409
6d
iOS26 WKWebView:Remote page becomes unresponsive after loading local file
Subject: iOS 26 WKWebView: Remote Pages Become Unresponsive After Loading Local HTML Files Description We're experiencing a critical issue with WKWebView in a React Native 0.64.3 application where remote web pages become completely unresponsive after loading local HTML files in iOS 26. It works well before iOS26. Environment: React Native 0.64.3 iOS 26.0 Xcode 26.0.1 Using custom WKWebView implementations in Native modules Problem Details App loads local HTML files using loadFileURL:allowingReadAccessToURL: Later, when loading remote pages via loadRequest:, the remote pages load successfully but become unresponsive to user interactions This occurs even when using different WKWebView instances The issue is reproducible 100% of the time once a local file has been loaded Restarting the app and loading remote pages directly works fine Code Example: // Loading local file (works fine) [self.webView loadFileURL:localFileURL allowingReadAccessToURL:accessURL]; // Later, loading remote page (loads but becomes unrespon
2
0
774
6d
LockedCameraCapture Does Not Launch The App from Lock Screen
My implementation of LockedCameraCapture does not launch my app when tapped from locked screen. But when the same widget is in the Control Center, it launches the app successfully. Standard Xcode target template: Lock_Screen_Capture.swift @main struct Lock_Screen_Capture: LockedCameraCaptureExtension { var body: some LockedCameraCaptureExtensionScene { LockedCameraCaptureUIScene { session in Lock_Screen_CaptureViewFinder(session: session) } } } Lock_Screen_CaptureViewFinder.swift: import SwiftUI import UIKit import UniformTypeIdentifiers import LockedCameraCapture struct Lock_Screen_CaptureViewFinder: UIViewControllerRepresentable { let session: LockedCameraCaptureSession var sourceType: UIImagePickerController.SourceType = .camera init(session: LockedCameraCaptureSession) { self.session = session } func makeUIViewController(context: Self.Context) -> UIImagePickerController { let imagePicker = UIImagePickerController() imagePicker.sourceType = sourceType imagePicker.mediaTypes = [UTType.image.identifier, U
1
0
166
6d
Beginner’s question on learning philosophy.
Hello Everyone! I started programming 6 months ago and started Swift / IOS last month. My learning so far has mainly been with Python. I learned a lot of the package ‘SQLAlchemy’, which has very ‘example based’ documentation. If I wanted to learn how to make a many to many relationship, there was a demonstration with code. But going into Swift and Apple packages, I notice most of the documentation is definitions of structures, modifiers, functions, etc. I wanted to make the equivalent of python ‘date times’ in my swift app. I found the section in the documentation “Foundation->Dates & Times”, but I couldn’t figure how to use that in my code. I assume my goal should not be to memorize every Swift and apple functionality by memory to be an app developer. So I would appreciate advice on how to approach this aspect of learning programming.
2
0
415
1w
EXC_BREAKPOINT in BKSHIDEventDeliveryManager during background scene invalidation
We're seeing a high velocity crash with our users in background tasks in an internal Apple Framework. It seems to have started in iOS 17/18, but has increased heavily in iOS 26+. EXC_BREAKPOINT · 0 BackBoardServices -[BKSHIDEventDeliveryManager _initWithConnectionFactory:forTesting:] 1 BackBoardServices ___44+[BKSHIDEventDeliveryManager sharedInstance]_block_invoke 2 libdispatch.dylib __dispatch_client_callout 3 libdispatch.dylib __dispatch_once_callout 4 BackBoardServices +[BKSHIDEventDeliveryManager sharedInstance] 5 UIKitCore _stateMachineSpec_block_invoke_3 6 UIKitCore _handleEvent 7 UIKitCore -[_UIEventDeferringManager _processEventDeferringActions:actionsCount:inScope:forDeferringToken:environments:target:addingRecreationReason:removingRecreationReason:forReason:] 8 UIKitCore -[_UIEventDeferringManager _sceneWillInvalidate:] 9 UIKitCore -[UIScene _invalidate] 10 UIKitCore -[UIWindowScene _invalidate] 11 UIKitCore -[UIApplication workspace:willDestroyScene:withTransitionContext:completion:] 12 U
0
0
29
1w
Best Practices for Binary Data (“Allows External Storage”) in Core Data with CloudKit Sync
Hello Apple Team, We’re building a CloudKit-enabled Core Data app and would like clarification on the behavior and performance characteristics of Binary Data attributes with “Allows External Storage” enabled when used with NSPersistentCloudKitContainer. Initially, we tried storing image files manually on disk and only saving the metadata (file URLs, dimensions, etc.) in Core Data. While this approach reduced the size of the Core Data store, it introduced instability after app updates and broke sync between devices. We would prefer to use the official Apple-recommended method and have Core Data manage image storage and CloudKit syncing natively. Specifically, we’d appreciate guidance on the following: When a Binary Data attribute is marked as “Allows External Storage”, large image files are stored as separate files on device rather than inline in the SQLite store. How effective is this mechanism in keeping the Core Data store size small on device? Are there any recommended size thresholds or
2
0
220
1w
Reply to Building macOS apps with Xcode 26 on macOS 26 VM
Yeah! It's working! I have a macOS 26.0 host. In UTM I have a macOS 15.7 guest with a Provisioning UUID that has never been added as a device in the developer portal. In my developer account I added a new device with the guest VM's UUID. It's the longer style with lowercase hex digits. I then edited my existing manual provisioning profile with the new device. In Xcode (26.1 beta 2) I changed my app's signing from automatic to manual and referenced the updated provisioning profile. I built the app and copied the new app to a share I use for the VM. I ran the VM and tried to run the updated app. It failed. I missed one step that maybe some others might miss. In Xcode, go to Settings, Apple Accounts. Select your account. Then select your team. Then click on Download Manual Profiles. Once I did that, I rebuilt the app with the now properly updated provisioning profile and recopied the app to the share. And now the app runs in the guest VM! Key steps for those that are not starting from scratch: You need
1w
Use CCID interface instead of CryptoTokenKit API
Hi, Is it possible for a macOS (or iOS/ipadOS) app to communicate with a CCID-compliant reader using the CCID interface (i.e., directly sending the PC_TO_RDR_* messages) instead of using the CryptoTokenKit API? Apple's CCID driver (/System/Library/CryptoTokenKit/usbsmartcardreaderd.slotd) seems to support all the PC_TO_RDR and RDR_TO_PC messages: https://blog.apdu.fr/posts/2023/11/apple-own-ccid-driver-in-sonoma/#enable-my-ccid-driver The background for this question is that we develop smartcard products and we'd like to use the finer grained settings provided by the CCID specification for testing/demo purposes. Thank you.
1
0
43
1w