Search results for

Building for iOS Simulator, but the linked and embedded framework ‘XX.framework‘ was built for

186,349 results found

Post

Replies

Boosts

Views

Activity

Reply to How can I create a more complex XPCPeerRequirement?
Entitlements and code-signing requirements are very different. See the following for more background on each: TN3125 Inside Code Signing: Provisioning Profiles TN3127 Inside Code Signing: Requirements You can use a code-signing requirement to check for an entitlement, for example: % codesign --verify -R '=entitlement [com.apple.security.app-sandbox] exists' -v /Applications/Pages.app … /Applications/Pages.app: explicit requirement satisfied % codesign --verify -R '=entitlement [com.apple.security.app-sandbox] exists' -v /usr/bin/true … test-requirement: code failed to satisfy specified code requirement(s) However, entitlements are tricky to use in this situation because: You can’t create a provisioning profile that authorises a custom requirement. Many of of the popular entitlements are either unrestricted on macOS, or only restricted in that they clear the entitlement-validate flag [1]. Given that, I think maintaining your previous approach makes sense, that is, check for the Team ID and a list of code-signi
Topic: Code Signing SubTopic: Entitlements Tags:
1w
Reply to MacOS(Apple Silicon) IOKit driver for FPGA DMA transmission, kernel panic.
If you haven't already, take a look at my post here. I haven't checked the code to confirm, but I strongly suspect that calling gen32IOVMSegments with a mask of the wrong size (like 0xFFFFF000) will cause kIOReturnMessageTooLarge, as the true mask range you've specified (0x00000000FFFFF000) is too large to fit in 32 bits. Yes, I am currently using the inTaskWithPhysicalMask interface in my code, and the masks I have tried are 0 x0000 0000 FFFF FFFFULL,0x0000 0000 FFFF F000ULL, 0xFFFF FFFF,0xFFFF F000, and 0x0, but the result is the same, using gen32IOVMSegments will return the same error. I have also used IOBufferCacheDescriptors:: withOptions and IOBufferCacheDescriptors:: withCapacity to build memory descriptors, but the results have not changed much. If you have time, Could you take a look at the code? I have submitted on the bug. One more thing to add: I have also tried using the prepare() interface of IODMACommand, but there have been no significant changes. Thanks!!
Topic: App & System Services SubTopic: Drivers Tags:
1w
Reply to iOS Keychain + Derived Credentials: Technical help needed!
Thanks for bringing this to the Apple Developer Forums. First up, I want to double check that this is for iOS. You mentioned the “System Keychain”, which is a macOS thing [1]. On iOS there is only one keychain, known as the data protection keychain. Within that keychain, credentials exist within a keychain access group. Your app’s access to keychain access groups is moderated by entitlements, as explained in Sharing access to keychain items among a collection of apps. Note For a lot more background on keychain APIs, see: SecItem: Fundamentals SecItem: Pitfalls and Best Practices Next, let’s look at your specific questions: [quote='815135021, HSB, /thread/815135, /profile/HSB'] 1- Is there an API that allows us to create a signature without us having to pass the private key itself [/quote] No. iOS does have the ability to work with keys where the key material isn’t directly accessible to your app. We use this, for example, to allow keys to be protected by the Secure Enclave and to su
Topic: Privacy & Security SubTopic: General Tags:
1w
How to accept CloudKit shares with the new SwiftUI app lifecycle?
In the iOS 13 world, I had code like this: class SceneDelegate: UIResponder, UIWindowSceneDelegate { ttfunc windowScene(_ windowScene: UIWindowScene, userDidAcceptCloudKitShareWith cloudKitShareMetadata: CKShare.Metadata) { tttt// do stuff with the metadata, eventually call CKAcceptSharesOperation tt} } I am migrating my app to the new SwiftUI app lifecycle, and can’t figure out where to put this method. It used to live in AppDelegate pre-iOS13, and I tried going back to that, but the AppDelegate version never gets called. There doesn’t seem to be a SceneDelegateAdaptor akin to UIApplicationDelegateAdaptor available, which would provide a bridge to the old code. So, I’m lost. How do I accept CloudKit shares with SwiftUI app lifecycle? 🙈
4
0
1.4k
1w
Reply to How to accept CloudKit shares with the new SwiftUI app lifecycle?
CloudKit share acceptance still requires a UIWindowSceneDelegate on iOS. Can try this approach ---- Keep SwiftUI lifecycle @main struct MyApp: App { var body: some Scene { WindowGroup { ContentView() } } } Add SceneDelegate import UIKit import CloudKit class SceneDelegate: UIResponder, UIWindowSceneDelegate { func windowScene( _ windowScene: UIWindowScene, userDidAcceptCloudKitShareWith metadata: CKShare.Metadata ) { acceptShare(with: metadata) } } Register it in Info.plist UISceneDelegateClassName = SceneDelegate
Topic: UI Frameworks SubTopic: SwiftUI Tags:
1w
Reply to iOS UDP Multicast: Receiving works but sending silently fails
I’m generally skeptical of using Network framework for multicasts. It should work in general, but: There are a bunch of things it can’t do. And even when it can, you often hit weird edge cases. My general advice — and this makes me very sad — is to stick with BSD Sockets for broadcasts and multicasts. See Extra-ordinary Networking > Broadcasts and Multicasts, Hints and Tips. Having said that, this is weird: [quote='815127021, Anshuman1989, /thread/815127, /profile/Anshuman1989'] Reinstalling the app fixes the issue [/quote] This isn’t a standard pathology I see with Network framework’s multicast support and you are right to suspect local network privacy in that case. So let’s dig into that. First up, you’ve signed your app with the the com.apple.developer.networking.multicast entitlement, right? You didn’t mention that, and it’s very important. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
1w
iOS UDP Multicast: Receiving works but sending silently fails
Hi everyone, I’m working with UDP Multicasting on iOS (iOS 15+) using Network.framework and facing a confusing issue. Setup: Multicast IP: 239.255.0.1 Port: 45454 Using NWConnectionGroup / NWMulticastGroup NSLocalNetworkUsageDescription is present in Info.plist Devices are on the same Wi-Fi network Problem: Receiving multicast packets works perfectly Sending multicast packets does NOT work No errors are thrown send() completion handler reports success stateUpdateHandler sometimes doesn’t transition to .ready No packets are actually transmitted on the network Observations: The app can receive data from other multicast senders Sending appears to be silently blocked Reinstalling the app fixes the issue This points to a Local Network permission problem If permission was denied once, iOS does not re-prompt Inbound multicast works, outbound multicast is blocked Questions: Is it expected on iOS that receiving multicast works even when sending is blocked? Is reinstalling the app th
1
0
134
1w
Reply to "Notarization stuck in 'In Progress' for 15+ hours - submission e3dff14c-16ab-41a7-a81c-0d1774c66588"
You can expect that most uploads will be notarised quickly. Occasionally, some uploads are held for in-depth analysis and may take longer to complete. As you notarise your apps, the system will learn how to recognise them, and you should see fewer delays. For lots of additional info about notarisation, see Notarisation Resources. Specifically, it links to a Q&A with the notary service team that’s quite instructive. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Topic: Code Signing SubTopic: Notarization Tags:
1w
Reply to How to visualize AR-dependent app if not supported on Simulator for Swift Student Challenge?
If your submission relies on features that are only available on device, I recommend that you built and test it using the Swift Playground app and then request that it be judged in that environment. The major drawback with that is that the current Swift Playground app doesn’t support the iOS 26 SDK )-: I don’t have anything to share about that limitation right now, but if and when I do I’ll update this other thread. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
1w
How to visualize AR-dependent app if not supported on Simulator for Swift Student Challenge?
Recently, applications for the Swift Student Challenge opened up. I noticed, that when selected where to run your app (mine was developed in Xcode 26), and you select Xcode26, there is a note underneath it that basically says all Xcode projects will be run on the simulator. What if my project is dependent on AR? How would I let the judges test my submission?
2
0
235
1w
Core Image for depth maps & segmentation masks: numeric fidelity issues when rendering CIImage to CVPixelBuffer (looking for Architecture suggestions)
Hello All, I’m working on a computer-vision–heavy iOS application that uses the camera, LiDAR depth maps, and semantic segmentation to reason about the environment (object identification, localization and measurement - not just visualization). Current architecture I initially built the image pipeline around CIImage as a unifying abstraction. It seemed like a good idea because: CIImage integrates cleanly with Vision, ARKit, AVFoundation, Metal, Core Graphics, etc. It provides a rich set of out-of-the-box transforms and filters. It is immutable and thread-safe, which significantly simplified concurrency in a multi-queue pipeline. The LiDAR depth maps, semantic segmentation masks, etc. were treated as CIImages, with conversion to CVPixelBuffer or MTLTexture only at the edges when required. Problem I’ve run into cases where Core Image transformations do not preserve numeric fidelity for non-visual data. Example: Rendering a CIImage-backed segmentation mask into a larger CVPixelBuffer can cause l
1
0
207
1w
iOS App Review: Guidelines 5.1.1(i) - Legal - Privacy - Data Collection and 5.1.2(i) - Legal - Privacy - Data Use
Our app (Tenkobo) received a rejection notice after review due to the fact that we use Gemini AI since 3 builds ago. Since then, we have been improving the disclosure of the data we collect, explicitly stating all the data, introducing a new feature that checks granular consent and syncs consent state for the user to the backend, and controls for whether to send to the Gemini API service for that feature depending on consent state for the user. Moreover, this feature is a premium add-on to a module that already does most things locally on the device and sends to our cloud infrastructure to allow storage and sync when users use multiple devices. It is a multi-platform app. However, despite every improvement, we keep getting the same Rejection reason that ... Review Device: iPad Air 11-inch (M3) ... The issues we previously identified still need your attention. Guidelines 5.1.1(i) - Legal - Privacy - Data Collection and 5.1.2(i) - Legal - Privacy - Data Use The app appears to share the user’s personal
1
0
86
1w
Live Activities widget extension does not reflect updated SwiftUI UI (custom views/assets appear ignored)
Hello Apple Developer Technical Support, I’m following up on case #102807413324 and submitting this as a code-level support request. We are integrating iOS Live Activities (ActivityKit + WidgetKit extension written in SwiftUI) into an Expo/React Native app. We’re seeing behavior where the Live Activity UI shown on the Lock Screen appears to “stick” to an older layout and ignores updated SwiftUI code and/or bundled assets, even after rebuilding, reinstalling, and removing existing Live Activities before testing again. Environment Device: iPhone 13 iOS: 26.2 macOS: 15.7.3 (24G419) Xcode: 16.4 (16F6) Expo SDK: 52 React Native: 0.76.9 expo-live-activity: ^0.4.2 Build type: Ad-Hoc signed IPA (EAS local build) Summary We have a WidgetKit extension target (LiveActivity.appex, bundle id: stimul8.LiveActivity) using ActivityConfiguration(for: LiveActivityAttributes.self). The extension contains multiple SwiftUI views selected via a “route” (derived from deepLinkUrl / title / subtitl
1
0
84
1w
How can responding to user reviews effectively contribute to improving ASO performance?
Responding to reviews helps ASO because it encourages better user sentiment, improves rating recovery, and builds trust for new users checking your app. Converting negative reviews into positive ones has a strong impact on ranking. Engaging consistently with users is one of the simplest ways to strengthen overall ASO performance.
2
0
140
1w