Hi everyone, I'm developing a MR Vision Pro app where I’d like to anchor virtual objects (such as UI elements) around the user's arm. However, I’ve noticed that Vision Pro seems to mask out the area where the user’s real arm is, hiding virtual content in that region so that you see your real arm.
Is there a way to render virtual elements on the user's arm—so that it looks like the object is placed directly on the arm despite the real-world passthrough? I was hoping there might be a way to adjust the depth or behavior of this masked-out region. Any insights or workarounds would be greatly appreciated! Thanks :)
Frameworks
RSS for tagAsk questions about APIs that can drive features in your apps.
Posts under Frameworks tag
200 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
We're building an SDK (let's call it MyFramework) which is distributed as an .xcframework for developers to integrate it into their own apps.
Recently, we've added tvOS support by adding it as a supported destination for the SDK. Essentially, the SDK became a cross-platform framework and easy to be adopted in both iOS and tvOS apps.
The .xcframework is generated fine, all builds correctly. We've tested the SDK integrated in test apps. We've also submitted an iOS archive app to the AppStore connect, just to make sure all is well with the AppStore submission.
However, when I tried submitting a tvOS archive app (that integrates the same SDK) to the AppStore connect, the build was marked as invalid binary and we've got the following error message:
ITMS-90562: Invalid Bundle - One of the nested bundles is built for a platform which is different from the main bundle platform. Please make sure that all bundles have correct platform specification.
First, I thought that any of the modules of the framework was not correctly configured for tvOS but that was not the case. Everything compiles ok and it runs in debug as expected.
It only fails when trying to submit the tvOS app that integrates the SDK.
After a lot of investigating, we realised that the reason for the AppStore submission error is because we codesign the framework. We use codesign to distribute it as cryptographically signed xcframework.
codesign --timestamp -v --sign "Apple Distribution: Company (xxxxxxxxxx)" "MyFramework.xcframework"
As soon as we remove the above line from our CI/CD pipeline that generates the xcframework, and submit a tvOS archive app that integrates the unsigned framework, the AppStore submission error goes away and the TestFlight build can be tested.
I've also tried to just strip the _CodeSignature folder from the .xcframework package but it still fails. So the only solution currently is to not codesign the xcframework to be able to upload tvOS archive apps that integrate the SDK to the AppStore connect.
However, we're still puzzled as to why only the tvOS archive app gets the invalid binary error when it integrates a signed SDK. I feel like the error message is quite misleading if we have to stop signing the SDK xcframework for it to be accepted during the AppStore submission.
Anyone has any idea? Or has anyone encountered a similar issue?
Thanks!
Our app supports iOS12 as the minimum OS and we embed a framework with iOS15 minimum support. Naturally we weak-link it and use #available(iOS 15, ) to guard accesses to its symbols.
On iOS12.5.7 the framework is completely ignored and the app works fine.
On iOS13.3.1 however we get to see this error:
Termination Description: DYLD, dependent dylib '/System/Library/Frameworks/AVFAudio.framework/AVFAudio' not found for '/private/var/containers/Bundle/Application/08DA2D93-4DC2-4523-98AF-FD52884989AE/<OUR_APP>.app/Frameworks/<FRAMEWORK>.framework/<FRAMEWORK>', tried but didn't find: '/System/Library/Frameworks/AVFAudio.framework/AVFAudio' '/System/Library/Frameworks/AVFAudio.framework/AVFAudio'
has a dependency on AVFAudio which is available only since iOS 14.5 so it makes sense it wouldn;t be able to find it but what's bothering us is why does dyld even try loading the 's dependencies instead of just ignoring it?
Could this be a bug on 13.3.1? Unfortunately at this time we don't have other iOS13 phones to test with.
The 'LC_BUILD_VERSION' command sure enough seems valid::
Load command 10
cmd LC_BUILD_VERSION
cmdsize 32
platform 2
minos 15.0
sdk 17.0
ntools 1
tool 3
version 1015.7
We are seeing network errors in Outlook mail on iOS and MacOS safari browsers.
As per current investigation, we notice these network error when the user tries to use outlook after leaving it open on Safari for a while.
Observations:
Issue present in both MacOS and iOS safari.
Issue is not present in other webkit browsers like brave and edge on iOS.
Issue is reproable on both mini and big owa on safari browser.
Issue is not related to post requests being sent in different packets on safari browser.
Requests are only blocked for outlook.office/outlook.live domains
What does not fix this issue?
Reloading the application
Clearing cookie, local storage or session storage
Unregistering service workers
Redirecting to a different page and coming back to outlook domain
Re authenticating the users
What fixes this issue?
Reconnecting to wifi or mobile network
Reconnecting vpn
Removing safari from background and reopening
Flushing the dns in setting
Since upgrading from Xcode 15 to 16, we have been experiencing a build error during compilation. Building on Xcode 15 still works with no issues. The error happens only on the first build after a clean. Subsequent builds succeed. This is an issue because our CI process archives the project from a clean slate, and this causes it to fail every time. I will attempt to describe the issue and include information I believe is relevant in this document.
The error occurs on this import line within an Objective-C file during the Scan Dependencies step of compiling. This line imports our custom Objective-C to Swift bridging header file - "Swift2Objc.h".
Our custom Objective-C to Swift bridging header file is simply wrapping the project’s auto-generated Objective-C to Swift bridging header file - "KWISwift.h".
The error is specific to the import of the OfflineServices Swift Package.
Specifically, the OfflineServices-Swift.h file - the Swift Package’s auto-generated Objective-C to Swift bridging file.
Module JRE not found - the exact error (Also included as text on the bottom of the post)
JRE is a third-party library provided to us as an xcframework. It is placed directly into our Swift Package as a binary target.
The xcframework itself is composed of .a file and a Headers folder which includes header files and a module.modulemap.
The module.modulemap file looks like this.
Hello everyone.
I use Translation Framework in my application. During development everything was fine, Translation framework worked well, but after two or three days of using the production version (that was published in AppStore and available for others also!) - my application stopped working. Translation framework gives errors:
Error sending 1 paragraphs Error Domain=TranslationErrorDomain Code=16 "Translation failed" UserInfo={NSLocalizedDescription=Translation failed, NSLocalizedFailureReason=Offline models not available for language pair}
Failed to translate input 0; returning error: Error Domain=TranslationErrorDomain Code=16 "Translation failed" UserInfo={NSLocalizedDescription=Translation failed, NSLocalizedFailureReason=Offline models not available for language pair}
Received unbridged NSError to API, converting to .internalError: Error Domain=TranslationErrorDomain Code=16 "Translation failed" UserInfo={NSLocalizedDescription=Translation failed, NSLocalizedFailureReason=Offline models not available for language pair}
Once again - it worked when I developed it, it was released on the AppStore, and suddenly it stopped working!
Are serialized parameters already available inside -pluginInstanceAddedToDocument via FxParameterRetrievalAPI or are they being read later?
I'm attempting to create a proof of concept of a static library, distributed as an XCFramework, which has two local XCFramework dependencies.
The reason for this is because I'm working to provide a single statically linked library to a customer, instead of providing them with the static library plus the two dependencies.
The Issue
With a fairly simple example project, I'm not able to access any code from the static library without the complier throwing a "No such module" error and saying that it cannot find one of the dependent modules.
Project Layout
I have an example project that has some example targets with basic example code.
Example Project on Github
Target: FrameworkA
Mach-0 Type: Dynamic
Build Mergable Library: Yes
Skip Install: No
Build Libraries For Distribution: Yes
Target: FrameworkB
Mach-0 Type: Dynamic
Build Mergable Library: Yes
Skip Install: No
Build Libraries For Distribution: Yes
XCFrameworks are being generated from these two targets using Apple's recommendations. I've verified that the mergable metadata is present in both framework's Info.plist files.
Each exposes a single struct which will return an example String.
Finally I have my SDK target:
Target: ExampleKit
Mach-0 Type: Static
Build Mergable Library: No
Create Merged Binary: Manual
Skip Install: No
Build Libraries For Distribution: Yes
The two .xcframework files are in the Target's folder structure as well. The "Link Binary With Libraries" build phase includes them and they're Required.
Inside of the ExampleKit target, I have a single public struct which has two static properties which return the example strings from FrameworkA and FrameworkB.
I then have another script which generates an XCFramework from this target.
Expectations
Based on Apple's documentation and the "Meet Mergable Libraries" WWDC session I would expect that I could make a simple iOS app, link the ExampleKit.xcframework, import ExampleKit inside of a file, and be able to access the single public struct present in ExampleKit. Unfortunately, all I get is "No such module FrameworkA".
I would expect that FrameworkA and FrameworkB would have been merged into ExampleKit? I'm really unsure of where to go from here in debugging this. And more importantly, is this even a possible thing to do?
Hi.
I have a xcframework that has a dependency on 'RxSwift' and 'RxCocoa'. I deployed it using SPM by embedding it in a Swift Package.
However when I import swift package into another project, I keep getting the following error:
"Missing required module 'RxCocoaRuntime"
How can I fix this?
Below are the steps to reproduce the error.
Steps
Create Xcode proejct, make a dependency on 'RxSwift' and 'RxCocoa' (no matter doing it through tuist or cocoapods)
Create XCFramework from that proejct. (I used commands below)
xcodebuild archive \
-workspace SimpleFramework.xcworkspace \
-scheme "SimpleFramework" \
-destination "generic/platform=iOS" \
-archivePath "./SimpleFramework-iphoneos.xcarchive" \
-sdk iphoneos \
SKIP_INSTALL=NO \
BUILD_LIBRARY_FOR_DISTRIBUTION=YES
xcodebuild archive \
-workspace SimpleFramework.xcworkspace \
-scheme "SimpleFramework" \
-archivePath "./SimpleFramework-iphonesimulator.xcarchive" \
-sdk "iphonesimulator" \
SKIP_INSTALL=NO \
BUILD_LIBRARY_FOR_DISTRIBUTION=YES
xcodebuild -create-xcframework \
-framework "./SimpleFramework-iphoneos.xcarchive/Products/Library/Frameworks/SimpleFramework.framework" \
-framework "./SimpleFramework-iphonesimulator.xcarchive/Products/Library/Frameworks/SimpleFramework.framework" \
-output "./SimpleFramework.xcframework"
Embed in Swift Package, and deploy.
// swift-tools-version: 6.0
// The swift-tools-version declares the minimum version of Swift required to build this package.
import PackageDescription
let package = Package(
name: "SimplePackage",
platforms: [.iOS(.v16)],
products: [
.library(
name: "SimplePackage",
targets: ["SimplePackage"]),
],
dependencies: [
.package(url: "https://github.com/ReactiveX/RxSwift", from: "6.8.0")
],
targets: [
.binaryTarget(
name: "SimpleFramework",
path: "Sources/SimpleFramework.xcframework"
),
.target(
name: "SimplePackage",
dependencies: [
"SimpleFramework",
"RxSwift",
.product(name: "RxCocoa", package: "RxSwift")
]
)
]
)
Download Swift Package in another project and import module.
I resolved this by removing dependencies from the Swift Package, downloading package in another project, and fetching dependencies by cocoapods.
Thist works, but I don't want to use another dependency manager while using SPM.
Development Environment
CPU : Apple M4 Max
MacOS : Sequoia 15.3
Xcode : 16.2
I'm using Network Framework to transfer files between 2 devices. The "secondary" device sends file requests to the "primary" device, and the primary sends the files back.
When the primary gets the request, it responds like this:
do {
let data = try Data(contentsOf: filePath)
let priSecDataFilePacket = PriSecDataFilePacket(fileName: filename, dataBlob: data)
let jsonData = try JSONEncoder().encode(priSecDataFilePacket)
let message = NWProtocolFramer.Message(priSecMessageType: PriSecMessageType.priToSecDataFile)
let context = NWConnection.ContentContext(identifier: "TransferUtility", metadata: [message])
connection.send(content: encodedJsonToSend, contentContext: context, isComplete: true, completion: .idempotent)
} catch {
print("\(error)")
}
It works great, even for hundreds of file requests. The problem arises if some files being requested are extremely large, like 600MB. You can see the memory speedometer on the primary quickly ramp up to the yellow zone, at which point iOS kills the app for high memory use, and you see the Jetsam log.
I changed the code to skip JSON encoding the binary file as a test, and that helped a bit, but it still goes too high; the real offender is the step where it loads the 600MB file into the data var:
let data = try Data(contentsOf: filePath)
If I remark out everything else and just leave that one line, I can still see the memory use spike.
As a fix, I'm rewriting this so the secondary requests the file in 5MB chunks by telling the primary a byte range such as "0-5242880" or "5242881-10485760", and then reassembling the chunks on the secondary once they all come in. So far this seems promising, but it's a fair amount of work.
My question: Does Network Framework have a built-in way to stream those bytes straight from disk as it sends them? So that I could send all the data in one single request without having to load the bytes into memory?
ISAC Client is crashing on macOS 15.4 Beta 1 which is from from the WebKit engine the underlying framework of WKWebView. And the "ResourceLoadNotifier" is from WebKit's internal framework.It seems to be related to resource loading failure which is potentially triggered by changes in macOS 15.4 Beta.
Hello,
In my IOS app, I have been working on implementing a third-party library's xcframework into my app. (They don't provide spm or cocoapods). However, whenever I import the XCFramework into my app, the build is successful, but when uploading to App Store Connect, I receive an email with an error stating the Swift Support folder is missing. This app was made using SwiftUI. I have a sample project linked below. Other apps also use this framework, so I'm not sure where I'm going wrong.
Project
Hello everyone,
I’m encountering an issue when trying to build and archive my library BleeckerCodesLib using Swift Package Manager. My project is structured with two targets:
CBleeckerLib: A C target that contains my image processing code (C source files and public headers).
BleeckerCodesLib: A Swift target that depends on CBleeckerLib and performs an import CBleeckerLib.
Below is the relevant portion of my Package.swift:
// swift-tools-version:5.7
import PackageDescription
let package = Package(
name: "BleeckerCodesLib",
platforms: [.iOS(.v16)],
products: [
.library(name: "BleeckerCodesLib", targets: ["BleeckerCodesLib"])
],
targets: [
.target(
name: "CBleeckerLib",
publicHeadersPath: "include"
),
.target(
name: "BleeckerCodesLib",
dependencies: ["CBleeckerLib"]
)
]
)
Directory Structure
My project directory looks like this:
BleeckerCodesLib/
├── BleeckerCodesLib.xcodeproj/
│ └── xcuserdata/
│ └── robertopitarch.xcuserdatad/
│ └── xcschemes/
│ └── xcschememanagement.plist
├── BleeckerCodesLib.h
├── Package.swift
└── Sources/
├── CBleeckerLib/
│ ├── bleecker-lib.c
│ └── include/
│ ├── bleecker-lib.h
│ └── CBleeckerLib.h
└── BleeckerCodesLib/
├── UIImage+Extensions.swift
├── ImageProcessingUtility.swift
├── APIManager.swift
├── BleeckerCodesLib.swift
├── CameraView.swift
├── RealTimeCameraView.swift
└── BleeckerCameraWrapper.swift
Code Example
In my Swift code (for example, in BleeckerCodesLib.swift), I import the C module as follows:
import SwiftUI
import UIKit
import CBleeckerLib // Import the C module
public struct BleeckerCodes {
public struct DetectedCode {
public let code: String
public let corners: [CGPoint]
public init(code: String, corners: [CGPoint]) {
self.code = code
self.corners = corners
}
}
// Initialization function
public static func initializeLibrary() -> String {
bleecker_init() // Call the C module function
return "BleeckerCodesLibrary initialized!"
}
// ... other functions
}
The Problem
When I try to compile or archive the project using commands such as:
xcodebuild archive -project BleeckerCodesLib.xcodeproj -scheme BleeckerCodesLib -destination "generic/platform=iOS" -archivePath "archives/BleeckerCodesLib"
I receive the error: "no such module 'CBleeckerLib'"
Any assistance or step-by-step guidance on resolving this integration issue would be greatly appreciated.
Thank you in advance!
Topic:
Developer Tools & Services
SubTopic:
Xcode
Tags:
Swift Packages
Developer Tools
Frameworks
Compiler
In the past I've used #if available to isolate functionality gaps at the code level in my apps.
But yesterday I was trying to integrate the new methods Apple has made available for granting access to contacts, as discussed here: https://developer.apple.com/documentation/contacts/accessing-a-person-s-contact-data-using-contacts-and-contactsui
The problem is that their example is rife with code that won't compile for iOS 17, which is still (from the last stats I found) 20% of the user base.
I also experimented with @available; but when I was nearly done integrating the new methods, a compiler bug halted my entire project. It now simply doesn't build (the ol' non-zero exit code) with no further details or errors flagged. I filed a report, as requested in the logs.
In the meantime, I have to start over. What is the recommended method for isolating large blocks of functionality that are only available to a later OS version?
I have static libraries and headers of a C++ project that I believe are correctly built for iOS and iOS Simulator destinations. The C++ project is built via CMake with something like:
cmake dirName \
-G "Unix Makefiles" \
-B buildDir \
-DCMAKE_INSTALL_PREFIX=installDir \
-DCMAKE_SYSTEM_NAME=iOS \
-DCMAKE_SYSTEM_PROCESSOR=arm64 \
-DCMAKE_OSX_ARCHITECTURES=arm64 \
-DCMAKE_OSX_SYSROOT=$(xcrun --sdk iphonesimulator --show-sdk-path) \
-DCMAKE_OSX_DEPLOYMENT_TARGET=15.0
...
cmake --build buildDir --config Release --target install
I believe those are all the important parameters. This gives me a static library (.a) and headers that I believe should be compatible with arm64 iOS simulators, and I do this same thing for x86_64 architecture with simulators and for actual iOS non-simulator via the iphoneos SDK path.
I'm pretty sure this gives me the correct static lib and headers. Let's assume it does because I'm not able actually create the XCFramework to know if they're right. This does work with a macOS lib and headers, but I need iOS for this library. How do I package this into an XCFramework now?
This Apple developer articles says I should be a able to create an xcframework via xcodebuild -create-xcframework -library libName.a -headers include but when I try to do this with my my iOS arm64 simulator static lib I get:
error: binaries with multiple platforms are not supported '/Users/.../install/ios-arm64-simulator/libName.a
But, when I run: lips -info libName.a I get Non-fat file libName.a is architecture arm64, so, I'm not sure what to do here. Trying to extract arm64 from that static library also produces an error as it it is just an arm64 lib.
I'm not really sure what's going on, but from reading online this specific command, xcodebuild -create-xcframework is a consistent pain point in the process of trying to get an XCFramework, and the seemingly only workaround is to archive a framework project and then create the xcframework via xcodebuild -create-xcframework -archive MyFramework.xcarchive -framework[or -library].
However, how am I supposed to get this static lib and headers into a suitable xcodeproj so that I can archive it correctly? Everytime I try to copy the headers and static lib into the Framework xcodeproj and set what I believe are all the correct settings, my .xcarchive is always empty.
Does anyone have any advice here on how to get this to work?
The main impetus for trying to get this C++ static lib and headers into an XCFramework as that seems like the only valid way to link a 3rd party C++ lib to an SPM package and have the C++ package be interfaceable with Swift.
在使用xcode15.2与iOS14.2版本的手机进行调试时,发现OC编译的sdk无法正常传输数据给swift编写的项目。当我的手机连接xcode调试的时候,数据能够正常传输、转换。当我断开手机与xcode的连接的时候,就无法正常获取数据了。而这个问题目前我只发现在iOS14.2中。当我使用iOS17与iOS18手机调试时没有出现这个问题。请问有没有人遇到过相似的问题。
Given Apple's new .limited contact authorization introduced in ios18, I want to be able to present the ContactAccessPicker directly from my app, via ionic capacitor. I present the .contactAccessPicker view via a UIHostingController, and I manage the view controller's dismissal accordingly when the ContactAccessPicker completes and is no longer presented.
Bug: After a few searches or interactions with the Contact Access Picker (ex. searching, selecting contacts, clicking the "show selected" button), the contact access picker crashes and the overlay remains. Any interaction with my app is then blocked because I can't detect that the contact access picker has disappeared when it crashes so I can't dismiss the viewController.
Is there a way for me to prevent the contact access picker from crashing, and how can I detect if it does crash, so I can at least dismiss the view controller if that happens?
struct ContactAccessPickerView: View {
@Binding var isPresented: Bool
let completion: @MainActor ([String]) -> Void
var body: some View {
Group {
if #available(iOS 18.0, *) {
Color.clear
.contactAccessPicker(isPresented: $isPresented) { result in
Task { @MainActor in
completion(result)
}
}
} else {
}
}
}
}
@objc func selectMoreContacts(_ call: CAPPluginCall) {
guard isContactsPermissionGranted() else {
call.resolve(["success": false])
return
}
// Save the call to ensure it's available until we finish
self.bridge?.saveCall(call)
DispatchQueue.main.async { [weak self] in
guard let self = self else { return }
var isPresented = true
let picker = ContactAccessPickerView(isPresented: .init(get: { isPresented }, set: { isPresented = $0 })) { contacts in
call.resolve(["success": true])
self.dismissAndCleanup(call)
}
let hostingController = UIHostingController(rootView: picker)
hostingController.modalPresentationStyle = .overFullScreen
self.bridge?.viewController?.present(hostingController, animated: true)
}
}
Recently we started facing BLE disconnect issues between our BLE peripheral (microphone) and iOS app that we're having trouble solving.
iOS App: Ionic Capacitor using @capacitor-community/bluetooth-le
Microphone Peripheral: esp32 board using ESP-IDF Apache NimBLE stack
App use case:
Our app records a sound clip using the BLE microphone and sends data via a characteristic. The sound clip is broken up into several packets and all sent over ( over 1600 packets ). The microphone has an antenna and boosted signal as well.
The Issue:
Recently, we've been facing consistent disconnects between the microphone and the iOS app that we think we've narrowed down to the iOS device is disconnecting due to too many dropped packets. It seems the phone can't get further than roughly 10 feet before we see packet loss. Up until recently we had little to no range issues with transferring data and settings disconnected from the microphone while being much further away. Nothing has changed on our end on either the app or microphone firmware side.
We use the same microphone firmware and app on Android and have no issues with range or dropped packets.
It also seems like we can transfer a couple recording , like 2 or 3 ( each with its own connection i.e scan and connect , subscribe to characteristic and gather all the packets , do some processing then disconnect and start over ), without issue than every attempt at gathering the packets starts failing because of disconnects.
Does anyone have any idea what might be going on?
Do we need to fix our connection parameters? This seems to be mostly an issue since the newest iOS updates ( 18.3,18.3.1 ) however we've tested on previous versions and are now seeing same ble range issues.
Any help or guidance on tracking down what's going on is appreciated.
Relevant logs:
`32mI (273409) Task_send_audio:: esp_ble_tx_power_get(ESP_BLE_PWR_TYPE_DEFAULT) = 255 [39m
[31mE (286869) main:: No MBUFs available from pool, retry.. [39m [23;1H
[31mE (287519) main:: No MBUFs available from pool, retry.. [39m [23;1H
[31mE (287769) main:: No MBUFs available from pool, retry.. [39m [23;1H
[31mE (287919) main:: No MBUFs available from pool, retry.. [39m [23;1H`
...
...
...
31mE (1622829) Task_send_audio:: send_audio_ble, couldn't send the audio totally, ***** unsubscribe from charactaristic [39m [23;1H
Peripheral connections parameters:
Hello, why is apple won’t adding Just-In-Time compiler to ”Emulators” in the app store. And/or hypervisor for newer devices.
i feel like UTM (which is a PC Emulator) or other Apps that emulate need JIT to work properly, and will consume significantly less battery to emulate/virtualize, And will have a noticeably better performance than just not enabling JIT, and by the way jit is already being used on iPadOS/iOS 18.3/18.3.1 and newer/older version of that so being enabled by the choice of the developer of the App is more convenient than doing it with tools.
and by the why apple wont let emulators on iPads and newer iPhones do hypervisor, it’s better than JIT but requires a good cpu, like making it available to people with newer/powerful devices, hypervisor is better than JIT by a lot and removing it in iPadOS/iOS 18.4 was an unnecessary choice?, becuase it had a better potential in virtualization instead of emulating, and I feel like enabling it In M1-M2 iPads and A14-18pro and newer devices is just better from having it disabled, to unlock the fullest potential of the iPad it needs to have a app or something to do instead of just running high graphics games/or Apps.
I am developing and distributing an XCFramework, and I want to ensure that it remains valid for as long as possible. I have some questions regarding certificate expiration and revocation:
I understand that if an XCFramework is signed with a timestamp, it remains valid even after the signing certificate expires.
However, if the signing certificate is revoked, the XCFramework immediately becomes unusable.
As far as I know, Apple allows a maximum of two active distribution certificates at the same time.
I assume that once a certificate expires, it will eventually need to be revoked in order to issue a third certificate. Is this correct?
If an expired certificate is later revoked, will the XCFrameworks signed with that certificate also become invalid, even though they were timestamped?
I want to ensure that released XCFrameworks remain valid for as long as possible. What is the best approach to achieve this?
If anyone has insights or official documentation references on how to manage signing certificates for long-term XCFramework validity, I would appreciate your guidance.
Thank you!
Topic:
Code Signing
SubTopic:
Certificates, Identifiers & Profiles
Tags:
Frameworks
Signing Certificates
Code Signing