I'm experiencing a persistent issue with CloudKit sharing in my iOS application. When attempting to present a UICloudSharingController, I receive the error message "Unknown client: ChoreOrganizer" in the console.
App Configuration Details:
App Name: ChoreOrganizer
Bundle ID: com.ProgressByBits.ChoreOrganizer
CloudKit Container ID: iCloud.com.ProgressByBits.ChoreOrganizer
Core Data Model Name: ChoreOrganizer.xcdatamodeld
Core Data Entity: Chore
Error Details:
The error "Unknown client: ChoreOrganizer" occurs when I present the UICloudSharingController
This happens only on the first attempt to share; subsequent attempts during the same app session don't show the error but sharing still doesn't work
All my code executes successfully without errors until UICloudSharingController is presented
Implementation Details:
I'm using NSPersistentCloudKitContainer for Core Data synchronization and UICloudSharingController for sharing. My implementation creates a custom CloudKit zone, saves both a record and a CKShare in that zone, and then presents the sharing controller.
Here's the relevant code:
@MainActor
func presentSharing(from viewController: UIViewController) async throws {
// Create CloudKit container
let container = CKContainer(identifier: containerIdentifier)
let database = container.privateCloudDatabase
// Define custom zone ID
let zoneID = CKRecordZone.ID(zoneName: "SharedChores", ownerName: CKCurrentUserDefaultName)
do {
// Check if zone exists, create if necessary
do {
_ = try await database.recordZone(for: zoneID)
} catch {
let newZone = CKRecordZone(zoneID: zoneID)
_ = try await database.save(newZone)
}
// Create record in custom zone
let recordID = CKRecord.ID(recordName: "SharedChoresRoot", zoneID: zoneID)
let rootRecord = CKRecord(recordType: "ChoreRoot", recordID: recordID)
rootRecord["name"] = "Shared Chores Root" as CKRecordValue
// Create share
let share = CKShare(rootRecord: rootRecord)
share[CKShare.SystemFieldKey.title] = "Shared Tasks" as CKRecordValue
// Save both record and share in same operation
let recordsToSave: [CKRecord] = [rootRecord, share]
_ = try await database.modifyRecords(saving: recordsToSave, deleting: [])
// Present sharing controller
let sharingController = UICloudSharingController(share: share, container: container)
sharingController.delegate = shareDelegate
// Configure popover
if let popover = sharingController.popoverPresentationController {
popover.sourceView = viewController.view
popover.sourceRect = CGRect(
x: viewController.view.bounds.midX,
y: viewController.view.bounds.midY,
width: 1, height: 1
)
popover.permittedArrowDirections = []
}
viewController.present(sharingController, animated: true)
} catch {
throw error
}
}
Steps I've already tried:
Verified correct bundle ID and container ID match in all places (code, entitlements file, Developer Portal)
Added NSUbiquitousContainers configuration to Info.plist
Ensured proper entitlements in the app
Created and configured proper provisioning profiles
Tried both default zone and custom zone for sharing
Various ways of saving the record and share (separate operations, same operation)
Cleaned build folder, deleted derived data, reinstalled the app
Tried on both simulator and physical device
Confirmed CloudKit container exists in CloudKit Dashboard with correct schema
Verified iCloud is properly signed in on test devices
Console Output:
1. Starting sharing process
2. Created CKContainer with ID: iCloud.com.ProgressByBits.ChoreOrganizer
3. Using zone: SharedChores
4. Checking if zone exists
5. Zone exists
7. Created record with ID: <CKRecordID: 0x3033ebd80; recordName=SharedChoresRoot, zoneID=SharedChores:__defaultOwner__>
8. Created share with ID: <CKRecordID: 0x3033ea920; recordName=Share-C4701F43-7591-4436-BBF4-6FA8AF3DF532, zoneID=SharedChores:__defaultOwner__>
9. About to save record and share
10. Records saved successfully
11. Creating UICloudSharingController
12. About to present UICloudSharingController
13. UICloudSharingController presented
Unknown client: ChoreOrganizer
Additional Information:
When accessing the CloudKit Dashboard, I can see that data is being properly synced to the cloud, indicating that the basic CloudKit integration is working. The issue appears to be specific to the sharing functionality.
I would greatly appreciate any insights or solutions to resolve this persistent "Unknown client" error. Thank you for your assistance.
Swift
RSS for tagSwift is a powerful and intuitive programming language for Apple platforms and beyond.
Posts under Swift tag
200 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I have added 2 command line tools in my swiftUI app for macOS, it was working fine locally, but it gives error when i try to make archive of it. I am not sure about the reason, but it was related to sandboxing the command line tools, after this i have tried multiple solutions but i am unable to resolve this issue, how should i handle the helper command line tools
I'm getting a fatal error in a preview:
fatalError in Environment+Objects.swift at line 34.
Anyone any idea how to find this file
Hi folks,
Unsure if I've implemented some sort of anti-pattern here, but any help or feedback would be great.
I've created a minimal reproducible sample below that lets you filter a list of people and mark individuals as a favourite.
When invoking the search function on a physical device running iOS 18.3.1, it crashes with Swift/ContiguousArrayBuffer.swift:675: Fatal error: Index out of range.
It runs fine on iOS 17 (physical device) and also on the various simulators I've tried (iOS 18.0, iOS 18.2, iOS 18.3.1).
If I remove the toggle binding, the crash doesn't occur (but I also can't update the toggles in the view model).
I'm expecting to be able to filter the list without a crash occurring and retain the ability to have the toggle switches update the view model.
Sample code is below. Thanks for your time 🙏!
import SwiftUI
struct Person {
let name: String
var isFavorite = false
}
@MainActor
class ViewModel: ObservableObject {
private let originalPeople: [Person] = [
.init(name: "Holly"),
.init(name: "Josh"),
.init(name: "Rhonda"),
.init(name: "Ted")
]
@Published var filteredPeople: [Person] = []
@Published var searchText: String = "" {
didSet {
if searchText.isEmpty {
filteredPeople = originalPeople
} else {
filteredPeople = originalPeople.filter { $0.name.lowercased().contains(searchText.lowercased()) }
}
}
}
init() {
self.filteredPeople = originalPeople
}
}
struct ContentView: View {
@ObservedObject var viewModel = ViewModel()
var body: some View {
NavigationStack {
List {
ForEach($viewModel.filteredPeople, id: \.name) { person in
VStack(alignment: .leading) {
Text(person.wrappedValue.name)
Toggle("Favorite", isOn: person.isFavorite)
}
}
}
.navigationTitle("Contacts")
}
.searchable(text: $viewModel.searchText)
}
}
#Preview {
ContentView()
}```
Just learning Swift and SwiftUI, having fun in Xcode. I have the following custom view defined, but when I try to test it, the information doesn't get updated as I expect. When I use the button defined INSIDE the custom view, the update works as expected. When I use the button defined in the Preview body, it doesn't. The "lines" variable of the custom view appears not to be updated in that case.
I know I'm missing something fundamental here about either view state binding or the preview environment, but I'm stumped. Any ideas?
import SwiftUI
struct ConsoleView: View {
var maxLines : Int = 26
private enum someIDs { case textID}
@State private var numLines : Int = 0
@State var lines = "a\nb\nc\n"
var body: some View {
VStack(alignment: .leading, spacing:0 ) {
ScrollViewReader { proxy in
ScrollView {
Button("Scroll to Bottom") {
withAnimation {
proxy.scrollTo(someIDs.textID, anchor: .bottom)
}
}
Text(lines)
.id(someIDs.textID)
.multilineTextAlignment(.leading)
.frame(maxWidth: .infinity, alignment: .bottomLeading)
.padding()
.onChange ( of: lines) {
withAnimation {
proxy.scrollTo( someIDs.textID, anchor: .bottom)
}
}
}
Button("Add more") {
writeln("More")
//writeln(response)
}
}
}
}
private func clipIfNeeded () {
if (numLines>=maxLines) {
if let i = lines.firstIndex(of: "\n") {
lines = String(lines.suffix( from: lines.index(after:i)))
}
}
}
func writeln ( _ newText : String) {
print("adding '\(newText)' to lines")
//clipIfNeeded()
write( newText )
write("\n")
numLines += 1
print(lines)
}
func write ( _ newText : String) {
lines += newText
}
}
#Preview {
VStack() {
var myConsole = ConsoleView(lines: "x\ny\nz\n")
myConsole
Button("Add stuff") {
myConsole.writeln("Stuff")
}
}
}
Hello,
Basically, I am reading and writing an asset.
To simplify, I am just reading the asset and rewriting it into an output video without any modifications.
However, I want to add a fade-out effect to the last three seconds of the output video.
I don’t know how to do this.
So far, before adding the CMSampleBuffer to the output video, I tried reducing its volume using an extension on CMSampleBuffer.
In the extension, I passed 0.4 for testing, aiming to reduce the video's overall volume by 60%.
My question is:
How can I directly adjust the volume of a CMSampleBuffer?
Here is the extension:
extension CMSampleBuffer {
func adjustVolume(by factor: Float) -> CMSampleBuffer? {
guard let blockBuffer = CMSampleBufferGetDataBuffer(self) else { return nil }
var length = 0
var dataPointer: UnsafeMutablePointer<Int8>?
guard CMBlockBufferGetDataPointer(blockBuffer, atOffset: 0, lengthAtOffsetOut: nil, totalLengthOut: &length, dataPointerOut: &dataPointer) == kCMBlockBufferNoErr else { return nil }
guard let dataPointer = dataPointer else { return nil }
let sampleCount = length / MemoryLayout<Int16>.size
dataPointer.withMemoryRebound(to: Int16.self, capacity: sampleCount) { pointer in
for i in 0..<sampleCount {
let sample = Float(pointer[i])
pointer[i] = Int16(sample * factor)
}
}
return self
}
}
I've been running into an issue using .fileImporter in SwiftUI already for a year. On iPhone simulator, Mac Catalyst and real iPad it works as expected, but when it comes to the test on a real iPhone, the picker just won't let you select files. It's not the permission issue, the sheet won't close at all and the callback isn't called. At the same time, if you use UIKits DocumentPickerViewController, everything starts working as expected, on Mac Catalyst/Simulator/iPad as well as on a real iPhone.
Steps to reproduce:
Create a new Xcode project using SwiftUI.
Paste following code:
import SwiftUI
struct ContentView: View {
@State var sShowing = false
@State var uShowing = false
@State var showAlert = false
@State var alertText = ""
var body: some View {
VStack {
VStack {
Button("Test SWIFTUI") {
sShowing = true
}
}
.fileImporter(isPresented: $sShowing, allowedContentTypes: [.item]) {result in
alertText = String(describing: result)
showAlert = true
}
VStack {
Button("Test UIKIT") {
uShowing = true
}
}
.sheet(isPresented: $uShowing) {
DocumentPicker(contentTypes: [.item]) {url in
alertText = String(describing: url)
showAlert = true
}
}
.padding(.top, 50)
}
.padding()
.alert(isPresented: $showAlert) {
Alert(title: Text("Result"), message: Text(alertText))
}
}
}
DocumentPicker.swift:
import SwiftUI
import UniformTypeIdentifiers
struct DocumentPicker: UIViewControllerRepresentable {
let contentTypes: [UTType]
let onPicked: (URL) -> Void
func makeCoordinator() -> Coordinator {
Coordinator(self)
}
func makeUIViewController(context: Context) -> UIDocumentPickerViewController {
let documentPicker = UIDocumentPickerViewController(forOpeningContentTypes: contentTypes, asCopy: true)
documentPicker.delegate = context.coordinator
documentPicker.modalPresentationStyle = .formSheet
return documentPicker
}
func updateUIViewController(_ uiViewController: UIDocumentPickerViewController, context: Context) {}
class Coordinator: NSObject, UIDocumentPickerDelegate {
var parent: DocumentPicker
init(_ parent: DocumentPicker) {
self.parent = parent
}
func documentPicker(_ controller: UIDocumentPickerViewController, didPickDocumentsAt urls: [URL]) {
print("Success!", urls)
guard let url = urls.first else { return }
parent.onPicked(url)
}
func documentPickerWasCancelled(_ controller: UIDocumentPickerViewController) {
print("Picker was cancelled")
}
}
}
Run the project on Mac Catalyst to confirm it working.
Try it out on a real iPhone.
For some reason, I can't attach a video, so I can only show a screenshot
Hello ladies and gentlemen, I'm writing a simple renderer on the main actor using Metal and Swift 6. I am at the stage now where I want to create a render pipeline state using asynchronous API:
@MainActor
class Renderer {
let opaqueMeshRPS: MTLRenderPipelineState
init(/*...*/) async throws {
let descriptor = MTLRenderPipelineDescriptor()
// ...
opaqueMeshRPS = try await device.makeRenderPipelineState(descriptor: descriptor)
}
}
I get a compilation error if try to use the asynchronous version of the makeRenderPipelineState method:
Non-sendable type 'any MTLRenderPipelineState' returned by implicitly asynchronous call to nonisolated function cannot cross actor boundary
Which is understandable, since MTLRenderPipelineState is not Sendable. But it looks like no matter where or how I try to access this method, I just can't do it - you have this API, but you can't use it, you can only use the synchronous versions.
Am I missing something or is Metal just not usable with Swift 6 right now?
In the header for UIViewController, the method dismissViewControllerAnimated is declared like this:
- (void)dismissViewControllerAnimated: (BOOL)flag completion: (void (^ __nullable)(void))completion NS_SWIFT_DISABLE_ASYNC API_AVAILABLE(ios(5.0));
NS_SWIFT_DISABLE_ASYNC means that there's no async version exposed like there would normally be of a method that exposes a completion handler. Why is this? And is it unwise / unsafe for me to make my own async version of it using a continuation?
My use case is that I want a method that will sequentially dismiss all view controllers presented by a root view controller. So I could have this extension on UIViewController:
extension UIViewController {
func dismissAsync(animated: Bool) async {
await withCheckedContinuation { continuation in
self.dismiss(animated: animated) {
continuation.resume()
}
}
}
func dismissPresentedViewControllers() async {
while self.topPresentedViewController != self {
await self.topPresentedViewController.dismissAsync(animated: true)
}
}
var topPresentedViewController: UIViewController {
var result = self
while result.presentedViewController != nil {
result = result.presentedViewController!
}
return result
}
Hi all, I am looking for a futureproof way of getting the Screen Resolution of my display device using SwiftUI in MacOS. I understand that it can't really be done to the fullest extent, meaning that the closest API we have is the GeometeryProxy and that would only result in the resolution of the parent view, which in the MacOS case would not give us the display's screen resolution. The only viable option I am left with is NSScreen.frame.
However, my issue here is that it seems like Apple is moving towards SwiftUI aggressively, and in order to futureproof my application I need to not rely on AppKit methods as much. Hence, my question: Is there a way to get the Screen Resolution of a Display using SwiftUI that Apple itself recommends? If not, then can I rely safely on NSScreen's frame API?
I have a SwiftUI project which has the following hierarchy:
IOSSceneDelegate (App target) - depends on EntryPoint and Presentation static libs.
Presentation (Static library) - Depends on EntryPoint static lib. Contains UI related logic and updates the UI after querying the data layer.
EntryPoint (Static library) - Contains the entry point, AppDelegate (for its lifecycle aspects) etc.
I've only listed the relevant targets here.
SceneDelegate was initially present in EntryPoint library, because the AppDelegate references it when a scene is created.
public func application(_ application: UIApplication, configurationForConnecting connectingSceneSession: UISceneSession, options: UIScene.ConnectionOptions) -> UISceneConfiguration {
// Set the SceneDelegate dynamically
let sceneConfig: UISceneConfiguration = UISceneConfiguration(name: "mainWindow", sessionRole: connectingSceneSession.role)
sceneConfig.delegateClass = SceneDelegate.self
return sceneConfig
}
The intent is to move the SceneDelegate to the Presentation library.
When moved, the EntryPoint library fails to compile because it's referencing the SceneDelegate (as shown above).
To remove this reference, I tried to set up the SceneDelegate in the old way - In the info.plist file, mention a SceneConfiguration and set the SceneDelegate in Presentation.
// In the Info.plist file
<key>UIApplicationSceneManifest</key>
<dict>
<key>UIApplicationSupportsMultipleScenes</key>
<true/>
<key>UISceneConfigurations</key>
<dict>
<key>UIWindowSceneSessionRoleApplication</key>
<array>
<dict>
<key>UISceneConfigurationName</key>
<string>Default Configuration</string>
<key>UISceneDelegateClassName</key>
<string>Presentation.SceneDelegate</string>
</dict>
</array>
</dict>
</dict>
// In the AppDelegate
public func application(_ application: UIApplication, configurationForConnecting connectingSceneSession: UISceneSession, options: UIScene.ConnectionOptions) -> UISceneConfiguration {
// Refer to a static UISceneconfiguration listed in the info.plist file
return UISceneConfiguration(name: "Default Configuration", sessionRole: connectingSceneSession.role)
}
As shown above, the Presentation.SceneDelegate is referred in the Info.plist file and the reference is removed from the AppDelegate (in EntryPoint library).
The app target compiles, but when I run it, the SceneDelegate is not invoked. None of the methods from the SceneDelegate (scene(_:willConnectTo:options:), sceneDidDisconnect(_:), sceneDidEnterBackground(_:) etc.) are invoked. I only get the AppDelegate logs.
It seems like the Configuration is ignored because it was incorrect. Any thoughts? Is it possible to move the SceneDelegate in this situation?
Hey, I am building some widgets and I'm quite surprised that Swipe gestures for widgets is not supported. It means the user must sacrifice home screen real estate to view multiple widgets to receive the same information. Ideally, swiping left / right inside of the widget should give a developer access to present different views.
I realize that it means that a user would need to swipe outside of the widget, (or swipe to the beginning/end of the series of views inside of the widget) for the page to swipe, but I'd argue that this is the intuitive behavior of what widget scrollview would or should look like anyway.
在使用xcode15.2与iOS14.2版本的手机进行调试时,发现OC编译的sdk无法正常传输数据给swift编写的项目。当我的手机连接xcode调试的时候,数据能够正常传输、转换。当我断开手机与xcode的连接的时候,就无法正常获取数据了。而这个问题目前我只发现在iOS14.2中。当我使用iOS17与iOS18手机调试时没有出现这个问题。请问有没有人遇到过相似的问题。
My question is simple, I do not have much experience in writing swift code, I am only doing it to create a small executable that I can call from my python application which completes Subcription Management.
I was hoping someone with more experience could point out my flaws along with giving me tips on how to verify that the check is working for my applicaiton. Any inight is appreciated, thank you.
import Foundation
import StoreKit
class SubscriptionValidator {
static func getReceiptURL() -> URL? {
guard let appStoreReceiptURL = Bundle.main.appStoreReceiptURL else {
print("No receipt found.")
return nil
}
return appStoreReceiptURL
}
static func validateReceipt() -> Bool {
guard let receiptURL = getReceiptURL(),
let receiptData = try? Data(contentsOf: receiptURL) else {
print("Could not read receipt.")
return false
}
let receiptString = receiptData.base64EncodedString()
let validationResult = sendReceiptToApple(receiptString: receiptString)
return validationResult
}
static func sendReceiptToApple(receiptString: String) -> Bool {
let isSandbox = Bundle.main.appStoreReceiptURL?.lastPathComponent == "sandboxReceipt"
let urlString = isSandbox ? "https://sandbox.itunes.apple.com/verifyReceipt" : "https://buy.itunes.apple.com/verifyReceipt"
let url = URL(string: urlString)!
let requestData: [String: Any] = [
"receipt-data": receiptString,
"password": "0b7f88907b77443997838c72be52f5fc"
]
guard let requestBody = try? JSONSerialization.data(withJSONObject: requestData) else {
print("Error creating request body.")
return false
}
var request = URLRequest(url: url)
request.httpMethod = "POST"
request.httpBody = requestBody
request.setValue("application/json", forHTTPHeaderField: "Content-Type")
let semaphore = DispatchSemaphore(value: 0)
var isValid = false
let task = URLSession.shared.dataTask(with: request) { data, response, error in
guard let data = data, error == nil,
let jsonResponse = try? JSONSerialization.jsonObject(with: data) as? [String: Any],
let status = jsonResponse["status"] as? Int else {
print("Receipt validation failed.")
semaphore.signal()
return
}
if status == 0, let receipt = jsonResponse["receipt"] as? [String: Any],
let inApp = receipt["in_app"] as? [[String: Any]] {
for purchase in inApp {
if let expiresDateMS = purchase["expires_date_ms"] as? String,
let expiresDate = Double(expiresDateMS) {
let expiryDate = Date(timeIntervalSince1970: expiresDate / 1000.0)
if expiryDate > Date() {
isValid = true
}
}
}
}
semaphore.signal()
}
task.resume()
semaphore.wait()
return isValid
}
}
Since updating Xcode to 16.2 I get the following message when attempting to build the project.
SWIFT_VERSION '' is unsupported, supported versions are: 4.0, 4.2, 5.0, 6.0.
The issue I have is that the project is pure objective-c. There is zero Swift code in the project so am struggling to understand why this error is being generated. Welcome assistance.
Graeme
We’re developing an iPad application that visualizes 2D and 3D building floor plans, including a mesh network of nodes that control lighting and climate. The node count ranges from 1,000 to 15,000.
We’re using SceneKit to dynamically render the floor plan and node mesh on an iPad 10th generation running iPadOS 18.3. While the core visualization works, we are experiencing significant performance degradation as the node count increases.
Specifically:
At 750–1,000 nodes, UI responsiveness noticeably declines.
At 2,000 nodes, navigating the floor plan becomes nearly unusable.
We attempted to optimize performance with a Geometric Pool algorithm, but the impact was minimal. Strangely, the same iPad handles 30,000+ 3D objects effortlessly when using Unity or Unreal Engine, raising the question of whether SceneKit may not be optimized for this scale.
Our questions:
Is SceneKit suitable for visualizing such large node counts, or are we hitting an inherent limitation of the framework?
Are there best practices or optimization techniques for SceneKit that we might be missing?
Should we consider a hybrid approach or fully transition to a different 3D engine for this use case?
We’ve attached a code sample below demonstrating the issue. Any insights, suggestions, or experiences would be greatly appreciated!
ContentView.swift
Is there any way I can get updates when I change CarPlay style settings?
I've tried CPSessionConfigurationDelegate.contentStyleChanged and CPTemplateApplicationSceneDelegate.contentStyleDidChange, but they always produce the same result.
When I choose:
Automatic -> I receive light in case of daylight;
Always Dark and Always Show Dark Map toggle on -> dark
Always Dark and Always Show Dark Map toggle off -> light.
But it seems to be wrong, b/c CarPlay's toolbar is still dark, and I receive light.
Is there a way to get a dark style when choosing Always Dark and Always Show Dark Map toggle off? Or at least get updates when the Always Show Dark Map toggle changes?
Does iOS provide a callback when a notification is manually removed from the notification tray ?
I'm trying to add Assets.xcassets to a .swiftpm project, but I'm getting the warning:
⚠️ Ignoring duplicate build file in build source build phase
(Just to know, that is about developing in XCODE, in Swift Playgrounds does not appear it, even in this second being harder to setting up)
The problem is that there are no “Build Phases” in XCODE to remove duplicate files manually.
I've already tried adding the path of Assets.xcassets in the resources property of Package.swift, like:
.executableTarget(
name: "AppModule",
path: ".",
resources: [
.process("Assets.xcassets")
]
)
Even so, the warning persists. Does anyone know how to solve this? Is there any way to remove duplicate references or force a cleanup of the Swift Package Manager cache to fix it?
I appreciate any tips! 🙏🚀
Hi, is there any way to use front camera to do the motion capture?
I want recognize if the user raised there hands up with the front camera on iPhone.
I was able to do it with the back camera, not the front.
Also, if there is any sample code, or document, I would be super happy.
Waiting for your reply!!