Build, test, and submit your app using Xcode, Apple's integrated development environment.

Posts under Xcode tag

200 Posts

Post

Replies

Boosts

Views

Activity

Best `AVMediaType` for depth data.
Dear Apple Developer Forum, I have a question regarding the AVCaptureDevice on iOS. We're trying to capture photos in the best quality possible along with depth data with the highest accuracy possible. We were delighted when we saw AVCaptureDevice could be initialized with the AVMediaType=.depthData which works as expected (depthData is a part of the AVCapturePhoto). When setting to AVMediaType=.video, we still receive depth data (of same quality according to our own internal tests). That confused us. Mind you, we set the device format and depth format as well: private func getDeviceFormat() throws -> AVCaptureDevice.Format { // Ensures high video format and an appropriate color profile. let format = camera?.formats.first(where: { $0.isHighPhotoQualitySupported && $0.supportedDepthDataFormats.count > 0 && $0.formatDescription.mediaSubType.rawValue == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange }) // Check and see if it's available. guard format != nil else { throw CaptureDeviceError.necessaryFormatNotAvailable } return format! } private func getDepthDataFormat(for format: AVCaptureDevice.Format) throws -> AVCaptureDevice.Format { // Access the depth format. let depthDataFormat = format.supportedDepthDataFormats.first(where: { $0.formatDescription.mediaSubType.rawValue == kCVPixelFormatType_DepthFloat32 }) // Check if it exists guard depthDataFormat != nil else { throw CaptureDeviceError.necessaryFormatNotAvailable } // Returns it. return depthDataFormat! } We're wondering, what steps we can take to ensure the best quality photo, along with the most accurate depth data? What properties are the most important, which have an effect, which don't? Are there any ways we can optimize our current configuration? We find it difficult as there's very limited guides and explanations on the media subtypes, for example kCVPixelFormatType_420YpCbCr8BiPlanarFullRange. Is it the best? Is it the best for our use case of high quality photo + most accurate depth data? Important comment: Our App only runs on iPhone 14 Pro, iPhone 15 Pro, iPhone 16 Pro on the latest iOS versions. We hope someone with greater knowledge at Apple can help us and guide us on how we can have the photos of best quality and depth data with most accuracy. Thank you very much! Kind regards.
0
0
364
Jan ’25
No longer able to add SSH package dependencies in Xcode 16
Latest version of Xcode 16.1. I have an existing package dependency which is sitting on a git@ssh.dev.azure.com account. So, now whenever I remove that package dependency, I can no longer add it within the Xcode UI. Just no possible way to add it or find it in the Search or Enter Package URL text field. How on earth are we meant to add SSH packages now? Anyone else have this issue? If so, have you found a work-around without having to manually edit the package dependencies in the project?
1
2
290
Jan ’25
Metal-cpp-extensions isn't working inside frameworks
I am making a framework in C++ using metal-cpp, basically a small game engine. I am also consequently using metal-cpp-extensions provided in LearnMetalCPP to make applications work. For one of my classes, I needed to add AppKit.hpp inside a public header file, so I moved it and its associate headers(NSApplication.hpp, NSMenu.hpp, etc.) from Project headers to Public in Build Phases' Headers, however, it started giving me the error "cast of C pointer type 'void *' to Objective-C pointer type 'Class' requires a bridged cast" at several points in the AppKit headers. They don't appear when AppKit and its associates are in the Project headers, or when they are in the Private headers and no headers import it. I imagined that disabling Objective-C ARC and Using __bridge casts outside of ARC in Build Settings would solve it, but it didn't budge. I imagined it wouldn't involve actively changing the headers would be the answer, but even if I try to put __bridge before the problematic casts, it didn't recognize __bridge. How do I solve this? And why is it only happening in Public and not Project headers?
2
0
775
Jan ’25
Are Swift Packages supported by String Catalogs?
Hello, do the String Catalogs (new in Xcode 15) support Swift Packages? I've tried adding a new Localizable.xcstrings (string catalog) file to my package's resources folder. Great! I then see this screen: All good so far. I then try to go and build my Swift Package... and nothing changes. The string catalog is never populated and I'm left with the same screen as above. So, do string catalogs not support packages at this time or am I doing something wrong? I was really hoping String Catalogs would work and save the day since Export Localizations also does not work for Swift packages that don't support macOS. 😔
9
0
5.7k
Jan ’25
Converted Model Preview Issues in Xcode
Hello! I have a TrackNet model that I have converted to CoreML (.mlpackage) using coremltools, and the conversion process appears to go smoothly as I get the .mlpackage file I am looking for with the weights and model.mlmodel file in the folder. However, when I drag it into Xcode, it just shows up as 4 script tags (as pictured) instead of the model "interface" that is typically expected. I initially was concerned that my model was not compatible with CoreML, but upon logging the conversions, everything seems to be converted properly. I have some code that may be relevant in debugging this issue: How I use the model: model = BallTrackerNet() # this is the model architecture which will be referenced later device = self.device # cpu model.load_state_dict(torch.load("models/balltrackerbest.pt", map_location=device)) # balltrackerbest is the weights model = model.to(device) model.eval() Here is the BallTrackerNet() model itself: import torch.nn as nn import torch class ConvBlock(nn.Module): def __init__(self, in_channels, out_channels, kernel_size=3, pad=1, stride=1, bias=True): super().__init__() self.block = nn.Sequential( nn.Conv2d(in_channels, out_channels, kernel_size, stride=stride, padding=pad, bias=bias), nn.ReLU(), nn.BatchNorm2d(out_channels) ) def forward(self, x): return self.block(x) class BallTrackerNet(nn.Module): def __init__(self, out_channels=256): super().__init__() self.out_channels = out_channels self.conv1 = ConvBlock(in_channels=9, out_channels=64) self.conv2 = ConvBlock(in_channels=64, out_channels=64) self.pool1 = nn.MaxPool2d(kernel_size=2, stride=2) self.conv3 = ConvBlock(in_channels=64, out_channels=128) self.conv4 = ConvBlock(in_channels=128, out_channels=128) self.pool2 = nn.MaxPool2d(kernel_size=2, stride=2) self.conv5 = ConvBlock(in_channels=128, out_channels=256) self.conv6 = ConvBlock(in_channels=256, out_channels=256) self.conv7 = ConvBlock(in_channels=256, out_channels=256) self.pool3 = nn.MaxPool2d(kernel_size=2, stride=2) self.conv8 = ConvBlock(in_channels=256, out_channels=512) self.conv9 = ConvBlock(in_channels=512, out_channels=512) self.conv10 = ConvBlock(in_channels=512, out_channels=512) self.ups1 = nn.Upsample(scale_factor=2) self.conv11 = ConvBlock(in_channels=512, out_channels=256) self.conv12 = ConvBlock(in_channels=256, out_channels=256) self.conv13 = ConvBlock(in_channels=256, out_channels=256) self.ups2 = nn.Upsample(scale_factor=2) self.conv14 = ConvBlock(in_channels=256, out_channels=128) self.conv15 = ConvBlock(in_channels=128, out_channels=128) self.ups3 = nn.Upsample(scale_factor=2) self.conv16 = ConvBlock(in_channels=128, out_channels=64) self.conv17 = ConvBlock(in_channels=64, out_channels=64) self.conv18 = ConvBlock(in_channels=64, out_channels=self.out_channels) self.softmax = nn.Softmax(dim=1) self._init_weights() def forward(self, x, testing=False): batch_size = x.size(0) x = self.conv1(x) x = self.conv2(x) x = self.pool1(x) x = self.conv3(x) x = self.conv4(x) x = self.pool2(x) x = self.conv5(x) x = self.conv6(x) x = self.conv7(x) x = self.pool3(x) x = self.conv8(x) x = self.conv9(x) x = self.conv10(x) x = self.ups1(x) x = self.conv11(x) x = self.conv12(x) x = self.conv13(x) x = self.ups2(x) x = self.conv14(x) x = self.conv15(x) x = self.ups3(x) x = self.conv16(x) x = self.conv17(x) x = self.conv18(x) # x = self.softmax(x) out = x.reshape(batch_size, self.out_channels, -1) if testing: out = self.softmax(out) return out def _init_weights(self): for module in self.modules(): if isinstance(module, nn.Conv2d): nn.init.uniform_(module.weight, -0.05, 0.05) if module.bias is not None: nn.init.constant_(module.bias, 0) elif isinstance(module, nn.BatchNorm2d): nn.init.constant_(module.weight, 1) nn.init.constant_(module.bias, 0) Here is also the meta data of my model: [ { "metadataOutputVersion" : "3.0", "storagePrecision" : "Float16", "outputSchema" : [ { "hasShapeFlexibility" : "0", "isOptional" : "0", "dataType" : "Float32", "formattedType" : "MultiArray (Float32 1 × 256 × 230400)", "shortDescription" : "", "shape" : "[1, 256, 230400]", "name" : "var_462", "type" : "MultiArray" } ], "modelParameters" : [ ], "specificationVersion" : 6, "mlProgramOperationTypeHistogram" : { "Cast" : 2, "Conv" : 18, "Relu" : 18, "BatchNorm" : 18, "Reshape" : 1, "UpsampleNearestNeighbor" : 3, "MaxPool" : 3 }, "computePrecision" : "Mixed (Float16, Float32, Int32)", "isUpdatable" : "0", "availability" : { "macOS" : "12.0", "tvOS" : "15.0", "visionOS" : "1.0", "watchOS" : "8.0", "iOS" : "15.0", "macCatalyst" : "15.0" }, "modelType" : { "name" : "MLModelType_mlProgram" }, "userDefinedMetadata" : { "com.github.apple.coremltools.source_dialect" : "TorchScript", "com.github.apple.coremltools.source" : "torch==2.5.1", "com.github.apple.coremltools.version" : "8.1" }, "inputSchema" : [ { "hasShapeFlexibility" : "0", "isOptional" : "0", "dataType" : "Float32", "formattedType" : "MultiArray (Float32 1 × 9 × 360 × 640)", "shortDescription" : "", "shape" : "[1, 9, 360, 640]", "name" : "input_frames", "type" : "MultiArray" } ], "generatedClassName" : "BallTracker", "method" : "predict" } ] I have been struggling with this conversion for almost 2 weeks now so any help, ideas or pointers would be greatly appreciated! Let me know if any other information would be helpful to see as well. Thanks! Michael
1
0
633
Jan ’25
CoreML Conversion Display Issues
Hello! I have a TrackNet model that I have converted to CoreML (.mlpackage) using coremltools, and the conversion process appears to go smoothly as I get the .mlpackage file I am looking for with the weights and model.mlmodel file in the folder. However, when I drag it into Xcode, it just shows up as 4 script tags instead of the model "interface" that is typically expected. I initially was concerned that my model was not compatible with CoreML, but upon logging the conversions, everything seems to be converted properly. I have some code that may be relevant in debugging this issue: How I use the model: model = BallTrackerNet() # this is the model architecture which will be referenced later device = self.device # cpu model.load_state_dict(torch.load("models/balltrackerbest.pt", map_location=device)) # balltrackerbest is the weights model = model.to(device) model.eval() Here is the BallTrackerNet() model itself import torch.nn as nn import torch class ConvBlock(nn.Module): def __init__(self, in_channels, out_channels, kernel_size=3, pad=1, stride=1, bias=True): super().__init__() self.block = nn.Sequential( nn.Conv2d(in_channels, out_channels, kernel_size, stride=stride, padding=pad, bias=bias), nn.ReLU(), nn.BatchNorm2d(out_channels) ) def forward(self, x): return self.block(x) class BallTrackerNet(nn.Module): def __init__(self, out_channels=256): super().__init__() self.out_channels = out_channels self.conv1 = ConvBlock(in_channels=9, out_channels=64) self.conv2 = ConvBlock(in_channels=64, out_channels=64) self.pool1 = nn.MaxPool2d(kernel_size=2, stride=2) self.conv3 = ConvBlock(in_channels=64, out_channels=128) self.conv4 = ConvBlock(in_channels=128, out_channels=128) self.pool2 = nn.MaxPool2d(kernel_size=2, stride=2) self.conv5 = ConvBlock(in_channels=128, out_channels=256) self.conv6 = ConvBlock(in_channels=256, out_channels=256) self.conv7 = ConvBlock(in_channels=256, out_channels=256) self.pool3 = nn.MaxPool2d(kernel_size=2, stride=2) self.conv8 = ConvBlock(in_channels=256, out_channels=512) self.conv9 = ConvBlock(in_channels=512, out_channels=512) self.conv10 = ConvBlock(in_channels=512, out_channels=512) self.ups1 = nn.Upsample(scale_factor=2) self.conv11 = ConvBlock(in_channels=512, out_channels=256) self.conv12 = ConvBlock(in_channels=256, out_channels=256) self.conv13 = ConvBlock(in_channels=256, out_channels=256) self.ups2 = nn.Upsample(scale_factor=2) self.conv14 = ConvBlock(in_channels=256, out_channels=128) self.conv15 = ConvBlock(in_channels=128, out_channels=128) self.ups3 = nn.Upsample(scale_factor=2) self.conv16 = ConvBlock(in_channels=128, out_channels=64) self.conv17 = ConvBlock(in_channels=64, out_channels=64) self.conv18 = ConvBlock(in_channels=64, out_channels=self.out_channels) self.softmax = nn.Softmax(dim=1) self._init_weights() def forward(self, x, testing=False): batch_size = x.size(0) x = self.conv1(x) x = self.conv2(x) x = self.pool1(x) x = self.conv3(x) x = self.conv4(x) x = self.pool2(x) x = self.conv5(x) x = self.conv6(x) x = self.conv7(x) x = self.pool3(x) x = self.conv8(x) x = self.conv9(x) x = self.conv10(x) x = self.ups1(x) x = self.conv11(x) x = self.conv12(x) x = self.conv13(x) x = self.ups2(x) x = self.conv14(x) x = self.conv15(x) x = self.ups3(x) x = self.conv16(x) x = self.conv17(x) x = self.conv18(x) # x = self.softmax(x) out = x.reshape(batch_size, self.out_channels, -1) if testing: out = self.softmax(out) return out def _init_weights(self): for module in self.modules(): if isinstance(module, nn.Conv2d): nn.init.uniform_(module.weight, -0.05, 0.05) if module.bias is not None: nn.init.constant_(module.bias, 0) elif isinstance(module, nn.BatchNorm2d): nn.init.constant_(module.weight, 1) nn.init.constant_(module.bias, 0) I have been struggling with this conversion for almost 2 weeks now so any help, ideas or pointers would be greatly appreciated! Thanks! Michael
13
0
1k
Jan ’25
Swift Package Manager - Package Download Issue
We have developed a custom iOS framework called PaySDK. Earlier we distributed the framework as PaySDK.xcframework.zip through GitHub (Private repo) with two dependent xcframeworks. Now, one of the clients asking to distribute the framework through Swift Package Manager. I have created a new Private repo in the GitHub, created the new release (iOSSDK_SPM_Test) tag 1.0.0. Uploaded the below frameworks as Assets and updated the downloadable path in the Package.Swift and pushed to the GitHub Main branch. PaySDK.xcframework.zip PaySDKDependentOne.xcframework.zip PaySDKDependentTwo.xcframework.zip When I try to integrate (testing) the (https://github.com/YuvaRepo/iOSSDK_SPM_Test) in Xcode, am not able to download the frameworks, the downloadable path is pointing to some old path (may be cache - https://github.com/YuvaRepo/iOSSDK_SPM/releases/download/1.2.0/PaySDK.xcframework.zip). Package.Swift: // swift-tools-version:5.3 import PackageDescription let package = Package( name: "iOSSDK_SPM_Test", platforms: [ .iOS(.v13) ], products: [ // Products define the executables and libraries a package produces, making them visible to other packages. .library( name: "iOSSDK_SPM_Test", targets: ["PaySDK", "PaySDKDependentOne", "PaySDKDependentTwo"] ) ], targets: [ // Targets are the basic building blocks of a package, defining a module or a test suite. // Targets can depend on other targets in this package and products from dependencies. .binaryTarget( name: "PaySDK", url: "https://github.com/YuvaRepo/iOSSDK_SPM_Test/releases/download/1.0.0/PaySDK.xcframework.zip", checksum: " checksum " ), .binaryTarget( name: "PaySDKDependentOne", url: "https://github.com/YuvaRepo/iOSSDK_SPM_Test/releases/download/1.0.0/PaySDKDependentOne.xcframework.zip", checksum: " checksum " ), .binaryTarget( name: "PaySDKDependentTwo", url: "https://github.com/YuvaRepo/iOSSDK_SPM_Test/releases/download/1.0.0/PaySDKDependentTwo.xcframework.zip", checksum: " checksum " ), .testTarget( name: "iOSSDK_SPM_TestTests", dependencies: ["PaySDK", "PaySDKDependentOne", "PaySDKDependentTwo"] ) ] ) Steps I followed: I have tried below steps, Removed the local repo and cloned new rm -rf ~/Library/Caches/org.swift.swiftpm/ rm -rf ~/Library/Developer/Xcode/DerivedData/* Can anyone help to identify the issue and resolve? Thanks in advance.
0
0
397
Jan ’25
"Network Link Conditioner" in "Additional Tools for Xcode 13" can not be loaded on Big Sur 11.6 (20G165)
I've just downloaded "Additional Tools for Xcode 13" after today's release, and installed "Network Link Conditioner.prefPane" on my macOS Big Sur 11.6 (20G165). But it just don't work. Every time I try to open it an error appeared. Dose the tool only support macOS Monterey which Xcode 13.0 (13A233) dose not support? BTW, the workaround for me is using the tool in "Additional Tools for Xcode 12.5"...
16
1
28k
Jan ’25
Code with Swift Assist
Hello, I would like to inquire about the release date of Swift Assist’s beta version. Apple has stated that it will be released later this year, but they have not provided a specific date or time. Could you please provide information on the beta version’s release date? Additionally, is there a trial version available? If so, when was it released? Thank you for your assistance.
2
1
2.4k
Jan ’25
Failed Message when running the default app.
Hi, My env. is .. Xcode: Version 16.2 (16C5032a) macOS Sequoia: Version 15.1 And I have 2 problems. Please give me the advice.. Failed Message. When I run the automatically generated app as it is, the following error(warning?) message appears in the terminal. Can't find or decode reasons Failed to get or decode unavailable reasons NSBundle file:///System/Library/PrivateFrameworks/MetalTools.framework/ principal class is nil because all fallbacks have failed Not on the simulator And the result is not running in the simulator, but instead appears as a window. (The simulator works fine when launched separately, but the app from the current project doesn’t show up in it.)
1
0
335
Jan ’25
Xcode 16.2 iOS error = CFMessagePort
Hi, After updating to Xcode 16.2, I am getting an error running all iOS apps. I'm using Sequioa 15.2. Error creating the CFMessagePort needed to communicate with PPT. I ran the default Hello World app and got the same error message (plus another error not showing up on my apps - Failed to send CA Event for app launch measurements for ca_event_type: 1 event_name: com.apple.app_launch_measurement.ExtendedLaunchMetrics). Can I ignore the error or is it truly affecting my app (all are testflight versions and it's fine if I don't update them for awhile, although not ideal). Hopefully someone can help! Thanks, -Ashley
11
7
3.2k
Jan ’25
Reality kit Entities Appearing to Lag in a Full or Progressive Style Immersive Space When Opened with Environment Turned On
PLATFORM AND VERSION Vision OS Development environment: Xcode 16.2, macOS 15.2 Run-time configuration: visionOS 2.3 (On Real Device, Not simulator) Please someone confirm I'm not crazy and this issue is actually out of my control. Spent hours trying to fix my app and running profiles because thought it was an issue related to my apps performance. Finally considered chance it was issue with API itself and made sample app to isolate problem, and it still existed in it. The issue is when a model entity moves around in a full space that was launched when the system environment immersion was turned up before opening it, the entities looks very choppy as they move around. If you take off the headset while still in the space, and put it back on, this fixes it and then they move smoothly as they should. In addition, you can also leave the space, and then turn the system environment immersion all the way down before launching the full space again, this will also make the entity moves smoothly as it should. If you launch a mixed immersion style instead of a full immersion style, this issue never arrises. The issue only arrises if you launch the space with either a full style, or progressive style, while the system immersion level is turned on. STEPS TO REPRODUCE https://github.com/nathan-707/ChoppyEntitySample Open my test project, its a small, modified vision os project template that shows it clearly. otherwise: create immersive space with either full or progressive immersion style. setup a entity in kinematic mode, apply a velocity to it to make it pass over your head when the space appears. if you opened the space while the Apple Vision Pros system environment was turned up, the entity will look choppy. if you take the headset off while in the same space, and put it back on, it will fix the issue and it will look smooth. alternatively if you open the space with the system immersion environment all the way down, you will also not run into the issue. Again, issue also does not happen if space launched is in mixed style.
1
0
528
Jan ’25
Getting a datagram too large error while writing back to NEAppProxyUDPFlow
I am trying to setup an extension using DNSProxyProvider that intercepts the DNS traffic on UDP and inserts our custom device identifier and send it to our custom DNS Server which gives us the response which I forward to the requesting client. I have been able to append the identifier with the domain name when sending out request to our custom DNS and I am getting the response back just fine but when I try to write the response to the udpflow I get this error in Console Logs. Error Domain=NEAppProxyFlowErrorDomain Code=9 "The datagram was too large" UserInfo={NSLocalizedDescription=The datagram was too large} Here is what I have tried so far. Truncating the datagram size to less than 10 bytes. Sending in dummy Data object while trying to write to the flow. Double checking the Signing and Capabilities, for Targets, the App and Network Extension. Attached below is code from my NEDNSProxyProvider. The DNS request is process in the handleNewFlow function which calls processUDPFlow override func handleNewFlow(_ flow: NEAppProxyFlow) -> Bool { if flow is NEAppProxyTCPFlow { NSLog("BDDNSProxyProvider : Is TCP Flow...") } else if let udpFlow = flow as? NEAppProxyUDPFlow { NSLog("BDDNSProxyProvider: handleNewFlow : \(udpFlow)") processUDPFlow(udpFlow) // < -- } return true } In the code below I concatenate domain name in the request with deviceId and send it to our server. Also have the Logs lines in, please ignore them. // Read incoming DNS packets from the client self.udpAppProxyFlow = udpFlow udpFlow.readDatagrams { datagrams, error in if let error = error { NSLog("Error reading datagrams: \(error.localizedDescription)") return } guard let datagrams = datagrams else { NSLog("No datagrams received.") return } // Forward each DNS packet to the custom DNS server for (index, packet) in datagrams.enumerated() { let dnsMessage = self.parseDNSMessage(from: packet.0) NSLog("tDatagram Header: \(dnsMessage.header)") for question in dnsMessage.questions { NSLog("tDatagram Question: \(question.name), Type: \(question.type), Class: \(question.klass)") } for answer in dnsMessage.answers { NSLog("tDatagram Answer: \(answer.name), Type: \(answer.type), Data: \(answer.data)") } let oldDomain = self.extractDomainName(from: packet.0)! let packetWithNewDomain = self.replaceDomainName(in: packet.0, with: "827-\(oldDomain)") // func to append device ID NSLog("Packet's new domain \(self.extractDomainName(from: packetWithNewDomain ?? packet.0) ?? "Found nil")") self.sendToCustomDNSServer(packetWithNewDomain!) { responseDatagram in guard let responseDatagram = responseDatagram else { NSLog("Failed to get a response from the custom DNS server") return } let tDatagram = (responseDatagram, packet.1) udpFlow.writeDatagrams([tDatagram]) { error in if let error = error { NSLog("Failed to write DNS response back to client: \(error)") } else { NSLog("Successfully wrote DNS response back to client.") } } } } // Continue Reading Datagrams - DO NOT REMOVE! self.processUDPFlow(udpFlow) } } Following is the function I use to replace domainName // Ensure the datagram is at least the size of the DNS header guard datagram.count > 12 else { NSLog("Error : Invalid datagram: Too small to contain a DNS header") return nil } NSLog("BDLine 193") // Start reading after the header (12 bytes) var offset = 12 // Parse the original domain name while offset < datagram.count { let length = Int(datagram[offset]) // Get the length of the next label offset += 1 // Check for the null terminator (end of domain name) if length == 0 { // Domain name ends here break } // Validate that the length is within bounds guard offset + length <= datagram.count else { NSLog("Error : Invalid datagram: Domain name length exceeds packet size") return nil } // Skip over this label offset += length } Everything is falling into place other than this last Error I get when I try to write back to flow. What am I missing here and how can I resolve this issue? Any help would be appreciated. Thanks
1
0
330
Jan ’25
Not able to save with SwiftData. "The file “default.store” couldn’t be opened."
I get this message when trying to save my Models. CoreData: error: SQLCore dispatchRequest: exception handling request: <NSSQLSaveChangesRequestContext: 0x303034540> , I/O error for database at /var/mobile/Containers/Data/Application/726ECA8C-6C67-4BFE-89E7-AFD8A83CAA5D/Library/Application Support/default.store. SQLite error code:1, 'no such table: ZCALENDARMODEL' with userInfo of { NSFilePath = "/var/mobile/Containers/Data/Application/726ECA8C-6C67-4BFE-89E7-AFD8A83CAA5D/Library/Application Support/default.store"; NSSQLiteErrorDomain = 1; } SwiftData.DefaultStore save failed with error: Error Domain=NSCocoaErrorDomain Code=256 "The file “default.store” couldn’t be opened." UserInfo={NSFilePath=/var/mobile/Containers/Data/Application/726ECA8C-6C67-4BFE-89E7-AFD8A83CAA5D/Library/Application Support/default.store, NSSQLiteErrorDomain=1} The App has Recipes and Calendars and the user can select a Recipe for each Calendar day. The recipe should not be referenced, it should be saved by SwiftData along with the Calendar. import SwiftUI import SwiftData enum CalendarSource: String, Codable { case created case imported } @Model class CalendarModel: Identifiable, Codable { var id: UUID = UUID() var name: String var startDate: Date var endDate: Date var recipes: [String: RecipeData] = [:] var thumbnailData: Data? var source: CalendarSource? // Computed Properties var daysBetween: Int { let days = Calendar.current.dateComponents([.day], from: startDate.midnight, to: endDate.midnight).day ?? 0 return days + 1 } var allDates: [Date] { startDate.midnight.allDates(upTo: endDate.midnight) } var thumbnailImage: Image? { if let data = thumbnailData, let uiImage = UIImage(data: data) { return Image(uiImage: uiImage) } else { return nil } } // Initializer init(name: String, startDate: Date, endDate: Date, thumbnailData: Data? = nil, source: CalendarSource? = .created) { self.name = name self.startDate = startDate self.endDate = endDate self.thumbnailData = thumbnailData self.source = source } // Convenience initializer to create a copy of an existing calendar static func copy(from calendar: CalendarModel) -> CalendarModel { let copiedCalendar = CalendarModel( name: calendar.name, startDate: calendar.startDate, endDate: calendar.endDate, thumbnailData: calendar.thumbnailData, source: calendar.source ) // Copy recipes copiedCalendar.recipes = calendar.recipes.mapValues { $0 } return copiedCalendar } // Codable Conformance private enum CodingKeys: String, CodingKey { case id, name, startDate, endDate, recipes, thumbnailData, source } required init(from decoder: Decoder) throws { let container = try decoder.container(keyedBy: CodingKeys.self) id = try container.decode(UUID.self, forKey: .id) name = try container.decode(String.self, forKey: .name) startDate = try container.decode(Date.self, forKey: .startDate) endDate = try container.decode(Date.self, forKey: .endDate) recipes = try container.decode([String: RecipeData].self, forKey: .recipes) thumbnailData = try container.decodeIfPresent(Data.self, forKey: .thumbnailData) source = try container.decodeIfPresent(CalendarSource.self, forKey: .source) } func encode(to encoder: Encoder) throws { var container = encoder.container(keyedBy: CodingKeys.self) try container.encode(id, forKey: .id) try container.encode(name, forKey: .name) try container.encode(startDate, forKey: .startDate) try container.encode(endDate, forKey: .endDate) try container.encode(recipes, forKey: .recipes) try container.encode(thumbnailData, forKey: .thumbnailData) try container.encode(source, forKey: .source) } } import SwiftUI struct RecipeData: Codable, Identifiable { var id: UUID = UUID() var name: String var ingredients: String var steps: String var thumbnailData: Data? // Computed property to convert thumbnail data to a SwiftUI Image var thumbnailImage: Image? { if let data = thumbnailData, let uiImage = UIImage(data: data) { return Image(uiImage: uiImage) } else { return nil // No image } } init(recipe: RecipeModel) { self.name = recipe.name self.ingredients = recipe.ingredients self.steps = recipe.steps self.thumbnailData = recipe.thumbnailData } } import SwiftUI import SwiftData @Model class RecipeModel: Identifiable, Codable { var id: UUID = UUID() var name: String var ingredients: String var steps: String var thumbnailData: Data? // Store the image data for the thumbnail static let fallbackSymbols = ["book.pages.fill", "carrot.fill", "fork.knife", "stove.fill"] // Computed property to convert thumbnail data to a SwiftUI Image var thumbnailImage: Image? { if let data = thumbnailData, let uiImage = UIImage(data: data) { return Image(uiImage: uiImage) } else { return nil // No image } } // MARK: - Initializer init(name: String, ingredients: String = "", steps: String = "", thumbnailData: Data? = nil) { self.name = name self.ingredients = ingredients self.steps = steps self.thumbnailData = thumbnailData } // MARK: - Copy Function func copy() -> RecipeModel { RecipeModel( name: self.name, ingredients: self.ingredients, steps: self.steps, thumbnailData: self.thumbnailData ) } // MARK: - Codable Conformance private enum CodingKeys: String, CodingKey { case id, name, ingredients, steps, thumbnailData } required init(from decoder: Decoder) throws { ... } func encode(to encoder: Encoder) throws { var container = encoder.container(keyedBy: CodingKeys.self) try container.encode(id, forKey: .id) try container.encode(name, forKey: .name) try container.encode(ingredients, forKey: .ingredients) try container.encode(steps, forKey: .steps) try container.encode(thumbnailData, forKey: .thumbnailData) } }
1
0
880
Jan ’25
Why does Xcode's Minimum Deployments only let you select N.6?
I want to set the minimum deployment to 16.0, however Xcode (16.2) won't let me select that. In the drop down box it shows 18,17,16,15, however if any of these is selected it sets them as 18.6, 17.6, 16.6 or 15.6 (see image) If an attempt is made to edit the value manually, to 16.0, then after change it, Xcode just deletes that value and sets it to nothing. What's going on here? Why is Xcode only allowing the version other than be something.6 and why will it not let you manually edit it?
1
0
434
Jan ’25
Why doesn’t getAPI() show up in autocomplete despite having a default implementation in a protocol extension?
I’m working on a project in Xcode 16.2 and encountered an issue where getAPI() with a default implementation in a protocol extension doesn’t show up in autocomplete. Here’s a simplified version of the code: import Foundation public protocol Repository { func getAPI(from url: String?) } extension Repository { public func getAPI(from url: String? = "https://...") { getAPI(from: url) } } final class _Repository: Repository { func getAPI(from url: String?) { // Task... } } let repo: Repository = _Repository() repo.getAPI( // Autocomplete doesn't suggest getAPI() I’ve tried the following without success: • Clean build folder • Restart Xcode • Reindexing Is there something wrong with the code, or is this a known issue with Xcode 16.2? I’d appreciate any insights or suggestions.
3
0
523
Jan ’25
Xcode "Connect via network" is stuck, controls disabled
Hi, since a while now, I've noticed that in Xcode 15 (beta 8), my iOS device running iOS 17 (latest beta), I can't seem to disable the setting "Connect via Network" in the Devices and Simulators window. The controls are disabled, and stuck to 'on'. I often also have issues with the 'Installing to device' step while development my apps, where a reboot of the iPad is required. My guess is that it gets stuck/confused how it is supposed to deploy and my network setup is a bit complicated due to VPN's, tight WiFi security etc. Unpairing the device doesn't help with resetting this setting. After unpairing the top-right header (with the Take Screenshot controls etc...) even still shows the details of the unpaired device. Anyone else has experienced this or know a solution.
10
14
5.9k
Jan ’25