Dive into the world of programming languages used for app development.

All subtopics

Post

Replies

Boosts

Views

Activity

Xcode Instruments CPU Profiler not logging os_signpost Points of Interest
If I create a new project with the following code in main.swift and then Profile it in Instruments with the CPU Profiler template, nothing is logged in the Points of Interest category. I'm not sure if this is related to the recent macOS 14.2 update, but I'm running Xcode 15.1. import Foundation import OSLog let signposter = OSSignposter(subsystem: "hofstee.test", category: .pointsOfInterest) // os_signpost event #1 signposter.emitEvent("foo") // os_signpost event #2 signposter.withIntervalSignpost("bar") { print("Hello, World!") } If I change the template to the System Trace template and profile again, then the two os_signpost events show up as expected. This used to work before, and this is a completely clean Xcode project created from the macOS Command Line Tool template. I'm not sure what's going on and searching for answers hasn't been fruitful. Changing the Bundle ID doesn't have any effect either.
1
0
1.1k
Dec ’23
CPListImageRowItem customisation
Hi I was trying to design the above UI, But using the below code of CPListImageRowItem func templateApplicationScene(_ templateApplicationScene: CPTemplateApplicationScene, didConnect interfaceController: CPInterfaceController) { self.interfaceController = interfaceController // Create a list row item with images let item = CPListImageRowItem(text: "Category", images: [UIImage(named: "cover.jpeg")!, UIImage(named: "cover2.jpeg")!, UIImage(named: "discover.jpeg")!, UIImage(named: "thumbnail.jpeg")!]) // Create a list section let section = CPListSection(items: [item]) // Create a list template with the section let listTemplate = CPListTemplate(title: "Your Template Title", sections: [section]) // Set the template on the interface controller interfaceController.setRootTemplate(listTemplate, animated: true) } I was getting only header and below image items but detailed text under images are no way to set. can any one help me out of this
1
1
840
Dec ’23
iOS/iPadOS 17.2 Bug in PDFKit duplicates content of filled PDF form fields
The following situation is given: In your code you have a PDFDocument object. This contains a PDF file with form fields that are filled with text. If you call the dataRepresentation() method on this PDFDocument object, you get back a data object: let myDocumentData: Data = pdfDocument.dataRepresentation()! If you now want to initialize a new PDFDocument object with this data object, the contents of the form fields are duplicated within the PDF. let newPDF: PDFDocument = PDFDocument(data: myDocumentData) If you now want to print the newPDF PDFDocument object by creating a new print job, you will get the following result: What you actually expect to see: You only see this behavior when you want to print or share the PDF. You won't see this behaviour inside a PDF View in your application. This behaviour only appears since iOS/iPadOS 17.2 Steps to reproduce: Get a PDF file with form fields, especially text fields fill out these text fields with text create a PDFDocument object with this PDF file call the dataRepresentation() method on this PDFDocument object and store the result in a new variable create a new PDFDocument object with the data object created in the previous step:PDFDocument(data: <data>) Print the new created PDFDocument within iOS/iPadOS 17.2 or share it for example via email I hope Apple will fix this bug soon!
5
4
1.2k
Dec ’23
PDFKit since iPadOS 17.2: Annotations are scaled down when saving file
Dear Developer Community, My app is saving handwritten notes, forms and images as annotations in PDF documents by using PDFKit. Since iPadOS 17.2, the content of the annotations within the annotation boundaries is scaled down when saving the annotated PDF file. I.e. the annotation boundaries remain unchanged, but the displayed annotation content shrinks and no longer fills the boundaries. This gets worse with every save operation and applies both to newly created annotations and to elements that were saved before iPadOS 17.2. This issue only occurred after updating to iPadOS 17.2. The same code on my test device with iPadOS 17.1 works perfectly. Does anybody have a similar issue and/or found a workaround to solve this problem? Thanks for any idea!
2
1
923
Dec ’23
Where is help on Swift documentation markup?
I am reluctant to admit that I only came to know that Swift provides a builtin documentation markup syntax just a few months ago. /** Test func Some description here. - Parameters: - b:Test - d: Test - f: Test - Returns: Bool */ func myMethod(a b:Int, c d:Int, e f:Int) -&gt; Bool { b &gt; d } It seems the markup is pretty simple and has only a few keywords. But, I want to read through the complete reference. Any useful pointers?
2
0
827
Dec ’23
QLPreviewController identify markup mode
Is there any way to identify if the QLPreviewController is in markup mode? I have a custom QLPreviewController which is used to preview and edit images in my app. I need to identify when entering the markup mode and make some changes in the navigation bar based on this. But I could not find any variable or delegate methods to identify this. Any help will be appreciated.
0
0
542
Dec ’23
Xcode/Swift/MacOS app enablement problem
I have a MacOS screenshot app that was created in 2014. I've recently updated some code and libraries and am having problems with the transfer of screenshot entablements to the new app. Basically, if a user had the old version of my app they would have to delete the old enablement start the new app and then re-enable the updated version of the app. Why is this happening? It's confusing because the user sees that my app is enabled but the enablement isn't working.
1
0
428
Dec ’23
How to address elements in an EnvironmentObject Array
I am new to programing apps, and I am trying to figure out how to use one item out of an array of items in a Text line. I am not getting any complaints from Xcode, but then the preview crashes giving me a huge error report that it keeps sending to Apple. I have cut out all the extra stuff from the app to get just the basics. What I want it to print on the screed is "Hello Bill How are you?" with Bill being from the observable array. The first picture below is about 2 seconds after I removed the // from in front of the line that reads Text("(friend2.friend2[1].name)"). The other two pictures are the main app page and the page where I setup the array. At the very bottom is a text file of the huge report it kept sending to Apple, until I turned of the option of sending to Apple. Would someone please explain what I am doing wrong. Well a side from probably everything. Error code.txt
1
0
649
Dec ’23
How to declare a Protocol that conforms to Observable
I'm learning Swift — from the language docs, it seems to me that this should be possible, but I get the commented compiler error on the line below. Can someone explain why this is illegal? import SwiftUI struct ContentView: View { // 'init(wrappedValue:)' is unavailable: The wrapped value must be an object that conforms to Observable @Bindable var failing : SomeProtocol var body: some View { Text(failing.foo) } } protocol SomeProtocol : Observable { var foo: String { get } } The use case is adapting different class hierarchies to be displayed by the same View: final SwiftData model classes and dynamic models supporting editing and calculation during model creation. Apple docs on Observation and Protocol Inheritance have led me to believe this should be possible. Thanks for taking time to read (and reply).
1
0
403
Dec ’23
corebluetooth Bluetooth cannot connect to airpods
Hello everyone: I used corebluetooth, but currently I cannot connect Airpods headphones. details as following: Unable to scan to unconnected devices. If the device is already linked, it can be searched, but the connection cannot be made successfully. I searched Google for related posts but couldn't find a website that can solve the problem. I need to ask if there are any relevant restrictions on Airpods or if there are other real solutions that I can link to. thank you all.
0
0
381
Dec ’23
Store location data when app is not running using Xcode 15.1 and SwiftUI
I'm close but can't seem to get my app to store location data when not running. I'm trying to collect driver data. When I start the App it asks for location permission while using the App. At no time does the app ask me to give permission for allowing the app to collect information when I am not using the app. I think the problem revolves around this. I've also tried going into iOS settings for the app and set to Always but that didn't help. I'm likely missing something here within the app to get it to collect the data when the app is not running. Any help is appreciated. Here is what I've got. For Signing I have Background Modes enabled with "Location updates". For plist.info I have the following set with descriptions. Privacy - Location Location Always Usage Description Privacy - Location When in Use Usage Description Privacy - LocationAways and When in Use Usage Description I also have in the Info file: Required background modes with Item 0 set with "App registers for location updates" for code I have the the following in the AppDelegate: import UIKit import CoreLocation class AppDelegate: NSObject, UIApplicationDelegate { static let shared = AppDelegate() func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]? = nil) -> Bool { // Add any custom setup code here return true } } extension AppDelegate { func requestLocationAuthorization() { let locationManager = CLLocationManager() locationManager.requestAlwaysAuthorization() } } For my app here is how I trigger the app delegate and starupt the location manager import SwiftUI import SwiftData @main struct ZenTracApp: App { @UIApplicationDelegateAdaptor(AppDelegate.self) var appDelegate @Environment(\.scenePhase) var scenePhase @StateObject var locationManager = LocationManager() ... for code I have the following LocationManager: import CoreLocation import SwiftData @MainActor class LocationManager: NSObject, ObservableObject { @Published var location: CLLocation? //@Published var region = MKCoordinateRegion() private let locationManager = CLLocationManager() /// Override exiting init override init() { /// Bring in the normal init super.init() AppDelegate.shared.requestLocationAuthorization() locationManager.desiredAccuracy = kCLLocationAccuracyBest locationManager.distanceFilter = kCLDistanceFilterNone locationManager.requestAlwaysAuthorization() locationManager.startUpdatingLocation() locationManager.delegate = self locationManager.allowsBackgroundLocationUpdates = true locationManager.showsBackgroundLocationIndicator = true locationManager.startUpdatingLocation() } } extension LocationManager: CLLocationManagerDelegate { /// iOS triggers whenever the location changes func locationManager(_ manager: CLLocationManager, didUpdateLocations locations: [CLLocation]) { guard let newLocation = locations.last else {return} /// Check if the location has changed significantly before updating in this case more than 10 meters if self.location == nil || location!.distance(from: newLocation) > 10 { self.location = newLocation // save the location info in this function saveEntry(location: self.location!) } } }
1
0
777
Dec ’23
Core ML MLOneHotEncoder Error Post-Update: "unknown category String"
Apple Developer community, I recently updated Xcode and Core ML from version 13.0.1 to 14.1.2 and am facing an issue with the MLOneHotEncoder in my Core ML classifier. The same code and data that worked fine in the previous version now throw an error during predictions. The error message is: MLOneHotEncoder: unknown category String [TERM] expected one of This seems to suggest that the MLOneHotEncoder is not handling unknown strings, as it did in the previous version. Here's a brief overview of my situation: Core ML Model: The model is a classifier that uses MLOneHotEncoder for processing categorical data. Data: The same dataset is used for training and predictions, which worked fine before the update. Error Context: The error occurs at the prediction stage, not during training. I have checked for data consistency and confirmed that the dataset is the same as used with the previous version. Here are my questions: Has there been a change in how MLOneHotEncoder handles unknown categories in Core ML version 14.1.2? Are there any recommended practices for handling unknown string categories with MLOneHotEncoder in the updated Core ML version? Is there a need to modify the model training code or data preprocessing steps to accommodate changes in the new Core ML version? I would appreciate any insights or suggestions on how to resolve this issue. If additional information is needed, I am happy to provide it. Thank you for your assistance!
1
0
655
Dec ’23
certificate is not trusted
I only recently installed Xcode 10 (yes, I know), and since then, new Swift code I write will compile but not run. It appears to be a "trust" issue. Exception Type: EXC_CRASH (Code Signature Invalid) Exception Codes: 0x0000000000000000, 0x0000000000000000 Exception Note: EXC_CORPSE_NOTIFY Termination Reason: Namespace CODESIGNING, Code 0x1 Bobs-MBP:$ codesign -v -vvv /Users/bobsmith/programming/cocoa/test20/build/Debug/test20.app /Users/bobsmith/programming/cocoa/test20/build/Debug/test20.app: CSSMERR_TP_NOT_TRUSTED In architecture: x86_64 I looked at my certificate, and saw that it wasn't trusted. Changing it to "Always Trust" failed to correct the problem. What should I do? Thanks for taking the time to read this.
3
0
481
Dec ’23
Having issues with SwiftUI ForEach usage
The code below is a json parser example. I used the quick type website to build my structs for reading in the json. All that code works but I am having an issue with looping thru the data read. I am trying to read this is a view, so I can not use a for statement. I am having issues learning how to use the ForEach Statement to loop on the contacts in this json data and print the firstName, lastName for each contact. This is the Code let data1JSON = """ [ { "data": { "viewer": { "__typename": "Member", "id": 123, "firstName": "d", "lastName": "a", "emailAddress": "w" }, "league": { "id": 1111, "name": "a", "slug": "b", "isMine": true, "logo": "g", "homePageUrl": "bA", "facebookUrl": "www.facebook.com/B", "phone": "1", "contacts": [ { "id": 12, "firstName": "", "lastName": "d", "phone": null, "__typename": "Contact" }, { "id": 10, "firstName": "", "lastName": "c", "phone": null, "__typename": "Contact" } ], "__typename": "League" } } } ] """ // MARK: - ApaResultElement struct ApaResultElement: Codable { let data: DataClass } // MARK: - DataClass struct DataClass: Codable { let viewer: Viewer let league: League } // MARK: - League struct League: Codable { let id: Int let name, slug: String let isMine: Bool let logo: String let homePageURL, facebookURL, phone: String let contacts: [Viewer] let typename: String enum CodingKeys: String, CodingKey { case id, name, slug, isMine, logo case homePageURL = "homePageUrl" case facebookURL = "facebookUrl" case phone, contacts case typename = "__typename" } } // MARK: - Viewer struct Viewer: Codable { let id: Int let firstName, lastName: String let phone: JSONNull? let typename: String let emailAddress: String? enum CodingKeys: String, CodingKey { case id, firstName, lastName, phone case typename = "__typename" case emailAddress } } typealias ApaResult = [ApaResultElement] // MARK: - Encode/decode helpers class JSONNull: Codable, Hashable { public static func == (lhs: JSONNull, rhs: JSONNull) -> Bool { return true } public var hashValue: Int { return 0 } public init() {} public required init(from decoder: Decoder) throws { let container = try decoder.singleValueContainer() if !container.decodeNil() { throw DecodingError.typeMismatch(JSONNull.self, DecodingError.Context(codingPath: decoder.codingPath, debugDescription: "Wrong type for JSONNull")) } } public func encode(to encoder: Encoder) throws { var container = encoder.singleValueContainer() try container.encodeNil() } } let decoder = JSONDecoder() let leagueView = data1JSON.data(using: .utf8)! do { try decoder.decode([ApaResultElement].self, from: leagueView) print("success") } catch { print("Error found => \(error)") } let d1 = try decoder.decode([ApaResultElement].self, from: leagueView) // This is where I need help to loop the json to extract all the data // I see that I need to loop on each array stuct to do that. I have looked for a good tutorial on ForEach, but none of the sites I found help me look on this JSON Data. I am new to swift and the autocomplete feature, so any tips using Apple Developer Documentation or the Code Complete would also be appreciated. Thanks
2
0
1.3k
Dec ’23
Very embarrassing beginner's question...
I'm an old C programmer just starting to learn Swift and the X-code environment. In Apple's "Develop in Swift Fundamentals", on page 19 I found the first exercise to print "Hello World" in the Terminal using Swift REPL. Weirdly, after following the instructions, I get the following "Badly placed ()'s" error (I'm omitting the Swift Subcommands menu)- Last login: Tue Dec 26 12:55:07 on ttys000 [MacBook-Pro:~] xxxxx% swift Welcome to Swift! [MacBook-Pro:] xxxxx% print("Hello, World!") Badly placed ()'s. [MacBook-Pro:] xxxxx% This is a pretty discouraging result! Obviously, I'm missing something (and please, no jokes about requisite neurons).
3
0
551
Dec ’23
AVCaptureDevice.FocusMode cannot be set in iPhone15
We are developing an app which uses AVCaptureSession. Here is a part of my code: if context.config.basic_settings.auto_focus == 1 { // 自動フォーカス device.focusMode = AVCaptureDevice.FocusMode.continuousAutoFocus completion() } else { device.focusMode = AVCaptureDevice.FocusMode.locked var _lenspos = Float(context.config.basic_settings.lens_position) ?? 0.5 if _lenspos > 1.0 { _lenspos = 1.0 } if _lenspos < 0.0 { _lenspos = 0.0 } device.setFocusModeLocked(lensPosition: _lenspos, completionHandler: { (time) in completion() }) } This code can successfully set focus mode to autofoucs or manual focus and also can set lens position perfectly if we use iPhone13 or iPhone14. But If we use iPhone15, this code cannot set the focus mode or lensposition. We also tried with different iPhone15 devices, but the result is always the same(fails to set focus). And also used different iPhone13 and iPhone14 devices. But these iPhones can set the focus correctly everytime. Is it a bug of iPhone15? Or is there anything that I can do about the issue?
2
0
800
Dec ’23
Visionkit can lift a subject. But the bounding rectangle is always returning x,y,width,height values as 0,0,0,0
In our app, we needed to use visionkit framework to lift up the subject from an image and crop it. Here is the piece of code: if #available(iOS 17.0, *) { let analyzer = ImageAnalyzer() let analysis = try? await analyzer.analyze(image, configuration: self.visionKitConfiguration) let interaction = ImageAnalysisInteraction() interaction.analysis = analysis interaction.preferredInteractionTypes = [.automatic] guard let subject = await interaction.subjects.first else{ return image } let s = await interaction.subjects print(s.first?.bounds) guard let cropped = try? await subject.image else { return image } return cropped } But the s.first?.bounds always returns a cgrect with all 0 values. Is there any other way to get the position of the cropped subject? I need the position in the image from where the subject was cropped. Can anyone help?
1
0
777
Dec ’23