I'm following the video tutorial below, using the exact examples, but was not able to semantically match the results:
https://developer.apple.com/videos/play/wwdc2024/10131
https://developer.apple.com/documentation/corespotlight/building-a-search-interface-for-your-app
In iOS 18 and macOS 15 and later, Spotlight also supports semantic searches of your content, in addition to lexical matching of a search term.
I'm on macOS 15.1, so I'd expect it should work now? Or is this depend on Apple Intelligence for some reason?
Specifically I've indexed the following:
Keyword: "windsurfing carmel"
Literal match:
the best windsurfing carmel county
windsurfing lessons
Semantic match:
sailboarding lessons
the best windsurfing carmel county
windsurfing lessons
Expected: find semantic match.
Actual: only literal match were returned.
Because CSUserQuery.prepare is only supported by macOS 15, my switch from CSSearchQuery makes no sense without the semantic search benefits.
Did I miss something? I also added the corespotlight delegate extension as directed but was not able to hit the breakpoint as per the video. I wish there is the sample code for this, but couldn't find it.
General
RSS for tagExplore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Post
Replies
Boosts
Views
Activity
I am trying to convert a string field to an integer field in our database schema. However, the custom migration that I write doesn't seem to run.
My Model
//Run 1
//typealias Book = BookSchemaV1.Book
//Run 2
typealias Book = BookSchemaV2.Book
// MARK: - Migration Plan
enum BookModelMigrationPlan: SchemaMigrationPlan {
static var schemas: [any VersionedSchema.Type] = [
BookSchemaV1.self,
BookSchemaV2.self
]
static var stages: [MigrationStage] = [migrateV1toV2]
static var oldBooks: [BookSchemaV1.Book] = []
static let migrateV1toV2 = MigrationStage.custom(
fromVersion: BookSchemaV1.self,
toVersion: BookSchemaV2.self,
willMigrate: nil,
didMigrate: { context in
oldBooks = try context.fetch(FetchDescriptor<BookSchemaV1.Book>())
oldBooks.forEach { oldBook in
do {
let newBook = BookSchemaV2.Book(
bookID: String(oldBook.bookID),
title: oldBook.title
)
context.insert(newBook)
context.delete(oldBook)
try context.save()
} catch {
print("New model not saved")
}
}
}
)
}
// MARK: - Schema Versions
enum BookSchemaV1: VersionedSchema {
static var models: [any PersistentModel.Type] = [Book.self]
static var versionIdentifier = Schema.Version(1, 0, 0)
@Model
final class Book {
@Attribute(.unique) var bookID: Int
var title: String
init(
bookID: Int,
title: String
) {
self.bookID = bookID
self.title = title
}
}
}
enum BookSchemaV2: VersionedSchema {
static var models: [any PersistentModel.Type] = [Book.self]
static var versionIdentifier = Schema.Version(2, 0, 0)
@Model
class Book {
@Attribute(.unique) var bookID: String
var title: String
init(
bookID: String,
title: String
) {
self.bookID = bookID
self.title = title
}
}
}
@MainActor
class AppDataContainer {
static let shared = AppDataContainer()
let container: ModelContainer
private init() {
do {
let schema = Schema([Book.self])
let config = ModelConfiguration(schema: schema)
container = try ModelContainer(for: schema, migrationPlan: BookModelMigrationPlan.self, configurations: [config])
} catch {
fatalError("Could not create ModelContainer: \(error)")
}
}
}
A lot of apps just produce a black screen image (sometimes with a logo) when you screenshot within them. It appears the UITextField trick most had used no longer works in iOS 18. How can you achieve this?
I'm implementing a QuickLook extension through the macOS extension point com.apple.quicklook.preview using the view-based method where I implement QLPreviewingController to show information about the previewed file url.
This NSView controlled by my QLPreviewingController supports no interaction which makes sense, but I see some other QuickLook previews like for videos having toolbar button to open other apps or modify the content.
How can I get similar behaviour?
I have two questions --
1) How can I prevent a modal from being dismissed when the app enters the background?
2) I have a modal I'm presenting that gets dismissed seemingly at random if it's displayed within the first several seconds of app launch but stays displayed indefinitely otherwise. No other code is calling dismiss, and none of the UIAdaptivePresentationControllerDelegate dismissal methods get called. What other actions / etc would cause a modal presentation to be dismissed like that?
In visionOS, i have been trying to implement this view as a background for information view, but i cannot find any information about it anywhere. Does anyone know what this is called or any workaround to achieve this look?
Hi, I have a problem that I can't solve, and I hope you can help me:
When I switch tabs or scroll down, the background color of the tab bar changes automatically.
This happened to me with Xcode 16.0.
Can yo help me please?
I have an app that uses Storyboard (not SwiftUI). Is there a way to define that a particular WKInterfaceButton should receive the "sent action" (as if the user tapped the button on screen) when the user uses the double-tap gesture with their fingers?
When a custom tool items set is formed in PKToolPicker with each inking item having non-nil identifier created using PKToolPickerInkingItem(type: ,color:,width:,identifier: ), changing color or width of an inking item (pen, pencil etc) causes an instant crash.
I believe it is a bug in PencilKit.
It seems that when you change the color or width of an inking item having a identifier in squeeze tool palette, it tries to find a tool item without identifier (the default tool picker has items without identifier) in the tool items set. I guess it cannot find, thus the find function returns either -1 or highest integer number (2^63 -1 ) and it uses this number as index without boundary checking. That's why we observe [__NSArrayM replaceObjectAtIndex:withObject:]: index 9223372036854775807 beyond bounds [0 .. 9]
I filed a report on Feedback Assistant with id: FB15519801 too.
The corresponding part in crash report is as follows:
0 CoreFoundation 0x183e0908c __exceptionPreprocess + 164 (NSException.m:249)
1 libobjc.A.dylib 0x18110b2e4 objc_exception_throw + 88 (objc-exception.mm:356)
2 CoreFoundation 0x183de4048 -[__NSArrayM replaceObjectAtIndex:withObject:] + 1020 (NSArrayM.m:180)
3 PencilKit 0x1c44f73c8 -[PKToolPicker _setSelectedTool:saveState:updateUI:updateLastSelectedTool:] + 800
(PKToolPicker.m:587)
4 PencilKit 0x1c45a5684 -[PKPencilSqueezeControllerPaletteViewDelegateProxy paletteView:didSelectTool:atIndex:] + 200 (PKPencilSqueezeControllerPaletteViewDelegateProxy.m:227)
5 PencilKit 0x1c460906c -[PKSqueezePaletteView _didSelectTool:atIndex:] + 196 (PKSqueezePaletteView.m:441)
6 PencilKit 0x1c462203c -[PKSqueezePaletteViewExpandedInkingToolLayout _didTapStrokeWeightButton:] + 336
(PKSqueezePaletteViewExpandedInkingToolLayout.m:224)
7 UIKitCore 0x18691edd8 -[UIApplication sendAction:to:from:forEvent:] + 100 (UIApplication.m:5797)
8 UIKitCore 0x18691ecb0 -[UIControl sendAction:to:forEvent:] + 112 (UIControl.m:942)
9 UIKitCore 0x18691eb00 -[UIControl _sendActionsForEvents:withEvent:] + 324 (UIControl.m:1013)
10 UIKitCore 0x187080568 -[UIButton _sendActionsForEvents:withEvent:] + 124 (UIButton.m:4192)
11 UIKitCore 0x187081d7c -[UIControl touchesEnded:withEvent:] + 400 (UIControl.m:692)
12 UIKitCore 0x1868675b0 -[UIWindow _sendTouchesForEvent:] + 852 (UIWindow.m:3313)
and the exception reason is
*** -[__NSArrayM replaceObjectAtIndex:withObject:]: index 9223372036854775807 beyond bounds [0 .. 9]
I have an StatusBar APP in Swift, under Sonoma the App works fine,
after upgrade to Sequoia, i cant get the input from Keyboard inside the NSTextField, because the NSTextField shows the Focusring.
The given input from keyboard, get every time to another window, like xcode, Safari or Desktop.
depending on what was last active after I selected a menu item in the menu of the statusbar APP with a view or an NSAlert with an NSTextField.
I have spent several hours with debug sessions, a KeyDown or KeyUP event from the keyboard does not arrive.
As I said in Sonoma this was not a problem and the APP works as expected.
To me this looks like a bug in Sequoia.
But maybe someone here has an idea on this topic
Somebody help me please.
I try to set specific time for notification, it works nice, but if you need a little beat more functional this is where difficulties appear. I'd like to give opportunities for repeat, example every hour. I know that UNCalendarNotificationTriger has a repeat value, but when you set repeat on true it remember date component, exp - .minute, and then just repeating notification every time when that minute comes!
I'm looking for solution for set notification at special time(exp: 5:00 pm), and then repeating this notification every hour(6, 7, 8, 9 pm)
Maybe it's so easy but looks like I feel stuck 😕
It appears that starting with macOS Sequoia, Quick Look Preview extension no longer loads MapKit maps correctly anymore. Map tiles do not appear, leaving users with a beige background.
Users report that polylines do render correctly, but annotations appears black.
This was previously working fine in prior macOS versions including Sonoma.
STEPS TO REPRODUCE
Create a macOS app project, with an associated document.
Ensure project has a Quick Look preview extension, with necessary basic setups.
Ensure that the extension mentioned in (2) must have a MKMapView. Any other cosmetic changes, etc, does not need to be implemented to observe the base issue. Do note that it has been reported that in addition to the map tiles not loading, annotations don't render correctly as well.
Is it ok to use automaticallyMergesChangesFromParent to true for a managed object context that pins its query generation to current?
I have heard from a few sources that in most cases, it’s not recommended to set automaticallyMergesChangesFromParent to true for a managed object context that pins its query generation to current.
A source I have seen says:
When you pin a managed object context to a specific query generation, you’re explicitly telling the context to view the data as it existed at the time of that generation token (often referred to as a snapshot). This ensures that the context is isolated from any subsequent changes made by other contexts or background tasks.
But this may Conflict with Automatic Merging because:
The purpose of setting automaticallyMergesChangesFromParent to true is to have the context automatically merge changes from the parent (e.g., background or main context) into itself whenever the parent context saves changes. This behavior conflicts with the concept of a pinned query generation because:
• If the context is pinned to current, it should not be concerned with changes that happen after that pinning.
• Automatic merging would introduce updates from the parent context that occur after the pinning, thereby violating the “snapshot” nature of the query generation and potentially creating inconsistencies.
However, in your own Apple Sample Code Titled "" does use both together. Inside the definition of lazy var persistentContainer: NSPersistentCloudkitContainer the following code is included:
// Pin the viewContext to the current generation token, and set it to keep itself up to date with local changes.
container.viewContext.automaticallyMergesChangesFromParent = true
do {
try container.viewContext.setQueryGenerationFrom(.current)
} catch {
fatalError("###\(#function): Failed to pin viewContext to the current generation:\(error)")
}
Even on iOS 18 (16-17 also repo) we are seeing a crash on the FamilyActivityPicker when users tap on "Other" or just at random times. We see the follow debug message but no other way of identifying the issue in code.
[u 3C8AF272-DC4E-55C4-B8C6-34826D2BEB5B:m (null)] [com.apple.FamilyControls.ActivityPickerExtension(1150.1)] Connection to plugin invalidated while in use.
Even with the most basic implementation of FamilyActivityPicker (example below) we can repro this crash consistently.
Big applications (think Opal) see the same issue but it seems like they see it less, and are able to intercept the disconnect in order to show an error to the user. My two questions are
How can we intercept this crash/disconnect in order to alert our user and restart the experience?
Is this EVER gonna get fixed properly?
Usage Example:
var body: some View {
NavigationView {
ZStack {
familyPickerErrorView
.opacity(isHidden ? 0 : 1)
.onAppear {
DispatchQueue.main.asyncAfter(deadline: .now() + 2) {
withAnimation {
isHidden = false
}
}
}
VStack {
Color.clear
.frame(height: 1)
.background(Color(UIColor.systemBackground))
FamilyActivityPicker(
headerText: "Select Apps To Be Blocked (Maximum of 50)",
footerText: "Want to block Safari? Check our FAQs",
selection: $familySelection)
.ignoresSafeArea(.all)
}
}
}
.toolbar {
ToolbarItem(placement: .navigationBarLeading) {
Button(action: {
isPresented = false
}) {
Text("Cancel")
.foregroundColor(.black)
}
}
ToolbarItem(placement: .navigationBarTrailing) {
Button(action: {
isPresented = false
}) {
Text("Done")
}
}
}
.navigationBarTitleDisplayMode(.inline)
.alert(isPresented: $showAlert) {
Alert(title: Text("Family Activity Picker Issue"), message: Text(alertMessage), dismissButton: .default(Text("OK")))
}
.onAppear {
isPresented = true
}
}
I have an XCFramework A, which has it's dependencies on XCFramework B and XCFramework C. Now I need to distribute this XCFramework A via SPM.
How to declare dependencies B and C for this XCFramework A, which is a binary target?
I need to distribute only Framework A through SPM, not every other dependent frameworks. How to do this? Is this possible?
Hello everyone,
Recently, I have encountered an issue in my tvOS app where a specific property of UITextField, isSecureTextEntry, set to true, was preventing another property, keyboardType, from functioning correctly. In my case, keyboardType is set to numberPad option.
The problem is that during the first tap on the text field, the default keyboard with numbers, letters, and some special characters opens. However, after the second tap, the correct keyboard type with only numbers appears as I want. Removing isSecureTextEntry or setting to false solves the problem.
import UIKit
class ViewController: UIViewController {
private let textField = UITextField()
override func viewDidLoad() {
super.viewDidLoad()
textField.keyboardType = .numberPad
textField.isSecureTextEntry = true
view.addSubview(textField)
setupConstraints()
}
}
I'd like to share an app's screen in two modes. First in a standard mirroring mode and second in an "additional content" mode (very likely with a session role windowExternalDisplayNonInteractive).
I found that the Keynote app on iOS does a very nice example of what I want to achieve when sharing an iPhone using AirPlay to an AppleTV.
sharing a screen results in mirroring the screen on the TV
tapping the play button in Keynote switches to "additional content" where iPhone and TV show different content
leaving the additional content mode returns to "mirroring" where TV and iPhone show the same content
Is there an example for implementing such a feature?
I am able to successfully use the external display (windowExternalDisplayNonInteractive) and show additional content there.
How can I programmatically "detach" the additional content from the external display and activate mirroring mode?
Searching the Developer Forums for windowExternalDisplayNonInteractive reveals some discussions, which include valuable information, however, returning to mirror mode does not seem to be covered.
We have widgets in our app. We're now working on a Live Activitiy with a button calling an app intent. This app intent needs to update our Widgets, and we're seeing semi-great results.
When we're updating the widgets from within the app, it works great. Also from geofence triggers it usually works, so we're thinking it might have to do with the "widget update budget"?
According to the docs:
Cases in which WidgetKit doesn’t count reloads against your widget’s budget include when: The widget performs an app intent, such as when the user taps a button or toggles a switch.
But we're not really seeing that. When I run our app from within Xcode, everything runs great all the time and the widget gets updated within milliseconds, but when running the TestFlight version is more spotty.
To be clear: This is a button in a live activity, calling an app intent, and in turn, the app intent is calling reloadAllTimelines for our "regular" widgets.
The live activity itself always gets updated properly.
My question is basically, am I doing something wrong and can I do something to increase the consistency of the widget updating on time?
Abbreviated example:
final class UserEventIntent: NSObject, LiveActivityIntent {
@MainActor
func perform() async throws -> some IntentResult {
do {
let newStatus: (stat: Status, wasSame: Bool) = try await eventHelper.performEvent(status: status)
WidgetCenter.shared.reloadAllTimelines()
}catch {
await WidgetCenterBridge.updateLiveActivityForInProgress(false)
}
return .result()
}
Dear Senior Developer,
I come to you at a time where I am lost.
Over the last 2-3 months, I have noticed a series of crashes occuring on my app. This all started randomly and has now been a regular occurence. Usually, I would receive some detail as it relates to some class or view that is causing this error but now the only details I have is it relates to a UiViewController dealloc even though I am using SwiftUI.
Below I have attached the stack trace from firebase crash analytics. I have spent months on this and I am asking for the help of someone much more senior and knowledable to assist me in this regard.
Thanks again for your help and I await your response. I am also willing to share my screen LIVE to help you help me identify this issue.
manny.GoblinTools_issue_bfd18ee65a92b459d4ecef3475a9ec34_crash_session_032166c28b8c4764b13a6fdca636d2d6_DNE_0_v2_stacktrace.txt
A very common use case in our iOS app is that users take a large number of pictures (about 30) in low-light conditions using the camera app, and immediately after, they try to upload them to our servers.
We measured the time to load photos from the PHPickerResult. For most photos, it takes less than 100 milliseconds, but for some of them, it takes several seconds—we even saw minutes in some extreme cases.
We believe this started happening with iOS 17, when deferred photo processing was introduced. If users take the pictures using our in-app camera experience, the options to customize the camera are enough to avoid the long waiting times. However, the majority of our users still prefer to take the photos with the camera app, and there is little we can do about that.
In the past few weeks, we tried many combinations:
Without asking for permissions, we tried loadFileRepresentation, loadData, and loadObject.
We explored the PHImageManager route, asking permissions and with different options for deliveryMode, resizeMode, version, isSynchronous, and allowSecondaryDegradedImage.
We also tried fetching the photos in parallel, with very bad results.
In summary, nothing helped the long waiting times—minutes in some cases.
The first question is then, is there anything we can do to ignore the post-processing of the photos and get them fast? We could accept the unprocessed images.
At a minimum, we would like to show our users what we are doing and why we are taking so much time. We tried to load thumbnails with loadPreviewImage and put a progress indicator on top. This method consistently gives us an error for all photos:
(lldb) p error.localizedDescription
(String) "Cannot load preview."
We can load thumbnails with the PHImageManager option, but it seems excessive to need to get permissions only for that.
Second question would be then, what can we do to load thumbnails without asking for permission?
I created a feedback report with a video and sample code to reproduce -> FB15493683