macOS is the operating system for Mac.

Posts under macOS tag

200 Posts

Post

Replies

Boosts

Views

Activity

How do you observe the count of records in a Swift Data relationship?
What is the correct way to track the number of items in a relationship using SwiftData and SwiftUI? Imagine a macOS application with a sidebar that lists Folders and Tags. An Item can belong to a Folder and have many Tags. In the sidebar, I want to show the name of the Folder or Tag along with the number of Items in it. I feel like I'm missing something obvious within SwiftData to wire this up such that my SwiftUI views correctly updated whenever the underlying modelContext is updated. // The basic schema @Model final class Item { var name = "Untitled Item" var folder: Folder? = nil var tags: [Tag] = [] } @Model final class Folder { var name = "Untitled Folder" var items: [Item] = [] } @Model final class Tag { var name = "Untitled Tag" var items: [Item] = [] } // A SwiftUI view to show a Folder. struct FolderRowView: View { let folder: Folder // Should I use an @Query here?? // @Query var items: [Item] var body: some View { HStack { Text(folder.name) Spacer() Text(folder.items.count.formatted()) } } } The above code works, once, but if I then add a new Item to that Folder, then this SwiftUI view does not update. I can make it work if I use an @Query with an #Predicate but even then I'm not quite sure how the #Predicate is supposed to be written. (And it seems excessive to have an @Query on every single row, given how many there could be.) struct FolderView: View { @Query private var items: [Item] private var folder: Folder init(folder: Folder) { self.folder = folder // I've read online that this needs to be captured outside the Predicate? let identifier = folder.persistentModelID _items = Query(filter: #Predicate { link in // Is this syntax correct? The results seem inconsistent in my app... if let folder = link.folder { return folder.persistentModelID == identifier } else { return false } }) } var body: some View { HStack { Text(folder.name) Spacer() // This mostly works. Text(links.count.formatted()) } } } As I try to integrate SwiftData and SwiftUI into a traditional macOS app with a sidebar, content view and inspector I'm finding it challenging to understand how to wire everything up. In this particular example, tracking the count, is there a "correct" way to handle this?
1
0
133
Aug ’25
How do you compute backing pixel alignment in SwiftUI's `Layout`?
When performing custom layout in AppKit, it's essential that you pixel align frames using methods like backingAlignedRect. The alignment differs depending on the backingScaleFactor of the parent window. When building custom Layouts in SwiftUI, how should you compute the alignment of a subview.frame in placeSubviews() before calling subview.place(...)? Surprisingly, I haven't seen any mention of this in the WWDC videos. However, if I create a Rectangle of width 1px and then position it on fractional coordinates, I get a blurry view, as I would expect. Rounding to whole numbers works, but on Retina screens you should be able to round to 0.5 as well. func placeSubviews( in bounds: CGRect, proposal: ProposedViewSize, subviews: Subviews, cache: inout Void ) { // This should be backing aligned based on the parent window's backing scale factor. var frame = CGRect( x: 10.3, y: 10.8, width: 300.6, height: 300.1 ) subview.place( at: frame.origin, anchor: .topLeading, proposal: ProposedViewSize(frame.size) ) }
Topic: UI Frameworks SubTopic: SwiftUI Tags:
1
0
69
Aug ’25
Audio Unit v3 host v2 third party plugins
Hi, I have just implemented an Audio Unit v3 host. AgsAudioUnitPlugin *audio_unit_plugin; AVAudioUnitComponentManager *audio_unit_component_manager; NSArray<AVAudioUnitComponent *> *av_component_arr; AudioComponentDescription description; guint i, i_stop; if(!AGS_AUDIO_UNIT_MANAGER(audio_unit_manager)){ return; } audio_unit_component_manager = [AVAudioUnitComponentManager sharedAudioUnitComponentManager]; /* effects */ description = (AudioComponentDescription) {0,}; description.componentType = kAudioUnitType_Effect; av_component_arr = [audio_unit_component_manager componentsMatchingDescription:description]; i_stop = [av_component_arr count]; for(i = 0; i < i_stop; i++){ ags_audio_unit_manager_load_component(audio_unit_manager, (gpointer) av_component_arr[i]); } /* instruments */ description = (AudioComponentDescription) {0,}; description.componentType = kAudioUnitType_MusicDevice; av_component_arr = [audio_unit_component_manager componentsMatchingDescription:description]; i_stop = [av_component_arr count]; for(i = 0; i < i_stop; i++){ ags_audio_unit_manager_load_component(audio_unit_manager, (gpointer) av_component_arr[i]); } But this doesn't show me Audio Unit v2 plugins, why? regards, Joël
3
0
350
Aug ’25
In SwiftUI for macOS, how can you detect if a view or any ancestor is "hidden"?
Given a View in SwiftUI for macOS, how can I tell if that view is hidden either because it, or any of its ancestor's opacity is 0.0 or the .hidden modifier has been applied? Presumably I can manually do this with an Environment value on the ancestor view, but I'm curious if this can be done more idiomatically. An example use case: I have views that run long-running Tasks via the .task(id:) modifier. These tasks only need to be running if the View itself is visible to the user. When the View is hidden, the task should stop. When the View reappears, the Task should restart. This happens automatically when Views are created and destroyed, but does not happen when a view is only hidden.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
2
0
92
Aug ’25
Squicle app icons on macOS 26
Just posted this feedback regarding macOS 26 "Tahoe" (FB19853155) - please support with additional submissions if you share my view. I will miss the beautiful and individual designed icons of the past! "macOS 26 is enforcing squicles for app icons, falling back to a grey background for 3rd party apps without a compliant AppIcon asset. As a result many original app icons are reduced in size and hard to distinguish because they share the same background color. Although I respect Apple's strive for an iOS-like UI on Macs, a smooth transition path would be more user- and developer-friendly ... e.g. with some info.plist property to opt-out icon migration, potentially ignored by a future macOS version. The current solution causes a bad usability, and makes the system look inconsistent as many - especially free - software will not be updated with new icon designs. Please reconsider this bad design decision!"
1
2
238
Aug ’25
AVAudioUnit host - PCM buffer output silent
Hi, I just started to develop audio unit hosting support in my application. Offline rendering seems to work except that I hear no output, but why? I suspect with the player goes something wrong. I connect to CoreAudio in a different location in the code. Here are some error messages I faced so far: 2025-08-14 19:42:04.132930+0200 com.gsequencer.GSequencer[34358:18611871] [avae] AVAudioEngineGraph.mm:4668 Can't retrieve source node to play sequence because there is no output node! 2025-08-14 19:42:04.151171+0200 com.gsequencer.GSequencer[34358:18611871] [avae] AVAudioEngineGraph.mm:4668 Can't retrieve source node to play sequence because there is no output node! 2025-08-14 19:43:08.344530+0200 com.gsequencer.GSequencer[34358:18614927] AUAudioUnit.mm:1417 Cannot set maximumFramesToRender while render resources allocated. 2025-08-14 19:43:08.346583+0200 com.gsequencer.GSequencer[34358:18614927] [avae] AVAEInternal.h:104 [AVAudioSequencer.mm:121:-[AVAudioSequencer(AVAudioSequencer_Player) startAndReturnError:]: (impl->Start()): error -10852 ** (<unknown>:34358): WARNING **: 19:43:08.346: error during audio sequencer start - -10852 I have implemented an AVAudioEngine based AudioUnit host. Here I instantiate player and effect: /* audio engine */ audio_engine = [[AVAudioEngine alloc] init]; fx_audio_unit_audio->audio_engine = (gpointer) audio_engine; av_format = (AVAudioFormat *) fx_audio_unit_audio->av_format; /* av audio player node */ av_audio_player_node = [[AVAudioPlayerNode alloc] init]; /* av audio unit */ av_audio_unit_effect = [[AVAudioUnitEffect alloc] initWithAudioComponentDescription:[((AVAudioUnitComponent *) AGS_AUDIO_UNIT_PLUGIN(base_plugin)->component) audioComponentDescription]]; av_audio_unit = (AVAudioUnit *) av_audio_unit_effect; fx_audio_unit_audio->av_audio_unit = av_audio_unit; /* audio sequencer */ av_audio_sequencer = [[AVAudioSequencer alloc] initWithAudioEngine:audio_engine]; fx_audio_unit_audio->av_audio_sequencer = (gpointer) av_audio_sequencer; /* output node */ [[AVAudioOutputNode alloc] init]; /* audio player and audio unit */ [audio_engine attachNode:av_audio_player_node]; [audio_engine attachNode:av_audio_unit]; [audio_engine connect:av_audio_player_node to:av_audio_unit format:av_format]; [audio_engine connect:av_audio_unit to:[audio_engine outputNode] format:av_format]; ns_error = NULL; [audio_engine enableManualRenderingMode:AVAudioEngineManualRenderingModeOffline format:av_format maximumFrameCount:buffer_size error:&ns_error]; if(ns_error != NULL && [ns_error code] != noErr){ g_warning("enable manual rendering mode error - %d", [ns_error code]); } ns_error = NULL; [[av_audio_unit AUAudioUnit] allocateRenderResourcesAndReturnError:&ns_error]; if(ns_error != NULL && [ns_error code] != noErr){ g_warning("Audio Unit allocate render resources returned error - ErrorCode %d", [ns_error code]); } Then I render in a dedicated thread. ns_error = NULL; [audio_engine startAndReturnError:&ns_error]; if(ns_error != NULL && [ns_error code] != noErr){ g_warning("error during audio engine start - %d", [ns_error code]); } [av_audio_sequencer prepareToPlay]; ns_error = NULL; [av_audio_sequencer startAndReturnError:&ns_error]; if(ns_error != NULL && [ns_error code] != noErr){ g_warning("error during audio sequencer start - %d", [ns_error code]); } [av_audio_player_node play]; while(is_running){ /* pre sync */ /* IO buffers */ av_output_buffer = (AVAudioPCMBuffer *) scope_data->av_output_buffer; av_input_buffer = (AVAudioPCMBuffer *) scope_data->av_input_buffer; /* fill input buffer */ /* schedule av input buffer */ frame_position = 0; // (gint64) ((note_offset * absolute_delay) + delay_counter) * buffer_size; av_audio_player_node = (AVAudioPlayerNode *) fx_audio_unit_audio->av_audio_player_node; AVAudioTime *av_audio_time = [[AVAudioTime alloc] initWithHostTime:frame_position sampleTime:frame_position atRate:((double) samplerate)]; [av_audio_player_node scheduleBuffer:av_input_buffer atTime:av_audio_time options:0 completionHandler:nil]; /* render */ ns_error = NULL; status = [audio_engine renderOffline:AGS_FX_AUDIO_UNIT_AUDIO_FIXED_BUFFER_SIZE toBuffer:av_output_buffer error:&ns_error]; if(ns_error != NULL && [ns_error code] != noErr){ g_warning("render offline error - %d", [ns_error code]); } } regards, Joël
3
0
405
Aug ’25
Avoid trackpad gesture conflict between dragging and accessibility zooming when using three fingers
Double-tap three fingers and drag to change zoom” should suppress “Three Finger to Drag”. Currently these gestures are triggered simultaneously, for no real reasons. I saw different behaviors for different environments, but none is desired. Current and desired behavior: This seems an issue so I filed a feedback.
0
0
717
Aug ’25
Java remote debugging stymied by connection refused on local network
I am trying to setup remote Java debugging between two machines running macOS (15.6 and 26). I am able to get the Java program to listen on a socket. However, I can connect to that socket only from the same machine, not from another machine on my local network. I use nc to test the connection. It reports Connection refused when trying to connect from the other machine. This issue sounds like it could be caused by the Java program lacking Local Network system permission. I am familiar with that issue arising when a program attempts to connect to a port on the local network. In that case, a dialog is displayed and System Settings can be used to grant Local Network permission to the client program. I don't know whether the same permission is required on the program that is receiving client requests. If it is, then I don't know how to grant that permission. There is no dialog, and System Settings does not provide any obvious way to grant permission to a program that I specify. Note that a Java application is a program run by the java command, not a bundled application. The java command contains a hard-wired Info.plist which, annoyingly, requests permission to use the microphone, but not Local Network access.
5
1
373
Aug ’25
Is there any future for screensavers on macOS?
I haven't been looking at screensavers for a long time because of Apple's lack of will (or resources?) to provide a public version of the private modern SDK used by Apple for a very long time now. I'm now looking at the Screen Saver pane in System Settings (the What-If version of System Preferences in an alternate universe where all screens are in portrait mode). In macOS Sequoia, it seems like 3rd party screensavers are not welcome considering that they are relegated to the "Other" section at the bottom of the list and you have to click Show All to start seeing 3rd party screen savers. I also had a quick look at macOS Tahoe Beta 3 and it looks like that all the real screensavers are gone (3rd party and the ones from Apple: Hello, Message, Flurry, etc.) or at least it requires to be a Nobel Prize to find them (and the Search field is not useful). I tried to install a 3rd party screen saver on macOS Tahoe Beta 3, it doesn't show up in the list. To summarize: No public access to modern APIs AFAIK. UI that is hostile to 3rd party screen savers on macOS Sequoia. Apparently only screensavers that are slideshows or movies curated by Apple in macOS Tahoe b3. Hence the question: Is there any future for screen savers on macOS? Because if there's none, I won't waste my time trying to update some old screen savers.
3
0
511
Aug ’25
macOS is experiencing a severe memory leak issue.
After installing macOS Tahoe26.0 beta5 (25A5338b) or beta4, a memory and disk leak appears, likely caused by Spotlight indexing. The mds_stores process consumes up to 60 GB of RAM, while disk writes continue until an additional 50 GB is filled. Once the maximum is reached, the system frees the space and the cycle repeats, leading to severe memory leakage. In the available space, system data occupies as much as 150 GB. System Versions Affected: macOS Tahoe26.0 beta5 (25A5338b) and beta4. Reproduction Steps: Occurs immediately after booting up, with no additional user actions. Workaround: Booting into Safe Mode prevents the issue from occurring. Possible Cause: A bug in Spotlight indexing. In Safe Mode, Launchpad cannot search for applications, and no other issues have been observed so far. This strongly suggests that Spotlight indexing is the root cause. Potentially Affected Software: Developer-related tools such as Node.js, Maven, Vue.js, etc. It is possible that Spotlight is indexing developer project dependency directories (e.g., node_modules), which contain a large number of small files that do not require indexing, leading to this problem.
7
4
296
Aug ’25
macOS is experiencing a severe memory leak issue.
问题描述: 在这两个版本后出现了可能是聚焦索引导致的内存、硬盘泄漏,使得mds_stores内存占用高达60G,硬盘内存持续写入50G,上述情况持续到最大后,系统释放空间后继续重新写入,导致内存严重泄漏,在可用空间中,系统数据占用高达150G, 系统版本: macOS Tahoe26.0beta5(25A5338b)、以及beta4 复现步骤: 开机即启动,没有任何操作步骤 解决方法: 进入安全模式后这个问题未复现 可能原因: 聚焦索引出现bug,在安全模式下“启动台”无法搜索应用程序,暂时没有发现其他问题,极有可能是聚焦索引出现问题 可能影响到的软件: node.js、meven、vue等与开发者相关的软件,既有可能使聚焦索引了开发者的项目依赖文件节点模块,它包含有极其大量的小文件,这些文件不需要索引,可能是这个问题导致。 ###问题截图
3
0
260
Aug ’25
How can I connect NSTableCellView.textField to a SwiftUI view?
When using NSTableView or NSOutlineView, if you use an NSTableCellView and wire up the .imageView and .textField properties then you get some "free" behaviour with respect to styling and sizing of those fields. (ex: They reflect the user's preferred "Sidebar Icon Size" as selected in Settings. ) If I'm using a SwiftUI View inside an NSTableCellView, is there any way to connect a Text or Image to those properties? Consider the following pseudo code: struct MyCellView: View { let text: String let url: URL? var body: some View { HStack { Image(...) // How to indicate this is .imageView? Text(...) // How to indicate this is .textField? } } } final class MyTableCellView: NSTableCellView { private var hostingView: NSHostingView<MyCellView>! init() { self.hostingView = NSHostingView(rootView: MyCellView(text: "", url: nil)) self.addSubview(self.hostingView) } func configureWith(text: String, url: URL) { let rootView = MyCellView(text: text, url: url) hostingView.rootView = rootView // How can I make this connection? self.textField = rootView.??? self.imageView = rootView.??? } } I'm ideally looking for a solution that works on macOS 15+.
2
0
106
Aug ’25
Missing flows for content filter on macOS 15 Sequoia
We use as content filter in our app to monitor flows, we gather data about the flow and block flows deemed suspicious. Our content filter is activated/deactivated by a UI app but the flows are reported via XPC to a separate daemon process for analysis. As of macOS 15, we are seeing cases where flows are missing or flows are not received at all by the content filter. The behaviour is not consistent, some devices seem to receive flows normally but others don't. It appears Intel devices are much less prone to showing the problem, whereas Arm devices routinely exhibit missing flows. On macOS 14 or earlier, there is no sign of missing flows. Testing on earlier beta versions of macOS 15 did not appear to show the problem, however I can't rule out if issue was present but it wasn't spotted. Experimenting with simple examples of using a content filter (e.g. QNE2FilterMac) does not appear to reproduce the issue. Questions, What has changed between macOS 14 and 15 that could be the cause of the lack of flows? Is our approach to using an app activated content filter reporting to a daemon connected via XPC unsupported?
7
1
972
Aug ’25
Initializing LanguageModelSession crashes app on macOS
Whenever I try to initialize a LanguageModelSession (let session = LanguageModelSession()), my app crashes with EXC_BAD_ACCESS. SystemLanguageModel.default.availability returns available. I tried running the two sample projects I found that use Foundation Models, FoundationModelsTripPlanner and SwiftTranscriptionSampleApp, and they both also crash—immediately on launch. I commented out the Foundation Models logic from the SwiftTranscriptionSampleApp and ran it again, and it no longer crashed. I'm on macOS 26 Beta 4 on an M1 Pro device. I'm based in Austria (EU), if that matters.
7
4
296
Aug ’25
General > Login Items > Allow in Background (User visible item names) in Ventura
In the latest beta of Ventura (and perhaps earlier versions) there is a section of the System Settings > General > Login Items pane called "Allow in Background". It appears that helpers (LaunchAgents/LaunchDaemons) that are installed by apps are listed here. As you can see in the screenshot below, I have 3 such items installed on my test system. The per User LaunchAgent for the Google Updater, the WireShark LaunchDaemon for the ChmodBPF script, and the LaunchDaemon for my userspace CoreAudio Driver (labelled "Metric Halo Distribution, Inc."). The WireShark and Google Updater have nice user identifiable names associated with them, whereas my Launch Daemon only has my company name associated with it. I don't see anything in the plists for Wireshark or GoogleUpdater that seem to specify this user-visible string, nor in the bundles the plists point to. How do I go about annotating my LaunchDaemon plist or the helper tool's plist so that the string in this pane helps the user properly identify what this Background item is for so that they don't accidentally turn it off and disable the driver they need to use our audio hardware? Obviously, we will document this, but just as obviously users don't always read the docs, and it would be better if the user just could make the immediate association that this Background item is needed for our CoreAudio driver.
5
2
2.2k
Aug ’25
Disable QUIC/HTTP3 support for specific MacOS application
Hello, I am currently investigating if we can disable usage of QUIC on application level. I know we can set enable_quic from /Library/Preferences/com.apple.networkd.plist to false but it will have a global impact since this is a system file, all the applications on machine will stop using QUIC. I don't want that. What i am looking for is to disable QUIC only for my application. Is there any way i can modify URLSession object in my application and disable QUIC? or modify URLSessionConfiguration so system will not use QUIC?
3
0
145
Aug ’25
Unsigned macOS app installed in /Applications does not appear in Launchpad
Hello, I have a macOS app built with Flutter’s macOS target (native Xcode project). The app is unsigned (no Developer ID code signing / notarization). The .app bundle looks valid: CFBundlePackageType = APPL Unique CFBundleIdentifier No LSUIElement or LSBackgroundOnly Executable exists and is runnable Placed at /Applications/MyApp.app (top-level), runs fine from Finder However, it does not show up in Launchpad. What I tried: Remove quarantine: xattr -dr com.apple.quarantine "/Applications/MyApp.app" Force Launch Services registration: /System/Library/Frameworks/CoreServices.framework/Frameworks/LaunchServices.framework/Support/lsregister -f "/Applications/MyApp.app" Rebuild LS caches: /System/Library/Frameworks/CoreServices.framework/Frameworks/LaunchServices.framework/Support/lsregister -kill -r -domain local -domain system -domain user Reset Launchpad DB and restart Dock: defaults write com.apple.dock ResetLaunchPad -bool true && killall Dock Verified bundle structure/type: mdls -name kMDItemContentType -name kMDItemKind "/Applications/MyApp.app" → shows com.apple.application-bundle / Application Questions Is code signing/notarization required for an app to appear in Launchpad (even if it runs from Finder)? What additional conditions cause Launchpad to skip an otherwise valid, unsigned .app in /Applications? Are there deeper Launch Services or Dock database checks I can run to diagnose why this specific app is excluded?
Topic: Code Signing SubTopic: General Tags:
2
0
111
Aug ’25
Quick Look Extension does not load MapKit map properly anymore, after macOS Sequoia
It appears that starting with macOS Sequoia, Quick Look Preview extension no longer loads MapKit maps correctly anymore. Map tiles do not appear, leaving users with a beige background. Users report that polylines do render correctly, but annotations appears black. This was previously working fine in prior macOS versions including Sonoma. STEPS TO REPRODUCE Create a macOS app project, with an associated document. Ensure project has a Quick Look preview extension, with necessary basic setups. Ensure that the extension mentioned in (2) must have a MKMapView. Any other cosmetic changes, etc, does not need to be implemented to observe the base issue. Do note that it has been reported that in addition to the map tiles not loading, annotations don't render correctly as well.
3
2
850
Aug ’25
macOS 26: NSTokenField crashes due to NSGenericException caused by too many Update Constraints
This example application crashes when entering any text to the token field with FAULT: NSGenericException: The window has been marked as needing another Update Constraints in Window pass, but it has already had more Update Constraints in Window passes than there are views in the window. The app uses controlTextDidChange to update a live preview where it accesses the objectValue of the token field. If one character is entered, it also looks like the NSTokenFieldDelegate methods tokenField(_:styleForRepresentedObject:) tokenField(_:editingStringForRepresentedObject:) tokenField(_:representedObjectForEditing:) are called more than 10000 times until the example app crashes on macOS Tahoe 26 beta 6. I've reported this issue with beta 1 as FB18088608, but haven't heard back so far. I have multiple occurrences of this issue in my app, which is working fine on previous versions of macOS. I haven't found a workaround yet, and I’m getting anxious of this issue persisting into the official release.
0
0
95
Aug ’25