Consider the code from my previous question: https://developer.apple.com/forums/thread/776592
How can I change the background color of a focused item?
AppKit
RSS for tagConstruct and manage a graphical, event-driven user interface for your macOS app using AppKit.
Posts under AppKit tag
169 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I just made a simple AppKit app, but don't know how to remove borders of rows when they're swiped.
SwiftUI's list does not have this problem though.
Attaching gif demo and code:
import SwiftUI
struct NSTableViewWrapper: NSViewRepresentable {
@State var data: [String]
class Coordinator: NSObject, NSTableViewDataSource, NSTableViewDelegate {
var parent: NSTableViewWrapper
weak var tableView: NSTableView?
init(parent: NSTableViewWrapper) {
self.parent = parent
}
func numberOfRows(in tableView: NSTableView) -> Int {
self.tableView = tableView
return parent.data.count
}
func tableView(_ tableView: NSTableView, viewFor tableColumn: NSTableColumn?, row: Int) -> NSView? {
let cell = tableView.makeView(withIdentifier: NSUserInterfaceItemIdentifier("Cell"), owner: nil) as? NSTextField
?? NSTextField(labelWithString: "")
cell.identifier = NSUserInterfaceItemIdentifier("Cell")
cell.stringValue = parent.data[row]
cell.isBordered = false
return cell
}
func tableView(_ tableView: NSTableView, rowActionsForRow row: Int, edge: NSTableView.RowActionEdge) -> [NSTableViewRowAction] {
guard edge == .trailing else { return [] }
let deleteAction = NSTableViewRowAction(style: .destructive, title: "Delete") { action, index in
self.deleteRow(at: index, in: tableView)
}
return [deleteAction]
}
private func deleteRow(at index: Int, in tableView: NSTableView) {
guard index < parent.data.count else { return }
NSAnimationContext.runAnimationGroup({ context in
context.duration = 0.3
tableView.removeRows(at: IndexSet(integer: index), withAnimation: .slideUp)
}, completionHandler: {
DispatchQueue.main.async {
self.parent.data.remove(at: index)
tableView.reloadData()
}
})
}
}
func makeCoordinator() -> Coordinator {
return Coordinator(parent: self)
}
func makeNSView(context: Context) -> NSScrollView {
let scrollView = NSScrollView()
let tableView = NSTableView()
let column = NSTableColumn(identifier: NSUserInterfaceItemIdentifier("Column"))
column.width = 200
tableView.addTableColumn(column)
tableView.delegate = context.coordinator
tableView.dataSource = context.coordinator
tableView.backgroundColor = .clear
tableView.headerView = nil
tableView.rowHeight = 50
tableView.style = .inset
scrollView.documentView = tableView
scrollView.hasVerticalScroller = true
scrollView.additionalSafeAreaInsets = .init(top: 0, left: 0, bottom: 6, right: 0)
return scrollView
}
func updateNSView(_ nsView: NSScrollView, context: Context) {
(nsView.documentView as? NSTableView)?.reloadData()
}
}
struct ContentView: View {
@State private var itemsString = Array(0..<40).map(\.description)
var body: some View {
NSTableViewWrapper(data: itemsString)
}
}
func createAppWindow() {
let window = NSWindow(
contentRect: .zero,
styleMask: [.titled],
backing: .buffered,
defer: false
)
window.title = "NSTableView from AppKit"
window.contentViewController = NSHostingController(rootView: ContentView())
window.setContentSize(NSSize(width: 759, height: 300))
window.center()
window.makeKeyAndOrderFront(nil)
}
class AppDelegate: NSObject, NSApplicationDelegate {
func applicationDidFinishLaunching(_ notification: Notification) {
createAppWindow()
}
}
let delegate = AppDelegate()
NSApplication.shared.delegate = delegate
NSApplication.shared.run()
When I present a view controller, whose view is a SwiftUI View, via presentAsModalWindow(_:) the presented window is no longer centered horizontally to the screen, but rather its origin is there. I know this issue occurs for macOS 15.2+, but can't tell if it is from 15.0+. I couldn't find any documentation on why was this changed.
Here's an example code that represents my architecture:
class RootViewController: NSViewController {
private lazy var button: NSButton = NSButton(
title: "Present",
target: self,
action: #selector(presentView))
override func viewDidLoad() {
super.viewDidLoad()
// Add button to tree
}
@objc func presentView() {
presentAsModalWindow(PresentedViewController())
}
}
class PresentedViewController: NSViewController {
override loadView() {
view = NSHostingView(rootView: MyView())
}
}
struct MyView: View {
/* impl */
}
In our application we have two usecases for a Hotkey/Shortcut identification API/method.
We have some predefined shortcuts that will ship with our MacOS application. They may or may not change dynamically, based on what the user has already set as shortcuts/hotkeys, and also to avoid any important system wide shortcuts that the user may or may not have changed.
We allow the user to customize the shortcuts/hotkeys in our application, so we want to show what shortcuts the user already has in use system-wide and across their OS experience.
This gives rise to the need for an API that lets us know which shortcut/hotkeys are currently being used by the user and also the current system wide OS shortcuts in use.
Please let me know if there are any APIs in AppKit or SwiftUI we can use for the above
I have a NSViewController as the root view and have a switui view embedded in it via NSHostingView.
override func loadView() {
self.view = NSHostingView(rootView: SwiftUiView())
}
}
In the SwiftUiView, I have a TextField and an NSTextView embedded using NSViewRepresentable, along with a few buttons. There is also a menu:
Menu {
ForEach(menuItems), id: \.self) { item in
Button {
buttonClicked()
} label: {
Text(item)
}
}
} label: {
Image("DropDown")
.contentShape(Rectangle())
.frame(maxWidth: .infinity)
.frame(maxHeight: .infinity)
}
The NSTextView and TextField work fine, and I can type in them until I click on the menu or show an alert. After that, I can no longer place my cursor in the text fields. I am able to select the text but not type in it. When I click on the NSTextView or TextField, nothing happens.
At first, I thought it was just a cursor visibility issue and tried typing, but I received an alert sound. I've been trying to fix this for a couple of days and haven't found any related posts. Any help would be greatly appreciated.
In some places of our app we make use of NSAccessibilityElement subclasses to vend some extra items to accessibility clients.
We need to know which item has the VoiceOver focus so we can keep track of it.
setAccessibilityFocused: does not get called when accessibility clients focus NSAccessibilityElements. This method is only called when accessibility clients focus view-based accessibility elements (i.e. when a NSView subclass gets focused).
At the same time we need to programmatically move VoiceOver focus to those items when something happens. Those accessibility elements inherit from NSObject so we can't make them first responder.
Is this the expected behavior? What are our options in terms of reacting to VoiceOver cursor moving around? What are our options in terms of programmatically moving the VoiceOver cursor to a different element?
Here's a sample project that demonstrates the first part of the issue: https://github.com/vendruscolo/apple-rdars/tree/master/DTS12368714%20-%20NSAccessibilityElement%20focus%20tracking
If you run the app, a window will show up. It contains a button and a red square. If you enable VoiceOver you'll be able to move the cursor over the red square, and a message will be logged. You'll also notice there's an extra element after the red square. That element is available to VoiceOver, however when it gets focuses, no message gets logged.
Where from and how does an NSRulerView get its magnification from? I am not using the automatic magnification by NSScrollView but using my own mechanism. How do I relay the zoom factor to NSRulerView?
So I am looking to use a custom NSWindow application (so I can implement some enhanced resizing/dragging behavior which is only possible overriding NSWindow).
The problem is my whole application is currently SwiftUI-based (see the project here: https://github.com/msdrigg/Roam/blob/50a2a641aa5f2fccb4382e14dbb410c1679d8b0c/Roam/RoamApp.swift).
I know there is a way to make this work by dropping my @main SwiftUI app and replacing it with a SwiftUI root view hosted in a standard AppKit root app, but that feels like I'm going backwards.
Is there another way to get access (and override) the root NSWindow for a SwiftUI app?
I'm attempting to write a macOS version of https://stackoverflow.com/a/74935849/2178159.
From my understanding, I should be able to set the menu property of an NSResponder and it will automatically show on right click.
I've tried a couple things:
A: set menu on an NSHostingController's view - when I do this and right or ctrl click, nothing happens.
B: set menu on NSHostingController directly - when I do this I get a crash Abstract method -[NSResponder setMenu:] called from class _TtGC7SwiftUI19NSHostingControllerGVS_21_ViewModifier_...__. Subclasses must override
C: manually call NSMenu.popup in a custom subclasses of NSHostingController or NSView's rightMouseDown method - nothing happens.
extension View {
func contextMenu(menu: NSMenu) -> some View {
modifier(ContextMenuViewModifier(menu: menu))
}
}
struct ContextMenuViewModifier: ViewModifier {
let menu: NSMenu
func body(content: Content) -> some View {
Interaction_UI(
view: { content },
menu: menu
)
.fixedSize()
}
}
private struct Interaction_UI<Content: View>: NSViewRepresentable {
typealias NSViewType = NSView
@ViewBuilder var view: Content
let menu: NSMenu
func makeNSView(context: Context) -> NSView {
let v = NSHostingController(rootView: view)
// option A - no effect
v.view.menu = menu
// option B - crash
v.menu = menu
return v.view
}
func updateNSView(_ nsView: NSViewType, context: Context) {
// part of option A
nsView.menu = menu
}
}
In the good old days, it was possible to retrieve dynamically the UnknownFSObjectIcon.icns icon using:
[[NSWorkspace sharedWorkspace] iconForFileType:NSFileTypeForHFSTypeCode(kUnknownFSObjectIcon)];
Now, this solution is considered to be deprecated (but is still working) by recent macOS SDKs.
[Q] What is the modern equivalent of this solution?
Notes:
Yes, reading the file directly works but is more fragile than using a System API.
Yes, Xcode suggests to use the iconForContentType: method but I haven't found which UTType should be used.
All the threads only contain system calls. The crashed thread only contains a single call to my app's code which is main.swift:13.
What could cause such a crash?
crash.crash
All the threads only contain system calls. The crashed thread only contains a single call to my app's code which is main.swift:13.
What could cause such a crash?
crash.crash
Is there some reason UIKit's and AppKit's animate(with:changes:completion:) methods are marked deprecated in iOS 18 when they were also first made available in iOS18? If they are indeed already deprecated, is there a replacement method we are supposed to use? This method allows the developer to use SwiftUI animations to animate UIKit and AppKit views.
I'm looking to develop a very rich networking macOS app (like social media apps) operated by very large number of users, each user is able to create a number of windows, operate/view each of them, able to customize the app to his liking etc. The UI is expected to be very rich and dynamic.
The question is, should I choose AppKit or SwiftUI?
I have a basic understanding of SwiftUI, its declarative way of defining UI layouts and populating it with data. Not sure if SwiftUI can handle a very rich and dynamic UI customised by large number of users.
Any thoughts? What works best in this scenario? What is Apple's recommendation?
In macOS application, we are using SwiftUI as an entry point to our application and attaching appdelegate using NSApplicationDelegateAdaptor.
We are using NSViewControllerRepresentable to add a View Controller to the hiracrchy so that we can store intance of viewcontroller and add content to it programatically .
@main
struct TWMainApp: App {
@NSApplicationDelegateAdaptor private var appDelegate: TWAppDelegate
internal var body : some Scene {
TWInitialScene ()
}
}
TWInitialScene :
public struct TWInitialScene : Scene {
public var body : some Scene {
WindowGroup {
TWInitialView ()
}
}
}
TWInitialView :
struct TWInitialView : View {
@Environment(\.scenePhase) private var scenePhase
var body : some View {
TWAppKitToSwiftUIBridge ()
}
}
TWAppKitToSwiftUIBridge :
struct TWNSKitToSwiftUIBridge : NSViewControllerRepresentable {
func makeNSViewController(context: Context) -> TWNSViewController {
let view_hierarchy : TWNSViewController
view_hierarchy = TWStaticContext.sViewController
return view_hierarchy
}
func updateNSViewController(_ nsViewController: TWNSViewController, context: Context) {
}
}
@objc
public class TWStaticContext : NSObject
{
public static let sViewController = TWNSViewController ()
public override init () {}
@objc
public static func GetViewController () -> TWNSViewController
{
return TWStaticContext.sViewController
}
}
public class TWNSViewController : NSViewController {
override public func viewDidLoad ()
{
super.viewDidLoad ()
}
}
To add content to the hirarchy we are accessing viewcontroller's intance and adding content to it like this :
public func PaintInitialScreen () {
let label = NSTextField(labelWithString: "TW window")
label.frame = NSRect(x: 100, y: 200, width: 200, height: 200)
// Adding content to viewcontroller
TWStaticContext.sViewController.view.addSubview(label)
}
We are using this approach because we have a contraint in our application that we have to update UI programatically and on compile time we dont know what we want to show . We will be adding content on runtime based on how many button we want, what label we want , where to place it etc.
When we were using purely appKit application, doing things programatically was simple but since SwiftUI is a declarative application we have to use above approach.
Rational for shifting to SwiftUI entry point is that we want our application to be future safe and since apple is more inclined to SwiffUI, we want to design our entry flow to use SwiftUI entry point . And SwiftUI being declarative, we are using appKit to add content to hiracrchy programtically.
We have used similar apprach in iOS also , where are using UIApplicationDelegateAdaptor inplace of NSApplicationAdaptor . And UIViewControllerReprestable in place of NSViewControllerRepresentable.
Is this right approach to use ?
Hi all, I am looking for a futureproof way of getting the Screen Resolution of my display device using SwiftUI in MacOS. I understand that it can't really be done to the fullest extent, meaning that the closest API we have is the GeometeryProxy and that would only result in the resolution of the parent view, which in the MacOS case would not give us the display's screen resolution. The only viable option I am left with is NSScreen.frame.
However, my issue here is that it seems like Apple is moving towards SwiftUI aggressively, and in order to futureproof my application I need to not rely on AppKit methods as much. Hence, my question: Is there a way to get the Screen Resolution of a Display using SwiftUI that Apple itself recommends? If not, then can I rely safely on NSScreen's frame API?
I am considering of shifting my codebase from appkit to SwiftUI entry point.
In Appkit, we get control on each NSWindow. So that we can hide/resize window, close window and controll when to present a specific window . Because i have access to NSWindow instance which i can store and perform these actions.
But when i shift to SwiftUI entry point, i declare struct conforming to SwiftUI Scene. And new windows will be created with the instance of this scene.
I am using NSViewControllerRepresentable to add a NSViewController to the hierarchy of these scene. And adding content to this NSViewController's instance to show on screen.
I need help in controlling the size of these windows. How can i close specific window ? Resize specific window ? or Hide specific window?
If i use purely SwiftUI view's , then i can do this by using the Enviorment propery and use DismissWindow to close a window or openWindow with identifier to open a specific window by passing the specificer .
But i am using Appkit's NSViewController where i will add buttons in heirarchy from which i want to trigger these events . And in that case how can i controll a specific window in case of multiwindow application?
macOS: 15.0
macFUSE: 4.8.3
I am using rclone + macFUSE and mount my netdisk where it has created three subdirectories in its root directory: /user, /share, and /group.
When I save a file to /[root]/user using NSSavePanel and name it test.txt, I expect the file to be saved as:
/[root]/user/test.txt
However, occasionally, the delegate method:
- (BOOL)panel:(id)sender validateURL:(NSURL *)url error:(NSError **)outError {
}
returns an incorrect path:
/[root]/test.txt
This issue only occurs when selecting /user. The same operation works correctly for /share and /group.
Is there any logs I could provide to help solving this issue?
Many thanks!
Our app presents an NSOpenPanel with an accessory view implemented in SwiftUI and presented via NSHostingView. TextFields and pickers are working OK, but Buttons and Toggles (checkboxes) aren’t, although Toggles styled with .switch are functioning as expected. Specifically:
Toggles styled with .checkbox fail with no feedback. Overriding NSHostingView mouseDown() shows that the mouse event is completely ignored by the Toggle
Buttons “see” the mouseDown event (the button highlights when pressed, and the event doesn’t fall through to the hosting view), but the button action isn’t triggered until the dialog is dismissed
Any idea on how to get these controls functional?
I'm not quite sure where the problem is, but I will describe what I am doing to recreate the issue, and am happy to provide whatever information I can to be more useful.
I am changing the ActivationPolicy for my app in order to make it unobtrusive when in the background (e.g. hiding it from the dock and using only a menu bar status item). When the user activates the app with a hotkey, it changes from NSApplicationActivationPolicyAccessory back to NSApplicationActivationPolicyRegular. This allows normal usage (dock icon, menu bar, etc.)
This works fine, except in a rare situation which I finally just tracked down. If there is a window open in the app and I use the hotkey to convert back to an accessory, and then disconnect and reconnect the display on which the app was previously displayed, when I convert the app back to "regular mode", the menu bar has disappeared (and I am left with an empty space at the top of the screen). I can also trigger this bug by having the display in question briefly mirror the other display (effectively "orphaning" the hidden app), and then restoring the original side-by-side configuration before activating the app again.
The app otherwise works, but the menu bar is missing. Switching back and forth with other apps does not fix the problem. Quitting and restarting the app resolves the issue. As does disabling the accessory only mode and forcing the app to always remain in "regular mode" with a dock icon (there is a preference for this in my app). Once fixed, I can then re-enable the "accessory mode" and all is well until the bug is triggered again.
The bug would normally occur quite sporadically, presumably requiring a particular combination of changing Spaces or displays, or having the computer go to sleep while this app was in accessory mode. Thus far, the above is the only way I have found that can replicate this issue on demand.
If I close all windows before hiding the app, then it works fine when I revert to "regular mode". It only happens if there is a window open at the time.
Using applicationDidChangeScreenParameters: on my AppDelegate indicates that there is a change in screen, and logging window.screen.frame for each open window in [NSApp orderedWindows] shows that the size changes from e.g. 1920x1080 to 0x0 and back while the display is disconnected or mirrored.
There is also an error in the console in Xcode when this happens -- invalid display identifier <some UUID>.
I have tried various options for window collectionBehavior, as well as various settings for Spaces (which I normally use). None of these changes has fixed the behavior thus far.
I use [NSApp hide:self]; from my AppDelegate to hide the app, and [[NSRunningApplication currentApplication] activateWithOptions:NSApplicationActivateAllWindows];[NSApp unhide:self]; to bring it back to the front.
I welcome any ideas for things to chase down, or requests for more specific information that would be useful.
Thank you!
Fletcher