I'm looking to develop a very rich networking macOS app (like social media apps) operated by very large number of users, each user is able to create a number of windows, operate/view each of them, able to customize the app to his liking etc. The UI is expected to be very rich and dynamic.
The question is, should I choose AppKit or SwiftUI?
I have a basic understanding of SwiftUI, its declarative way of defining UI layouts and populating it with data. Not sure if SwiftUI can handle a very rich and dynamic UI customised by large number of users.
Any thoughts? What works best in this scenario? What is Apple's recommendation?
AppKit
RSS for tagConstruct and manage a graphical, event-driven user interface for your macOS app using AppKit.
Posts under AppKit tag
173 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
In macOS application, we are using SwiftUI as an entry point to our application and attaching appdelegate using NSApplicationDelegateAdaptor.
We are using NSViewControllerRepresentable to add a View Controller to the hiracrchy so that we can store intance of viewcontroller and add content to it programatically .
@main
struct TWMainApp: App {
@NSApplicationDelegateAdaptor private var appDelegate: TWAppDelegate
internal var body : some Scene {
TWInitialScene ()
}
}
TWInitialScene :
public struct TWInitialScene : Scene {
public var body : some Scene {
WindowGroup {
TWInitialView ()
}
}
}
TWInitialView :
struct TWInitialView : View {
@Environment(\.scenePhase) private var scenePhase
var body : some View {
TWAppKitToSwiftUIBridge ()
}
}
TWAppKitToSwiftUIBridge :
struct TWNSKitToSwiftUIBridge : NSViewControllerRepresentable {
func makeNSViewController(context: Context) -> TWNSViewController {
let view_hierarchy : TWNSViewController
view_hierarchy = TWStaticContext.sViewController
return view_hierarchy
}
func updateNSViewController(_ nsViewController: TWNSViewController, context: Context) {
}
}
@objc
public class TWStaticContext : NSObject
{
public static let sViewController = TWNSViewController ()
public override init () {}
@objc
public static func GetViewController () -> TWNSViewController
{
return TWStaticContext.sViewController
}
}
public class TWNSViewController : NSViewController {
override public func viewDidLoad ()
{
super.viewDidLoad ()
}
}
To add content to the hirarchy we are accessing viewcontroller's intance and adding content to it like this :
public func PaintInitialScreen () {
let label = NSTextField(labelWithString: "TW window")
label.frame = NSRect(x: 100, y: 200, width: 200, height: 200)
// Adding content to viewcontroller
TWStaticContext.sViewController.view.addSubview(label)
}
We are using this approach because we have a contraint in our application that we have to update UI programatically and on compile time we dont know what we want to show . We will be adding content on runtime based on how many button we want, what label we want , where to place it etc.
When we were using purely appKit application, doing things programatically was simple but since SwiftUI is a declarative application we have to use above approach.
Rational for shifting to SwiftUI entry point is that we want our application to be future safe and since apple is more inclined to SwiffUI, we want to design our entry flow to use SwiftUI entry point . And SwiftUI being declarative, we are using appKit to add content to hiracrchy programtically.
We have used similar apprach in iOS also , where are using UIApplicationDelegateAdaptor inplace of NSApplicationAdaptor . And UIViewControllerReprestable in place of NSViewControllerRepresentable.
Is this right approach to use ?
Hi all, I am looking for a futureproof way of getting the Screen Resolution of my display device using SwiftUI in MacOS. I understand that it can't really be done to the fullest extent, meaning that the closest API we have is the GeometeryProxy and that would only result in the resolution of the parent view, which in the MacOS case would not give us the display's screen resolution. The only viable option I am left with is NSScreen.frame.
However, my issue here is that it seems like Apple is moving towards SwiftUI aggressively, and in order to futureproof my application I need to not rely on AppKit methods as much. Hence, my question: Is there a way to get the Screen Resolution of a Display using SwiftUI that Apple itself recommends? If not, then can I rely safely on NSScreen's frame API?
I am considering of shifting my codebase from appkit to SwiftUI entry point.
In Appkit, we get control on each NSWindow. So that we can hide/resize window, close window and controll when to present a specific window . Because i have access to NSWindow instance which i can store and perform these actions.
But when i shift to SwiftUI entry point, i declare struct conforming to SwiftUI Scene. And new windows will be created with the instance of this scene.
I am using NSViewControllerRepresentable to add a NSViewController to the hierarchy of these scene. And adding content to this NSViewController's instance to show on screen.
I need help in controlling the size of these windows. How can i close specific window ? Resize specific window ? or Hide specific window?
If i use purely SwiftUI view's , then i can do this by using the Enviorment propery and use DismissWindow to close a window or openWindow with identifier to open a specific window by passing the specificer .
But i am using Appkit's NSViewController where i will add buttons in heirarchy from which i want to trigger these events . And in that case how can i controll a specific window in case of multiwindow application?
macOS: 15.0
macFUSE: 4.8.3
I am using rclone + macFUSE and mount my netdisk where it has created three subdirectories in its root directory: /user, /share, and /group.
When I save a file to /[root]/user using NSSavePanel and name it test.txt, I expect the file to be saved as:
/[root]/user/test.txt
However, occasionally, the delegate method:
- (BOOL)panel:(id)sender validateURL:(NSURL *)url error:(NSError **)outError {
}
returns an incorrect path:
/[root]/test.txt
This issue only occurs when selecting /user. The same operation works correctly for /share and /group.
Is there any logs I could provide to help solving this issue?
Many thanks!
Our app presents an NSOpenPanel with an accessory view implemented in SwiftUI and presented via NSHostingView. TextFields and pickers are working OK, but Buttons and Toggles (checkboxes) aren’t, although Toggles styled with .switch are functioning as expected. Specifically:
Toggles styled with .checkbox fail with no feedback. Overriding NSHostingView mouseDown() shows that the mouse event is completely ignored by the Toggle
Buttons “see” the mouseDown event (the button highlights when pressed, and the event doesn’t fall through to the hosting view), but the button action isn’t triggered until the dialog is dismissed
Any idea on how to get these controls functional?
I'm not quite sure where the problem is, but I will describe what I am doing to recreate the issue, and am happy to provide whatever information I can to be more useful.
I am changing the ActivationPolicy for my app in order to make it unobtrusive when in the background (e.g. hiding it from the dock and using only a menu bar status item). When the user activates the app with a hotkey, it changes from NSApplicationActivationPolicyAccessory back to NSApplicationActivationPolicyRegular. This allows normal usage (dock icon, menu bar, etc.)
This works fine, except in a rare situation which I finally just tracked down. If there is a window open in the app and I use the hotkey to convert back to an accessory, and then disconnect and reconnect the display on which the app was previously displayed, when I convert the app back to "regular mode", the menu bar has disappeared (and I am left with an empty space at the top of the screen). I can also trigger this bug by having the display in question briefly mirror the other display (effectively "orphaning" the hidden app), and then restoring the original side-by-side configuration before activating the app again.
The app otherwise works, but the menu bar is missing. Switching back and forth with other apps does not fix the problem. Quitting and restarting the app resolves the issue. As does disabling the accessory only mode and forcing the app to always remain in "regular mode" with a dock icon (there is a preference for this in my app). Once fixed, I can then re-enable the "accessory mode" and all is well until the bug is triggered again.
The bug would normally occur quite sporadically, presumably requiring a particular combination of changing Spaces or displays, or having the computer go to sleep while this app was in accessory mode. Thus far, the above is the only way I have found that can replicate this issue on demand.
If I close all windows before hiding the app, then it works fine when I revert to "regular mode". It only happens if there is a window open at the time.
Using applicationDidChangeScreenParameters: on my AppDelegate indicates that there is a change in screen, and logging window.screen.frame for each open window in [NSApp orderedWindows] shows that the size changes from e.g. 1920x1080 to 0x0 and back while the display is disconnected or mirrored.
There is also an error in the console in Xcode when this happens -- invalid display identifier <some UUID>.
I have tried various options for window collectionBehavior, as well as various settings for Spaces (which I normally use). None of these changes has fixed the behavior thus far.
I use [NSApp hide:self]; from my AppDelegate to hide the app, and [[NSRunningApplication currentApplication] activateWithOptions:NSApplicationActivateAllWindows];[NSApp unhide:self]; to bring it back to the front.
I welcome any ideas for things to chase down, or requests for more specific information that would be useful.
Thank you!
Fletcher
Hi all,
I use the FileManager trashIitem function to put a file in the trash.
If it is only one file, then the option to put it back is available.
If, however, several files are deleted, the option to put it back is only available for the first
deleted file. All others cannot be put back.
The problem has been known for at least 10 years.
See Put back only works for the first file.
NSWorkspace recycle has the same problem.
It seems to be due to .DS_Store in the trash. The files that are in the trash are stored there. This may also lead you to believe that the trashItem function is working properly because the deleted files are still in the .DS_Store file.
If I call trashItem or recycle several times and wait 2 seconds between calls, then the option to put it back is available for all of them.
That obviously can't be the solution. Waiting less than 2 seconds only offers to put the first file back.
So trashItem and recycle are the same as remove, with the difference that you can look at the files in the trash can again, but not put them back.
Are there other ways?
The Finder can also delete multiple files and put them all back.
If I use NSAlert the buttons look like this:
The Cancel button has a gray background. We got complaints about the bad contrast and people pointed out that the alerts from System Settings look like this:
Here the Cancel button has a white background. Unfortunately I did not find out how to make the buttons in my own alerts look like those in System Settings. Setting the button's bezel color to white did not work. Any help would be highly appreciated. Thanks.
Best regards,
Marc
All the threads only contain system calls. The crashed thread only contains a single call to my app's code which is main.swift:12.
What could cause such a crash?
crash.txt
I have a complex app that requires the main SwiftUI view of the app to be embedded inside an NSHostingView which is a subview of an NSViewController's view. Then this NSViewController is wrapped using NSViewControllerRepresentable to be presented using SwiftUI's Window. And if I have a TimelineView inside my SwiftUI view hierarchy, it causes constant recalculation of the layout.
Here's a simplified demo code:
@main
struct DogApp: App {
private let dogViewController = DogViewController()
var body: some Scene {
Window("Dog", id: "main") {
DogViewControllerUI()
}
}
}
private struct DogViewControllerUI: NSViewControllerRepresentable {
let dogViewController = DogViewController ()
func makeNSViewController(context: Context) -> NSViewController { dogViewController }
func updateNSViewController(_ nsViewController: NSViewController, context: Context) {}
func sizeThatFits(_ proposal: ProposedViewSize, nsViewController: NSViewController, context: Context) -> CGSize? {
debugPrint("sizeThatFits", proposal)
return nil
}
}
public class DogViewController: NSViewController {
public override func viewDidLoad() {
super.viewDidLoad()
let mainView = MainView()
let hostingView = NSHostingView(rootView: mainView)
view.addSubview(hostingView)
hostingView.translatesAutoresizingMaskIntoConstraints = false
hostingView.topAnchor.constraint(equalTo: view.topAnchor).isActive = true
hostingView.leadingAnchor.constraint(equalTo: view.leadingAnchor).isActive = true
hostingView.trailingAnchor.constraint(equalTo: view.trailingAnchor).isActive = true
hostingView.bottomAnchor.constraint(equalTo: view.bottomAnchor).isActive = true
}
}
struct MainView: View {
var body: some View {
VStack {
TimelineView(.animation) { _ in
Color.random
.frame(width: 100, height: 100)
}
}
}
}
extension Color {
static var random: Color {
Color(
red: .random(in: 0...1),
green: .random(in: 0...1),
blue: .random(in: 0...1)
)
}
}
When running it's printing out this repeatedly (multiple times a second).
"sizeThatFits" SwiftUI.ProposedViewSize(width: Optional(559.0), height: Optional(528.0))
"sizeThatFits" SwiftUI.ProposedViewSize(width: Optional(0.0), height: Optional(0.0))
"sizeThatFits" SwiftUI.ProposedViewSize(width: Optional(559.0), height: Optional(528.0))
"sizeThatFits" SwiftUI.ProposedViewSize(width: Optional(0.0), height: Optional(0.0))
"sizeThatFits" SwiftUI.ProposedViewSize(width: Optional(559.0), height: Optional(528.0))
"sizeThatFits" SwiftUI.ProposedViewSize(width: Optional(0.0), height: Optional(0.0))
"sizeThatFits" SwiftUI.ProposedViewSize(width: Optional(559.0), height: Optional(528.0))
If I run an equivalent code for an iPad, it only prints twice. If I comment out TimelineView on macOS, then it only prints out the above logs when resizing the app window.
The main reason this is an issue is that it's clearly causing dramatic degradation in performance. I was told to submit a bug report after I submitted TSI so a SwiftUI engineer could investigate it. Case-ID: 7461887. FB13810482. This was back in May but I received no response. LLMs are no help, and I've experimented with all sorts of workarounds. My last hope is this forum, maybe someone has an idea of what might be going on and why the recalculation is happening constantly on macOS.
In an AppKit document-based project created by Xcode, setting canConcurrentlyReadDocuments to true allows new documents to open normally in Swift 5, but switching to Swift 6 causes an error.
Judging from the error message, it seems to be a threading issue, but I’m not sure how to adjust the code to support the Swift 6 environment.
The project is the most basic code from an Xcode-created document-based project without any modifications, except for changing the Swift version to 6 and setting canConcurrentlyReadDocuments to true.
Source code: https://drive.google.com/file/d/1ryb2TaU6IX884q0h5joJqqZwSX95Q335/view?usp=sharing
AppDelegate.swift
import Cocoa
@main
class AppDelegate: NSObject, NSApplicationDelegate {
func applicationDidFinishLaunching(_ aNotification: Notification)
{
// Insert code here to initialize your application
}
func applicationWillTerminate(_ aNotification: Notification) {
// Insert code here to tear down your application
}
func applicationSupportsSecureRestorableState(_ app: NSApplication) -> Bool {
return true
}
}
Document.swift
import Cocoa
class Document: NSDocument {
override init() {
super.init()
// Add your subclass-specific initialization here.
}
override class var autosavesInPlace: Bool {
return true
}
override class func canConcurrentlyReadDocuments(ofType typeName: String) -> Bool {
true
}
override func canAsynchronouslyWrite(to url: URL, ofType typeName: String, for saveOperation: NSDocument.SaveOperationType) -> Bool {
true
}
override func makeWindowControllers() {
// Returns the Storyboard that contains your Document window.
let storyboard = NSStoryboard(name: NSStoryboard.Name("Main"), bundle: nil)
let windowController = storyboard.instantiateController(withIdentifier: NSStoryboard.SceneIdentifier("Document Window Controller")) as! NSWindowController
self.addWindowController(windowController)
}
override func data(ofType typeName: String) throws -> Data {
// Insert code here to write your document to data of the specified type, throwing an error in case of failure.
// Alternatively, you could remove this method and override fileWrapper(ofType:), write(to:ofType:), or write(to:ofType:for:originalContentsURL:) instead.
// throw NSError(domain: NSOSStatusErrorDomain, code: unimpErr, userInfo: nil)
return Data()
}
override func read(from data: Data, ofType typeName: String) throws {
// Insert code here to read your document from the given data of the specified type, throwing an error in case of failure.
// Alternatively, you could remove this method and override read(from:ofType:) instead.
// If you do, you should also override isEntireFileLoaded to return false if the contents are lazily loaded.
// throw NSError(domain: NSOSStatusErrorDomain, code: unimpErr, userInfo: nil)
}
}
ViewController.swift
import Cocoa
class ViewController: NSViewController {
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
}
override var representedObject: Any? {
didSet {
// Update the view, if already loaded.
}
}
}
Hi,
My MACOS app has sensitive content and dont want user to take screenshot or to record the screen.
I tries window.sharingType=none. With this user can still record the screen.
I know that user can record with external device. But we dont want him to record using screen capture.
Can you please tell me how to detect when screen recording is active in MACOs apps? or how to prevent screen recording in MACOs apps.
Thanks
In macOS, I am encountering an issue where the system API fails to grant permission to open a file despite enabling the necessary Read/Write permissions within the SandBox. Could you please elucidate the reasons behind this behavior? Thanks!
func finderOpenFileSystem(at path: String) {
let fileURL = URL(fileURLWithPath: path)
guard FileManager.default.fileExists(atPath: path) else {
print("Error: File does not exist at path: \(path)")
return
}
let success = NSWorkspace.shared.open(fileURL)
if success {
print("File opened successfully: \(fileURL)")
} else {
print("Error: Failed to open file: \(fileURL)")
}
}
When scrolling a basic NSScrollView there seems to be a sudden jump after each flick. Scrolling does not appear smooth and is disorientating.
A scroll jump seems to happen directly after letting go of a scroll flick using a trackpad/mouse. Right at that moment the scroll turns into a momentum scroll, slowly decreasing the speed. But the first frame after the gesture the content jumps forward, more than what is expected.
Observations:
Counterintuitively, scrolling appears to be smoother when disabling NSScrollView.isCompatibleWithResponsiveScrolling. If disabled using a custom NSScrollView subclass there is no large jump anymore.
Scrolling also appears to be smoother using a SwiftUI ScrollView. I assume that has the same behaviour as a disabled isCompatibleWithResponsiveScrolling
Ironically a WKWebView scrolls much smoother. No sudden jump is observable. It also seems to scroll with faster acceleration, but the individual frames do appear smoother. Why is this better than a native NSScrollView?
Elastic scrolling at the bounds of the scroll view also appears much smoother for WKWebViews. When pulling to refresh there is a jump for NSScrollView/SwiftUI, but not for WKWebView.
When using an NSScrollView with isCompatibleWithResponsiveScrolling disabled, scrolling appears just as smooth as WKWebView on macOS 13 Ventura and below. On macOS 14 Sonoma scrolling behaviour is suddenly different.
Please see a sample project with 4 different scroll views side by side:
https://github.com/floorish/ScrollTest
Screen recordings show the sudden jumps when scrolling and when elastic scrolling.
Tested on Intel & Arm Macs, macOS 11 Big Sur through 15 Sequoia, built with Xcode 16.
Should isCompatibleWithResponsiveScrolling be disabled on Sonoma+? Are there any drawbacks?
There is also no overdraw anymore since Monterey, as described in https://developer.apple.com/library/archive/releasenotes/AppKit/RN-AppKitOlderNotes/#10_9Scrolling
Even with responsive scrolling disabled, why is WKWebView scrolling much smoother than NSScrollView?
I have an application that binds a menu item to trigger on ⌘]. When I set the US input source, I press ⌘] in order to trigger that item. However, when I switch the input source to QWERTZ (German), the trigger changes to ⌘Ä automatically by the OS. It seems to translate keystrokes for different input sources.
The problem is that I also render the keybindings in a window in my application, and my application is not aware of this translation. Furthermore, I have other key shortcuts in my application which are not bound to menu items, and I want to make sure those get translated too.
Does AppKit expose a way to lookup what a keystroke will be when MacOS translates it, i.e. lookup ⌘Ä from ⌘] when the current layout is QWERTZ? I can't find anything in Apple's docs.
I tried converting a character to virtual key code based on the US layout and then mapping it back to a character based on the QWERTZ layout. That doesn't seem to be the same b/c that ends up converting ] to + instead which seems to be based on physical key location, different from how the keybindings are handled.
Update: I notice similar behavior for VS Code's menu bar, e.g. in their "Terminal" menu. Switching to German changes some bindings. This does not occur at all in iTerm's menu bar, I suspect b/c their menu items are specified in a different way, xib files with hard-coded key equivalents
Let's say I want to build a simple photo management app on Mac or iPad with Swift UI. This app has multi-window support. My photos are organized inside albums. I should be able to drag photos between windows from one album to another.
I struggle to get this working properly with Swift UI. Writing modern code I would like to use Transferable. Let's say my photos are not real files. So I can't use a FileRepresentation. Instead I use CodableRepresentation and encode an identifier. This identifier is later used to drive the move operation between folders.
I ran into some limitations here
Transferable seems to be meant for copy-like Drag & Drop operations. I have no possible to get a "move" cursor on macOS (it's always the copy cursor with the green + sign). Also the API reads like it is about importing/exporting – not moving.
When using dropDestination on ForEach, I completely lack the possibility to deny a drop (e.g. when a photo is already part of the album).
I'd like to have modifier key to switch between copying and moving.
Sometimes a drop should be redirected to a different index. How to do that?
Is there any chance to do this with Transferable? It even doesn't seem to be easy with NSItemProvider and onDrop/onDrag? Or should we still use a plain old UICollectionView/NSCollectionView, if we want to have more sophisticated control over drag/drop validation?
Hi, we are developing a cross-platform library for creating desktop applications in C++: https://github.com/aseprite/laf
For this reason, in macOS, we cannot rely on the default NSApplication.run() event loop, so we decided to implement our custom event loop using the nextEventMatchingMask method.
Then, when a window is in fullscreen mode, for some reason the window stops receiving mouseMove events when the mouse pointer enters an area at the top of the window.
You can see this issue in action by trying the following example project:
https://github.com/martincapello/custom-event-loop-issue
This project just opens one window and uses a custom event loop, it displays the current mouse position at every mouseMove event received, and when the aforementioned area is entered it suddenly stops updating.
There is also a video showing how to reproduce it.
I was able to see that when the position stops updating, we still receive mouseMove events, but for a different window, a borderless window that is added to the NSApplication.windows collection when switching to fullscreen, and which seems to be taken the mouseMove events before reaching the main window.
Also, this issue doesn't happen when using the default NSApplication.run method, despite the borderless windows being added as well.
macOS 15 includes a neat section in System Preferences Settings to change the dynamic text size, as outlined see: https://support.apple.com/guide/mac-help/make-text-and-icons-bigger-mchld786f2cd/mac
However, it's not immediately clear a) how to get one's app in this list, and b) if the usual methods from iOS to react to text size even work on macOS. Does anyone have any experience here? Or should I implement my own controls in my app's settings and call it a day?
For context, my app is a macOS-native SwiftUI app.
How do I implement the same Navigation split view with a side bar in Appkit?
Basically I have this code:
import SwiftUI
struct ContentView: View {
var body: some View {
NavigationSplitView {
// Sidebar
List {
NavigationLink("Item 1", value: "Item 1 Details")
NavigationLink("Item 2", value: "Item 2 Details")
NavigationLink("Item 3", value: "Item 3 Details")
}
.navigationTitle("Items")
} content: {
// Main content (detail view for selected item)
Text("Select an item to see details.")
.padding()
} detail: {
// Detail view (for the selected item)
Text("Select an item from the sidebar to view details.")
.padding()
}
}
}
struct MyApp: App {
var body: some Scene {
WindowGroup {
ContentView()
}
}
}
and wanted to somehow convert it to Appkit. I tried to use an NSSplitViewController but I still don't have that side bar and that button to collapse it, how do I go about this?