Construct and manage a graphical, event-driven user interface for your macOS app using AppKit.

AppKit Documentation

Posts under AppKit tag

233 Posts
Sort by:
Post not yet marked as solved
3 Replies
879 Views
Shortly after presenting a UIDocumentPickerViewController on Mac Catalyst the system throws this exception and I keep hitting my exception breakpoint: NSView: valueForUndefinedKey this class is not key value coding-compliant for the key cell. Appears to be related to the system using Touch Bar APIs (my app isn't currently using Touch Bar APIs directly but the NSOpenPanel created by UIDocumentPickerViewController is). #2 0x0000000198de2828 in -[NSObject(NSKeyValueCoding) valueForUndefinedKey:] () #3 0x000000019884ef3c in -[NSObject(NSKeyValueCoding) valueForKey:] () #4 0x0000000200b75a68 in -[NSObject(UIAccessibilitySafeCategory) __axValueForKey:] () #5 0x0000000200b75960 in __57-[NSObject(UIAccessibilitySafeCategory) safeValueForKey:]_block_invoke () #6 0x0000000200b75f9c in -[NSObject(UIAccessibilitySafeCategory) _accessibilityPerformSafeValueKeyBlock:withKey:onClass:] () #7 0x0000000200b754d0 in -[NSObject(UIAccessibilitySafeCategory) safeValueForKey:] () #8 0x0000000222206b60 in -[NSTouchBarItemAccessibility__UIKit__AppKit _accessibilityPopulateAccessibiltiyInfoFromUIKit] () #9 0x0000000222206b1c in -[NSTouchBarItemAccessibility__UIKit__AppKit _itemViewMinSize:maxSize:preferredSize:stretchesContent:] () #10 0x000000019b3f70bc in -[NSCompressionGroupLayout item:minSize:maxSize:preferredSize:] () #11 0x000000019aff24f4 in -[NSTouchBarItemContainerView _updateMeasuredSizes] () #12 0x000000019aff2358 in -[NSTouchBarItemContainerView minSize] () #13 0x000000019accae98 in -[NSTouchBarLayout _aggregateWidthOfItems:sharesLeftEdge:sharesRightEdge:widthMeasurement:] () #14 0x000000019accb22c in -[NSTouchBarLayout _attributesOfItems:centerItems:givenSize:sharesLeftEdge:sharesRightEdge:xOrigin:] () #15 0x000000019acca930 in -[NSTouchBarLayout attributesOfItems:centerItems:givenSize:] () #16 0x000000019b52bbb0 in -[NSTouchBarView _positionSubviews] () #17 0x000000019b52ba6c in -[NSTouchBarView layout] () -- The exception does get caught (my app doesn't crash) but I hit the exception breakpoint over and over again while the NSOpenPanel is on screen which is annoying.
Post not yet marked as solved
2 Replies
2.1k Views
I'm developing a media player for Mac (AppKit, not Catalyst) that plays local and remote content. I have AirPlay working with AVPlayer (with an AVRoutePickerView assigning the route), but while I get the metadata that I've set for the MPNowPlayingInfoCenter on the AirPlay device (a TV in this case), I don't get any album art (but I do in the macOS now playing menu bar/control centre applet). It looks like this: (imgur link because I can't get it to upload in the forum): https://i.imgur.com/2JBIYCw.jpg My code for setting the metadata:         NSImage *artwork = [currentTrack coverImage];         CGSize artworkSize = [artwork size];         MPMediaItemArtwork *mpArtwork = [[MPMediaItemArtwork alloc] initWithBoundsSize:artworkSize requestHandler:^NSImage * _Nonnull(CGSize size) {             return artwork;         }];         [songInfo setObject: mpArtwork forKey:MPMediaItemPropertyArtwork]; I noticed that it doesn't resize, but it seems at least macOS doesn't care. I tried modifying the code to resize the artwork in the callback, but that also doesn't change anything. I noticed in the logs that I get a message about a missing entitlement: 2023-01-29 14:00:37.889346-0400 Submariner[42682:9794531] [Entitlements] MSVEntitlementUtilities - Process Submariner PID[42682] - Group: (null) - Entitlement: com.apple.mediaremote.external-artwork-validation - Entitled: NO - Error: (null) ...however, this seems to be a private entitlement and the only reference I can find to it is WebKit. Using it makes LaunchServices very angry at me, and I presume it's a red herring.
Posted
by
Post not yet marked as solved
4 Replies
1.1k Views
Filed as rdar://FB11975037 When macOS Ventura is run as a guest OS within the virtualization framework, the main menu bar items will not be displayed correctly if VZMacGraphicsDisplayConfiguration defines a large resolution. The menu bar titles appear to be using the same color as the menu bar itself. When the Appearance is set to Light, the menu bar items are effectively invisible. When the Appearance is set to Dark, the menu bar items are drawn in what looks like a disabled state. This only affects the menu bar item titles on the left-hand side. The date-time and menu bar icons on the right side are always displayed in the correct color. This appears to be a regression in macOS Ventura as this issue is not present in macOS 12 running as a guest. This bug can be easily reproduced using Apple's own Virtualization sample code titled: "Running macOS in a Virtual Machine on Apple Silicon Macs" Steps to reproduce: Follow the sample code instructions for building and installing a VM.bundle. Before running 'macOSVirtualMachineSampleApp', change the VZMacGraphicsDisplayConfiguration to use: width = 5120, height = 2880, ppi = 144. Run 'macOSVirtualMachineSampleApp' and notice that the menu bar titles on the left side of the screen are not correctly drawn in the guest instance. This has been tested on: Host: macOS 13.1 Guest: macOS 13.x (All versions) Hardware: MBP 14" M1 Pro 32GB/2TB Is there anything that can be done to resolve this issue?
Posted
by
Post not yet marked as solved
0 Replies
359 Views
I have several applications (written in Java, although I doubt that matters). Some of these applications define custom dock menus and some do not. None of them are NSDocument based. Most of the time, the dock menus for these applications include at the top a list of the document's windows. Windows that are hidden (minimized?) are included in the list with a diamond icon. I don't think that my applications or Java are creating these menu items. What I don't understand is that when the application is hidden, sometimes these menu items disappear from the dock menu and sometimes they remain. Is this a bug, or is there some rational explanation? (I would prefer that the menu items remain.) I'm running macOS 13.3.1 (Ventura).
Posted
by
Post not yet marked as solved
0 Replies
537 Views
Apparently, the call to performDefaultImplementation of an AppleScript is called on the main runloop with the defaultModes only, so it is not called while a menu (or contextual menu) is open. This is very unfortunate as it means the execution of these scripts is delayed until after the menu is closed. Looking for a way to change this behaviour but I have no clue how... Thanks ! Daniel
Posted
by
Post not yet marked as solved
5 Replies
512 Views
I am running on macOS 13.3.1 (a). I have a longstanding macOS app with a document icon. I have changed app's icon and document icon in the latest release. The new application icon shows correctly on the app's icon, but the new document icon does not display properly in the title bar of my NSDocument windows. All I did was to change the application and document icons in the Asset catalog and recompile with Xcode 14.3. My documents are file packages. There are no copies of the application with the old document icon anywhere on my disk. If I create a new document using the app, the document displays the correct--new--icon, but when the app opens the document, the icon that appears in the title bar is the old document icon. When I discovered this I rebuilt the launch services database, but the problem remains. From where could the system be getting that old icon? Does anybody have any idea how to fix this?
Posted
by
Post not yet marked as solved
1 Replies
565 Views
So in my app users can select a larger block of connected elements with a triple-click, just like selecting a paragraph works in an NSTextView. However, a small subset of these elements also has properties that can be edited and a double-click switches to edit mode for these elements.. This leads to the current, unfortunate behaviour: If a user triple-clicks a non-editable element the entire block of elements is highlighted, no problem. If a user triple-clicks editable elements an edit is triggered and the third click is ignored. What would be the best approach to enable triple-click on editable elements, i.e. something like waiting whether a third mouse click follows and enabling edit mode only then..? Some dispatch after mechanism with a delay of NSEvent.doubleClickInterval maybe? Any suggestions on how to implement this in a robust manner would be much appreciated! Cheers, Jay
Posted
by
Post not yet marked as solved
1 Replies
678 Views
I am using SwiftUI to build a macOS app. I have to print papers and I'm trying to see if I'm actively connected to a printer. I have a default printer set as you can see in the below image but it's currently offline. My issue is the code I have returns true that I am connected to a printer because I have a saved printer device. Is there a way to check if the printer is offline even when I'm connected? In the image it says the printer is offline and I need to know how to get that. Code I'm currently using that returns true: func isConnectedToPrinter() -> Bool { let printers = NSPrinter.printerNames return !printers.isEmpty } This returns true because the printer is still remembered by my mac even though its Offline(powered down). Any idea on how mac OS can determine the printer is "Offline"? Also here is my current code to print the pdfDocument, Is there anything I can add here to help? private func printPDFDocument(forDoc document: PDFDocument) { let printInfo = NSPrintInfo.shared printInfo.horizontalPagination = .fit printInfo.verticalPagination = .fit printInfo.orientation = .portrait printInfo.topMargin = 0 printInfo.bottomMargin = 0 printInfo.leftMargin = 0 printInfo.rightMargin = 0 printInfo.isHorizontallyCentered = true printInfo.isVerticallyCentered = true let scale: PDFPrintScalingMode = .pageScaleDownToFit let printOp = document.printOperation(for: printInfo, scalingMode: scale, autoRotate: true) DispatchQueue.main.async { let result = printOp?.run() self.showLoadingText = false } }
Posted
by
Post not yet marked as solved
0 Replies
425 Views
In Monterey, when a user was at the top of a ScrollView implemented inside of a NSHostingController's view (that was itself embedded in a window with a NSToolbar), the window's toolbar background would be hidden until the user scrolled from the top. In Ventura, this behavior is different, with the toolbar's background visible all of the time unless a traditional NSScrollView is used (which means no SwiftUI). Is there the ability to change this behavior within SwiftUI some how now?
Posted
by
Post not yet marked as solved
0 Replies
254 Views
As the picture 1 shows, I have a NSTableView and the table view have left and right insects between the scroll view and itself. However, I don't know why there are 2 NSTableBackgroundViews added in the insects. In the OS version of 13.4, the NSTableBackgroundViews are transparent, but in the OS version of 10.13, it's black as the picture 2 shows. Can anyone give some information about NSTableBackgroundView, and how to prevent it to be added automatically?
Posted
by
Post not yet marked as solved
0 Replies
520 Views
I have a macOS app that has a lot of form type inputs throughout. As such I have subclassed NSPopUpButton and implemented canBecomeKeyView to return YES. This works great as users can tab through fields and pop up menus to enter data. The issue is that when tabbing to a pop up I expect the user to see instant changes as they type, instead it takes 1-2 seconds before the menu changes. If the user hits the down arrow key it expands the menu and then type selection is instantaneous. Is there any way to make NSPopUpButton respond instantly to typing, like a select menu in a web form? As it is it inhibits fast data input as you wait for it to recognize your input before tabbing to the next field.
Posted
by
Post not yet marked as solved
1 Replies
748 Views
I have a Mac app, Notenik, written with AppKit, that dynamically generates an Edit view that is used to edit multiple fields for a Note. Sometimes, depending on a user's request, multiple NSTextViews end up appearing within the same parent view (a tab, if that matters). Lately I and some of my users have noticed that, when moving from one NSTextView to another (either by tabbing or by clicking), the cursor initially fails to appear. Once the user starts editing within the NSTextView, the cursor appears and acts normally. And this doesn't occur when moving from an ordinary text field to an NSTextView -- only when moving from one NSTextView to another. But it is troubling and confusing to users that the cursor initially disappears. I'm not sure exactly when this started happening, but it has not been happening for very long. Although I make frequent updates to the app, this section of code has had no recent changes, so I am at a loss to explain this behavior, or to correct it. Any help would be appreciated.
Posted
by
Post not yet marked as solved
0 Replies
335 Views
Let's say I have a NSTextView embedded in a NSClipView and NSScrollView. Let's say that the application using this NSTextView is run in a right-to-left (RTL) language. Let's say that because the text visible in the text view will always be LTR, the text view alignment is forced to left instead of the default natural setting and that this text is big enough to require a vertical and horizontal scrollers. What I'm experiencing so far is that AppKit will scroll the NSTextView to the top right corner. Which would make sense if the contents of the textview was RTL. But it is not. I looked at the documentation, headers, tried playing with a subclass of NSScrollView and play with the tile method, with the userInterfaceLayoutDirection property of the scrollview. No success so far. [Q] How do you tell NSScrollView or NSTextView that the default scroll position is the top left corner and not the top right corner in such a case? Environment: macOS 10.14.6 ; AppKit ; Objective-C (if that matters)
Posted
by
Post not yet marked as solved
1 Replies
346 Views
The documentation is silent on this question. I can imagine several possibilities: The block is performed regardless of the run loop mode. The block is performed in the default run loop mode. The block is performed in any of the common run loop modes. (I shouldn't have to write a test program to figure this out!)
Posted
by
Post not yet marked as solved
0 Replies
1k Views
Hello there! We do have a (custom, borderless) NSWindow which is moved / resized by app methods, not standard window-titlebar-live-windowsserver-assisted method. In macOS14 Sonoma first developer preview, AppKit seems to be rearchitected and window management is all new. All window modifications now seem to be synchronised with display (v-sync) and thus slower. That's fine but if we have two windows moved at the same time (we have a transparent child/parent window which is moved together with window), it's two times slower. That's very slow. NSDisableScreenUpdates or CATransaction doesn't seem to have any effect like they did before. Any ideas?
Posted
by
Post not yet marked as solved
0 Replies
478 Views
I've got a macOS app that has a collection view all done with bindings and no datasource implementation. I've got it so I can drag and drop items, highlight selection and delete items. The last item is to bind the collection view selection to my array controller. In IB I select the collection view then in the bindings inspector I select the array controller under selection indexes and enter the controller key of selectionIndexes. But this does nothing. If I log the collection view selectionIndexes it shows correctly but the array controller will show nothing. I've fussed with this the entire day and am starting to believe it just won't work like it does with a table view. I thought about updating the array controller selection through the collection view delegate selection methods -didSelectItemsAtIndexPaths / didDeselectItemsAtIndexPaths but it feels like I must be missing something. Should this be possible and if not what are my alternatives?
Posted
by
Post not yet marked as solved
0 Replies
317 Views
To design a macOS application, use SetToolTip in menuItemImage or menuItemButton, how to change tooltip font size?
Posted
by
Post marked as solved
1 Replies
417 Views
Hello! Here is the reproducer of the problem: #include <Foundation/Foundation.h> #include <Cocoa/Cocoa.h> @interface MyWindow : NSWindow { } - (instancetype)init; - (BOOL)windowShouldClose:(id)sender; @end @interface MyView : NSView { } - (BOOL)acceptsFirstResponder; - (void)mouseDown:(NSEvent *)event; - (void)mouseDragged:(NSEvent *)event; - (void)mouseUp:(NSEvent *)event; @end @implementation MyWindow - (instancetype)init { [super initWithContentRect:NSMakeRect(0, 0, 300, 300) styleMask:NSWindowStyleMaskTitled | NSWindowStyleMaskClosable | NSWindowStyleMaskMiniaturizable | NSWindowStyleMaskResizable backing:NSBackingStoreBuffered defer:NO]; [self setTitle:@"My Window"]; [self center]; MyView* contentView = [[MyView alloc] init]; [contentView setWantsLayer:TRUE]; [[contentView layer] setBackgroundColor:[[NSColor lightGrayColor] CGColor]]; [self setContentView: contentView]; [self setIsVisible:YES]; return self; } - (BOOL)windowShouldClose:(id)sender { [NSApp terminate:sender]; return YES; } @end @implementation MyView - (BOOL)acceptsFirstResponder { return YES; } - (void)mouseDown:(NSEvent *)event { const NSUInteger pmb = [NSEvent pressedMouseButtons]; NSLog(@"mouseDown: pmb=%llu, event=%@", (unsigned long long)pmb, event); [super mouseDown:event]; } - (void)mouseDragged:(NSEvent *)event { const NSUInteger pmb = [NSEvent pressedMouseButtons]; NSLog(@"mouseDragged: pmb=%llu, event=%@", (unsigned long long)pmb, event); [super mouseDragged:event]; } - (void)mouseUp:(NSEvent *)event { const NSUInteger pmb = [NSEvent pressedMouseButtons]; NSLog(@"mouseUp: pmb=%llu, event=%@", (unsigned long long)pmb, event); [super mouseUp:event]; } @end int main(int argc, char* argv[]) { [NSAutoreleasePool new]; [NSApplication sharedApplication]; [[[[MyWindow alloc] init] autorelease] makeMainWindow]; [NSApp run]; return 0; } What you need to do: to run the app and make a few (~10) short mouse drags ( routines left mouse button press -> drag -> left mouse button release) inside the window. In the application's output you'll find the line(s) containing the following: mouseDragged: pmb=0, event=... Which mean(s) that [NSEvent pressedMouseButtons] returned 0 for a drag event, i.e. there were no pressed buttons. Documentation for the [NSResponder mouseDragged(with:)] says that the mouse has been moved with the pressed button. Doesn't it mean that [NSEvent pressedMouseButtons] is supposed to return this pressed button? Just in case, my setup is MBP 16" 2019 + macOS 12.6.3, mouse actions are performed via the trackpad (not a mouse).
Posted
by
Post not yet marked as solved
0 Replies
610 Views
How to make a Mac OS app without the entire rectangular border/window just its component is showing? Like Grammarly, its a button floating beside any text field on Mac So far I have done: Set the app as an agent in the info list to remove it from the dock hide the title bar
Posted
by
Post not yet marked as solved
1 Replies
908 Views
I am trying to add custom attributes on-the-fly. To make it work, I subclassed NSTextLayoutFragment, and overrode .textLineFragments property to return custom made NSTextLineFragment objects. But if I override it, TextKit2 no longer render the text and selection also doesn't work. It's same even if provide NSTextLineFragment with exactly same properties (attributed string and range). In WWDC 22 video, you told me that NSTextLayoutFragment and NSTextLineFragment are all immutable and have value semantics. But it doesn't work with different object, therefore seems still have very strong reference semantics. How to make it work with custom attributes? P.S. I also reported this as a feedback -- https://feedbackassistant.apple.com/feedback/12443016
Posted
by