Objective-C

RSS for tag

Objective-C is a programming language for writing iOS, iPad OS, and macOS apps.

Posts under Objective-C tag

96 Posts

Post

Replies

Boosts

Views

Activity

-applicationDockMenu: method on NSApplicationDelegate doesn't work when attached to debugger
When I add a simple menu to the dock via the NSApplicationDelegate method -applicationDockMenu: and run the app from Xcode it doesn't work. -(NSMenu*)applicationDockMenu:(NSApplication*)sender { NSMenu *dockMenu = [self buildDockMenu]; if (dockMenu != nil) { NSLog(@"Returning dock menu."); return dockMenu; } else { NSLog(@"Not ready to build dock menu"); return nil; } } When I run the app, my main app window shows up but nothing logs out in -applicationDockMenu: until I click outside my app's window (so if I click the desktop background, or a Finder window, or whatever). Then after I click outside my app's main window this logs out: Returning dock menu. The "Not ready to build dock menu" message does not log out. But...when I right click on the dock icon, the menu doesn't show up. But if I stop the app from Xcode and just run it not attached to the debugger, the dock menu does show up. But this makes the debugging/testing situation not ideal.
3
0
92
May ’25
When using WKWebView in iOS 18.0 (Xcode 16.2) to open a local H5 page, the request for server resources cannot carry cookies.
In our project, we download H5 resources to the local device and then open the H5 pages through WKWebView(-loadFileURL:allowingReadAccessToURL:). When the H5 pages request server resources, cookies are required. Before opening the H5 page, we set the required cookies in the WKHTTPCookieStore using the setCookie method. Additionally, we set the allowFileAccessFromFileURLs and allowUniversalAccessFromFileURLs properties for the WebView. On other mobile phones, the cookies can be carried normally. However, on mobile phones running the iOS 18.0 system, the cookies cannot be carried. Moreover, this problem only emerged after we upgraded Xcode to version 16.2. We've also tried injecting cookies via JavaScript, but it didn't work(document.cookie = xx=${xx}; path=/; expires=weekday, xx jan xxxx xx:xx:xx GMT; Domain=example.com; Secure; SameSite=None ;). Can anyone help me on this? Thanks in advance.
2
0
199
May ’25
Query GPU metrics
Hello! I'm a developer working on a plugin for the Elgato Stream Deck, called GPU Metrics. The plugin currently only works on Windows but I'd like to bring it to macOS. However, based on forum posts I've read (and StackOverflow) there isn't a very clear path to query GPU metrics like usage, temperature, used GPU memory, and power consumption. There are some tools out there that do similar things, but I wanted to see what would be the recommendation from Apple's engineering team to get this data via a public API. Requirements: Access GPU utilization, temperature, memory usage, power usage C/C++ based API for querying the metrics so I can expose the data to JavaScript via Node Addon No need to compatibile with Intel-based Macs, as Apple silicon will be fine for now Plugin GitHub Thank you! Noah
0
0
106
May ’25
AVAssetWriterInputTaggedPixelBufferGroupAdaptor Hanging With Tagged Buffers
We've successfully implemented an AVAssetWriter to produce HLS streams (all code is Objective-C++ for interop with existing codebase) but are struggling to extend the operations to use tagged buffers. We're starting to wonder if the tagged buffers required for an MV-HEVC signal are fully supported when producing HLS segments in a live-stream setting. We generate a live stream of data using something like: UTType *t = [UTType typeWithIdentifier:AVFileTypeMPEG4]; m_writter = [[AVAssetWriter alloc] initWithContentType:t]; // - videoHint describes HEVC and width/height // - m_videoConfig includes compression settings and, when using MV-HEVC, // the correct keys are added (i.e. kVTCompressionPropertyKey_MVHEVCVideoLayerIDs) // The app was throwing an exception without these which was // useful to know when we got the configuration right. m_video = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:m_videoConfig sourceFormatHint:videoHint]; For either path we're producing CVPixelBufferRefs that contain the raw pixel information (i.e. 32BGRA) so we use an adapter to make that as simple as possible. If we use a single view and a AVAssetWriterInputPixelBufferAdaptor things work out very well. We produce segments and the delegate is called. However, if we use the AVAssetWriterInputTaggedPixelBufferGroupAdaptor as exampled in the SideBySideToMVHEVC demo project, things go poorly. We create the tagged buffers with something like: CMTagCollectionRef collections[2]; CMTag leftTags[] = { CMTagMakeWithSInt64Value( kCMTagCategory_VideoLayerID, (int64_t)0), CMTagMakeWithSInt64Value( kCMTagCategory_StereoView, kCMStereoView_LeftEye) }; CMTagCollectionCreate( kCFAllocatorDefault, leftTags, 2, &(collections[0]) ); CMTag rightTags[] = { CMTagMakeWithSInt64Value( kCMTagCategory_VideoLayerID, (int64_t)1), CMTagMakeWithSInt64Value( kCMTagCategory_StereoView, kCMStereoView_RightEye) }; CMTagCollectionCreate( kCFAllocatorDefault, rightTags, 2, &(collections[1]) ); CFArrayRef tagCollections = CFArrayCreate( kCFAllocatorDefault, (const void **)collections, 2, &kCFTypeArrayCallBacks ); CVPixelBufferRef buffers[] = {*b, *alt}; CFArrayRef b = CFArrayCreate( kCFAllocatorDefault, (const void **)buffers, 2, &kCFTypeArrayCallBacks ); CMTaggedBufferGroupRef bufferGroup; OSStatus res = CMTaggedBufferGroupCreate( kCFAllocatorDefault, tagCollections, b, &bufferGroup ); Perhaps there's something about this OBJC code that I've buggered up? Hopefully! Anyways, when I submit this tagged bugger group to the adaptor: if (![mvVideoAdapter appendTaggedPixelBufferGroup:bufferGroup withPresentationTime:pts]) { // report error... } Appending does not raise any errors - eventually it just hangs on us and we never return from it... Real Issue: So either: The delegate assigned to the AVAssetWriter doesn't fire its assetWriter callback which should produce the segments The adapter hangs on the appendTaggedPixelBufferGroup before a segment is ready to be completed (but succeeds for a number of buffer groups before this happens). This is the same delegate class that's assigned to the non multi view code path if MV-HEVC is turned off which works perfectly.
1
0
69
Apr ’25
Xcode 15.4: "Swift.h" file not found for simulator
I'm currently adding swift widgets to my existing ObjC project and building it for ios-simulator causes a "Swift.h" file not found error. it works without issue for device build. I see the file compiled under DerivedData at and set the Header Search Path and User Header Search Path to: $(CONFIGURATION_TEMP_DIR)/$(PROJECT_NAME).build/DerivedSources but it still doesn't work. Removing the #import "proj-Swift.h" line fixes the issues, but I need to import it to use WidgetCenter ReloadAllTimelines. I checked that the file is being generated correctly by viewing the autogenerated file and its contents. Any advice and direction would be a great help. Been stuck on this all week and I can't think of a different solution.
1
0
85
Apr ’25
NSTask-launch path not accessible
I'm trying to launch a command line app from my objective C application (sandboxed) using NSTask and I keep getting "launch path not accessible" Here is the path: [task setLaunchPath:@"/usr/local/bin/codeview"]; I have set the appropriate attributes for codeview and it is working perfectly when I use it from the command line and /usr/local/bin IS in the $PATH I know I have NSTask configured correctly because this WILL work: [task setLaunchPath:@"/usr/bin/hexdump"]; With the exception being that I'm using a command already in /usr/bin. But I can't copy codeview into /usr/bin due to SIPS. I've tried moving codeview to various other non-SIPS protected locations all to no avail. Must all NSTask commands come from /usr/bin? Where might I put codeview so that it can be launched. Today I'm going to use an older computer and disable SIPS to put my command in /usr/bin and see if that works. If it does. I will do it on my main machine.
6
0
141
Apr ’25
What is the correct syntax to continue in app for custom intent?
I have a custom intent. When my app is unable to complete the resolution of a parameter within the app extension, I need to be able to continue within the app. I am unable to figure out what the correct objective C syntax is to enable the execution to continue with the app. Here is what I have tried: completion([[PickWoodIntentResponse init] initWithCode:PickWoodIntentResponseCodeContinueInApp userActivity:nil]); This results in the following error: Implicit conversion from enumeration type 'enum PickWoodIntentResponseCode' to different enumeration type 'INAnswerCallIntentResponseCode' (aka 'enum INAnswerCallIntentResponseCode') I have no idea why it is referring to the enum type of 'INAnswerCallIntentResponseCode' which is unrelated to my app. I have also tried: PickWoodIntentResponse *response = [[PickWoodIntentResponse init] initWithCode:PickWoodIntentResponseCodeContinueInApp userActivity:nil]; completion(response); but that results in 2 errors: Implicit conversion from enumeration type 'enum PickWoodIntentResponseCode' to different enumeration type 'INAnswerCallIntentResponseCode' (aka 'enum INAnswerCallIntentResponseCode') and Incompatible pointer types passing 'PickWoodIntentResponse *' to parameter of type 'INStringResolutionResult *' The relevant autogenerated code provided to me with the creation of my intent is as follows: @class PickWoodIntentResponse; @protocol PickWoodIntentHandling <NSObject> - (void)resolveVarietyForPickWood:(PickWoodIntent *)intent withCompletion:(void (^)(INStringResolutionResult *resolutionResult))completion NS_SWIFT_NAME(resolveVariety(for:with:)) API_AVAILABLE(ios(13.0), macos(11.0), watchos(6.0)); @end typedef NS_ENUM(NSInteger, PickWoodIntentResponseCode) { PickWoodIntentResponseCodeUnspecified = 0, PickWoodIntentResponseCodeReady, PickWoodIntentResponseCodeContinueInApp, PickWoodIntentResponseCodeInProgress, PickWoodIntentResponseCodeSuccess, PickWoodIntentResponseCodeFailure, PickWoodIntentResponseCodeFailureRequiringAppLaunch } @interface PickWoodIntentResponse : INIntentResponse - (instancetype)init NS_UNAVAILABLE; - (instancetype)initWithCode:(PickWoodIntentResponseCode)code userActivity:(nullable NSUserActivity *)userActivity NS_DESIGNATED_INITIALIZER; @property (readonly, NS_NONATOMIC_IOSONLY) PickWoodIntentResponseCode code; @end Am I overlooking something? What would be the proper syntax to have within the completion block to satisfy the compiler?
1
0
88
Apr ’25
What's the idea behind the changes in the Objective-C flavor of the Foundation documentation?
I just noticed that when you check the online documentation for Foundation using the delicious Objective-C flavor, some values are no more the expected ones: https://developer.apple.com/documentation/foundation/filemanager/copyitem(at:to:)?language=objc#return-value true if the item was copied successfully or the file manager’s delegate stopped the operation deliberately. Returns false if an error occurred. Considering that a BOOL used to be YES or NO for the last quarter of a century, I have the following question: [Q] What is the idea behind the disturbing changes of the Objective-C documentation?
5
0
127
Apr ’25
Detached Keychain Suggestion Transparent UI when Programmatically Focusing NSSecureTextField (AppKit/Objective-C)
Environment: • macOS: Sequoia 15.3.2 • Xcode: 16.2 • Framework: AppKit (Objective-C) Issue: When programmatically setting the first responder to an NSSecureTextField shortly after its containing window loads or becomes key, a visual anomaly intermittently occurs (roughly 50% of the time). A semi-transparent UI element—likely related to the system’s Keychain password suggestion/autofill feature—appears detached from the text field. Instead of anchoring to the field, it renders elsewhere on the screen. I found similar issues discussed here: https://stackoverflow.com/questions/74220070/strange-transparent-view-appears-beneath-textfield-in-mac-catalyst-app https://stackoverflow.com/questions/73277582/swiftui-view-with-textfield-and-securefield-buggy-on-macos-shows-strange-view/73615876#73615876 https://developer.apple.com/forums/thread/708075
1
0
60
Mar ’25
Custom keypad touchUpInside events not working in iOS18
I have a custom keypad to accept numeric input for iPads that I have been using for many years now. This is longstanding working code. With iOS 18 the touchUpInside (and other) events in the underlying Objective-C modules are not called in the file owner module when activated from the interface. The buttons seem to be properly activated based on the visual cues (they change colors when pressed). This is occurring in both simulators and on hardware. Setting the target OS version does not help. What could the cause and/or solution of this be?
0
0
50
Mar ’25
Healthstore read only in objective-C ?
I have an App in objective-c that is using Health data (walk/run, cycling) to give advice to users . I do not want/need to write any data in the Healtkit. If i do (with the 3 values in the plist / .info : self.healthStore requestAuthorizationToShareTypes:nil readTypes:readDataTypes My request crashes. *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'Must request authorization for at least one data type' *** First throw call stack: ( 0 CoreFoundation 0x00000001804b910c __exceptionPreprocess + 172 1 libobjc.A.dylib 0x0000000180092da8 objc_exception_throw + 72 2 CoreFoundation 0x00000001804b901c -[NSException initWithCoder:] + 0 3 HealthKit 0x000000019da034d4 -[HKHealthStore _validateAuthorizationRequestWithShareTypes:readTypes:] + 92 4 HealthKit 0x000000019da03670 -[HKHealthStore requestAuthorizationToShareTypes:readTypes:shouldPrompt:completion:] + 292 BUT in swift : healthStore.requestAuthorization(toShare: nil, read: readTypes) is working, présents only my 2 datas to read... in the same IOS , same phone without crashing. What is the difference ? Nil object in objective-c and Nil object in swift are not the same ? how do i make readonly requests in objective C ?
5
0
370
Mar ’25
macOS maximum CPU usage of application
My audio and MIDI sequencer application consumes about 600 % of CPU power with 10 different instruments during playback. While idle approximately 100%. What is the maximum of CPU power that an application can consume? Are there any limits and could they be modified? I am asking because if I add more instruments the real-time behaviour gets bad at 700 % of CPU power. I have got following HW: MacBook Pro 14-inch, Nov 2024 Apple M4 Pro 24 GB
1
0
192
Mar ’25
Adjusting the width of a UISlider in self.navigationItem.titleView
I set the titleView of a view controller to a UISlider like so in viewDidLoad: UISlider *slider = [UISlider new]; self.navigationItem.titleView = slider; The desired outcome is that the slider takes the full width of the title view. This is working fine when the view is loaded in the wider landscape mode. The slider adjust its size as expected when rotating to portrait mode. However, when the view is loaded in the narrower portrait mode and then the device is rotated to landscape, the slider does not grow in width to make use of the newly available size. Why is that so and how can it get the desired outcome as described? After viewDidLoad: After rotating:
1
0
259
Mar ’25
How do we retrieve UnknownFSObjectIcon.icns these days?
In the good old days, it was possible to retrieve dynamically the UnknownFSObjectIcon.icns icon using: [[NSWorkspace sharedWorkspace] iconForFileType:NSFileTypeForHFSTypeCode(kUnknownFSObjectIcon)]; Now, this solution is considered to be deprecated (but is still working) by recent macOS SDKs. [Q] What is the modern equivalent of this solution? Notes: Yes, reading the file directly works but is more fragile than using a System API. Yes, Xcode suggests to use the iconForContentType: method but I haven't found which UTType should be used.
1
0
315
Mar ’25
Objective-C: instantiating a Class object
My company wants to be insure that if my Objective-C to Swift conversions fail in anyway, that the app can revert to using the older Objective-C code. By using a remotely controllable flag, the app can switch which code runs as, both are compiled into the app. Essentially, I create a protocol that describes the original class, then both classes (with a "s" or "o" appended to them) conform to the protocol. Protocol: Object Objective-C class: oObject Swift class: sObject That said, I hit one issue that I just can't seem reason out. I create a Objective-C function that returns the appropriate class: Class<Object> classObject(void) { if (myFlag) { return [sObject class]; } else { return [oObject class]; } } Swift deals with this really well - I can create an initialized object using: let object = classObject().init() but I cannot find a way to do this in Objective-C: Object *object = [[classSalesForceData() alloc] init]; fails with "No known class method for selector 'alloc'" Is there a way to do this? David PS: my workaround is to return an allocated object: Object *createObject(void) { if (myFlag) { return [sObject alloc]; } else { return [oObject alloc]; } }
4
0
466
Feb ’25
How to validate a property list has the right structure
I need to read data from the user. For convenience, the data will be in a property list, so it's easy to get a dictionary containing the property list data. But, since it's coming from outside, I need to validate that the data is in the required format, i.e. it has the right keys and the right sort of data for each key, e.g. <name> has a string, <keys> has an array of appropriate values. Since this is part of a long-established product, and targets 10.13, I want to do this in Objective-C if possible. I've been working mostly with Swift in recent years, so I've forgotten a lot of what I used to know about Objective-C, I'm sure. My first thought was to obtain the value for each key and check the class type with isa, but I see that's deprecated in macOS 13 with no replacement. I don't see another way to check the class. I'm sure other people have solved the same problem, but my searches have not turned up any answers.
3
0
357
Feb ’25
swiftui fileimporter inside UIHostingController
I'm working on an old iOS app that started with objective-C + UIKit and has being migrated to Swift + SwiftUI. Currently its code is mostly Swift + SwiftUI but it has still some objective-C and some UIKit ViewControllers. One of the SwiftUI views uses fileImporter to open Files App and select a file from the device. This has been working well until iOS 18 is launched. With iOS 18 the file picker is not launching correctly and is frozen in every simulator (the unique real device I've could test with iOS 18 seemed to work correctly). I managed to clone my project and leave it with the minimal amount of files to reproduce this error. This is the code: AppDelegate.h #import <UIKit/UIKit.h> @interface AppDelegate : UIResponder <UIApplicationDelegate> {} @property (strong, nonatomic) UIWindow *window; @end AppDelegate.m #import "AppDelegate.h" #import "MyApp-Swift.h" @interface AppDelegate () @end @implementation AppDelegate - (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions { self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]]; self.window.backgroundColor = [UIColor whiteColor]; [self.window makeKeyAndVisible]; FirstViewBuilder *viewBuilder = [[FirstViewBuilder alloc] init]; [viewBuilder show]; return YES; } @end FirstViewBuilder.swift import SwiftUI @objc class FirstViewBuilder: NSObject { private var view: UIHostingController<FirstView> @objc override init() { self.view = MyHostingController(rootView: FirstView()) } @objc func show() { let app = UIApplication.shared.delegate as? AppDelegate let window = app?.window window?.backgroundColor = .white // Use navigationController or view directly depending on use window?.rootViewController = view } } FirstView.swift import SwiftUI struct FirstView: View { @State var hasToOpenFilesApp = false var body: some View { VStack(alignment: .leading, spacing: 0) { Button("Open Files app") { hasToOpenFilesApp = true }.fileImporter(isPresented: $hasToOpenFilesApp, allowedContentTypes: [.text]) { result in switch result { case .success(let url): print(url.debugDescription) case .failure(let error): print(error.localizedDescription) } } } } } And finally, MyHostingController import SwiftUI class MyHostingController<Content>: UIHostingController<Content> where Content: View { override init(rootView: Content) { super.init(rootView: rootView) } @objc required dynamic init?(coder aDecoder: NSCoder) { fatalError("init(coder:) has not been implemented") } override func viewDidLoad() { super.viewDidLoad() navigationItem.hidesBackButton = true } } Launching this in an iPhone 13 Pro (18.2) simulator I click on Open Files App, it takes 2 seconds to open it, and it opens full screen (not like a modal). Buttons on the top are behind the status bar and buttons at the bottom are behind the Home indicator. But it's worse because the user can't interact with this view, it's frozen. I created a fresh SwiftUI project just with this unique view and the fileimport worked as expected so I thought the problem was due to embed the SwiftUI view inside the UIHostingController. So I made these modifications to the minimal project: Remove the files AppDelegate, FirstViewBuilder and MyHostingController. Create this SwiftUI App file import SwiftUI @main struct MyApp: App { var body: some Scene { WindowGroup { FirstView() } } } And again the same problem with iOS 18. But if I launch this exact project in an iPhone 13 Pro (17.4) simulator and open the files apps (now it opens almost instantly) it works OK and shows the file picker as a modal, as expected, and I can interact with it and select files. Last thing I've tried is removing LaunchScreen.xib from my project and Launch screen interface file base name key from my info.plist but the problem keeps happening. I guess it must be due to my project configuration (too old) but I have no more ideas of where to look at. The possibility of having a fresh SwiftUI project and "move" the old project to the new one could take me several weeks and I discard it by the moment. Could I use another method to select files from SwiftUI views with iOS 18?
2
0
433
Feb ’25
Populating Now Playing with Objective-C
Hello. I am attempting to display the music inside of my app in Now Playing. I've tried a few different methods and keep running into unknown issues. I'm new to Objective-C and Apple development so I'm at a loss of how to continue. Currently, I have an external call to viewDidLoad upon initialization. Then, when I'm ready to play the music, I call playMusic. I have it hardcoded to play an mp3 called "1". I believe I have all the signing set up as the music plays after I exit the app. However, there is nothing in Now Playing. There are no errors or issues that I can see while the app is running. This is the only file I have in Xcode relating to this feature. Please let me know where I'm going wrong or if there is another object I need to use! #import <Foundation/Foundation.h> #import <UIKit/UIKit.h> #import <MediaPlayer/MediaPlayer.h> #import <AVFoundation/AVFoundation.h> @interface ViewController : UIViewController <AVAudioPlayerDelegate> @property (nonatomic, strong) AVPlayer *player; @property (nonatomic, strong) MPRemoteCommandCenter *commandCenter; @property (nonatomic, strong) MPMusicPlayerController *controller; @property (nonatomic, strong) MPNowPlayingSession *nowPlayingSession; @end @implementation ViewController - (void)viewDidLoad { [super viewDidLoad]; NSLog(@"viewDidLoad started."); [self setupAudioSession]; [self initializePlayer]; [self createNowPlayingSession]; [self configureNowPlayingInfo]; NSLog(@"viewDidLoad completed."); } - (void)setupAudioSession { AVAudioSession *audioSession = [AVAudioSession sharedInstance]; NSError *setCategoryError = nil; if (![audioSession setCategory:AVAudioSessionCategoryPlayback error:&setCategoryError]) { NSLog(@"Error setting category: %@", [setCategoryError localizedDescription]); } else { NSLog(@"Audio session category set."); } NSError *activationError = nil; if (![audioSession setActive:YES error:&activationError]) { NSLog(@"Error activating audio session: %@", [activationError localizedDescription]); } else { NSLog(@"Audio session activated."); } } - (void)initializePlayer { NSString *soundFilePath = [NSString stringWithFormat:@"%@/base/game/%@",[[NSBundle mainBundle] resourcePath], @"bgm/1.mp3"]; if (!soundFilePath) { NSLog(@"Audio file not found."); return; } NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath]; self.player = [AVPlayer playerWithURL:soundFileURL]; NSLog(@"Player initialized with URL: %@", soundFileURL); } - (void)createNowPlayingSession { self.nowPlayingSession = [[MPNowPlayingSession alloc] initWithPlayers:@[self.player]]; NSLog(@"Now Playing Session created with players: %@", self.nowPlayingSession.players); } - (void)configureNowPlayingInfo { MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter]; CMTime duration = self.player.currentItem.duration; Float64 durationSeconds = CMTimeGetSeconds(duration); CMTime currentTime = self.player.currentTime; Float64 currentTimeSeconds = CMTimeGetSeconds(currentTime); NSDictionary *nowPlayingInfo = @{ MPMediaItemPropertyTitle: @"Example Title", MPMediaItemPropertyArtist: @"Example Artist", MPMediaItemPropertyPlaybackDuration: @(durationSeconds), MPNowPlayingInfoPropertyElapsedPlaybackTime: @(currentTimeSeconds), MPNowPlayingInfoPropertyPlaybackRate: @(self.player.rate) }; infoCenter.nowPlayingInfo = nowPlayingInfo; NSLog(@"Now Playing info configured: %@", nowPlayingInfo); } - (void)playMusic { [self.player play]; [self createNowPlayingSession]; [self configureNowPlayingInfo]; } - (void)pauseMusic { [self.player pause]; [self configureNowPlayingInfo]; } @end
2
0
531
Feb ’25