Objective-C

RSS for tag

Objective-C is a programming language for writing iOS, iPad OS, and macOS apps.

Posts under Objective-C tag

136 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Custom keypad touchUpInside events not working in iOS18
I have a custom keypad to accept numeric input for iPads that I have been using for many years now. This is longstanding working code. With iOS 18 the touchUpInside (and other) events in the underlying Objective-C modules are not called in the file owner module when activated from the interface. The buttons seem to be properly activated based on the visual cues (they change colors when pressed). This is occurring in both simulators and on hardware. Setting the target OS version does not help. What could the cause and/or solution of this be?
0
0
24
1w
macOS maximum CPU usage of application
My audio and MIDI sequencer application consumes about 600 % of CPU power with 10 different instruments during playback. While idle approximately 100%. What is the maximum of CPU power that an application can consume? Are there any limits and could they be modified? I am asking because if I add more instruments the real-time behaviour gets bad at 700 % of CPU power. I have got following HW: MacBook Pro 14-inch, Nov 2024 Apple M4 Pro 24 GB
1
0
138
1w
WiFi Connect error,NEHotspotConfigurationErrorDomain code=11
hi everybody, When I use the following code to connect to WiFi network, an error message of "error=null" or "error='Error Domain=NEHotspotConfigurationErrorDomain Code=11 "" UserInfo={NSLocalizedDescription=}' " will occur. It has been uploaded to Feedback. Feedback ID: FB16819345 (WiFi-无法加入网络) NEHotspotConfiguration *hotspotConfig = [[NEHotspotConfiguration alloc] initWithSSID:ssid passphrase:psk isWEP:NO]; [[NEHotspotConfigurationManager sharedManager] applyConfiguration:hotspotConfig completionHandler:^(NSError * _Nullable error) { }];
7
0
259
1d
Healthstore read only in objective-C ?
I have an App in objective-c that is using Health data (walk/run, cycling) to give advice to users . I do not want/need to write any data in the Healtkit. If i do (with the 3 values in the plist / .info : self.healthStore requestAuthorizationToShareTypes:nil readTypes:readDataTypes My request crashes. *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'Must request authorization for at least one data type' *** First throw call stack: ( 0 CoreFoundation 0x00000001804b910c __exceptionPreprocess + 172 1 libobjc.A.dylib 0x0000000180092da8 objc_exception_throw + 72 2 CoreFoundation 0x00000001804b901c -[NSException initWithCoder:] + 0 3 HealthKit 0x000000019da034d4 -[HKHealthStore _validateAuthorizationRequestWithShareTypes:readTypes:] + 92 4 HealthKit 0x000000019da03670 -[HKHealthStore requestAuthorizationToShareTypes:readTypes:shouldPrompt:completion:] + 292 BUT in swift : healthStore.requestAuthorization(toShare: nil, read: readTypes) is working, présents only my 2 datas to read... in the same IOS , same phone without crashing. What is the difference ? Nil object in objective-c and Nil object in swift are not the same ? how do i make readonly requests in objective C ?
5
0
324
1w
Adjusting the width of a UISlider in self.navigationItem.titleView
I set the titleView of a view controller to a UISlider like so in viewDidLoad: UISlider *slider = [UISlider new]; self.navigationItem.titleView = slider; The desired outcome is that the slider takes the full width of the title view. This is working fine when the view is loaded in the wider landscape mode. The slider adjust its size as expected when rotating to portrait mode. However, when the view is loaded in the narrower portrait mode and then the device is rotated to landscape, the slider does not grow in width to make use of the newly available size. Why is that so and how can it get the desired outcome as described? After viewDidLoad: After rotating:
1
0
215
3w
How do we retrieve UnknownFSObjectIcon.icns these days?
In the good old days, it was possible to retrieve dynamically the UnknownFSObjectIcon.icns icon using: [[NSWorkspace sharedWorkspace] iconForFileType:NSFileTypeForHFSTypeCode(kUnknownFSObjectIcon)]; Now, this solution is considered to be deprecated (but is still working) by recent macOS SDKs. [Q] What is the modern equivalent of this solution? Notes: Yes, reading the file directly works but is more fragile than using a System API. Yes, Xcode suggests to use the iconForContentType: method but I haven't found which UTType should be used.
1
0
261
3w
Objective-C: instantiating a Class object
My company wants to be insure that if my Objective-C to Swift conversions fail in anyway, that the app can revert to using the older Objective-C code. By using a remotely controllable flag, the app can switch which code runs as, both are compiled into the app. Essentially, I create a protocol that describes the original class, then both classes (with a "s" or "o" appended to them) conform to the protocol. Protocol: Object Objective-C class: oObject Swift class: sObject That said, I hit one issue that I just can't seem reason out. I create a Objective-C function that returns the appropriate class: Class<Object> classObject(void) { if (myFlag) { return [sObject class]; } else { return [oObject class]; } } Swift deals with this really well - I can create an initialized object using: let object = classObject().init() but I cannot find a way to do this in Objective-C: Object *object = [[classSalesForceData() alloc] init]; fails with "No known class method for selector 'alloc'" Is there a way to do this? David PS: my workaround is to return an allocated object: Object *createObject(void) { if (myFlag) { return [sObject alloc]; } else { return [oObject alloc]; } }
4
0
398
3w
swiftui fileimporter inside UIHostingController
I'm working on an old iOS app that started with objective-C + UIKit and has being migrated to Swift + SwiftUI. Currently its code is mostly Swift + SwiftUI but it has still some objective-C and some UIKit ViewControllers. One of the SwiftUI views uses fileImporter to open Files App and select a file from the device. This has been working well until iOS 18 is launched. With iOS 18 the file picker is not launching correctly and is frozen in every simulator (the unique real device I've could test with iOS 18 seemed to work correctly). I managed to clone my project and leave it with the minimal amount of files to reproduce this error. This is the code: AppDelegate.h #import <UIKit/UIKit.h> @interface AppDelegate : UIResponder <UIApplicationDelegate> {} @property (strong, nonatomic) UIWindow *window; @end AppDelegate.m #import "AppDelegate.h" #import "MyApp-Swift.h" @interface AppDelegate () @end @implementation AppDelegate - (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions { self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]]; self.window.backgroundColor = [UIColor whiteColor]; [self.window makeKeyAndVisible]; FirstViewBuilder *viewBuilder = [[FirstViewBuilder alloc] init]; [viewBuilder show]; return YES; } @end FirstViewBuilder.swift import SwiftUI @objc class FirstViewBuilder: NSObject { private var view: UIHostingController<FirstView> @objc override init() { self.view = MyHostingController(rootView: FirstView()) } @objc func show() { let app = UIApplication.shared.delegate as? AppDelegate let window = app?.window window?.backgroundColor = .white // Use navigationController or view directly depending on use window?.rootViewController = view } } FirstView.swift import SwiftUI struct FirstView: View { @State var hasToOpenFilesApp = false var body: some View { VStack(alignment: .leading, spacing: 0) { Button("Open Files app") { hasToOpenFilesApp = true }.fileImporter(isPresented: $hasToOpenFilesApp, allowedContentTypes: [.text]) { result in switch result { case .success(let url): print(url.debugDescription) case .failure(let error): print(error.localizedDescription) } } } } } And finally, MyHostingController import SwiftUI class MyHostingController<Content>: UIHostingController<Content> where Content: View { override init(rootView: Content) { super.init(rootView: rootView) } @objc required dynamic init?(coder aDecoder: NSCoder) { fatalError("init(coder:) has not been implemented") } override func viewDidLoad() { super.viewDidLoad() navigationItem.hidesBackButton = true } } Launching this in an iPhone 13 Pro (18.2) simulator I click on Open Files App, it takes 2 seconds to open it, and it opens full screen (not like a modal). Buttons on the top are behind the status bar and buttons at the bottom are behind the Home indicator. But it's worse because the user can't interact with this view, it's frozen. I created a fresh SwiftUI project just with this unique view and the fileimport worked as expected so I thought the problem was due to embed the SwiftUI view inside the UIHostingController. So I made these modifications to the minimal project: Remove the files AppDelegate, FirstViewBuilder and MyHostingController. Create this SwiftUI App file import SwiftUI @main struct MyApp: App { var body: some Scene { WindowGroup { FirstView() } } } And again the same problem with iOS 18. But if I launch this exact project in an iPhone 13 Pro (17.4) simulator and open the files apps (now it opens almost instantly) it works OK and shows the file picker as a modal, as expected, and I can interact with it and select files. Last thing I've tried is removing LaunchScreen.xib from my project and Launch screen interface file base name key from my info.plist but the problem keeps happening. I guess it must be due to my project configuration (too old) but I have no more ideas of where to look at. The possibility of having a fresh SwiftUI project and "move" the old project to the new one could take me several weeks and I discard it by the moment. Could I use another method to select files from SwiftUI views with iOS 18?
2
0
372
Feb ’25
How to validate a property list has the right structure
I need to read data from the user. For convenience, the data will be in a property list, so it's easy to get a dictionary containing the property list data. But, since it's coming from outside, I need to validate that the data is in the required format, i.e. it has the right keys and the right sort of data for each key, e.g. <name> has a string, <keys> has an array of appropriate values. Since this is part of a long-established product, and targets 10.13, I want to do this in Objective-C if possible. I've been working mostly with Swift in recent years, so I've forgotten a lot of what I used to know about Objective-C, I'm sure. My first thought was to obtain the value for each key and check the class type with isa, but I see that's deprecated in macOS 13 with no replacement. I don't see another way to check the class. I'm sure other people have solved the same problem, but my searches have not turned up any answers.
3
0
311
4w
How to Implement Screen Mirroring in iOS for Google TV?
I am developing an iOS application that supports screen mirroring to Google TV (or Chromecast with Google TV). My goal is to mirror the iPhone/iPad screen in real time to a Google TV device. What I Have Tried So Far I have explored multiple approaches but haven't found a direct way to achieve low-latency screen mirroring. Here are some of my findings: Google Cast SDK: Google Cast SDK is primarily designed for casting media (videos, images, audio) rather than real-time mirroring. It supports custom receiver applications, but there are no direct APIs for full screen mirroring. Casting a recorded video is possible, but it introduces latency and is not real-time. ReplayKit for Screen Capture: RPScreenRecorder.shared().startCapture(handler: ...) allows capturing the iPhone screen as a video stream. However, sending this stream to Google TV in real time is a challenge. I could potentially encode the video as HLS and stream it, but the delay is significant. RTSP/UDP Streaming: Some third-party libraries support RTSP/UDP streaming for real-time screen sharing. Google TV does not natively support RTSP, making this approach difficult. My Questions: Is it possible to achieve real-time screen mirroring on Google TV using Google Cast SDK? Does Google TV support WebRTC or any low-latency streaming protocol that can be used from iOS? Are there any alternative approaches to mirror an iOS screen to Google TV with minimal latency? I would appreciate any guidance, code examples, or references to relevant documentation.
0
1
365
Feb ’25
Confusion About Objective-C's Memory Management (Cocoa)
Hello everyone, There is one thing about Objective-C's memory management that confuses me, which is a returned object's lifetime from methods with names doesn't start with "alloc", "new", "copy", or "mutableCopy". Take this as an example, when using NSBitmapImageRep's representationUsingType:properties: method, it returns an NSData object (reference: https://developer.apple.com/documentation/appkit/nsbitmapimagerep/representation(using:properties:)?language=objc). While testing this out, the NSData seemed to be an owned object (it doesn't get released until the end of the program). From what I understand, this may be an auto-released object which is released at the end of an autorelease pool block. Could someone explain this in more detail? What if I want to release that NSData object before the end of the autorelease pool block? How can I know which object is autoreleased, borrowed, or owned?
3
0
522
Jan ’25
Populating Now Playing with Objective-C
Hello. I am attempting to display the music inside of my app in Now Playing. I've tried a few different methods and keep running into unknown issues. I'm new to Objective-C and Apple development so I'm at a loss of how to continue. Currently, I have an external call to viewDidLoad upon initialization. Then, when I'm ready to play the music, I call playMusic. I have it hardcoded to play an mp3 called "1". I believe I have all the signing set up as the music plays after I exit the app. However, there is nothing in Now Playing. There are no errors or issues that I can see while the app is running. This is the only file I have in Xcode relating to this feature. Please let me know where I'm going wrong or if there is another object I need to use! #import <Foundation/Foundation.h> #import <UIKit/UIKit.h> #import <MediaPlayer/MediaPlayer.h> #import <AVFoundation/AVFoundation.h> @interface ViewController : UIViewController <AVAudioPlayerDelegate> @property (nonatomic, strong) AVPlayer *player; @property (nonatomic, strong) MPRemoteCommandCenter *commandCenter; @property (nonatomic, strong) MPMusicPlayerController *controller; @property (nonatomic, strong) MPNowPlayingSession *nowPlayingSession; @end @implementation ViewController - (void)viewDidLoad { [super viewDidLoad]; NSLog(@"viewDidLoad started."); [self setupAudioSession]; [self initializePlayer]; [self createNowPlayingSession]; [self configureNowPlayingInfo]; NSLog(@"viewDidLoad completed."); } - (void)setupAudioSession { AVAudioSession *audioSession = [AVAudioSession sharedInstance]; NSError *setCategoryError = nil; if (![audioSession setCategory:AVAudioSessionCategoryPlayback error:&setCategoryError]) { NSLog(@"Error setting category: %@", [setCategoryError localizedDescription]); } else { NSLog(@"Audio session category set."); } NSError *activationError = nil; if (![audioSession setActive:YES error:&activationError]) { NSLog(@"Error activating audio session: %@", [activationError localizedDescription]); } else { NSLog(@"Audio session activated."); } } - (void)initializePlayer { NSString *soundFilePath = [NSString stringWithFormat:@"%@/base/game/%@",[[NSBundle mainBundle] resourcePath], @"bgm/1.mp3"]; if (!soundFilePath) { NSLog(@"Audio file not found."); return; } NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath]; self.player = [AVPlayer playerWithURL:soundFileURL]; NSLog(@"Player initialized with URL: %@", soundFileURL); } - (void)createNowPlayingSession { self.nowPlayingSession = [[MPNowPlayingSession alloc] initWithPlayers:@[self.player]]; NSLog(@"Now Playing Session created with players: %@", self.nowPlayingSession.players); } - (void)configureNowPlayingInfo { MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter]; CMTime duration = self.player.currentItem.duration; Float64 durationSeconds = CMTimeGetSeconds(duration); CMTime currentTime = self.player.currentTime; Float64 currentTimeSeconds = CMTimeGetSeconds(currentTime); NSDictionary *nowPlayingInfo = @{ MPMediaItemPropertyTitle: @"Example Title", MPMediaItemPropertyArtist: @"Example Artist", MPMediaItemPropertyPlaybackDuration: @(durationSeconds), MPNowPlayingInfoPropertyElapsedPlaybackTime: @(currentTimeSeconds), MPNowPlayingInfoPropertyPlaybackRate: @(self.player.rate) }; infoCenter.nowPlayingInfo = nowPlayingInfo; NSLog(@"Now Playing info configured: %@", nowPlayingInfo); } - (void)playMusic { [self.player play]; [self createNowPlayingSession]; [self configureNowPlayingInfo]; } - (void)pauseMusic { [self.player pause]; [self configureNowPlayingInfo]; } @end
2
0
468
Feb ’25
macOS dark mode while window is in background
Hi, I detect dark mode on macOS like following: NSAppearance *appearance = NSApp.mainWindow.effectiveAppearance; NSString *interface_style = appearance.name; NSAppearanceName basicAppearance = [appearance bestMatchFromAppearancesWithNames:@[ NSAppearanceNameAqua, NSAppearanceNameDarkAqua ]]; if([basicAppearance isEqualToString:NSAppearanceNameDarkAqua]){ theme = "Adwaita:dark"; dark_mode = TRUE; } if([interface_style isEqualToString:NSAppearanceNameDarkAqua]){ theme = "Adwaita:dark"; dark_mode = TRUE; }else if([interface_style isEqualToString:NSAppearanceNameVibrantDark]){ theme = "Adwaita:dark"; dark_mode = TRUE; }else if([interface_style isEqualToString:NSAppearanceNameAccessibilityHighContrastAqua]){ theme = "HighContrast"; }else if([interface_style isEqualToString:NSAppearanceNameAccessibilityHighContrastDarkAqua]){ theme = "HighContrast:dark"; dark_mode = TRUE; }else if([interface_style isEqualToString:NSAppearanceNameAccessibilityHighContrastVibrantDark]){ theme = "HighContrast:dark"; dark_mode = TRUE; } But this doesn't work if my window is in background. As the application window is put into background, it loses dark mode. Howto fix it? regards, Joël
2
0
456
Jan ’25
Cannot mimic fullscreen behavior when using custom event loop
Hi, we are developing a cross-platform library for creating desktop applications in C++: https://github.com/aseprite/laf For this reason, in macOS, we cannot rely on the default NSApplication.run() event loop, so we decided to implement our custom event loop using the nextEventMatchingMask method. Then, when a window is in fullscreen mode, for some reason the window stops receiving mouseMove events when the mouse pointer enters an area at the top of the window. You can see this issue in action by trying the following example project: https://github.com/martincapello/custom-event-loop-issue This project just opens one window and uses a custom event loop, it displays the current mouse position at every mouseMove event received, and when the aforementioned area is entered it suddenly stops updating. There is also a video showing how to reproduce it. I was able to see that when the position stops updating, we still receive mouseMove events, but for a different window, a borderless window that is added to the NSApplication.windows collection when switching to fullscreen, and which seems to be taken the mouseMove events before reaching the main window. Also, this issue doesn't happen when using the default NSApplication.run method, despite the borderless windows being added as well.
8
0
551
Jan ’25
Swift/objC combined with Swift/C++ interop
Consider this Swift struct: public struct Example { public func foo(callback: ()->Void) { .... } public func blah(i: Int) { .... } .... } Using Swift/C++ interop, I can create Example objects and call methods like blah. But I can't call foo because Swift/C++ interop doesn't currently support passing closures (right?). On the other hand, Swift/objC does support passing objC blocks to Swift functions. But I can't use that here because Example is a Swift struct, not a class. So I could change it to a class, and update everything to work with reference rather than value semantics; but then I also have to change the objC++ code to create the object and call its methods using objC syntax. I'd like to avoid that. Is there some hack that I can use to make this possible? I'm hoping that I can wrap a C++ std::function in some sort of opaque wrapper and pass that to swift, or something. Thanks for any suggestions!
1
0
581
Jan ’25
Whats the Appkit equivalent of SwiftUI's NavigationSplitView?
How do I implement the same Navigation split view with a side bar in Appkit? Basically I have this code: import SwiftUI struct ContentView: View { var body: some View { NavigationSplitView { // Sidebar List { NavigationLink("Item 1", value: "Item 1 Details") NavigationLink("Item 2", value: "Item 2 Details") NavigationLink("Item 3", value: "Item 3 Details") } .navigationTitle("Items") } content: { // Main content (detail view for selected item) Text("Select an item to see details.") .padding() } detail: { // Detail view (for the selected item) Text("Select an item from the sidebar to view details.") .padding() } } } struct MyApp: App { var body: some Scene { WindowGroup { ContentView() } } } and wanted to somehow convert it to Appkit. I tried to use an NSSplitViewController but I still don't have that side bar and that button to collapse it, how do I go about this?
2
0
580
Dec ’24
FirstResponderView/FirstResponderIndexPath in TableView
I found when I put a webView on the screen and then remove it, several properties in TableView including firstResponderView, FirstResponderIndexPath, and FirstResponderViewType have changed. These properties are hidden and I cannot change them. firstResponderView strong holds my cell, resulting in my cell not being able to call didEndDisplayCell when it slides out of the screen as expected. What should I do to avoid firstResponderView from strong holding my cell, or what should I do to release it?
1
0
306
Dec ’24
AppleScriptLauncher Menu App
I'm primarily an AppleScript writer and only a novice programmer, using ChatGPT to help me with the legwork. It has helped me to write a functioning app that builds a menu structure based on the scripts I have in the Scripts directory used in the script menu and then runs the applescripts. When I distribute the app to my desktop and run it, the scripts that access other apps, like InDesign will cause it to launch, but not actually do anything. I included the ids for each app in the entitlements dictionary and have given the app full disk access in system settings, but it's not functioning as I'd expect. I know there are apps like Alfred that allow you to run scripts from a keystroke, but I'm building this for others I work with so they can also access info about each script, what it does, and how to use it from the menu, as well as key commands to run them. Not sure what else to say, but if this sounds like a simple fix to anyone, please let me know.
0
0
413
Dec ’24
[UIViewController willMoveToParentViewController:] provides an incorrect navStack when popping a view controller in iPadOS 18.2
The Apple documentation for [UIViewController willMoveToParentViewController:] states, Called just before the view controller is added or removed from a container view controller. Hence, the view controller being popped should appear at the top of the navStack when willMoveToParentViewController: is called. In iPadOS 16 and 17 the view controller being popped appears at the top of the navStack when willMoveToParentViewController: is called. In iPadOS 18 the view controller being popped has already been (incorrectly) removed from the navStack. We confirmed this bug using iPadOS 18.2 as well as 18.0. Our app relies upon the contents of the navStack within willMoveToParentViewController: to track the user's location within a folder hierarchy. Our app works smoothly on iPadOS 16 and 17. Conversely, our customers running iPadOS 18 have lost client data and corrupted their data folders as a result of this bug! These customers are angry -- not surprisingly, some have asked for refunds and submitted negative app reviews. Why doesn't willMoveToParentViewController: provide the correct navStack in iPadOS 18.2?
1
0
567
Dec ’24