Objective-C

RSS for tag

Objective-C is a programming language for writing iOS, iPad OS, and macOS apps.

Posts under Objective-C tag

106 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

When referencing the value of an instance variable of type NSString, it may sometimes appear as <__NSMallocBlock__: 0x106452d60>.
When the app launches, it assigns the correct string stored in NSUserDefaults to the instance variable (serverAddress) of type NSString.
Afterwards, when the app transitions to a suspended state and receives a VoIP Push notification approximately 4 hours later, causing the app to enter the background state, referencing the value of serverAddress may result in it showing as &lt;NSMallocBlock: 0x106452d60&gt; even though it hasn't been deallocated. This does not occur every time. The specific class definitions are as follows. @implementation NewPAPEOPLEClient
{
    NSString* serverAddress;
    NSString* apiKey;
    dispatch_semaphore_t lock;
    NSMutableArray* errCallID;
} #define PREF_STR(key)  [[NSUserDefaults standardUserDefaults] stringForKey:(key) The global instance variable (serverAddress) is set with the following value when the application starts: serverAddress = PREF_STR(@"APP_CONTACT_SERVICE_SERVER_ADDRESS") Based on the above, please answer the following questions. (1) If the instance variable's value becomes &lt;NSMallocBlock: 0x106452d60&gt;, does that mean the memory area has been released? (2) Can the memory area of an instance variable be automatically released without being explicitly released?
 Can an instance variable of type NSString that is not autoreleased be automatically released? (3) Please tell me how to prevent the memory area from being automatically released. Will using retain prevent automatic release?
1
0
135
1d
How to get PID from AudioObjectID on macOS pre Sonoma
3 I am working on an application to get when input audio device is being used. Basically I want to know the application using the microphone (built-in or external) This app runs on macOS. For Mac versions starting from Sonoma I can use this code: int getAudioProcessPID(AudioObjectID process) { pid_t pid; if (@available(macOS 14.0, *)) { constexpr AudioObjectPropertyAddress prop { kAudioProcessPropertyPID, kAudioObjectPropertyScopeGlobal, kAudioObjectPropertyElementMain }; UInt32 dataSize = sizeof(pid); OSStatus error = AudioObjectGetPropertyData(process, &amp;prop, 0, nullptr, &amp;dataSize, &amp;pid); if (error != noErr) { return -1; } } else { // Pre sonoma code goes here } return pid; } which works. However, kAudioProcessPropertyPID was added in macOS SDK 14.0. Does anyone know how to achieve the same functionality on previous versions?
1
0
195
14h
How to Handle App Focus When Launching Another App During Secure Event Input
Hi everyone, I'm encountering a behaviour related to Secure Event Input on macOS and wanted to understand if it's expected or if there's a recommended approach to handle it. Scenario: App A enables secure input using EnableSecureEventInput(). While secure input is active, App A launches App B using NSWorkspace or similar. App B launches successfully, but it does not receive foreground focus — it starts in the background. The system retains focus on App A, seemingly to preserve the secure input session. Observed Behavior: From what I understand, macOS prevents app switching during Secure Event Input to avoid accidental or malicious focus stealing (e.g., to prevent UI spoofing during password entry). So: Input focus remains locked to App A. App B runs but cannot become frontmost unless the secure input session ends or App B is brought to the frontmost by manual intervention or by running a terminal script. This is consistent with system security behaviour, but it presents a challenge when App A needs to launch and hand off control to another app. Questions: Is this behaviour officially documented anywhere? Is there a recommended pattern for safely transferring focus to another app while Secure Event Input is active? Would calling DisableSecureEventInput() just before launching App B be the appropriate (and safe) solution? Or is there a better way to defer the transition? Thanks in advance for any clarification or advice.
0
0
24
3d
Compatibility Issues with Main Event Loop in macOS 26 Beta
I have a standard Mac project created using Xcode templates with Storyboard. I manually removed the window from the Storyboard file and use ViewController.h as the base class for other ViewControllers. I create new windows and ViewControllers programmatically, implementing the UI entirely through code. However, I've discovered that on all macOS versions prior to macOS 26 beta, the application would trigger automatic termination due to App Nap under certain circumstances. The specific steps are: Close the application window (the application doesn't exit immediately at this point) Click on other windows or the desktop, causing the application to lose focus (though this description might not be entirely accurate - the app might have already lost focus when the window closed, but the top menu bar still shows the current application immediately after window closure) The application automatically terminates In previous versions, I worked around this issue by manually executing [[NSApplication sharedApplication] run]; to create an additional main event loop, keeping the application active (although I understand this isn't an ideal approach). However, in macOS 26 beta, I've found that calling [[NSApplication sharedApplication] run]; causes all NSButton controls I created in the window to become unresponsive to click events, though they still respond to mouse enter events. Through recent research, I've learned that the best practice for preventing App Nap automatic termination is to use [[NSProcessInfo processInfo] disableAutomaticTermination:@"User interaction required"]; to increment the automatic termination reference count. My questions are: Why did calling [[NSApplication sharedApplication] run]; work properly on macOS versions prior to 15.6.1? Why does calling [[NSApplication sharedApplication] run]; in macOS 26 beta cause buttons to become unresponsive to click events?
0
1
77
1w
Audio Unit v3 host v2 third party plugins
Hi, I have just implemented an Audio Unit v3 host. AgsAudioUnitPlugin *audio_unit_plugin; AVAudioUnitComponentManager *audio_unit_component_manager; NSArray<AVAudioUnitComponent *> *av_component_arr; AudioComponentDescription description; guint i, i_stop; if(!AGS_AUDIO_UNIT_MANAGER(audio_unit_manager)){ return; } audio_unit_component_manager = [AVAudioUnitComponentManager sharedAudioUnitComponentManager]; /* effects */ description = (AudioComponentDescription) {0,}; description.componentType = kAudioUnitType_Effect; av_component_arr = [audio_unit_component_manager componentsMatchingDescription:description]; i_stop = [av_component_arr count]; for(i = 0; i < i_stop; i++){ ags_audio_unit_manager_load_component(audio_unit_manager, (gpointer) av_component_arr[i]); } /* instruments */ description = (AudioComponentDescription) {0,}; description.componentType = kAudioUnitType_MusicDevice; av_component_arr = [audio_unit_component_manager componentsMatchingDescription:description]; i_stop = [av_component_arr count]; for(i = 0; i < i_stop; i++){ ags_audio_unit_manager_load_component(audio_unit_manager, (gpointer) av_component_arr[i]); } But this doesn't show me Audio Unit v2 plugins, why? regards, Joël
3
0
295
1w
AVAudioUnit host - PCM buffer output silent
Hi, I just started to develop audio unit hosting support in my application. Offline rendering seems to work except that I hear no output, but why? I suspect with the player goes something wrong. I connect to CoreAudio in a different location in the code. Here are some error messages I faced so far: 2025-08-14 19:42:04.132930+0200 com.gsequencer.GSequencer[34358:18611871] [avae] AVAudioEngineGraph.mm:4668 Can't retrieve source node to play sequence because there is no output node! 2025-08-14 19:42:04.151171+0200 com.gsequencer.GSequencer[34358:18611871] [avae] AVAudioEngineGraph.mm:4668 Can't retrieve source node to play sequence because there is no output node! 2025-08-14 19:43:08.344530+0200 com.gsequencer.GSequencer[34358:18614927] AUAudioUnit.mm:1417 Cannot set maximumFramesToRender while render resources allocated. 2025-08-14 19:43:08.346583+0200 com.gsequencer.GSequencer[34358:18614927] [avae] AVAEInternal.h:104 [AVAudioSequencer.mm:121:-[AVAudioSequencer(AVAudioSequencer_Player) startAndReturnError:]: (impl->Start()): error -10852 ** (<unknown>:34358): WARNING **: 19:43:08.346: error during audio sequencer start - -10852 I have implemented an AVAudioEngine based AudioUnit host. Here I instantiate player and effect: /* audio engine */ audio_engine = [[AVAudioEngine alloc] init]; fx_audio_unit_audio->audio_engine = (gpointer) audio_engine; av_format = (AVAudioFormat *) fx_audio_unit_audio->av_format; /* av audio player node */ av_audio_player_node = [[AVAudioPlayerNode alloc] init]; /* av audio unit */ av_audio_unit_effect = [[AVAudioUnitEffect alloc] initWithAudioComponentDescription:[((AVAudioUnitComponent *) AGS_AUDIO_UNIT_PLUGIN(base_plugin)->component) audioComponentDescription]]; av_audio_unit = (AVAudioUnit *) av_audio_unit_effect; fx_audio_unit_audio->av_audio_unit = av_audio_unit; /* audio sequencer */ av_audio_sequencer = [[AVAudioSequencer alloc] initWithAudioEngine:audio_engine]; fx_audio_unit_audio->av_audio_sequencer = (gpointer) av_audio_sequencer; /* output node */ [[AVAudioOutputNode alloc] init]; /* audio player and audio unit */ [audio_engine attachNode:av_audio_player_node]; [audio_engine attachNode:av_audio_unit]; [audio_engine connect:av_audio_player_node to:av_audio_unit format:av_format]; [audio_engine connect:av_audio_unit to:[audio_engine outputNode] format:av_format]; ns_error = NULL; [audio_engine enableManualRenderingMode:AVAudioEngineManualRenderingModeOffline format:av_format maximumFrameCount:buffer_size error:&ns_error]; if(ns_error != NULL && [ns_error code] != noErr){ g_warning("enable manual rendering mode error - %d", [ns_error code]); } ns_error = NULL; [[av_audio_unit AUAudioUnit] allocateRenderResourcesAndReturnError:&ns_error]; if(ns_error != NULL && [ns_error code] != noErr){ g_warning("Audio Unit allocate render resources returned error - ErrorCode %d", [ns_error code]); } Then I render in a dedicated thread. ns_error = NULL; [audio_engine startAndReturnError:&ns_error]; if(ns_error != NULL && [ns_error code] != noErr){ g_warning("error during audio engine start - %d", [ns_error code]); } [av_audio_sequencer prepareToPlay]; ns_error = NULL; [av_audio_sequencer startAndReturnError:&ns_error]; if(ns_error != NULL && [ns_error code] != noErr){ g_warning("error during audio sequencer start - %d", [ns_error code]); } [av_audio_player_node play]; while(is_running){ /* pre sync */ /* IO buffers */ av_output_buffer = (AVAudioPCMBuffer *) scope_data->av_output_buffer; av_input_buffer = (AVAudioPCMBuffer *) scope_data->av_input_buffer; /* fill input buffer */ /* schedule av input buffer */ frame_position = 0; // (gint64) ((note_offset * absolute_delay) + delay_counter) * buffer_size; av_audio_player_node = (AVAudioPlayerNode *) fx_audio_unit_audio->av_audio_player_node; AVAudioTime *av_audio_time = [[AVAudioTime alloc] initWithHostTime:frame_position sampleTime:frame_position atRate:((double) samplerate)]; [av_audio_player_node scheduleBuffer:av_input_buffer atTime:av_audio_time options:0 completionHandler:nil]; /* render */ ns_error = NULL; status = [audio_engine renderOffline:AGS_FX_AUDIO_UNIT_AUDIO_FIXED_BUFFER_SIZE toBuffer:av_output_buffer error:&ns_error]; if(ns_error != NULL && [ns_error code] != noErr){ g_warning("render offline error - %d", [ns_error code]); } } regards, Joël
3
0
343
1w
hyperthreading with arm64
Hi, I am curious about if hyperthreading is enabled/disabled on my macbook pro M1 or M4. Howto figure out? I am using macOS 15.5. Further, I develop a multi-threaded audio sequencer that creates threads per instrument. I use vector operations to increase performance. I recognized lowering synchronization rate from 250 Hz to 60 Hz gives additional performance advantages. Howto programmatically check if Hyperthreading is enabled/disabled and howto enable/disable it programmatically? After some research I found sysctl() and nvram SMTDisable=%01. https://support.apple.com/en-us/101870 Can anyone provide me an Objective C example? regards, Joël
2
0
109
Jul ’25
NoobScript : a new (php-esq) programming language for mac/iOS etc development
Noob Software has been working on a new programming language designed primarily for mac development, with a syntax similar to PHP and some aspects from JavaScript. To demonstrate this language i've released the code for one of my apps "Noob Music" https://github.com/noobsoftware/NoobMusic3 i would like to hear if people are interested in using this programming language. I think there are many benefits to the high level syntax and high level thinking, and not having to define datatypes and such. The language also supports multithreading using the "async" keyword prefixed in front of the function keyword, like you would do in JavaScript only with multithreading instead of interleaved processing. Pushing to array is threadsafe and some other functinonality as well and it is recommended to use a new class instance within each async function. I have relied heavily on using webviews for UI so there is a layout engine i have developed for native view elements in Cocoa which at this point ar primarily WebViews. The webviews can define a callback function to receive messages from JavaScript, so combining these methods you get a full fledged "web" development feeling for mac development, with HTML+CSS+JavaScript and NoobScript
0
0
319
Jul ’25
On macOS Tahoe (26.0), the call sysctlbyname for kern.osproductversion returns 16.0
The objective C code using the kernel API ‘sysctlbyname’ for ‘kern.osproductversion’ returns 16.0 instead of 26.0 on macOS Tahoe. sysctlbyname("kern.osproductversion", version, &size, NULL, 0) The command ‘sysctl kern.osproductversion’ returns ‘kern.osproductversion: 26.0’ on same macOS Tahoe. Note: The objective C code was built using Xcode 16.3.
2
0
208
Jul ’25
Metal Compositor Service & Persona (VisionOS)
Hello, I'm currently trying to make a collaborative app. But it just works only on Reality View, when I tried to use Compositor Layer like below, the personas disappeared. ImmersiveSpace(id: "ImmersiveSpace-Metal") { CompositorLayer(configuration: MetalLayerConfiguration()) { layerRenderer in SpatialRenderer_InitAndRun(layerRenderer) } } Is there any potential solution too see Personas in Metal view? Thanks in advance!
1
0
645
Jul ’25
Exposing Objective-C API to Swift inside a Framework (Private Framework API)
My framework has private Objective-C API that is only used within the framework. It should not be exposed in the public interface (so it shouldn't be imported in the umbrella header). To expose this API to Swift that's within the framework only the documentation seems to indicate that this needs to be imported in the umbrella header? Import Code Within a Framework Target To use the Objective-C declarations in files in the same framework target as your Swift code, configure an umbrella header as follows: 1.Under Build Settings, in Packaging, make sure the Defines Module setting for the framework target is set to Yes. 2.In the umbrella header, import every Objective-C header you want to expose to Swift. Swift sees every header you expose publicly in your umbrella header. The contents of the Objective-C files in that framework are automatically available from any Swift file within that framework target, with no import statements. Use classes and other declarations from your Objective-C code with the same Swift syntax you use for system classes. I would imagine that there must be a way to do this?
0
0
252
Jul ’25
"illegal character encoding in string literal" warnings in Xcode
Good day! I have a long-term project ported all the way up from old Think C through many versions of Xcode. Its source files are encoded in "Western (Mac OS Roman)". Some of my error messages have characters outside the straight ASCII character set (i.e. "å"). The editor correctly displays these, but I get plenty of Illegal Character warnings and the messages do not display properly. I imagine there's a way to have seperate files of localized text for internationalized applications, but I am the only end-user of this application, and it used to just plain work in earlier Xcode versions. Furthermore, there must be developers throughout Europe who use such characters in string literals, just typing in their native languages, straight off their keyboards. I was thinking that there must be a Clang setting or something, but have been unable to find it, and an internet search turns up no solution except to cumbersomely escape each individual character. I can't imagine that a French programmer does that every time they want to type "è", "é", or "à"! Any help? (Disclaimer: I'm an English speaker and only use such characters whimsically, but want to keep them for legacy's sake.) Thanks.... p.s. using Xcode 15.3, and under Settings->Text Editing->Editing, "Western (Mac OS Roman)" is already selected as the default text encoding with "Convert existing files on save" checked.
3
0
140
Jun ’25
How to capture audio from the stream that's playing on the speakers?
Good day, ladies and gents. I have an application that reads audio from the microphone. I'd like it to also be able to read from the Mac's audio output stream. (A bonus would be if it could detect when the Mac is playing music.) I'd eventually be able to figure it out reading docs, but if someone can give a hint, I'd be very grateful, and would owe you the libation of your choice. Here's the code used to set up the AudioUnit: -(NSString*) configureAU { AudioComponent component = NULL; AudioComponentDescription description; OSStatus err = noErr; UInt32 param; AURenderCallbackStruct callback; if( audioUnit ) { AudioComponentInstanceDispose( audioUnit ); audioUnit = NULL; } // was CloseComponent // Open the AudioOutputUnit description.componentType = kAudioUnitType_Output; description.componentSubType = kAudioUnitSubType_HALOutput; description.componentManufacturer = kAudioUnitManufacturer_Apple; description.componentFlags = 0; description.componentFlagsMask = 0; if( component = AudioComponentFindNext( NULL, &description ) ) { err = AudioComponentInstanceNew( component, &audioUnit ); if( err != noErr ) { audioUnit = NULL; return [ NSString stringWithFormat: @"Couldn't open AudioUnit component (ID=%d)", err] ; } } // Configure the AudioOutputUnit: // You must enable the Audio Unit (AUHAL) for input and output for the same device. // When using AudioUnitSetProperty the 4th parameter in the method refers to an AudioUnitElement. // When using an AudioOutputUnit for input the element will be '1' and the output element will be '0'. param = 1; // Enable input on the AUHAL err = AudioUnitSetProperty( audioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Input, 1, &param, sizeof(UInt32) ); chkerr("Couldn't set first EnableIO prop (enable inpjt) (ID=%d)"); param = 0; // Disable output on the AUHAL err = AudioUnitSetProperty( audioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Output, 0, &param, sizeof(UInt32) ); chkerr("Couldn't set second EnableIO property on the audio unit (disable ootpjt) (ID=%d)"); param = sizeof(AudioDeviceID); // Select the default input device AudioObjectPropertyAddress OutputAddr = { kAudioHardwarePropertyDefaultInputDevice, kAudioObjectPropertyScopeGlobal, kAudioObjectPropertyElementMaster }; err = AudioObjectGetPropertyData( kAudioObjectSystemObject, &OutputAddr, 0, NULL, &param, &inputDeviceID ); chkerr("Couldn't get default input device (ID=%d)"); // Set the current device to the default input unit err = AudioUnitSetProperty( audioUnit, kAudioOutputUnitProperty_CurrentDevice, kAudioUnitScope_Global, 0, &inputDeviceID, sizeof(AudioDeviceID) ); chkerr("Failed to hook up input device to our AudioUnit (ID=%d)"); callback.inputProc = AudioInputProc; // Setup render callback, to be called when the AUHAL has input data callback.inputProcRefCon = self; err = AudioUnitSetProperty( audioUnit, kAudioOutputUnitProperty_SetInputCallback, kAudioUnitScope_Global, 0, &callback, sizeof(AURenderCallbackStruct) ); chkerr("Could not install render callback on our AudioUnit (ID=%d)"); param = sizeof(AudioStreamBasicDescription); // get hardware device format err = AudioUnitGetProperty( audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 1, &deviceFormat, &param ); chkerr("Could not install render callback on our AudioUnit (ID=%d)"); audioChannels = MAX( deviceFormat.mChannelsPerFrame, 2 ); // Twiddle the format to our liking actualOutputFormat.mChannelsPerFrame = audioChannels; actualOutputFormat.mSampleRate = deviceFormat.mSampleRate; actualOutputFormat.mFormatID = kAudioFormatLinearPCM; actualOutputFormat.mFormatFlags = kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked | kAudioFormatFlagIsNonInterleaved; if( actualOutputFormat.mFormatID == kAudioFormatLinearPCM && audioChannels == 1 ) actualOutputFormat.mFormatFlags &= ~kLinearPCMFormatFlagIsNonInterleaved; #if __BIG_ENDIAN__ actualOutputFormat.mFormatFlags |= kAudioFormatFlagIsBigEndian; #endif actualOutputFormat.mBitsPerChannel = sizeof(Float32) * 8; actualOutputFormat.mBytesPerFrame = actualOutputFormat.mBitsPerChannel / 8; actualOutputFormat.mFramesPerPacket = 1; actualOutputFormat.mBytesPerPacket = actualOutputFormat.mBytesPerFrame; // Set the AudioOutputUnit output data format err = AudioUnitSetProperty( audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 1, &actualOutputFormat, sizeof(AudioStreamBasicDescription)); chkerr("Could not change the stream format of the output device (ID=%d)"); param = sizeof(UInt32); // Get the number of frames in the IO buffer(s) err = AudioUnitGetProperty( audioUnit, kAudioDevicePropertyBufferFrameSize, kAudioUnitScope_Global, 0, &audioSamples, &param ); chkerr("Could not determine audio sample size (ID=%d)"); err = AudioUnitInitialize( audioUnit ); // Initialize the AU chkerr("Could not initialize the AudioUnit (ID=%d)"); // Allocate our audio buffers audioBuffer = [self allocateAudioBufferListWithNumChannels: actualOutputFormat.mChannelsPerFrame size: audioSamples * actualOutputFormat.mBytesPerFrame]; if( audioBuffer == NULL ) { [ self cleanUp ]; return [NSString stringWithFormat: @"Could not allocate buffers for recording (ID=%d)", err]; } return nil; } (...again, it would be nice to know if audio output is active and thereby choose the clean output stream over the noisy mic, but that would be a different chunk of code, and my main question may just be a quick edit to this chunk.) Thanks for your attention! ==Dave [p.s. if i get more than one useful answer, can i "Accept" more than one, to spread the credit around?] {pps: of course, the code lines up prettier in a monospaced font!}
1
0
107
Jun ’25
Module not found when using inheritance
PLATFORM AND VERSION Development environment: Xcode 16.4 (16F6), macOS 15.5 (24F74) Run-time configuration: iOS 18.3.1 DESCRIPTION OF PROBLEM I cannot build my app due to a module not found error when I'm importing Sample-Swift.h header to an ObjectiveC file. I need to use Swift code to ObjectiveC file with Swift code using an external XCFramework. It seems to be related to a mix with ObjectiveC and Swift code when Swift code uses class inheritance from FZWorkoutKit class. With a full Swift project, no issue. STEPS TO REPRODUCE Project https://fizzup.s3.amazonaws.com/files/public/Sample-WK.zip Open the provided project and try to build it. The issue occurs. In MyObject.m file, if you comment the first import (Sample-Swift.h), no issue occurs but I cannot use my Swift code (MyFile class). Uncomment the line commented in step 1. Go to MyFile.swift and change class inheritance from GoModeController (from FZWorkoutKit framework) to UIView (from UIKit). No issue occurs.
0
0
63
Jun ’25
WKNavigationDelegate methods not getting called even though
Hi everyone, We work on a macOS plugin which then gets loaded into another application. I'm trying to load a webpage in that application through our plugin using a WKWebView & I set my class as the navigationDelegate for the same. I do not receive any callbacks for the WKNavigationDelegate methods. I have debugged & made sure that the navigationDelegate is actually set to my class. Here is a sample code of what I'm doing : [[NSApplication sharedApplication] runModalForWindow:self.mWindowController.window]; - (void)windowDidLoad { [super windowDidLoad]; [self.window makeKeyAndOrderFront:self]; [self.window orderFrontRegardless]; if (self.mLoadingView == nil) { self.mLoadingView = [[LoadingView alloc] initWithFrame:[[self.window contentView] frame]]; } [[self.window contentView] addSubview:self.mLoadingView]; [self.mLoadingView showLoadingView]; [self.window setLevel:NSMainMenuWindowLevel]; [self loadWebPage]; } -(void)loadWebPage { [self.mWKWebView setUIDelegate:self]; [self.mWKWebView setNavigationDelegate:self]; [self.mWKWebView stopLoading]; NSURL *lURL = [self samplePageURL]; WKNavigation *lNavigation = [self.mWKWebView loadRequest:[NSURLRequest requestWithURL:lURL]]; } - (void)webView:(WKWebView *)pWKWebView didFinishNavigation:(WKNavigation *)pNavigation { [self removeLoadingview]; [self.mWKWebView evaluateJavaScript:@"document.body.setAttribute('oncontextmenu', 'event.preventDefault();');" completionHandler:nil]; } I do not get any calls in the webView:didFinishNavigation: & I also tried other methods like webView:didStartProvisionalNavigation, webView:didFailProvisionalNavigation:withError: etc but did not receive any call in those either. Instead of runModalForWindow: if I use showWindow: on the mWindowController the webpage somehow loads but I still don't get any callbacks & so the loadingview subview is also present. The WKWebView is placed in a storyboard, and I have an IBOutlet connected to it in my class. Has anyone faced a similar issue or can point me to something I might be missing? All help is appreciated. Thanks in advance.
2
0
71
Jun ’25
Crashed: com.apple.CFNetwork.Connection
Hi, i have a crash received in my Firebase Crashlytics. I couldn't figure out the root cause of the issue. Could anyone please help me with it. Crashed: com.apple.CFNetwork.Connection 0 libobjc.A.dylib 0x20b8 objc_retain_x19 + 16 1 CFNetwork 0x47398 HTTP3Fields::appendField(NSString*, NSString*) + 72 2 CFNetwork 0x41250 invocation function for block in HTTP3Stream::_buildRequestHeaders() + 240 3 CoreFoundation 0x249f0 __NSDICTIONARY_IS_CALLING_OUT_TO_A_BLOCK__ + 24 4 CoreFoundation 0x565dc ____NSDictionaryEnumerate_block_invoke_2 + 56 5 CoreFoundation 0x55b10 CFBasicHashApply + 148 6 CoreFoundation 0x8abfc __NSDictionaryEnumerate + 520 7 CFNetwork 0x793d4 HTTP3Stream::scheduleAndOpenWithHandler(CoreSchedulingSet const*, void (__CFHTTPMessage*, NSObject<OS_dispatch_data>*, CFStreamError const*) block_pointer, void (unsigned char) block_pointer) + 1120 8 CFNetwork 0x1665c HTTPProtocol::useNetStreamInfoForRequest(MetaNetStreamInfo*, HTTPRequestMessage const*, unsigned char) + 4044 9 CFNetwork 0x80c80 HTTP3ConnectionCacheEntry::enqueueRequestForProtocol(MetaConnectionCacheClient*, HTTPRequestMessage const*, MetaConnectionOptions) + 2540 10 CFNetwork 0x7fab8 HTTP3ConnectionCacheWrapper::ingestTube(Tube*, bool) + 2924 11 CFNetwork 0x257dc TubeManager::newTubeReady(Tube*, CFStreamError) + 4284 12 CFNetwork 0x57b64 invocation function for block in TubeManager::_onqueue_createNewTube(HTTPConnectionCacheKey*) + 72 13 CFNetwork 0x2fe30 Tube::_onqueue_invokeCB(CFStreamError) + 360 14 CFNetwork 0x2fc20 NWIOConnection::_signalEstablished() + 652 15 CFNetwork 0x4ba1c invocation function for block in NWIOConnection::_handleEvent_ReadyFinish() + 748 16 CFNetwork 0x4b5b0 invocation function for block in Tube::postConnectConfiguration(NSObject<OS_tcp_connection>*, NSObject<OS_nw_parameters>*, void () block_pointer) + 860 17 CFNetwork 0x4b220 BlockHolderVar<std::__1::shared_ptr<NetworkProxy>, bool, CFStreamError>::invoke_normal(std::__1::shared_ptr<NetworkProxy>, bool, CFStreamError) + 64 18 CFNetwork 0x32f2c ProxyConnectionEstablishment::postProxyConnectionConfiguration(__CFAllocator const*, std::__1::shared_ptr<TransportConnection>, NSObject<OS_nw_parameters>*, __CFHTTPMessage*, HTTPConnectionCacheKey*, std::__1::shared_ptr<MetaAuthClient>, SmartBlockWithArgs<std::__1::shared_ptr<NetworkProxy>, bool, CFStreamError>) + 664 19 CFNetwork 0x32bbc Tube::postConnectConfiguration(NSObject<OS_tcp_connection>*, NSObject<OS_nw_parameters>*, void () block_pointer) + 744 20 CFNetwork 0xc19b0 invocation function for block in NWIOConnection::_setupConnectionEvents() + 2360 21 libdispatch.dylib 0x132e8 _dispatch_block_async_invoke2 + 148 22 libdispatch.dylib 0x40d0 _dispatch_client_callout + 20 23 libdispatch.dylib 0xb6d8 _dispatch_lane_serial_drain + 744 24 libdispatch.dylib 0xc214 _dispatch_lane_invoke + 432 25 libdispatch.dylib 0xd670 _dispatch_workloop_invoke + 1732 26 libdispatch.dylib 0x17258 _dispatch_root_queue_drain_deferred_wlh + 288 27 libdispatch.dylib 0x16aa4 _dispatch_workloop_worker_thread + 540 28 libsystem_pthread.dylib 0x4c7c _pthread_wqthread + 288 29 libsystem_pthread.dylib 0x1488 start_wqthread + 8 [Here is the complete crash report.](https://developer.apple.com/forums/content/attachment/58b5bb7d-7c90-4eec-906c-4fb76861d44b)
2
0
72
Jun ’25
Essentials of macOS to read and write mp3 and mp4 audio files
Hi, On macOS I used to open MP3 and MP4 files with ExtAudioFile. For a few years it doesn't work anymore. So I decided to try different macOS API using the AudioFileID of AudioToolbox framework. I decided to write a test: https://gist.github.com/joelkraehemann/7f5b241b52ca38c3a765c138fb647588 It fails right here: AudioFileOpenWithCallbacks() By telling OSStatus error 1954115647, which means kAudioFileUnsupportedFileTypeError. The filename was set to an MP4 file: ~/Music/test.mp4 Howto fix this? regards, Joël
1
0
100
Jun ’25
NEFilterManager saveToPreferences fails with "permission denied" on TestFlight build
I'm working on enabling a content filter in my iOS app using NEFilterManager and NEFilterProviderConfiguration. The setup works perfectly in debug builds when running via Xcode, but fails on TestFlight builds with the following error: **Failed to save filter settings: permission denied ** **Here is my current implementation: ** (void)startContentFilter { NSUserDefaults *userDefaults = [NSUserDefaults standardUserDefaults]; [userDefaults synchronize]; [[NEFilterManager sharedManager] loadFromPreferencesWithCompletionHandler:^(NSError * _Nullable error) { dispatch_async(dispatch_get_main_queue(), ^{ if (error) { NSLog(@"Failed to load filter: %@", error.localizedDescription); [self showAlertWithTitle:@"Error" message:[NSString stringWithFormat:@"Failed to load content filter: %@", error.localizedDescription]]; return; } NEFilterProviderConfiguration *filterConfig = [[NEFilterProviderConfiguration alloc] init]; filterConfig.filterSockets = YES; filterConfig.filterBrowsers = YES; NEFilterManager *manager = [NEFilterManager sharedManager]; manager.providerConfiguration = filterConfig; manager.enabled = YES; [manager saveToPreferencesWithCompletionHandler:^(NSError * _Nullable error) { dispatch_async(dispatch_get_main_queue(), ^{ if (error) { NSLog(@"Failed to save filter settings: %@", error.localizedDescription); [self showAlertWithTitle:@"Error" message:[NSString stringWithFormat:@"Failed to save filter settings: %@", error.localizedDescription]]; } else { NSLog(@"Content filter enabled successfully!"); [self showAlertWithTitle:@"Success" message:@"Content filter enabled successfully!"]; } }); }]; }); }]; } **What I've tried: ** Ensured the com.apple.developer.networking.networkextension entitlement is set in both the app and system extension. The Network extension target includes content-filter-provider. Tested only on physical devices. App works in development build, but not from TestFlight. **My questions: ** Why does saveToPreferencesWithCompletionHandler fail with “permission denied” on TestFlight? Are there special entitlements required for using NEFilterManager in production/TestFlight builds? Is MDM (Mobile Device Management) required to deploy apps using content filters? Has anyone successfully implemented NEFilterProviderConfiguration in production, and if so, how?
1
0
126
Jun ’25
Error in bnns.h - Missing ')'
Greetings! I have an app that builds ad runs just fine in XCode 11 on Catalina on an old Intel MBP. I've recently purchased an M3 Max machine and want to bring this app forward. Just going from XCode 11 to XCode 12 on the old machine, not even trying yet to up to XCode 14 or 15 on the new machine, I get two build errors that I have no idea how to resolve, both in bnns.h. I have not deliberately included this framework, and text search in my project and dependencies finds no incidence of "bnns" at all. These occur in the function prototypes for BNNSApplyMultiheadAttention and BNNSApplyMultiheadAttentionBackward. These prototypes look okay to me, and again I didn't deliberately invoke them in the first place. Does anybody have any idea how I can get past this roadblock? p.s. this is what the first prototype looks like; no unbalanced parens here (spacing edited for readability): int BNNSApplyMultiheadAttention(BNNSFilter F, size_t batch_size, void const* query, size_t query_stride, void const* key, size_t key_stride, BNNSNDArrayDescriptor const* _Nullable key_mask, size_t key_mask_stride, void const* value, size_t value_stride, void *output, size_t output_stride, BNNSNDArrayDescriptor const* _Nullable add_to_attention, size_t * _Nullable backprop_cache_size, void * _Nullable backprop_cache, size_t * _Nullable workspace_size, void * _Nullable workspace)
3
0
83
May ’25
dlopen and dlsym loadable modules located in app directory
Hi, I encounter problems after updating macOS to Sequoia 15.5 with plugins loaded with dlopen and dlsym. $ file /Applications/com.gsequencer.GSequencer.app/Contents/Plugins/ladspa/cmt.dylib /Applications/com.gsequencer.GSequencer.app/Contents/Plugins/ladspa/cmt.dylib: Mach-O universal binary with 2 architectures: [x86_64:Mach-O 64-bit bundle x86_64] [arm64:Mach-O 64-bit bundle arm64] /Applications/com.gsequencer.GSequencer.app/Contents/Plugins/ladspa/cmt.dylib (for architecture x86_64): Mach-O 64-bit bundle x86_64 /Applications/com.gsequencer.GSequencer.app/Contents/Plugins/ladspa/cmt.dylib (for architecture arm64): Mach-O 64-bit bundle arm64 I am currently investigating what goes wrong. My application runs in a sandboxed environment.
2
0
70
Jun ’25