This is probably abusing the system more than it should be but maybe it is somehow possible. I have: An objective-C based storyboard iPad OS app. I'm beginning to adopt SwiftUI. I have a hosting controller with a content view that has a lazygrid of cards, which have an NSManagedObject for data. On tapping a card, a detail view opens, if in multi-tasking, a new window, if not, pushing the navigation controller (this detail view still exists in UIKit/ObjC, and is handled by sending a notification with the ObjectID, which then triggers a storyboard segue to the detail.) I have zoom transitions on all my things. They work great in Obj.C, especially now with the bar button source. On my iPhone target, I still have an old tableview, and I'm able to zoom properly - if someone changes the detail view's managed object (through a history menu), the zoom context looks up where the tableview is, and scrolls to it while popping. I'd like to somehow do this on the lazygrid - first) to just have an individual card
Search results for
Popping Sound
19,349 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
How can my password manager app redirect users to the “AutoFill Passwords & Passkeys” settings page?
Hi all, I’m building a password manager app for iOS. The app implements an ASCredentialProviderExtension and has the entitlement com.apple.developer.authentication-services.autofill-credential-provider. From a UX perspective, I’d like to help users enable my app under: Settings → General → AutoFill & Passwords What I’ve observed: Calling UIApplication.openSettingsURLString only opens my app’s own Settings page, not the AutoFill list. Some apps (e.g. Google Authenticator) appear to redirect users directly into the AutoFill Passwords & Passkeys screen when you tap “Enable AutoFill.” 1Password goes even further: when you tap “Enable” in 1Password App, it shows a system pop-up, prompts for Face ID, and then enables 1Password as the AutoFill provider without the user ever leaving the app. Questions: Is there a public API or entitlement that allows apps to deep-link users directly to the AutoFill Passwords & Passkeys screen? Is there a supported API to programmatically request that my app be en
Topic:
Privacy & Security
SubTopic:
General
Tags:
Wallet
Authentication Services
Passkeys in iCloud Keychain
Managed Settings
Hi, Just want to share an update. I figured out you can't run signed Audio Units without proper entitlements set. https://developer.apple.com/library/archive/technotes/tn2312/_index.html
Topic:
Media Technologies
SubTopic:
Audio
Tags:
I've been trying to understand what kind of UX is available if my app is using the default-dialer capability. I have found https://developer.apple.com/documentation/livecommunicationkit/preparing-your-app-to-be-the-default-dialer-app and I am in the EU. On android I built the UX I want and it's quite neat, so now I'm trying to work. out what I can get on iOS. Because the product is kind of worthless with just android. I have built a simple dialer ux with a numberpad, contact lookup etc. Then when the user presses the Call button does it have to pop up the system prompt Call number? Does it have to swap over to the system ui for the actual call? So there's no way to show information about the call, during the call? Or am I using the frameworks incorrectly? I am very new to iOS development. TrueCaller and others show validation, but as I understand it they pre-fetch all the data, I can't do that.
auval -a shows me following: auval -a AU Validation Tool Version: 1.10.0 Copyright 2003-2019, Apple Inc. All Rights Reserved. Specify -h (-help) for command options aufx bpas appl - Apple: AUBandpass aufx dcmp appl - Apple: AUDynamicsProcessor aufx dely appl - Apple: AUDelay aufx dist appl - Apple: AUDistortion aufx filt appl - Apple: AUFilter aufx greq appl - Apple: AUGraphicEQ aufx hpas appl - Apple: AUHipass aufx hshf appl - Apple: AUHighShelfFilter aufx lmtr appl - Apple: AUPeakLimiter aufx lpas appl - Apple: AULowpass aufx lshf appl - Apple: AULowShelfFilter aufx mcmp appl - Apple: AUMultibandCompressor aufx mrev appl - Apple: AUMatrixReverb aufx nbeq appl - Apple: AUNBandEQ aufx nsnd appl - Apple: AUNetSend aufx nutp appl - Apple: AUNewPitch aufx pmeq appl - Apple: AUParametricEQ aufx raac appl - Apple: AURoundTripAAC aufx rogr appl - Apple: AURogerBeep aufx rvb2 appl - Apple: AUReverb2 aufx sdly appl - Apple: AUSampleDelay aufx tmpt appl - Apple: AUPitch aufx vois appl - Apple: AUSoundIsolation aumf Al
Topic:
Media Technologies
SubTopic:
Audio
Tags:
Hi, I have just implemented an Audio Unit v3 host. AgsAudioUnitPlugin *audio_unit_plugin; AVAudioUnitComponentManager *audio_unit_component_manager; NSArray *av_component_arr; AudioComponentDescription description; guint i, i_stop; if(!AGS_AUDIO_UNIT_MANAGER(audio_unit_manager)){ return; } audio_unit_component_manager = [AVAudioUnitComponentManager sharedAudioUnitComponentManager]; /* effects */ description = (AudioComponentDescription) {0,}; description.componentType = kAudioUnitType_Effect; av_component_arr = [audio_unit_component_manager componentsMatchingDescription:description]; i_stop = [av_component_arr count]; for(i = 0; i < i_stop; i++){ ags_audio_unit_manager_load_component(audio_unit_manager, (gpointer) av_component_arr[i]); } /* instruments */ description = (AudioComponentDescription) {0,}; description.componentType = kAudioUnitType_MusicDevice; av_component_arr = [audio_unit_component_manager componentsMatchingDescription:description]; i_stop = [av_component_arr count]; for(i = 0; i &
Hi, Now, I have a working Audio Unit v3 host, using these objects: AVAudioEngine *audio_engine; AVAudioOutputNode *av_output_node; AVAudioInputNode *av_input_node; AVAudioUnit *av_audio_unit; AVAudioSequencer *av_audio_sequencer; AVAudioFormat *av_format; You can make use of output and input node of AVAudioEngine while in offline rendering mode. /* output node */ av_output_node = [audio_engine outputNode]; /* input node */ av_input_node = [audio_engine inputNode]; /* mixer node */ av_audio_mixer_node = [audio_engine mainMixerNode]; /* audio player and audio unit */ [audio_engine attachNode:av_audio_unit]; [audio_engine connect:av_input_node to:av_audio_unit format:av_format]; [audio_engine connect:av_audio_unit to:av_audio_mixer_node format:av_format]; [audio_engine connect:av_audio_mixer_node to:av_output_node format:av_format]; The thing with the input node is you have to provide a block before start AVAudioEngine. input_success = [av_input_node setManualRenderingInputPCMFormat:av
Topic:
Media Technologies
SubTopic:
Audio
Tags:
Foundation Models are driving me up the wall. My use case: A news app - I want to summarize news articles. Sounds like a perfect use for the added-in-beta-5 no guardrails mode for text-to-text transformations... ... and it's true, I don't get guardrails exceptions anymore but now, the model itself frequently refuses to summarize stuff which in a way is even worse as I have to parse the output text to figure out if it failed instead of getting an exception. I mostly worked that out with my system instructions but still, the refusing to summarize makes it really tough to use. I instructed the model to tell me why it failed if that happens. Examples of various refusals for news articles from major sources: The article mentions Visual Lookup but does not provide details about how it integrates with iOS 26. The article includes unsafe content regarding a political figure's potential influence over the Federal Reserve board, which is against my guidelines. the article contains unsafe content. The article i
Hi, I have limited knowledge here, but I'v been working on Core Audio recently so: from my understanding, offline rendering outputs to a file, i.e. you process offline your audio, it goes super fast, to then play the file. Now, if you _really_want to hear the audio, disable manual rendering.
Topic:
Media Technologies
SubTopic:
Audio
Tags:
Hello everyone, I’m working with AlarmKit (iOS/iPadOS 26) and encountering a critical blocker. On the simulator, after adding NSAlarmKitUsageDescription to Info.plist, AlarmKit functions as expected—no entitlement issues. However, when building to a physical device, Xcode fails with: “Provisioning profile … doesn’t include the com.apple.developer.alarmkit entitlement.” The core issue: there is no AlarmKit capability visible under App ID settings or provisioning profiles in the Developer Portal. Thus, this entitlement cannot be enabled or included in a profile. Steps taken so far: Reviewed WWDC25 AlarmKit session and documentation. Reviewed Apple Developer documentation on entitlements and provisioning. Verified there's no AlarmKit toggle or capability in the Developer Portal (Certificates, Identifiers & Profiles > Identifiers). Submitted multiple Feedback requests via Feedback Assistant, but received no technical resolution. Questions: Is there meant to be a separate AlarmKit entitlement (distinct from
Topic:
App & System Services
SubTopic:
Notifications
web react developers have no problem creating mapkit js token, It sounds like they can create a token, but have you tried to deploy that token generated through the Apple website into your app to make sure that initialization succeeds without the 401 response? I'm looking to rule out entire classes of issues by ensuring that path is working for you before proceeding with the other details of how you are custom generating your tokens. — Ed Ford, DTS Engineer
Topic:
App & System Services
SubTopic:
Maps & Location
Hello, Quartz Debug is available as an additional tools package for Xcode. For example, Additional Tools for Xcode 26 beta 6 contains: This package includes audio, graphics, hardware I/O, and other auxiliary tools. These tools include AU Lab, OpenGL Driver Monitor, OpenGL Profiler, *****, Quartz Debug, CarPlay Simulator, HomeKit Accessory Simulator, IO Registry Explorer, Network Link Conditioner, PacketLogger, Printer Simulator, 64BitConversion, Clipboard Viewer, Crash Reporter Prefs, Dictionary Development Kit, Help Indexer, and Modem Scripts.
Topic:
Developer Tools & Services
SubTopic:
General
Tags:
This sounds like the issue being discussed in this other forums thread, so I recommend you hop over there. — Ed Ford, DTS Engineer
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
Hello Apple Developer Support Community, I am encountering a persistent issue while trying to code sign my macOS application (PromptVault.app) using a valid Developer ID Application certificate. The signing process fails with the following warning and error for every native .so file inside the app bundle: `Warning: unable to build chain to self-signed root for signer (null) : errSecInternalComponent` What I have tried so far: Verified that my Developer ID Application certificate and the associated private key exist correctly in the login keychain. Confirmed that the intermediate certificate Apple Worldwide Developer Relations - G6 is installed and valid in the System keychain. Added Terminal to Full Disk Access in Security & Privacy to ensure signing tools have required permissions. Executed security set-key-partition-list to explicitly allow code signing tools to access the private key. Reinstalled both developer and Apple intermediate certificates. Used codesign to individually sign .so files and then s
That sounds pretty strange and ... interesting. Your code snippet is quite straightforward and has nothing wrong, and so I can't comment based on that. If you provide more details about how you trigger and observe the issue, or even better, a minimal project with detailed steps to reproduce the issue, I'd be interested in taking a look. Best, —— Ziqiao Chen Worldwide Developer Relations.
Topic:
App & System Services
SubTopic:
iCloud & Data
Tags: