Does the alert only serve the following purposes: required for start events require for any event that needs to alert the user (with sound or having the live activity show in expanded presentation) while its content DO NOT matter. Only the presence of the field and its attributes matter. The only time its content matter are when the receiving device is an Apple Watch. Is that correct?
Search results for
Popping Sound
19,348 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
hi, Is there an Audio Unit logo I can show on my website? I would love to show that my application is able to host Audio Unit plugins. regards, Joël
I’m using the shared instance of AVAudioSession. After activating it with .setActive(true), I observe the outputVolume, and it correctly reports the device’s volume. However, after deactivating the session using .setActive(false), changing the volume, and then reactivating it again, the outputVolume returns the previous volume (before deactivation), not the current device volume. The correct volume is only reported after the user manually changes it again using physical buttons or Control Center, which triggers the observer. What I need is a way to retrieve the actual current device volume immediately after reactivating the audio session, even on the second and subsequent activations. Disabling and re-enabling the audio session is essential to how my application functions. I’ve tested this behavior with my colleagues, and the issue is consistently reproducible on iOS 18.0.1, iOS 18.1, iOS 18.3, iOS 18.5 and iOS 18.6.2. On devices running iOS 17.6.1 and iOS 16.0.3, outputVolume correctly refl
so if you log in to developer.apple.com and click the Account tab, you see the missing team under your name? But in Xcode's Settings, you don't see that team under your account? That does sound frustrating. It has never happened to me, so I can't help directly. There is a link to Membership and Account help here: https://developer.apple.com/contact/topic/select good luck!
Topic:
Code Signing
SubTopic:
Certificates, Identifiers & Profiles
Tags:
Is the iPhone mic actually capturing at 8 kHz, or is it recording at 48 kHz and then downsampling to 8 kHz internally? the latter. Your own investigation indicates this. The microphone doesn't record, it provides samples. The AVAudioRecorder records, you gave it the settings to use. Is there any way to force the hardware to record natively at 8 kHz? Have you tried setPreferredSampleRate? The docs say it is hardware dependent, the microphone might provide samples at a single fixed, or a few fixed rates. If not, what’s the recommended approach for telephony-quality audio (true 8 kHz) on iOS devices? Well, 8kHz 16-bit recordings from a modern iPhone microphone probably yields better than telephone quality. What are you actually trying to achieve? Sounds like an old analog telephone is probably a job for a narrow bandwidth filter (300 to 3.5kHz), the injection of noise at about 45 dB below the maximum amplitude, and some distortion and clipping. See https://developer.apple.com/documentation/avfo
Topic:
Media Technologies
SubTopic:
Audio
Tags:
Hello @Crixoule , thank you for your question! You mention the animation is working fine in RCP, which sounds like it was imported succesfully! Were you able to use AnimationLibraryComponent to preview it in RCP? For your use case specifically, I recommend taking a look at our BOT-anist sample, spefically how the backpack and head are attached to the main body of the robot. In RealityKit, the skeleton joint positions are not separate entities with transform components, as you might expect. This means you don't interact with them by placing other Entities as descendants of the skeleton joints. Instead, to get the position of a joint for the purpose of attaching another Entity to it, you calculate its transform by multiplying a chain of joint transforms together each frame. There are a few steps to this, first you will need to get the indices for the chain of joints starting from the root joint that connect to your head or backpack or arm_cannon or whatever joint you're looking for. In BOT-anist, see t
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Tags:
This certainly sounds like a bug! Have you filed a feedback about this? Please post the FB ID (ideally attach the sample app you're using) here so we can track this, and fix it appropriately.
Topic:
UI Frameworks
SubTopic:
UIKit
Tags:
I've noticed that when I'm in a moving vehicle going faster than 15 mph I can't tap a WhatsApp message and get the contextual menu (copy, forward etc). It's like the app or iOS disregards the tactile input above 15 mph. I need to try it with other apps to see if it happens with them. I'm guessing it's an iOS thing because I doubt WhatsApp has access to the speed value from the GPS. Maybe that's not a correct assumption. It's really a pain when you can't get the contextual menu to pop up. Can the feature that blocks popup menus at speeds above 15 mph be turned off?
We do allow you to embed multiple network extensions in the same app. That’s true on all our platforms, although the mechanics vary based on whether you’re using a system extension (sysex) or an app extension (appex). For more info about what packaging is supported on what platforms, see TN3134 Network Extension provider deployment. It sounds like you’re using a sysexes, and that presents a decision point: You can have multiple independent sysex, each with a single NE provider. Or you can have a single sysex with multiple NE providers. The latter is generally easier to wrangle. In that case you have a single sysex target with multiple NE provider subclasses linked into the resulting executable, and then list all of those in the NEProviderClasses dictionary in your sysex’s Info.plist. IMPORTANT If you set things up this way there’s a single sysex process and all of your NE providers run within that. This has pros (you can share data between them) and cons (if one crashes, they all crash). Of specific
Topic:
App & System Services
SubTopic:
Networking
Tags:
Apparently on my previous tries, I must have had an error of some sort. Thanks now to @daneover and the great step-by-step guide they provided, I'm now listening to some tunes on YouTube, via my Mackie Onyx-1220i 's Firewire connection!! Mac mini 2023 with M2 Pro | 32GB RAM MacOS Sequoia 15.6.1 Followed the instructions in sections 1, 2, and 3. No apparent issues installing the FW kext from the AppleFWAudioVentura.pkg. Downloaded the ZIP file & extracted the .pkg to my desktop. Note: One thing I did not do is have the Mackie connected to my Mini via FW during the whole install/config. I waited until the last step in part 3 (after double reboot) to finally get connected. Also of note, I have had success via 2 different ways to connect to my Mini. In previous attempts, I tried the dongle daisy chain method with a FW800 (male) > FW400 (female) adapter connected to the Apple FW800 > TB2 adapter, and finally the TB2 > TB3 adapter. This time around, that collection of adapters is working great. Prior t
Topic:
Media Technologies
SubTopic:
General
Tags:
Hi, I need to bundle an additional binary along my yet published application. It is a Audio Unit test application. My yet published application implemented Audio Unit plugin support. But upload is always rejected: Validation failed (409) Invalid Provisioning Profile. The provisioning profile included in the bundle com.gsequencer.GSequencer [com.gsequencer.GSequencer.pkg/Payload/com.gsequencer.GSequencer.app] is invalid. [Missing code-signing certificate.] For more information, visit the macOS Developer Portal. (ID: ****) I have followed the instructions here: Embedding a helper tool in a sandboxed app but no luck. Does anyone know whats going on? I use Transporter to upload the application, the embedded.provisioningprofile is copied from Xcode build and code signing is done manually.
Topic:
Code Signing
SubTopic:
Certificates, Identifiers & Profiles
Tags:
macOS
Provisioning Profiles
Code Signing
just opened a iOS18 project in latest Xcode 26 (beta 7) and the list reordering animation is broken and jerky. on iOS 18 a change to one of the list items would smoothly move it to its correct place but in iOS 26 the items jerk around, disappear then pop up in the correct order in the list. I am using this to filter and sort the events if searchQuery.isEmpty { return events.sort(on: selectedSortOption) } else { let filteredEvents = events.compactMap { event in // Check if the event title contains the search query (case-insensitive). let titleContainsQuery = event.title.range(of: searchQuery, options: .caseInsensitive) != nil return titleContainsQuery ? event : nil } return filteredEvents.sort(on: selectedSortOption) } } is there a better way for iOS 26?
Topic:
UI Frameworks
SubTopic:
SwiftUI
Hi, I believe I've found a potential error in the sample code on the documentation page for creating and using a process tap with an aggregate device. The issue is in the section explaining how to add a tap to the aggregate device. I have already filed a Feedback Assistant ticket on this (ID: FB17411663) but haven't heard back for months. Capturing system audio with Core Audio taps The sample code for modifying the kAudioAggregateDevicePropertyTapList incorrectly uses the tapID as the target AudioObjectID when calling AudioObjectSetPropertyData. // (Code to get the list and potentially modify listAsArray) if var listAsArray = list as? [CFString] { // ... (modification logic) ... // Set the list back on the aggregate device. <--- The comment is correct list = listAsArray as CFArray _ = withUnsafeMutablePointer(to: &list) { list in // INCORRECT: This call uses tapID as the target object. AudioObjectSetPropertyData(tapID, &propertyAddress, 0, nil, propertySize, list) } } The kAudioAg
Thanks for filing FB19955494. Looking at the bug it seems to have landed in a reasonable place. However, there’s no sysdiagnose log attached and, as I mention in Bug Reporting: How and Why?, that’s important even if the bug is readily reproducible. I recommend that you reproduce the problem and then trigger a sysdiagnose log and attach that. Also, it sounds like this bug should reproduce in a small test project. If so, it’d be great if you could attach that as well. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Topic:
UI Frameworks
SubTopic:
UIKit
Hi everyone, I’m testing audio recording on an iPhone 15 Plus using AVFoundation. Here’s a simplified version of my setup: let settings: [String: Any] = [ AVFormatIDKey: Int(kAudioFormatLinearPCM), AVSampleRateKey: 8000, AVNumberOfChannelsKey: 1, AVLinearPCMBitDepthKey: 16, AVLinearPCMIsFloatKey: false ] audioRecorder = try AVAudioRecorder(url: fileURL, settings: settings) audioRecorder?.record() When I check the recorded file’s sample rate, it logs: Actual sample rate: 8000.0 However, when I inspect the hardware sample rate: try session.setCategory(.playAndRecord, mode: .default) try session.setActive(true) print(Hardware sample rate:, session.sampleRate) I consistently get: `Hardware sample rate: 48000.0 My questions are: Is the iPhone mic actually capturing at 8 kHz, or is it recording at 48 kHz and then downsampling to 8 kHz internally? Is there any way to force the hardware to record natively at 8 kHz? If not, what’s the recommended approach for telephony-quality audio (true 8 kHz) on i