Search results for

Popping Sound

19,348 results found

Post

Replies

Boosts

Views

Activity

Reply to XCode Refuses to Load Team
so if you log in to developer.apple.com and click the Account tab, you see the missing team under your name? But in Xcode's Settings, you don't see that team under your account? That does sound frustrating. It has never happened to me, so I can't help directly. There is a link to Membership and Account help here: https://developer.apple.com/contact/topic/select good luck!
3w
Why does AVAudioRecorder show 8 kHz when iPhone hardware is 48 kHz?
Hi everyone, I’m testing audio recording on an iPhone 15 Plus using AVFoundation. Here’s a simplified version of my setup: let settings: [String: Any] = [ AVFormatIDKey: Int(kAudioFormatLinearPCM), AVSampleRateKey: 8000, AVNumberOfChannelsKey: 1, AVLinearPCMBitDepthKey: 16, AVLinearPCMIsFloatKey: false ] audioRecorder = try AVAudioRecorder(url: fileURL, settings: settings) audioRecorder?.record() When I check the recorded file’s sample rate, it logs: Actual sample rate: 8000.0 However, when I inspect the hardware sample rate: try session.setCategory(.playAndRecord, mode: .default) try session.setActive(true) print(Hardware sample rate:, session.sampleRate) I consistently get: `Hardware sample rate: 48000.0 My questions are: Is the iPhone mic actually capturing at 8 kHz, or is it recording at 48 kHz and then downsampling to 8 kHz internally? Is there any way to force the hardware to record natively at 8 kHz? If not, what’s the recommended approach for telephony-quality audio (true 8 kHz) on i
1
0
171
3w
Reply to Why does AVAudioRecorder show 8 kHz when iPhone hardware is 48 kHz?
Is the iPhone mic actually capturing at 8 kHz, or is it recording at 48 kHz and then downsampling to 8 kHz internally? the latter. Your own investigation indicates this. The microphone doesn't record, it provides samples. The AVAudioRecorder records, you gave it the settings to use. Is there any way to force the hardware to record natively at 8 kHz? Have you tried setPreferredSampleRate? The docs say it is hardware dependent, the microphone might provide samples at a single fixed, or a few fixed rates. If not, what’s the recommended approach for telephony-quality audio (true 8 kHz) on iOS devices? Well, 8kHz 16-bit recordings from a modern iPhone microphone probably yields better than telephone quality. What are you actually trying to achieve? Sounds like an old analog telephone is probably a job for a narrow bandwidth filter (300 to 3.5kHz), the injection of noise at about 45 dB below the maximum amplitude, and some distortion and clipping. See https://developer.apple.com/documentation/avfo
Topic: Media Technologies SubTopic: Audio Tags:
3w
Reply to Bones/joints data issue - USD file export from Blender to RCP
Hello @Crixoule , thank you for your question! You mention the animation is working fine in RCP, which sounds like it was imported succesfully! Were you able to use AnimationLibraryComponent to preview it in RCP? For your use case specifically, I recommend taking a look at our BOT-anist sample, spefically how the backpack and head are attached to the main body of the robot. In RealityKit, the skeleton joint positions are not separate entities with transform components, as you might expect. This means you don't interact with them by placing other Entities as descendants of the skeleton joints. Instead, to get the position of a joint for the purpose of attaching another Entity to it, you calculate its transform by multiplying a chain of joint transforms together each frame. There are a few steps to this, first you will need to get the indices for the chain of joints starting from the root joint that connect to your head or backpack or arm_cannon or whatever joint you're looking for. In BOT-anist, see t
3w
No contextual menus above 15 mph?
I've noticed that when I'm in a moving vehicle going faster than 15 mph I can't tap a WhatsApp message and get the contextual menu (copy, forward etc). It's like the app or iOS disregards the tactile input above 15 mph. I need to try it with other apps to see if it happens with them. I'm guessing it's an iOS thing because I doubt WhatsApp has access to the speed value from the GPS. Maybe that's not a correct assumption. It's really a pain when you can't get the contextual menu to pop up. Can the feature that blocks popup menus at speeds above 15 mph be turned off?
1
0
55
3w
Reply to Network Extension App for MacOS with 3 Extensions
We do allow you to embed multiple network extensions in the same app. That’s true on all our platforms, although the mechanics vary based on whether you’re using a system extension (sysex) or an app extension (appex). For more info about what packaging is supported on what platforms, see TN3134 Network Extension provider deployment. It sounds like you’re using a sysexes, and that presents a decision point: You can have multiple independent sysex, each with a single NE provider. Or you can have a single sysex with multiple NE providers. The latter is generally easier to wrangle. In that case you have a single sysex target with multiple NE provider subclasses linked into the resulting executable, and then list all of those in the NEProviderClasses dictionary in your sysex’s Info.plist. IMPORTANT If you set things up this way there’s a single sysex process and all of your NE providers run within that. This has pros (you can share data between them) and cons (if one crashes, they all crash). Of specific
3w
Reply to Ventura Hack for FireWire Core Audio Support on Supported MacBook Pro and others...
Apparently on my previous tries, I must have had an error of some sort. Thanks now to @daneover and the great step-by-step guide they provided, I'm now listening to some tunes on YouTube, via my Mackie Onyx-1220i 's Firewire connection!! Mac mini 2023 with M2 Pro | 32GB RAM MacOS Sequoia 15.6.1 Followed the instructions in sections 1, 2, and 3. No apparent issues installing the FW kext from the AppleFWAudioVentura.pkg. Downloaded the ZIP file & extracted the .pkg to my desktop. Note: One thing I did not do is have the Mackie connected to my Mini via FW during the whole install/config. I waited until the last step in part 3 (after double reboot) to finally get connected. Also of note, I have had success via 2 different ways to connect to my Mini. In previous attempts, I tried the dongle daisy chain method with a FW800 (male) > FW400 (female) adapter connected to the Apple FW800 > TB2 adapter, and finally the TB2 > TB3 adapter. This time around, that collection of adapters is working great. Prior t
Topic: Media Technologies SubTopic: General Tags:
3w
Reply to [NSBitmapImageRep imageRepsWithContentsOfFile] error with HDR
Thanks for filing FB19955494. Looking at the bug it seems to have landed in a reasonable place. However, there’s no sysdiagnose log attached and, as I mention in Bug Reporting: How and Why?, that’s important even if the bug is readily reproducible. I recommend that you reproduce the problem and then trigger a sysdiagnose log and attach that. Also, it sounds like this bug should reproduce in a small test project. If so, it’d be great if you could attach that as well. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Topic: UI Frameworks SubTopic: UIKit
3w
Reply to Xcode 26.0 beta 7 (17A5305k) / Window 'Device and Simulator' / top part is blurred and overblends important info and toggle
SOLVED … by myself. Resolution is as follows: In the Device and Simulator Windows (where currently my right-pane top-area is blurred), click context/right-click like with intention to customize toolbar. Then the pop-up for toolbar-customizing like icon-only, icon and text , text-only appears. Choose something other than icon and text (‘Icon and Text’ will blur the top-part!!!). e. g. with Icon only setting —> all looks fine as expected. So, Apple, please fix this, other users will also trap into this. Completion: Do not set Toolbar to ‘Icon and Text’ ps. I don’t know how this initially happened to me. No clue.
3w
Best Practices for Continuous Background Biometric Monitoring on Apple Watch
Hello, everyone! I'm seeking some guidance on the App Store review process and technical best practices for a watchOS app. My goal is to create an app that uses HealthKit to continuously monitor a user's heart rate in the background for sessions lasting between 30 minutes and 3 hours. This app would not be a fitness or workout tracker. My primary question is about the best way to achieve this reliably while staying within the App Store Review Guidelines. Is it advisable to use the WorkoutKit framework to start a custom, non-fitness session for the purpose of continuous background monitoring? Are there any other recommended APIs or frameworks for this kind of background data collection on watchOS that I should be aware of? What are the key review considerations I should be mindful of, particularly regarding Guideline 4.1 (Design) and the intended use of APIs? My app's core functionality would require this kind of data for a beneficial purpose. I want to ensure my approach is technically sound and has
1
0
255
4w
Reply to Best Practices for Continuous Background Biometric Monitoring on Apple Watch
In general, there is no way to continuously run a watchOS app in background for long time, if it doesn't have an active background session (such as workout, audio, or location). There is no way to directly access the heart rate sensor data on an Apple Watch either. You will need to go through HealthKit. How background tasks and HealthKit work on watchOS was somehow discussed here and here. You can take a look if they help. If your app is for research purpose, however, you can probably use SensorKit to access the heart rate sensor data. Access to those data types is limited to research uses and requires an app to have a private entitlement, which is reviewed separately for each study. See Accessing SensorKit Data for more info. You can also check if ResearchKit can help. Best, —— Ziqiao Chen  Worldwide Developer Relations.
4w
Reply to XCode reverts CoreData's .xccurrentversion
I had been too swamped to be able to continue this conversation because of the highly prioritized work related to WWDC25, and after that I lost the track of this thread. Sorry for that. Thanks to another developer reaching out via DTS with the same issue, I am now picking up this thread, which hopefully is not too late. I'd like to confirm that I can now reproduce the issue. As folks have mentioned, the issue happens only in an Xcode project using folders, not groups. I've always used groups which helps me better organize files logically, and so had not seen the issue before. Xcode 26 Beta 6 doesn't fix the issue. Before the issue is fixed from the system side, the workarounds folks mentioned, like using groups instead or giving the new model version a name alphabetically ordered to the last, sound good to me. Best, —— Ziqiao Chen  Worldwide Developer Relations.
4w
Looking for solutions to build a video chat app (Omegle/Chatroulette style) on Vision Pro
Hi everyone, We are working on a prototype app for Apple Vision Pro that is similar in functionality to Omegle or Chatroulette, but exclusively for Vision Pro owners. The core idea is: – a matching system where one user connects to another through a virtual persona; – real-time video and audio transmission; – time limits for sessions with the ability to extend them; – users can skip a match and move on to the next one. We have explored WebRTC and Twilio, but unfortunately, they don’t fit our use case. Question: What alternative services or SDKs are available for implementing real-time video/audio communication on Vision Pro that would work with this scenario? Has anyone encountered a similar challenge and can recommend which technologies or tools to use? Thanks in advance!
2
0
227
Aug ’25