Hi everyone, I’m testing audio recording on an iPhone 15 Plus using AVFoundation. Here’s a simplified version of my setup: let settings: [String: Any] = [ AVFormatIDKey: Int(kAudioFormatLinearPCM), AVSampleRateKey: 8000, AVNumberOfChannelsKey: 1, AVLinearPCMBitDepthKey: 16, AVLinearPCMIsFloatKey: false ] audioRecorder = try AVAudioRecorder(url: fileURL, settings: settings) audioRecorder?.record() When I check the recorded file’s sample rate, it logs: Actual sample rate: 8000.0 However, when I inspect the hardware sample rate: try session.setCategory(.playAndRecord, mode: .default) try session.setActive(true) print(Hardware sample rate:, session.sampleRate) I consistently get: `Hardware sample rate: 48000.0 My questions are: Is the iPhone mic actually capturing at 8 kHz, or is it recording at 48 kHz and then downsampling to 8 kHz internally? Is there any way to force the hardware to record natively at 8 kHz? If not, what’s the recommended approach for telephony-quality audio (true 8 kHz) on i
Search results for
Popping Sound
19,349 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
3 I am working on an application to get when input audio device is being used. Basically I want to know the application using the microphone (built-in or external) This app runs on macOS. For Mac versions starting from Sonoma I can use this code: int getAudioProcessPID(AudioObjectID process) { pid_t pid; if (@available(macOS 14.0, *)) { constexpr AudioObjectPropertyAddress prop { kAudioProcessPropertyPID, kAudioObjectPropertyScopeGlobal, kAudioObjectPropertyElementMain }; UInt32 dataSize = sizeof(pid); OSStatus error = AudioObjectGetPropertyData(process, &prop, 0, nullptr, &dataSize, &pid); if (error != noErr) { return -1; } } else { // Pre sonoma code goes here } return pid; } which works. However, kAudioProcessPropertyPID was added in macOS SDK 14.0. Does anyone know how to achieve the same functionality on previous versions?
SOLVED … by myself. Resolution is as follows: In the Device and Simulator Windows (where currently my right-pane top-area is blurred), click context/right-click like with intention to customize toolbar. Then the pop-up for toolbar-customizing like icon-only, icon and text , text-only appears. Choose something other than icon and text (‘Icon and Text’ will blur the top-part!!!). e. g. with Icon only setting —> all looks fine as expected. So, Apple, please fix this, other users will also trap into this. Completion: Do not set Toolbar to ‘Icon and Text’ ps. I don’t know how this initially happened to me. No clue.
Topic:
Developer Tools & Services
SubTopic:
Xcode
Tags:
I use import * as StoreReview from 'expo-store-review'; In my onboarding I do: const requestAppRating = async () => { try { console.log('Requesting app rating via StoreReview'); if (await StoreReview.isAvailableAsync()) { console.log('Store review is available, requesting rating'); await StoreReview.requestReview(); } else { console.log('Store review not available on this device'); // Fallback: show alert encouraging manual rating Alert.alert( 'Rate Oddible', 'Enjoying the app? Please take a moment to rate us on the App Store!', [ { text: 'Maybe Later', style: 'cancel' }, { text: 'Rate Now', onPress: () => { console.log('User chose to rate manually'); // You can add a link to the App Store here if needed }} ] ); } } catch (error) { console.error('Error requesting app rating:', error); } }; This works perfectly in my development build but my production build in the app store does not pop up to request a review from a user. Any idea why this could be the case?
In general, there is no way to continuously run a watchOS app in background for long time, if it doesn't have an active background session (such as workout, audio, or location). There is no way to directly access the heart rate sensor data on an Apple Watch either. You will need to go through HealthKit. How background tasks and HealthKit work on watchOS was somehow discussed here and here. You can take a look if they help. If your app is for research purpose, however, you can probably use SensorKit to access the heart rate sensor data. Access to those data types is limited to research uses and requires an app to have a private entitlement, which is reviewed separately for each study. See Accessing SensorKit Data for more info. You can also check if ResearchKit can help. Best, —— Ziqiao Chen Worldwide Developer Relations.
Topic:
App & System Services
SubTopic:
Health & Fitness
Tags:
I had been too swamped to be able to continue this conversation because of the highly prioritized work related to WWDC25, and after that I lost the track of this thread. Sorry for that. Thanks to another developer reaching out via DTS with the same issue, I am now picking up this thread, which hopefully is not too late. I'd like to confirm that I can now reproduce the issue. As folks have mentioned, the issue happens only in an Xcode project using folders, not groups. I've always used groups which helps me better organize files logically, and so had not seen the issue before. Xcode 26 Beta 6 doesn't fix the issue. Before the issue is fixed from the system side, the workarounds folks mentioned, like using groups instead or giving the new model version a name alphabetically ordered to the last, sound good to me. Best, —— Ziqiao Chen Worldwide Developer Relations.
Topic:
Developer Tools & Services
SubTopic:
Xcode
Tags:
Hello, everyone! I'm seeking some guidance on the App Store review process and technical best practices for a watchOS app. My goal is to create an app that uses HealthKit to continuously monitor a user's heart rate in the background for sessions lasting between 30 minutes and 3 hours. This app would not be a fitness or workout tracker. My primary question is about the best way to achieve this reliably while staying within the App Store Review Guidelines. Is it advisable to use the WorkoutKit framework to start a custom, non-fitness session for the purpose of continuous background monitoring? Are there any other recommended APIs or frameworks for this kind of background data collection on watchOS that I should be aware of? What are the key review considerations I should be mindful of, particularly regarding Guideline 4.1 (Design) and the intended use of APIs? My app's core functionality would require this kind of data for a beneficial purpose. I want to ensure my approach is technically sound and has
Topic:
App & System Services
SubTopic:
Health & Fitness
Tags:
SensorKit
Health and Fitness
watchOS
Watch Complications
The main problem we are currently facing is that we cannot access the camera and transmit audio. We seem to have managed to do this through LiveKit, but it needs additional testing. Maybe someone knows how PersonaParty or TelePeeps do it?
Topic:
Media Technologies
SubTopic:
Video
Tags:
It sounds like badgeLabel is the API you're looking for.
Topic:
App & System Services
SubTopic:
General
Is that the exact wording of the message? It sounds like your app has code that checks a backend service and has logic that determines when to display such a message that you'll need to debug. — Ed Ford, DTS Engineer
Topic:
Developer Tools & Services
SubTopic:
General
My app will not open to myself or to clients and has a pop up that claims App Outdated The app version you are using is no longer supported. Please install he latest Heartland update through the App Store. However, there have been no updates to the app.
Topic:
Developer Tools & Services
SubTopic:
General
So, let me start here and correct a basic misunderstanding: Even the app is not started yet. Fundamentally, CallKit is an interface framework, albiet a very specialized on, not a calling framework. In other words, the ONLY reason the user app ever sees the incoming call UI from your app is because your app told the system to show it. By definition, anytime CallKit is active, your app is running. hi Kevin, actually my question is all about the audio call only. So basically my concern is there anyway when user slide to answer the audio call, then they unlock phone, can we open the app to foreground? Why? Why do you want to this? The direct answer here is no, that's not possible, however, that's basically because video call is the call metadata CallKit uses to determine what should happen when the user unlocks the phone. __ Kevin Elliott DTS Engineer, CoreOS/Hardware
Topic:
App & System Services
SubTopic:
General
Tags:
A framework is a fancy wrapper around a shared library. Shared libraries have numerous benefits on Apple platforms: They can help reduce build times. If the same code is used by multiple programs in your product, like an app and an app extension, putting that code in a shared library can reduce the product’s size. If those programs can run simultaneously, a shared library can reduce your product’s memory footprint because only one copy of the code is loaded at a time. In some cases — and this is particularly important with Objective-C — using a static library can result in incorrect behaviour at runtime [1]. Oh, and when you bundle a shared library into a framework, there’s one more benefit: A framework can have resources used by the shared library’s code. Having said that, they’re not the right choice in all situations. And that brings me to this: [quote='798530021, raunits, /thread/798530, /profile/raunits'] However, I have found that the bundle size is more for Project 2 as compared to the bundle size of p
Topic:
Developer Tools & Services
SubTopic:
General
Tags:
Hi everyone, We are working on a prototype app for Apple Vision Pro that is similar in functionality to Omegle or Chatroulette, but exclusively for Vision Pro owners. The core idea is: – a matching system where one user connects to another through a virtual persona; – real-time video and audio transmission; – time limits for sessions with the ability to extend them; – users can skip a match and move on to the next one. We have explored WebRTC and Twilio, but unfortunately, they don’t fit our use case. Question: What alternative services or SDKs are available for implementing real-time video/audio communication on Vision Pro that would work with this scenario? Has anyone encountered a similar challenge and can recommend which technologies or tools to use? Thanks in advance!
hi Kevin, actually my question is all about the audio call only. So basically my concern is there anyway when user slide to answer the audio call, then they unlock phone, can we open the app to foreground? Even the app is not started yet.
Topic:
App & System Services
SubTopic:
General
Tags: