We found that on some iPad devices (such as iPad8,9/iPad OS/18.5), when the media is playing, the recorded human voice is very small and is basically suppressed by the sound of the playing media. When the media is not playing, the human voice can be recorded. Likely when use screen recording with microphone, the ambient sound is also basically suppressed by the media sound.
Search results for
Popping Sound
19,349 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
The helper has no UI because it’s not an app, it’s just a standalone executable. The helper has a UI, allowing to set the app parameters and show the activities, but the only way to access it for the user is through the menu bar icon. We can pop up the main window through a terminal command, and the UI is ok as well on macOS 26. For the other statements, yes, it is correct.
Topic:
App & System Services
SubTopic:
Core OS
I really don’t appreciate answers along the lines of “Why would you want to do this? You don't need it.” That kind of response is exactly why people are increasingly turning to AI tools for help — because they aim to assist without being dismissive or patronizing. I asked the question because I do notice a difference. I’m currently using a bitmap-style font from this site: https://int10h.org/oldschool-pc-fonts/fontlist. They provide real bitmap fonts, but macOS doesn’t support them natively, so I'm forced to use outline versions instead. Creating a custom font won't help in this case — it won't behave differently regarding anti-aliasing. Here’s the context: I’m developing a music app for macOS that simulates the look of old audio equipment — specifically the monochrome dot matrix LCDs and VFDs. The font is rendered at a specific size such that each ‘fake bitmap pixel’ maps to a 3×3 block of screen pixels. I then apply a mask so that only the 4 top-left pixels of each block are visible, mimicking the
Topic:
UI Frameworks
SubTopic:
SwiftUI
Tags:
The Apple documentation for [UIViewController willMoveToParentViewController:] states, Called just before the view controller is added or removed from a container view controller. Hence, the view controller being popped should appear at the top of the navStack when willMoveToParentViewController: is called. In iPadOS 16 and 17 the view controller being popped appears at the top of the navStack when willMoveToParentViewController: is called. In iPadOS 18 the view controller being popped has already been (incorrectly) removed from the navStack. We confirmed this bug using iPadOS 18.2 as well as 18.0. Our app relies upon the contents of the navStack within willMoveToParentViewController: to track the user's location within a folder hierarchy. Our app works smoothly on iPadOS 16 and 17. Conversely, our customers running iPadOS 18 have lost client data and corrupted their data folders as a result of this bug! These customers are angry -- not surprisingly, some have asked for refunds and
Thank you for the clarification. It seems that the solution based on IOKit may be too complex and unstable. So, I believe, the best approach is to use Core Audio (HAL). Thank you very much!
Topic:
App & System Services
SubTopic:
Core OS
Tags:
Sorry about the delay responding. It’s been a busy week. [quote='849355022, Andrewfromvictoria, /thread/791461?answerId=849355022#849355022, /profile/Andrewfromvictoria'] there are very large libraries of audio content available for our program, and the install scripts help them with various options for installing/using this content [/quote] OK. But does that content ultimately stored in your installer package? Or is the script getting it from elsewhere? Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Topic:
Developer Tools & Services
SubTopic:
General
Hi @DTS Engineer, Our new metrics indicate that the one-way issue caused by audio session not activated accounts for more than 50% of cases. Given this significant percentage, could you please revisit and reexamine this issue? Thank you
Topic:
App & System Services
SubTopic:
General
Tags:
Is there an Apple-recommended approach to implement such blocking more securely? No, not really. I'm not expert enough in the audio details of the audio system to make any specific recommendation there, but all of the different options within the audio system would really only change the small details of behavior (for example, performance), not provide some fundamental improvement. However, I can say that IOKit: Maybe some solution which is based on IOKit. ...will definitely NOT work, nor would I expect any approach outside of the audio system to really work. Here is what I'd be concerned about: Bluetooth doesn't really live in the kernel in the same way other I/O paths do. I haven't looked closely, but I'm not sure it's possible to block Bluetooth audio devices using IOKit. The range of hardware here is so large that I think it would be VERY difficult to build an IOKit-based solution that I'd be confident could block everything. Further complicating #2, I believe
Topic:
App & System Services
SubTopic:
Core OS
Tags:
Hey @Perazim, It sounds like you're building a pretty cool experience, in order to fully understand your situation, it would be great if you could provide me with a sample project that replicates your issue. Our engineering teams need to investigate this issue, as resolution may involve changes to Apple's software. I'd greatly appreciate it if you could open a bug report, include an Xcode project that replicates the issue, and post the FB number here once you do. Bug Reporting: How and Why? has tips on creating your bug report. Thanks, Michael
Topic:
UI Frameworks
SubTopic:
SwiftUI
Tags:
I'm working in Swift/SwiftUI, running XCode 16.3 on macOS 15.4 and I've seen this when running in the iOS simulator and in a macOS app run from XCode. I've also seen this behaviour with 3 different audio files. Nothing in the documentation says that the speechRecognitionMetadata property on an SFSpeechRecognitionResult will be nil until isFinal, but that's the behaviour I'm seeing. I've stripped my class down to the following: private var isAuthed = false // I call this in a .task {} in my SwiftUI View public func requestSpeechRecognizerPermission() { SFSpeechRecognizer.requestAuthorization { authStatus in Task { self.isAuthed = authStatus == .authorized } } } public func transcribe(from url: URL) { guard isAuthed else { return } let locale = Locale(identifier: en-US) let recognizer = SFSpeechRecognizer(locale: locale) let recognitionRequest = SFSpeechURLRecognitionRequest(url: url) // the behaviour occurs whether I set this to true or not, I recently set // it to true to see if it made a difference
Hello, We have SpatialAudioComponent set on the sphere to play audio when certain Entities within the sphere are tapped. That part works great. However, when we switch out the image material on the sphere with a VideoMaterial, the spatial sound doesn't appear to be enabled. The volume actually decreases. Is there some meta data we are missing somewhere to spatially play the audio in the video? Thank you! bvsdev
Topic:
Spatial Computing
SubTopic:
General
Tags:
I've tried following apple's documentation to apply a video material on a Model Entity, but I have encountered a compile error while attempting to specify the Spatial Audio type. It is a 360 video on a Sphere which plays just fine, but the audio is too quiet compared to the volume I get when I preview the video on Xcode. So I tried tried to configure audio playback mode on the material but it gives me a compile error: audioInputMode' is unavailable in visionOS audioInputMode' has been explicitly marked unavailable here RealityFoundation.VideoPlaybackController.audioInputMode) https://developer.apple.com/documentation/realitykit/videomaterial/ Code: let player = AVPlayer(url: url) // Instantiate and configure the video material. let material = VideoMaterial(avPlayer: player) // Configure audio playback mode. material.controller.audioInputMode = .spatial // this line won’t compile. VisionOS 2.4, Xcode 16.4, also tried Xcode 26 beta 2. The videos are HEVC MPEG-4 codecs. Is the
We seem to be having the same issue as well, whenever the window goes into an inactive state before foreground, the audio session gets interrupted. if you have only one window and you close it from the system button, it goes into inactive (causing the session to be interrupted) and then foreground. Closing it programmatically (dismiss window Action) is handled differently as then when observing the scene phases, the window goes directly into foreground. Also if you open window A, and then Window B and then close window B and then A. Interrupts the Audio. In this case, Window B goes into background, but window A goes to Inactive-> Background. However closing window A while B is open and then closing window B does not interrupt it. both windows go into Background state directly. which makes me think that it is the inactive state that is causing this interruption.
Topic:
Spatial Computing
SubTopic:
General
Tags:
I have a visionOS app that plays audio using AVAudioEngine and presents both a window and an immersive space. If I close the window, the audio session gets interrupted and attempting to restart the session and audio engine has no effect. I need to dismiss the app, then reopen it, which reopens the main window, in order for audio to start playing again. This is in all visionOS 2 betas. Note that I have background audio enabled for my app.
I've been aware for some time that Push notifications work on the iOS simulator now -- I see them pop up while I'm working. However, it would seem that SILENT push notifications do not work. I came to this conclusion only after several frustrating hours of debugging my app, thinking either the app was broken or the server wasn't sending the notification. Finally I tested it on a device and found that it actually works fine. Why does such a limitation exist? If I can't depend on the simulator to handle ALL of the notifications, I'd rather it didn't work at all. Having it work part of the time on some notifications is really confusing.