I have an app that plays sound files stored locally. I'm using a single SwiftUI view with a MPVolumeView so the user can control system volume from the player in my app. When I'm playing the sound file on the iPhone, my volume slider operates as expected. When I AirPlay to my AppleTV, the volume slider still works to control the volume, but when I hit play in my app, the volume snaps to a different value, but actual sound volume doesn't change. Control still works. Flipping to control center, I see a volume mismatch between system volume and the MPVolumeView.
Here's the code that I use to put the slider in my app.
struct VolumeSlider: UIViewRepresentable {
func makeUIView(context: Context) -> MPVolumeView {
let vv = MPVolumeView(frame: .zero)
vv.showsVolumeSlider = true
vv.setVolumeThumbImage(UIImage() ,for: UIControl.State.normal)
return vv
}
func updateUIView(_ uiView: MPVolumeView, context: Context) {
// No need to update the view in this case
}
}
I'm using AVFoundation and AVAudioPlayer to playback the sound file. I'm using MediaPlayer to tell MPNowPlayingInfoCenter the track info and AlbumArt. Audio control via control center works perfectly. Does the same if I target iOS 16 or 17.
Is this a bug with the MPVolumeView or the way I added it to the app?
Audio
RSS for tagDive into the technical aspects of audio on your device, including codecs, format support, and customization options.
Post
Replies
Boosts
Views
Activity
Am a musician/DJ. Jumped from 14 Pro Max iOS 17 to 16 Pro Max iOS 18.1 b4. For each audio source(music app/yt music etc.) same track/eq/volume compared side by side. With the new device, overall it's a bit muffled and damping music when highs and lows are mixed. Most noticeable when listening to high vocals and acoustic instruments. Drum and bass sound much like on an old Nokia. On 14 Pro it's nothing like that. Thank you
Hi,
In my app I am using MusicLibraryRequest<Artist> to fetch all of the artists in someone's Library collection. With this response I then fetch each artists albums: artist.with([.album]).
The response from this only gives albums in the users Library collection. I would like to augment it with all of the albums for an artist from the full catalogue.
I'm using MusicKit and targeting iOS18 and visionOS 2.
Could someone please point me towards the best way to approach this?
Hello! The new lower latency support for AirPods in Game Mode is impressive, but I'm not sure of the best way to handle the transition into/out of Game Mode while audio is playing. In order to lower the latency, the system appears to drop some number of samples, with the result being a good deal less latency. My use case is macOS where it's easier to switch in/out of the fullscreen game (a simple swipe left), thus causing more issues for Game Mode since the audio is playing the entire time. It would be nice if offscreen games could remain in game mode, but I understand not wanting to give developers that control.
Are there any best practices for avoiding or masking the audio glitch caused by this skip-ahead? Is there a system event I can receive to know when Game Mode is about to be enabled or disabled, where I could perhaps fade out the audio? My callback checks the inTimestamp->mSampleTime value to detect gaps, but it only rarely detects a Game Mode gap, even though the audio skip-ahead always happens.
BTW, I am currently only developing on macOS (15.0) and I'm working at a low level with AudioUnit callbacks and a SpatialMixer. I am not currently using any higher-level audio APIs.
And here's a few questions I don't necessarily expect answers to, but it doesn't hurt to ask: Is there any additional technical details about how this latency reduction works, or exactly how much of a reduction is achieved (or said another way, how many samples are dropped)? How much does this affect AirPods battery life? And finally, is there a way to query the actual latency value? I check the value for kAudioDevicePropertyLatency but it seems to always report 160ms for AirPods. Thanks!
Hi, I recently updated to ios 18. And yes the music is playing in background with camera app. But unable to play music with notes app open
Hello,
I'm getting an unknown, never-before-seen error at application launch, when running my iOS SpriteKit game on the iOS 18 arm64 simulator from Xcode 16.0 (16A242d) —
AudioConverterOOP.cpp:847 Failed to prepare AudioConverterService: -302
This is occurs on all iOS 18 simulator devices, between application(_:didFinishLaunchingWithOptions:) and the first applicationDidBecomeActive(_:) — the SKScene object may have been already initialized by SpriteKit, but the scene's didMove(to:) method hasn't been called yet.
Also, note that the error message is being emitted from a secondary (non-main) thread, obviously not created by the app.
After the error occurs, no SKScene is able to play audio — this had never occurred on iOS versions prior to 18, neither on physical devices nor on the simulator.
Has anyone seen anything like this on a physical device running 18?
Unfortunately, at the moment I cannot test myself on an 18 device, only on the simulator...
Thank you,
D.
Iphone 13mini updated to ios18.
Carplay is wired on my 2021 RAM Laramie.
After the update => Premium audio is lost, I can only hear low quality audio.
When I manually change to Bluetooth instead of usb on the car, then audio comes in speaker mode on my phone and not on the truck.
I am fetching playlist songs from the users library and also need the releaseDate (or year) for the song for my use case. However, the releaseDate is always nil since I have upgraded to sequoia. I am pretty sure, this was working before the upgrade, but I couldn't find any documentation on changes related to this.
Furthermore I noticed, the IDs also now seem to be the catalog IDs instead of the global ones like i.PkdZbQXsPJ4DX04
Here's in a nutshell what I am doing
func fetchSongs(playlist: Playlist) async throws {
let detailedPlaylist = try await playlist.with([.tracks])
var currentTracks: MusicItemCollection<Track>? = detailedPlaylist.tracks
repeat {
for track in currentTracks! {
guard case .song(let song) = track else {
print("This is not a song")
continue
}
print(song.releaseDate)
}
currentTracks = try await currentTracks?.nextBatch()
} while currentTracks != nil
}
I successfully retrieved strings, arrays, and other data through a custom AudioObjectPropertySelector, but I can only get fixed returns. Whenever I modify it to use dynamic data, it results in an error. Below is my code.
case kPlugIn_CustomPropertyID:
{
*((CFStringRef*)outData) = CFSTR("qin@@@123");
*outDataSize = sizeof(CFStringRef);
}
break;
case kPlugIn_ContainDic:
{
CFMutableDictionaryRef mutableDic1 = CFDictionaryCreateMutable(kCFAllocatorDefault,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(mutableDic1, CFSTR("xingming"), CFSTR("qinmu"));
*((CFDictionaryRef*)outData) = mutableDic1;
*outDataSize = sizeof(CFPropertyListRef);
// *((CFPropertyListRef*)outData) = mutableDic;
}
break;
case kPlugIn_ContainArray:
{
CFMutableArrayRef mutableArray = CFArrayCreateMutable(kCFAllocatorDefault, 0, &kCFTypeArrayCallBacks);
CFArrayAppendValue(mutableArray, CFSTR("Hello"));
CFArrayAppendValue(mutableArray, CFSTR("World"));
*((CFArrayRef*)outData) = mutableArray;
*outDataSize = sizeof(CFArrayRef);
}
break;
These are fixed returns, and there are no issues when I retrieve the data.
When I change the return data in kPlugIn_ContainDic to the following, the first time I restart the CoreAudio service and retrieve the data, it works fine. However, when I attempt to retrieve it again, it results in an error:
case kPlugIn_ContainDic:
{
*outDataSize = sizeof(CFPropertyListRef);
*((CFPropertyListRef*)outData) = mutableDic;
}
break;
error code:
HALC_ShellDevice::CreateIOContextDescription: failed to get a description from the server
HAL_HardwarePlugIn_ObjectGetPropertyData: no object
HALPlugIn::ObjectGetPropertyData: got an error from the plug-in routine, Error: 560947818 (!obj)
The declaration and usage of mutableDic are as follows:
static CFMutableDictionaryRef mutableDic;
static OSStatus BlackHole_Initialize(AudioServerPlugInDriverRef inDriver, AudioServerPlugInHostRef inHost)
{
OSStatus theAnswer = 0;
gPlugIn_Host = inHost;
if (mutableDic == NULL){
mutableDic = CFDictionaryCreateMutable(kCFAllocatorDefault,
100,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
}
}
static OSStatus BlackHole_AddDeviceClient(AudioServerPlugInDriverRef inDriver, AudioObjectID inDeviceObjectID, const AudioServerPlugInClientInfo* inClientInfo)
{
CFStringRef string = CFStringCreateWithFormat(kCFAllocatorDefault, NULL, CFSTR("%u"), inClientInfo->mClientID);
CFMutableDictionaryRef dic = CFDictionaryCreateMutable(kCFAllocatorDefault,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(dic, CFSTR("clientID"), string);
CFDictionarySetValue(dic, CFSTR("bundleID"), inClientInfo->mBundleID);
CFDictionarySetValue(mutableDic, string, dic);
}
Can someone tell me why
A couple of weeks ago I got help here to play one song and the solution to my problem was that I wasn't adding the song (Track Type) to the queue correctly, so now I want to be able to add a playlist worth of songs to the queue. The problem is when I try to add an array of the Track type I get an error. The other part of this issue for me is how do I access an individual song off of the queue after I add it? I see I can do ApplicationMusicPlayer.shared.queue.currentItem but I think I'm missing/misunderstanding something here. Anyway's I'll post the code I have to show how I'm attempting to do this at this moment.
In this scenario we're getting passed in a playlist from another view.
import SwiftUI
import MusicKit
struct PlayBackView: View {
@State var song: Track?
@State private var songs: [Track] = []
@State var playlist: Playlist
private let player = ApplicationMusicPlayer.shared
VStack {
// Album Cover
HStack(spacing: 20) {
if let artwork = player.queue.currentEntry?.artwork {
ArtworkImage(artwork, height: 100)
} else {
Image(systemName: "music.note")
.resizable()
.frame(width: 100, height: 100)
}
VStack(alignment: .leading) {
// Song Title
Text(player.queue.currentEntry?.title ?? "Song Title Not Found")
.font(.title)
.fixedSize(horizontal: false, vertical: true)
}
}
}
.padding()
.task {
await loadTracks()
// It's Here I thought I could do something like this
player.queue = tracks
// Since I can do this with one singular track
player.queue = [song]
do {
try await player.queue.insert(songs, position: .afterCurrentEntry)
} catch {
print(error.localizedDescription)
}
}
}
@MainActor
private func loadTracks() async {
do {
let detailedPlaylist = try await playlist.with([.tracks])
let tracks = detailedPlaylist.tracks ?? []
setTracks(tracks)
} catch {
print(error.localizedDescription)
}
}
@MainActor
private func setTracks(_ tracks: MusicItemCollection<Track>) {
songs = Array(tracks)
}
}
I use Universal Product Codes (UPC) in my app to reliably identify albums after having used albumIDs for a time. AlbumIDs can change over time for no obvious reasons (see here for songIDs) so I switched to UPCs since I believed they cannot change. Well apparently they can.
A few days ago I populated a JSON with UPCs including 196871067713. Today trying to perform a MusicCatalogResourchRequest for the UPC does not return anything. When using that UPC and putting it into an Apple Music link like https://music.apple.com/de/album/folge-89-im-geistergarten/1683337782?l=en-GB redirects to https://music.apple.com/de/album/folge-89-im-geistergarten/1683337782?l=en-GB so I assume the UPC has changed from 196871067713 to 1683337782.
Apple Music can handle that and redirects to the new upc both in the app and as a website.
But a MusicCatalogResourceRequest cannot do that. I filed a suggestion for that (FB15167146) but I need a solution quicker. Can I somehow detect where the URL is redirecting to? Is there a way MusicCatalogResourceRequest can do this? Performing a MusicCatalogSearchRequest can be an option but seems unreliable when using the title as search term. Other ideas?
Thank you
Hi,
I'm facing an issue with my Unity-based app when deploying it to the AVP. Often, after building and running the app on the device, the audio gets muted. I couln't find any setting that let me unmute it. The only solution I've found is to reset the device settings, which makes the audio work again.
Here are a few things I’ve noticed:
The sound works fine when I reset my device’s settings.
I haven't changed any sound or audio settings on the device before or after deploying the app.
The issue doesn’t always occur immediately, but when it does, resetting settings seems to be the only fix.
Could there be something in the AVP audio configuration that causes this problem? I’d appreciate any advice or suggestions.
Thanks!
How can I search for a single song by using its song ID? I tried the following code but it's not working:
MusicCatalogResourceRequest(matching: Song.self, equalTo: song.id.rawValue)
I get the following errors in Xcode:
Cannot convert value of type 'Song.Type' to expected argument type 'KeyPath<MusicItemType.FilterType, String>'
Generic parameter 'MusicItemType' could not be inferred
I have the song ID saved as a string inside song so I just want to grab the full MusicKit MusicItemType from the API. I looked at the documentation but it doesn't make any sense to me.
Please help!
On Xcode 16 (16A242) app execution and UI will stall / lag as soon as an AVAudioPlayer is initialized.
let audioPlayer = try AVAudioPlayer(contentsOf: URL)
audioPlayer.volume = 1.0
audioPlayer.delegate = self
audioPlayer.prepareToPlay()
Typically you would not notice this in a music app for example, but it is especially noticable in games where multiple sounds are being played using multiple instances of AVAudioPlayer. The entire app slows down because of it.
This is similar to this issue from last year.
I have reported it to Apple in FB15144369, as this messes up my production games where fps goes down to nothing when sounds are enabled.
Unfortunately I cannot find a solution. Anyone?
1, I saw nullAudio custom properties of the static const AudioObjectPropertySelector kPlugIn_CustomPropertyID = 'PCst'; But I don't know how to use this in a project.
2. What is the difference between PlugIn and Device's custom properties?
3. When I try to customize the PropertySelector for deivce. After adding kAudioObjectPropertyCustomPropertyInfoList NullAudio_HasDeviceProperty method to compile again after restarting coreAudio service, found that virtual devices don't show.
Here is crash logs
NSInvalidArgumentException - *** -[AVContentKeyRequest processContentKeyResponse:] AVContentKeySession's keySystem is not same as that of keyResponse
We observed few 16.X.X Devices are not able to download audio media content, and if they trying multiple times. app got crash and above error encountered while crashing.
Please let us know if any issues.
Haptics are often represented as audio for presentation purposes by Apple in videos and learning resources.
I am curious if:
...Apple has released, or is willing to release any tools that may have been used synthesize audio to represent a haptic patterns (such as in their WWDC19 Audio-Haptic presentation)?
...there are any current tools available that take haptic instruction as input (like AHAP) and outputs an audio file?
...there is some low-level access to the signal that drives the Taptic Engine, so that it can be repurposed as an audio stream?
...you have any other suggestions!
I can imagine some crude solutions that hack together preexisting synthesizers and fudging together a process to convert AHAP to MIDI instructions, dialing in some synth settings to mimic the behaviour of an actuator, but I'm not too interested in that rabbit hole just yet.
Any thoughts? Very curious what the process was for the WWDC videos and audio examples in their documentation...
Thank you!
We are using the MediaPlayer API to provide CarPlay support. Starting in iOS 18 we are having issues updating the content list. The initial list of items will populate on a fresh instance but soon there after an error will show up saying we are not entitled to "com.apple.mediaremote.external-artwork-validation". From that point onwards no changes we make to our MPPlayableContentDataSource are reflected in CarPlay. Even after restarting the device.
While the MediaPlayer API is marked as deprecated, we are still using it to provide CarPlay support going back to iOS 10.
Has anyone else run into this or have suggestions for workarounds?
Setting the default output device using Core Audio stops working after using continuity with AirPods
Hi, made an app which is managing the sound input/output for the user and I'm facing an unexpected behavior which can be replicated using the Apple's HALLab tool too.
Initially the app is able to set the input/output AudioDevice and everything works as expected however if you switch from your Mac to your iPhone when using AirPods Pro and switch back again the app is no longer able to set the output device. There's no error, it simply switches back to AirPods immediately.
You can replicate the issue on the HALLab(an app provided by Apple as "additional tools for Xcode").
How to:
Open HALLab, put on your AirPods and play some media.
Try out changing the input and output source and study the expected behavior
Unlock your iPhone, play some media and wait for the AirPods Pro to switch to the iPhone.
Go back to your Mac and play some media and wait for AirPods to switch to Mac
Try changing the output source on HALLab, notice that the source immediately reverses back to AirPods, which is the unexpected behavior.
Changing the source from the System Settings keeps working as expected.
Any ideas on what's going on and how to handle that?
I'm on MacOS 15.1 using the following code to set the device:
private func setDevice(deviceID: AudioDeviceID, forType type: AudioDeviceType) throws {
guard isDeviceAvailable(deviceID) else {
throw AudioDeviceError.deviceNotAvailable
}
print("setting the device.")
var propertyAddress = AudioObjectPropertyAddress(
mSelector: type == .input ? kAudioHardwarePropertyDefaultInputDevice : kAudioHardwarePropertyDefaultOutputDevice,
mScope: kAudioObjectPropertyScopeGlobal,
mElement: kAudioObjectPropertyElementMain
)
let dataSize = UInt32(MemoryLayout<AudioDeviceID>.size)
var mutableDeviceID = deviceID // Create a mutable copy
let status = AudioObjectSetPropertyData(AudioObjectID(kAudioObjectSystemObject), &propertyAddress, 0, nil, dataSize, &mutableDeviceID)
guard status == noErr else {
throw AudioDeviceError.deviceNotSet(status: status)
}
}
I am using an AVQueuePlayer to play a series of audio files. While implementing the now playing functionality for the iOS lock screen, I got a little confused with how to use MPNowPlayingInfo. When you update MPNowPlayingInfo, one of the fields is MPNowPlayingInfoPropertyElapsedPlaybackTime. This leads me to believe you need to call it at least once a second to keep that up-to-date.
But if you don't call it that frequently, the Now Playing UI does update correctly as it's playing, so that makes me think you only need to call it once you start playing?
It feels very expensive to keep on calling it every time in my periodic time observer, but is that the correct approach? Or do you just call it when you play/pause/skip, etc. ?