I have created a demo iOS app to create BLE connection with surrounding headphone.
I am able to connect to headphone successfully through my demo iOS app. I can also see in iPhone Bluetooth Setting that headphone is connected but when i am playing music from Spotify/YouTube then music is not being played through headphone. It is still using iPhone speakers.
First i am scanning sarounding bluetooth Devices through CBCentralManager and then connecting one of the found device.
cBCenteralManager.scanForPeripherals(withServices: nil, options: nil)
For connecting:
cBCenteralManager.connect(peripheral, options: nil)
Do i need to make any code changes while connecting via BLE?
I am expecting when i am connecting to headphone via my Demo app. Same connection is visible in iPhone Bluetooth setting too then when i play music on spotify/youtube then sound should be played on headphone and not on iPhone speakers.
Audio
RSS for tagDive into the technical aspects of audio on your device, including codecs, format support, and customization options.
Post
Replies
Boosts
Views
Activity
Are the AudioObject APIs (such as AudioObjectGetPropertyData, AudioObjectSetPropertyData, etc.) thread-safe? Meaning, for the same AudioObjectID is it safe to do things like:
Get a property in one thread while setting the same property in another thread
Set the same property in two different threads
Add and remove property listeners in different threads
Put differently, is there any internal synchronization or mutex for this kind of usage or is the burden on the caller?
I was unable to find any documentation either way which makes me think that the APIs are not thread-safe.
【手順】
1.アプリを起動する。
2.ImmersiveSpace1がopenされ、3Dオブジェクトのアニメーションが再生される。
3.アニメーションが終了するとImmersiveSpace1をdismissしてImmersiveSpace2をopenする。
【期待値】
ImmersiveSpace1をopenするとBGMが再生され、ImmersiveSpace2がopenしても引き続きBGMが再生されていること。
【結果】
ImmersiveSpace1をopenするとBGMが再生され、ImmersiveSpace2がopenするとBGMの再生が止まる。
【環境】
・実機(VisionOS2)にて発生。
・シミュレータでは発生しない。
・Xcode:Version 15.2 (15C500b)
【ログ】
ImmersiveSpace2をopenした際に実機で出力されている。シミュレータでは出力されない。
AVAudioSession_iOS.mm:2223 Server returned an error from destroySession:. Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service with pid 39 named com.apple.audio.AudioSession was invalidated from this process." UserInfo={NSDebugDescription=The connection to service with pid 39 named com.apple.audio.AudioSession was invalidated from this process.}
【Procedure】
Launch the application.
ImmersiveSpace1 is opened and the animation of the 3D object is played.
When the animation finishes, ImmersiveSpace1 is dismissed and ImmersiveSpace2 is opened.
【Expected value】
When ImmersiveSpace1 is opened, the background music should play, and when ImmersiveSpace2 is opened, the background music should continue to play.
【Result】
When ImmersiveSpace1 is opened, the BGM is played, and when ImmersiveSpace2 is opened, the BGM stops playing.
【Environment】
This problem occurs on an actual machine (VisionOS2).
It does not occur on the simulator.
Xcode: Version 15.2 (15C500b)
【Log】
Output on actual device when ImmersiveSpace is opened. It is not output on the simulator.
AVAudioSession_iOS.mm:2223 Server returned an error from destroySession:. Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service with pid 39 named com.apple.audio.AudioSession was invalidated from this UserInfo={NSDebugDescription=The connection to service with pid 39 named com.apple.audio.AudioSession was invalidated from this AudioSession was invalidated from this process.}
Is there any feasible way to get a Core Audio device's system effect status (Voice Isolation, Wide Spectrum)?
AVCaptureDevice provides convenience properties for system effects for video devices. I need to get this status for Core Audio input devices.
Hi, I'm trying to play multiple video/audio file with AVPlayer using AVMutableComposition. Each video/audio file can process simultaneously so I set each video/audio in individual tracks. I use only local file.
let second = CMTime(seconds: 1, preferredTimescale: 1000)
let duration = CMTimeRange(start: .zero, duration: second)
var currentTime = CMTime.zero
for _ in 0...4 {
let mutableTrack = composition.addMutableTrack(
withMediaType: .audio,
preferredTrackID: kCMPersistentTrackID_Invalid
)
try mutableTrack?.insertTimeRange(
duration,
of: audioAssetTrack,
at: currentTime
)
currentTime = currentTime + second
}
When I set many audio tracks (maybe more than 5), the first part sounds a little different from original when it starts. It seems like audio's front part is skipped.
But when I set only two tracks, AVPlayer plays as same as original file.
avPlayer.play()
How can I fix it? Why do audio tracks affect that don't have any playing parts when start? Please let me know.
Hi.
I saw that in iOS 18 Beta there is a property "transition" on the Music Kit's ApplicationMusicPlayer. However, in my app I am using MPMusicPlayerApplicationController because I want to play Apple Music songs, local songs and podcasts. But I didn't find an analogue property on MPMusicPlayerApplicationController to specify transitions between songs. Am I missing something?
Thanks,
Dirk
Hello,
We are seeing some strange behaviour when using MPRemoteCommandCenter as well as UIKeyCommand in our app on iOS 17.
We are finding that when a UIKeyCommand is trigerred via external keyboard to start playing some music (via our own custom CoreAudio driver), the keyboard becomes unresponsive for a few seconds before UIKeyCommands are triggered again.
Strangely enough, if we comment out all our MPRemoteCommandCenter code, the UIKeyCommands work without going into the unresponsive state for a few seconds.
ie
UIKeyCommands:
override open var keyCommands: [UIKeyCommand] {
let commands = [UIKeyCommand(title: String(localized: "__PLAY_STOP__"), action: #selector(shortcutPlayStop(_:)), input: " "),
UIKeyCommand(title: String(localized: "__PAUSE__"), action: #selector(shortcutPause(_:)), input: "."), /** etc **/]
commands.forEach { $0.wantsPriorityOverSystemBehavior = true }
return commands
}
and MPRemoteCommands:
MPRemoteCommandCenter.shared().playCommand.addTarget { [weak self] _ in
self?.doStuff()
// etc
return .success
}
Note this issue did not occur prior to iOS17 🤨
Any ideas what could be going on?
Thank you!
Hi, I'm working on a web project that uses the MediaSession API to interface with the media notification on iOS. The issue that I'm experiencing occurs after pressing the play button in the media session modal where the session seems to NOT fire the event handler callback and also kill the media session itself. It's a strange behaviour considering that the pause callback works fine.
audio_source = new Audio(url);
navigator.mediasession.metadata = {
... // Metadata here
};
navigator.mediasession.setActionHandler('play', (details) => {
audio_source.play();
}
);
navigator.mediasession.setActionHandler('pause', (details) => {
audio_source.pause();
}
);
Hello,
I'm having a problem with my Iphone 12:
iOS version: 17.5.1
Model name: Iphone 12
When I call someone it's impossible to communicate, it seems that the microphone is deactivated or that it doesn't respond.
I was able to extract the ips file from the "data":
stacks+audiomxd-.... Here's an extract:
{"bug_type":"288","timestamp":"2024-06-16 22:44:25.00 +0200","os_version":"iPhone OS 17.5.1 (21F90)","roots_installed":0,"incident_id":"7B8604DF-3863-4760-806C-591A90A7A7A4"}
{
"crashReporterKey" : "6928f591dd9e4d26541855e6d4b6a20d408bdfd1",
"exception" : "0xbe18d1ee",
"frontmostPids" : [
34
],
"tuning" : {
},
"absoluteTime" : 2954071508880,
"product" : "iPhone13,2",
"kernel" : "Darwin Kernel Version 23.5.0: Wed May 1 20:35:15 PDT 2024; root:xnu-10063.122.3~3\/RELEASE_ARM64_T8101",
"date" : "2024-06-16 22:44:25.08 +0200",
"reason" : "XPC message timeout in -[AVAudioSessionRemoteXPCClient getProperty:propertyName:MXProperty:reply:], probably deadlocked. Writing a stackshot and terminating.",
"codeSigningMonitor" : 1,
"incident" : "7B8604DF-3863-4760-806C-591A90A7A7A4",
"build" : "iPhone OS 17.5.1 (21F90)",
"roots_installed" : 0,
"bug_type" : "288",
"pid" : 102,
"memoryStatus" : {"compressorSize":38088,"compressions":25391066,"decompressions":20835948,"busyBufferCount":3,"memoryPressureDetails":{"pagesWanted":467,"pagesReclaimed":2085},"pageSize":16384,"memoryPressure":false,"memoryPages":{"active":67555,"throttled":0,"fileBacked":60679,"wired":57187,"purgeable":3679,"inactive":65541,"free":1533,"speculative":2364}},
As you can see, one line states :
"reason" : "XPC message timeout in -[AVAudioSessionRemoteXPCClient getProperty:propertyName:MXProperty:reply:], probably deadlocked. Writing a stackshot and terminating.",
Deadlock occurs when two or more processes are waiting for resources held by the others, creating a situation where none of the processes can progress.
In this case, it seems that the process has tried to recover an audio property, but has remained blocked indefinitely.
What reinforces my doubts is that at the end of the .ips is a line that seems to give some informations as to where the problem is :
"notes" : ["Requested by audiomxd","_dyld_process_info_create(623) for resampling UUIDs failed with 1","_dyld_process_info_create(2535) for resampling UUIDs failed with 1","_dyld_process_info_create(3181) for resampling UUIDs failed with 1","_dyld_process_info_create(3183) for resampling UUIDs failed with 1","_dyld_process_info_create(3503) for resampling UUIDs failed with 1","resampled 409 of 1813 threads with truncated backtraces from 0 pids: ","resampled 625 of 37 images missing from 175 pids: 75,93,98,178,190,210,...,3627"],
Here's what my research yielded. Could you please help me?
All calls are useless because the microphone no longer works.
Regards
Has anyone noticed when you connect your phone to CarPlay the screen shows up but when you play audio its still coming out of the iPhones speaker.
Just installed macOS Sequoia and observed that the mClientID and mProcessID parameters in the AudioServerPlugInClientInfo structure are empty when called AddDeviceClient and RemoveDeviceClient functions of the AudioServerPlugInDriverInterface.
This data is essential to identify the connected client, and its absence breaks the basic functionality of the HAL plugins.
FB13858951 ticket filed.
Hi there, whenever I use any third party editing software, the third clip and clips after that have no audio. Here’s how I did it
take any third party editing software
put 2 clips, cut one in half and delete the other half, cut a half of the 2nd clip
Any fixes?
I'm getting an issue even unencrypted video playback also failing with status failed.
Error Domain=CoreMediaErrorDomain Code=-12927 "(null)"
I unable to find any info on above error code.
Is there some way to look this up?
Sample master M3U8 is shared below.
Note: If I use any variant M3U8 then it is working playing fine.
We are using a VoiceProcessingIO audio unit in our VoIP application on Mac. In certain scenarios, the AudioComponentInstanceNew call blocks for up to five seconds (at least two). We are using the following code to initialize the audio unit:
OSStatus status;
AudioComponentDescription desc;
AudioComponent inputComponent;
desc.componentType = kAudioUnitType_Output;
desc.componentSubType = kAudioUnitSubType_VoiceProcessingIO;
desc.componentFlags = 0;
desc.componentFlagsMask = 0;
desc.componentManufacturer = kAudioUnitManufacturer_Apple;
inputComponent = AudioComponentFindNext(NULL, &desc);
status = AudioComponentInstanceNew(inputComponent, &unit);
We are having the issue with current MacOS versions on a host of different Macs (x86 and x64 alike). It takes two to three seconds until AudioComponentInstanceNew returns.
We also see the following errors in the log multiple times:
AUVPAggregate.cpp:2560 AggInpStreamsChanged wait failed
and those right after (which I don't know if they matter to this issue):
KeystrokeSuppressorCore.cpp:44 ERROR: KeystrokeSuppressor initialization was unsuccessful. Invalid or no plist was provided. AU will be bypassed. vpStrategyManager.mm:486 Error code 2003332927 reported at GetPropertyInfo
I'm developing an app which use "System Audio Recording Only" API to capture system audio.
Is there any API to check if app is authorized? So I can instruct user to give my app with this permission.
Thanks.
Hello,
my app works as Auv3 plugin.
I am interested in copying / pasting LogicPro chord track.
After I copy chord track in LogicPro and read UIPasteBoard.general in the app, I can see:
["LogicPasteBoardMarker": <OS_dispatch_data: data[0x3024599c0] = { leaf, size = 1, buf = 0x10a758000 }>]
How can I access these data? Thank you.
Sometimes when I call AudioWorkIntervalCreate the call hangs with the following stacktrace. The call is made on the main thread.
mach_msg2_trap 0x00007ff801f0b3ce
mach_msg2_internal 0x00007ff801f19d80
mach_msg_overwrite 0x00007ff801f12510
mach_msg 0x00007ff801f0b6bd
HALC_Object_AddPropertyListener 0x00007ff8049ea43e
HALC_ProxyObject::HALC_ProxyObject(unsigned int, unsigned int, unsigned int, unsigned int) 0x00007ff8047f97f2
HALC_ProxyObjectMap::_CreateObject(unsigned int, unsigned int, unsigned int, unsigned int) 0x00007ff80490f69c
HALC_ProxyObjectMap::CopyObjectByObjectID(unsigned int) 0x00007ff80490ecd6
HALC_ShellPlugIn::_ReconcileDeviceList(bool, bool, std::__1::vector<unsigned int, std::__1::allocator<unsigned int>>&, std::__1::vector<unsigned int, std::__1::allocator<unsigned int>>&) 0x00007ff8045d68cf
HALB_CommandGate::ExecuteCommand(void () block_pointer) const 0x00007ff80492ed14
HALC_ShellObject::ExecuteCommand(void () block_pointer) const 0x00007ff80470f554
HALC_ShellPlugIn::ReconcileDeviceList(bool, bool) 0x00007ff8045d6414
HALC_ShellPlugIn::ConnectToServer() 0x00007ff8045d74a4
HAL_HardwarePlugIn_InitializeWithObjectID(AudioHardwarePlugInInterface**, unsigned int) 0x00007ff8045da256
HALPlugInManagement::CreateHALPlugIn(HALCFPlugIn const*) 0x00007ff80442f828
HALSystem::InitializeDevices() 0x00007ff80442ebc3
HALSystem::CheckOutInstance() 0x00007ff80442b696
AudioObjectAddPropertyListener_mac_imp 0x00007ff80469b431
auoop::WorkgroupManager_macOS::WorkgroupManager_macOS() 0x00007ff8040fc3d5
auoop::gWorkgroupManager() 0x00007ff8040fc245
AudioWorkIntervalCreate 0x00007ff804034a33
Using the hardware volume buttons on the iPhone, you have 16 steps you can adjust your volume to. I want to implement a volume control slider in my app. I am updating the value of the slider using AVAudioSession.sharedInstance().outputVolume. The problem is that this returns values rounded to the nearest 0 or 5. This makes the slider jump around. .formatted() is not causing this problem.
You can recreate the problem using code below.
@main
struct VolumeTestApp: App {
init() {
try? AVAudioSession.sharedInstance().setActive(true)
}
var body: some Scene {
WindowGroup {
ContentView()
}
}
}
struct ContentView: View {
@State private var volume = Double()
@State private var difference = Double()
var body: some View {
VStack {
Text("The volume changed by \(difference.formatted())")
Slider(value: $volume, in: 0...1)
}
.onReceive(AVAudioSession.sharedInstance().publisher(for: \.outputVolume), perform: { value in
volume = Double(value)
})
.onChange(of: volume) { oldValue, newValue in // Only used to make the problem more obvious
if oldValue > newValue {
difference = oldValue - newValue
} else {
difference = newValue - oldValue
}
}
}
}
Here is a video of the problem in action:
https://share.icloud.com/photos/00fmp7Vq1AkRetxcIP5EXeAZA
What am I doing wrong or what can I do to avoid this?
Thank you
Tested with library songs on an app targeted to Mac (Designed for iPad).
The same app running on iOS queries the same library songs and the duration is expressed correctly in seconds, as expected for the TimeInterval type.
Xcode 15.3
MacOS 14.5
FB13821671
Hi,
I am getting into a trap. Please check stack-trace, howto fix this?
regards, Joël
stack-trace with ExtAudioFileWrite