I'm working on a media app that would like to be able to tell if the TV connected to tvOS is running at 59.94hz or 60.00hz, so it can optimize a video stream. It looks like the best I can currently do is to check if the user has Match Content Rate enabled, and based on that, when calling displayManager.preferredDisplayCriteria to change video modes, I could guess which rate their TV might be in. It's not very ideal, because not all TVs support both of these rates, and my request for 59.94 might end up as 60 and vice versa.
I dug around and can't find any available method in UIScreen to get this info. The odd thing is, the data is right there in currentMode when I look in the debugger, but it seems to be in a private or undocumented class. Is there any way to get at it?
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have an iPad app that I want to run on Apple Silicon macs.
Everything works fine except for VNDocumentCameraViewController. According to the docs this class is available on:
iOS 13.0+ iPadOS 13.0+ Mac Catalyst 13.1+ visionOS 1.0+
yet when I try using it I get Document camera is not available on my Mac Studio running macOS 15.2
Is this expected behaviour?
Thanks
Hello,
I am experiencing slow image retrieval when using the requestImageForAsset:targetSize:contentMode:options:resultHandler: method in my application. The delay is significantly impacting the performance of my app.
Here are the details of my implementation:
for (PHAsset *asset in assets) {
@autoreleasepool {
PHImageManager *imageManager = [PHImageManager defaultManager];
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.synchronous = YES;
options.deliveryMode = PHImageRequestOptionsDeliveryModeFastFormat;
options.resizeMode = PHImageRequestOptionsResizeModeNone;
[imageManager requestImageForAsset:asset
targetSize:CGSizeMake(100, 100)
contentMode:PHImageContentModeAspectFill
options:options
resultHandler:^(UIImage *thumbnail, NSDictionary *info) {
CameraRollCellDto *cellDto = [[CameraRollCellDto alloc] init];
cellDto.index = index;
cellDto.thumbnail = thumbnail;
cellDto.propertyDate = asset.creationDate;
if (self.segmentedTorikomi.selectedSegmentIndex == SEG_INDEX_IKKATSU) {
cellDto.isSelected = YES;
} else {
cellDto.isSelected = NO;
}
[list addObject:cellDto];
}];
index++;
}
}
Has anyone else encountered this issue? Are there any known solutions or optimizations that can help improve the speed of image retrieval using this method?
Thank you for your assistance.
Topic:
Media Technologies
SubTopic:
Photos & Camera
Hello! I am trying to determine the best approach with AVPlayer for implementing auto-play, that is, playback that automatically starts without user initiation. Ideally this would work for both local and streaming audio.
My current approach is using KVO and the status on an AVPlayerItem equal to readyToPlay to do this, but I was wondering if there was a better property or state to use, or, alternatively, whether this use case may already be handled when automaticallyWaitsToMinimizeStalling is true, so that I could simply write:
player.replaceCurrentItem(with: AVPlayerItem(url: streamingUrl))
player.rate = 1
or
let playerItem = AVPlayerItem(url: streamingUrl)
player = AVPlayer(playerItem: playerItem)
player.rate = 1
and expect the item to be auto-played when ready.
In the context of user-initiated playback, I've typically seen code that makes a button's enabled state contingent on player.currentItem.duration, e.g. in AVFoundationSimplePlayer-iOS. On the other hand, AVAutoWait, which utilizes automaticallyWaitsToMinimizeStalling, does not seem to do this.
As a side note, I am not using an AVQueuePlayer.
Topic:
Media Technologies
SubTopic:
Streaming
Hey folks, I'm running into an odd issue suddenly with an app that had a working MusicKit integration before.
I'm using ApplicationMusicPlayer to play Apple Music albums and songs. I'm testing on a physical device, signed in to Apple ID, and with a valid subscription. Apple Music via the first-party app works entirely fine on this device.
Attempting to play back any content at all gives the log:
<ICUserIdentityStoreACAccountBackend: 0x1070bf3e0> Failed to initialize primary apple account, error=Error Domain=ICError Code=-7013 "Client is not entitled to access account store" UserInfo={NSDebugDescription=Client is not entitled to access account store}
[ICUserIdentityStore] - initializing account histories with activeAccountDSID = nil, activeLockerAccountDSID = nil, timestamp = 14605951908
[ICUserIdentityStore] Failed to fetch local store account with error: Error Domain=ICError Code=-7013 "Client is not entitled to access account store" UserInfo={NSDebugDescription=Client is not entitled to access account store}.
The album artwork, track names, etc, all appear in the control center playback controls, but the music doesn't play. Trying to trigger playback with control center just results in it skipping to the next track, which doesn't play either.
This exact code used to work. I have the MusicKit service selected in Apple Connect. Since this isn't entitlement-based, I'm not sure how else to check that I'm set up correctly.
I've tried deleting/reinstalling the app, restarting the device, cleaning/rebuilding, and deleting DerivedData, to no avail.
Any help?
Running Xcode 16.4 (16F6), testing on iOS 18.5 (22F76)
Hi. I am working on an audio app for iOS. I have implemented UI and handling which allows the user to change playback rate of audio. When the user selects a different rate, I update the rate property on my AVQueuePlayer. This is working well on device.
When I use Airplay, it works for some devices and not for others. Some devices won't change playback rate and will always play at 1x speed.
Is this possibly a limitation of those 3rd-party devices? Or is there something I'm missing/should check? Would love to get playback rate changes working across all Airplay devices with our app.
Kind regards.
Among the millions of users of our online product, we have identified through data metrics that the silent audio data capture rate on iPadOS 18.4.1 or 18.5 has increased abnormally. However, we are unable to reproduce the issue. Has anyone encountered a similar issue? The parameters we used are as follows:
AudioSession:
category:AVAudioSessionCategoryPlayAndRecord
mode:AVAudioSessionModeDefault
option:77
preferredSampleRate:48000.000000
preferredIOBufferDuration:0.010000
AudioUnit
format.mFormatID = kAudioFormatLinearPCM;
format.mSampleRate = 48000.0;
format.mChannelsPerFrame = 2;
format.mBitsPerChannel = 16;
format.mFramesPerPacket = 1;
format.mBytesPerFrame = format.mChannelsPerFrame * 16 / 8;
format.mBytesPerPacket = format.mBytesPerFrame * format.mFramesPerPacket;
format.mFormatFlags = kAudioFormatFlagsNativeEndian | kLinearPCMFormatFlagIsPacked | kLinearPCMFormatFlagIsSignedInteger;
component.componentType = kAudioUnitType_Output;
component.componentSubType = kAudioUnitSubType_RemoteIO;
component.componentManufacturer = kAudioUnitManufacturer_Apple;
component.componentFlags = 0;
component.componentFlagsMask = 0;
We integrate with FCP X using a custom share destination and the Apple Script interface. This has been working fine until the the recent version 11 update of FCP X.
With this update we are no longer receiving the open event when the export has completed. We get the apple event to creat the Asset and the file is exported to the location we set in the response. There is just no open event after that. I suspect something is wrong with our scripting support but I have no idea what or how to troubleshoot.
This works fine in 10.8.1 and below.
This is my native module code implementation
I'm getting base64 encoded string from server and passing this to my native module of pcm player to play audio
App.tsx
PcmPlayer.writeChunk(e.data);
PcmPlayer.swift
import AVFoundation
@objc(PcmPlayer)
class PcmPlayer: RCTEventEmitter {
private var engine: AVAudioEngine?
private var playerNode: AVAudioPlayerNode?
private var format: AVAudioFormat?
private var bufferQueue = [Data]()
private var isPlaying = false
private var hasEnded = false
private var scheduledBufferCount = 0
private let minBufferBytes = 50000
private let pcmQueue = DispatchQueue(label: "pcm.queue")
override init() {
super.init()
}
override func supportedEvents() -> [String]! {
return ["onStatus", "onMessage"]
}
@objc(initPlayer:channels:bitsPerSample:)
func initPlayer(_ sampleRate: NSNumber,
channels: NSNumber,
bitsPerSample: NSNumber) {
pcmQueue.async {
self.stopInternal()
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(.playback, mode: .default, options: [])
try session.setActive(true, options: .notifyOthersOnDeactivation)
try session.setMode(.default)
print("🔈 Audio session active. Output route:", session.currentRoute.outputs)
} catch {
print("❌ Audio session setup failed:", error)
return
}
self.engine = AVAudioEngine()
self.playerNode = AVAudioPlayerNode()
guard let engine = self.engine, let playerNode = self.playerNode else {
print("❌ Engine or playerNode is nil")
return
}
engine.attach(playerNode)
self.format = AVAudioFormat(commonFormat: .pcmFormatFloat32,
sampleRate: sampleRate.doubleValue,
channels: AVAudioChannelCount(channels.uintValue),
interleaved: false)
guard let format = self.format else {
print("❌ Failed to create AVAudioFormat")
return
}
engine.connect(playerNode, to: engine.mainMixerNode, format: format)
do {
try engine.start()
playerNode.play()
engine.mainMixerNode.outputVolume = 1.0
print("✅ AVAudioEngine started with format:", format)
} catch {
print("❌ Engine start failed:", error)
}
self.hasEnded = false
}
}
@objc(writeChunk:)
func writeChunk(_ base64Pcm: String) {
pcmQueue.async {
guard base64Pcm.count >= 10 else {
print("⚠️ Skipping short base64 string")
return
}
var padded = base64Pcm
let mod4 = base64Pcm.count % 4
if mod4 > 0 {
padded += String(repeating: "=", count: 4 - mod4)
}
guard let data = Data(base64Encoded: padded, options: .ignoreUnknownCharacters) else {
print("❌ Failed to decode base64")
return
}
self.bufferQueue.append(data)
print("📥 Received PCM chunk (\(data.count) bytes)")
print("📥 writeChunk called. isPlaying=\(self.isPlaying), bufferQueue.count=\(self.bufferQueue.count)")
if !self.isPlaying {
self.isPlaying = true
self.waitForBufferAndStartPlayback()
} else if self.scheduledBufferCount == 0 {
self.isPlaying = true
self.waitForBufferAndStartPlayback()
}
}
}
private func waitForBufferAndStartPlayback() {
DispatchQueue.global().async {
while self.queueSize() < self.minBufferBytes && !self.hasEnded {
Thread.sleep(forTimeInterval: 0.01)
}
self.writeLoop()
}
}
private func writeLoop() {
DispatchQueue.global().async {
writeLoop: while self.isPlaying {
if self.bufferQueue.isEmpty {
for _ in 0..<100 {
Thread.sleep(forTimeInterval: 0.01)
if !self.bufferQueue.isEmpty { break }
}
if self.bufferQueue.isEmpty {
print("🔇 No more data to play after waiting")
self.isPlaying = false
break writeLoop
}
}
var data: Data?
self.pcmQueue.sync {
if !self.bufferQueue.isEmpty {
data = self.bufferQueue.removeFirst()
}
}
guard let chunk = data else {
print("⚠️ No data to process")
continue
}
if let buffer = self.pcmBufferFromData(chunk) {
self.scheduledBufferCount += 1
self.playerNode?.scheduleBuffer(buffer, completionHandler: {
self.pcmQueue.async {
self.scheduledBufferCount -= 1
if self.bufferQueue.isEmpty && self.scheduledBufferCount == 0 {
print("ℹ️ Playback idle - waiting for more data")
self.isPlaying = false
}
}
})
}
}
}
}
private func pcmBufferFromData(_ data: Data) -> AVAudioPCMBuffer? {
guard let format = self.format else { return nil }
let frameCount = UInt32(data.count / 2)
guard let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: frameCount) else {
print("❌ Failed to create AVAudioPCMBuffer")
return nil
}
buffer.frameLength = frameCount
guard let floatChannelData = buffer.floatChannelData?[0] else {
print("❌ floatChannelData is nil")
return nil
}
data.withUnsafeBytes { (rawBuffer: UnsafeRawBufferPointer) in
let int16Buffer = rawBuffer.bindMemory(to: Int16.self)
let count = min(int16Buffer.count, Int(frameCount))
for i in 0..<count {
floatChannelData[i] = Float32(int16Buffer[i]) / Float32(Int16.max)
}
}
return buffer
}
@objc(stopPlayer)
func stopPlayer() {
pcmQueue.async {
self.stopInternal()
}
}
private func stopInternal() {
print("🛑 stopInternal called")
self.playerNode?.stop()
self.engine?.stop()
self.engine?.reset()
self.playerNode = nil
self.engine = nil
self.format = nil
self.bufferQueue.removeAll()
self.isPlaying = false
self.hasEnded = true
self.scheduledBufferCount = 0
}
@objc(canWrite:rejecter:)
func canWrite(_ resolve: @escaping RCTPromiseResolveBlock,
rejecter reject: RCTPromiseRejectBlock) {
pcmQueue.async {
resolve(self.bufferQueue.count < 20)
}
}
@objc(flushPlayer:rejecter:)
func flushPlayer(_ resolve: @escaping RCTPromiseResolveBlock,
rejecter reject: RCTPromiseRejectBlock) {
pcmQueue.async {
self.bufferQueue.removeAll()
resolve(nil)
}
}
@objc
static override func requiresMainQueueSetup() -> Bool {
return false
}
private func queueSize() -> Int {
return pcmQueue.sync {
return self.bufferQueue.reduce(0) { $0 + $1.count }
}
}
}
I couldn't able to hear any audio via my real iOS device also it is working fine on emulator.
Topic:
Media Technologies
SubTopic:
Streaming
In Final Cut Pro, keyframes for transform parameters (such as Position, Scale, and Rotation) are automatically set to “Smooth” interpolation. This often results in undesired easing between keyframes, especially when linear motion is required.
Currently, we have to manually adjust each keyframe to "Linear" using the Video Animation Editor, which can be time-consuming when working with many keyframes.
Would it be possible to add an option to set the default keyframe interpolation to "Linear"—either globally in Preferences or per parameter in the Inspector?
This would greatly streamline the animation workflow for many editors.
Thank you for considering this request!
I have a music app I'm developing and having a weird issue where I can see now playing info for every other platform than tvOS. As far as I can tell I have correctly configured the MPNowPlayingInfoCenter
MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo MPNowPlayingInfoCenter.default().playbackState = .playing
Are there any extra requirements to get my app's now-playing info showing in control center on tvOS? Another strange issue that might be related is I can use the apple TV remote to pause audio but not resume playback, so I feel like there's something I'm missing about registering audio playback on tvOS specifically.
In iOS 18, CarPlay shows an error: “There was a problem loading this content” after playback starts. Audio works fine, but the Now Playing screen doesn’t load. I’m using MPPlayableContentManager. This worked fine in iOS 17. Anyone else seeing this error in iOS 18?
Howdy. I'm trying to access media from a users song library and receive:
<ICUserIdentityStoreACAccountBackend: 0x148f8af30> Failed to initialize active account, error=Error Domain=ICError Code=-7013 "Client is not entitled to access account store" UserInfo={NSDebugDescription=Client is not entitled to access account store}
I'm told I need to add a Media Library Access Capability. Nothing like this shows up in Xcode under Signing & Capabilities > +Capabilities. Also I can't find anything like this in my account in dev.apple.com.
How do I enable myself and a test user using another iPhone device to access my music and their music respectively?
Thanks!
Topic:
Media Technologies
SubTopic:
General
Tags:
App Tracking Transparency
Media Player
iOS
MusicKit
Hello, I'm trying to create a webbrowser but currently when signed into apple music webplayer I get the following message when I attempt to play on any versions of my webbrowser:
Not available on the web
You can listen to this in the Apple Music app.
Is there a way to setup DRM (assuming this is the issue) with apple to allow my webbrowser to play this content?
I believe Apple TV is also affected.
Thank you ahead of time.
I am work an app development on an app which request an audio function in background as an alert sound.
during debug testing , the function work fine,
but once I testing standalone without debugging , The function not work , it will play out the sound when I back to app.
does any way to trace the issues ?
am new to using Swift for a Mac Application. I am trying to control an external UVC-compliant camera focus and other capabilities. However, I'm having trouble with this and don't know where to start. I have downloaded an application from the App Store and it can control the focus and other capabilities.
I've tried IOKit but this seems to be complicated and this does not return any capabilities or control the camera.
I also tried AVfoundation and was able to open the camera, but using the following code did not work for me. as a device.isFocusPointOfInterestSupported returns false and without checking the app crashes.
@IBAction func focusChanged(_ sender: NSSlider) {
do {
guard let device = videoDevice else { return }
try device.lockForConfiguration()
// Check if focus mode and point of interest are supported
if device.isFocusModeSupported(.locked) {
device.focusMode = .locked
}
if device.isFocusPointOfInterestSupported {
// Map the slider value (0.0 to 1.0) to the focus point's X coordinate
let focusX = CGFloat(sender.doubleValue)
let focusPoint = CGPoint(x: focusX, y: 0.5) // Y coordinate is typically 0.5 (centered vertically)
device.focusPointOfInterest = focusPoint
} else {
print("Focus point of interest is not supported on this device.")
}
device.unlockForConfiguration()
// Log focus settings
print("Focus point: \(device.focusPointOfInterest)")
print("Focus mode: \(device.focusMode.rawValue)")
} catch {
print("Error adjusting focus: \(error)")
}
Any help or advice is much appreciated.
When using the Apple Devices to sync Apple Music to iPhone where is the Apple Devices backup being written to?
Apple Devices->music->sync.
Not trying to backup the iPhone via Apple Devices app.
Hi,
Not sure if this is the right forum to ask this question in, but could you please advise if I can use Apple Digital Masters logo (badge) in my iOS app that is playing music from Apple Music service?
Topic:
Media Technologies
SubTopic:
Audio
My app is a camera app that supports Picture-in-Picture (PiP) mode.
Normally, when the device rotates, I get the device orientation from iOS and use it to rotate the camera feed so that the preview stays correctly aligned.
However, when the app enters PiP mode, it is considered to be in the background, and I can no longer receive orientation updates from the system.
As a result, I can’t apply rotation corrections to the camera video in PiP mode.
Is there any way to retrieve device orientation while the app is in the background (specifically during PiP mode)?
Any guidance would be greatly appreciated.
Thank you!
I am developing a VOD playback app, but when I stream video to an external monitor connected via HDMI with Lightning on iOS 18 or later, the screen goes dark and I cannot confirm playback.
The app I am developing does not detect the HDMI and display the Player separately, but simply mirrors the video.
We have confirmed that the same phenomenon occurs with other services, but we were able to confirm playback with some services such as Apple TV.
Please let us know if there are any other necessary settings such as video certificates required for video playback.
We would also like to know if the problem occurs with iOS 18 or later.
Topic:
Media Technologies
SubTopic:
Audio