Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

AVPlayer HLS Restirect Stream Resulotion
Hello, we have HLS stream app and we use AVPlayer for HLS stream. We want to implement dynamic resulotion feature as user's selection. For example, if user want to watch only 1080p user has to watch only 1080p but we have tried to implement "preferredMaximumResolution" and "preferredPeakBitRate" parameters and but AVPlayer does not force it which means that setting preferredMaximumResolution= CGSize(width: 1920, height: 1080) player does not only force to play 1080p profile, player drops resulotion to 720p but we do not want 720p stream if user selected 1080p resulotion. Is there any method to force it even if stream stalls? Thank you in advance
0
0
540
Oct ’24
API to switch the mode of Airpods Pro 2
Hi, May I ask if there is any API or similar way inside the iOS app to set up/switch the transparency and ANC modes of the AirPods Pro 2? One way is to set up one shortcut and activate that shortcut in the app, but it requires manually setting for a shortcut, which is not convenient. Thx for any advice on that!
0
1
210
Nov ’24
AVAudioUnitTimePitch: speeding up introduces artifacts
For an upcoming update of one of my apps, I’m facing an issue: The .rate parameter of a AVAudioUnitTimePitch allows me to slow down an audio track without any issues: setting .rate to 0.7 or 0.8 results in an almost perfect playback without changing pitch. However, whenever the .rate parameter is greater than 1 (e.g. 1.1 or 1.15), I’m starting to hear audio artifacts (“flattering”) in the audio output which is not so nice (even at .overlap = 32). Intuitively, I’d’ve thought that speeding up the file should contain less artifacts than slowing it down?? I’ve tried different sample rates (44.1 kHz and 48 kHz), but same result. Grateful for any input on this 🙏
0
0
404
Nov ’24
How correctly setup AVSampleBufferDisplayLayer
How can I setup correctly AVSampleBufferDisplayLayer for video display when I have input picture format kCVPixelFormatType_32BGRA? Currently video i visible in simulator, but not iPhone, miss I something? Render code: var pixelBuffer: CVPixelBuffer? let attrs: [String: Any] = [ kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32BGRA, kCVPixelBufferWidthKey as String: width, kCVPixelBufferHeightKey as String: height, kCVPixelBufferBytesPerRowAlignmentKey as String: width * 4, kCVPixelBufferIOSurfacePropertiesKey as String: [:] ] let status = CVPixelBufferCreateWithBytes( nil, width, height, kCVPixelFormatType_32BGRA, img, width * 4, nil, nil, attrs as CFDictionary, &pixelBuffer ) guard status == kCVReturnSuccess, let pb = pixelBuffer else { return } var formatDesc: CMVideoFormatDescription? CMVideoFormatDescriptionCreateForImageBuffer( allocator: nil, imageBuffer: pb, formatDescriptionOut: &formatDesc ) guard let format = formatDesc else { return } var timingInfo = CMSampleTimingInfo( duration: .invalid, presentationTimeStamp: currentTime, decodeTimeStamp: .invalid ) var sampleBuffer: CMSampleBuffer? CMSampleBufferCreateForImageBuffer( allocator: kCFAllocatorDefault, imageBuffer: pb, dataReady: true, makeDataReadyCallback: nil, refcon: nil, formatDescription: format, sampleTiming: &timingInfo, sampleBufferOut: &sampleBuffer ) if let sb = sampleBuffer { if CMSampleBufferGetPresentationTimeStamp(sb) == .invalid { print("Invalid video timestamp") } if (displayLayer.status == .failed) { displayLayer.flush() } DispatchQueue.main.async { [weak self] in guard let self = self else { print("Lost reference to self drawing") return } displayLayer.enqueue(sb) } frameIndex += 1 }
0
0
63
Apr ’25
Anyone know the output power of the headphone jack of a MacBook Pro for each percentage of volume?
Hello! I'm trying to create a headphone safety prototype to give warnings if I listen to music too loud, but inputing my headphone's impedance, sensitivity, and wanted SPL level, and all I need is just the data on the amount of power each percentage of volume outputs(I'm assuming the MacBook Pro has 1-100% volume scale). If anyone has this info, or can direct me to someone who has this info, that would be great! Also do I contact apple support for things like this? I'm not too sure... Thanks!!
0
0
298
Oct ’24
Apple TV HDMI Connected device turn off detection problem via HDMI
Hello, we have HLS Stream app on Apple TV. Our streams are DRM protected. We have problem with streams when source device is turned off. For example, user start to watch our HLS DRM Protected content. After some time, user turns off device (it can be Monitor or TV via connected HDMI). Our app does not understand HDMI Source device turned off. Is there any way to understand HDMI connected device is turned off on Swift?
0
0
380
Jan ’25
VNDocumentCameraViewController not working on macOS
I have an iPad app that I want to run on Apple Silicon macs. Everything works fine except for VNDocumentCameraViewController. According to the docs this class is available on: iOS 13.0+ iPadOS 13.0+ Mac Catalyst 13.1+ visionOS 1.0+ yet when I try using it I get Document camera is not available on my Mac Studio running macOS 15.2 Is this expected behaviour? Thanks
0
0
341
Jan ’25
Title: Ambisonic B-Format Playback Issues on Vision Pro
I'm trying to implement Ambisonic B-Format audio playback on Vision Pro with head tracking. So far audio plays, head tracking works, and the sound appears to be stereo. The problem is that it is not a proper binaural playback when compared to playing back the audiofile with a DAW. Has anyone successfully implemented B-Format playback on Vision Pro? Any suggestions on my current implementation: func playAmbiAudioForum() async { do { try AVAudioSession.sharedInstance().setCategory(.playback) try AVAudioSession.sharedInstance().setActive(true) // AudioFile laoding/preperation guard let testFileURL = Bundle.main.url(forResource: "audiofile", withExtension: "wav") else { print("Test file not found") return } let audioFile = try AVAudioFile(forReading: testFileURL) let audioFileFormat = audioFile.fileFormat // create AVAudioFormat with Ambisonics B Format guard let layout = AVAudioChannelLayout(layoutTag: kAudioChannelLayoutTag_Ambisonic_B_Format) else { print("layout failed") return } let format = AVAudioFormat( commonFormat: audioFile.processingFormat.commonFormat, sampleRate: audioFile.fileFormat.sampleRate, interleaved: false, channelLayout: layout ) // write audiofile to buffer guard let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: UInt32(audioFile.length)) else { print("buffer failed") return } try audioFile.read(into: buffer) playerNode.renderingAlgorithm = .HRTF // connecting nodes audioEngine.attach(playerNode) audioEngine.connect(playerNode, to: audioEngine.outputNode, format: format) audioEngine.prepare() playerNode.scheduleBuffer(buffer, at: nil) { print("File finished playing") } try audioEngine.start() playerNode.play() } catch { print("Setup error:", error) } }
0
0
433
Jan ’25
How to change PHLivePhoto EXIF metadata
I have an app that allows the user to change a photo’s EXIF metadata. To do this, I request a content editing input, get the full size image, modify its properties, create a content editing output, write the output image to the rendered content URL, then call performChanges on the PHPhotoLibrary creating an asset change request for that asset setting its content editing output. This works as expected for regular photos but Live Photos get turned off converted to a regular photo. To address this, I’m doing something similar by changing the properties of the .photo image in the Live Photo. I detect when the content editing input has a Live Photo, create a Live Photo editing context, set a frame processor that returns the frame’s image after setting its properties to the updated properties when the frame type is photo, then I create the content editing output and save the Live Photo to that output. It modifies the Live Photo successfully, but the metadata is not updated. If you get the full size image again the properties are the original properties. If you look at the EXIF metadata using an app like Metapho it remains unchanged. What am I doing wrong here? Thanks! let imageURL = contentEditingInput.fullSizeImageURL! let inputImage = CIImage(contentsOf: imageURL, options: [.applyOrientationProperty: true])! var metadata: [AnyHashable: Any] = inputImage.properties // Edit the metadata as desired... let editingContext = PHLivePhotoEditingContext(livePhotoEditingInput: contentEditingInput)! editingContext.frameProcessor = { frame, error -> CIImage? in // Edit only the still photo if frame.type == .photo { return frame.image.settingProperties(metadata) } return frame.image } let contentEditingOutput = try await withCheckedThrowingContinuation { continuation in let editingOutput = PHContentEditingOutput(contentEditingInput: contentEditingInput) editingOutput.adjustmentData = adjustmentData editingContext.saveLivePhoto(to: editingOutput) { success, error in if success { continuation.resume(returning: editingOutput) } else { continuation.resume(throwing: error!) } } } try await PHPhotoLibrary.shared().performChanges { let request = PHAssetChangeRequest(for: asset) request.contentEditingOutput = contentEditingOutput }
0
0
521
Nov ’24
changePlaybackRateCommand does not work on iOS.
Hi. I work on an audio app for iOS which is successfully using the MPRemoteCommandCenter for commands like next, back, skip forward, skip backward etc. I am trying to implement playback rate controls in my app (so that users can change the playback speed of audio to 0.5x or 2x for example). While the above commands work, the changePlaybackRateCommand does not seem to. I have enabled the command, given it a target/handler and set supported rates. With the other commands, this caused the UI to change on lock screen, in command center etc, by adding the control for the command (a next button for the next command for example). However, it does not seem to do anything for the playback rate command. I can implement my own "rate button" UI and rate change handling, but I'm wondering if this is a known bug within Apple? Looking online, it seems other people face the same issue and haven't been able to get this command to work. Why is this API provided if it doesn't seem to do anything? Is there something I'm missing? Kind regards.
0
0
324
Jan ’25
Audio session activation occasionally fails from CarPlay
I'm working on adding CarPlay support to an audio app and am running into an issue. Occasionally, when a user opens the app from CarPlay while the main app scene is either not connected or is currently in the background, I will receive an error when attempting to activate the audio session. The code below mimics my setup: do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .spokenAudio) try AVAudioSession.sharedInstance().setActive(true) } catch { print(error) // NSOSStatusErrorDomain - 560557684: Session activation failed } That error code maps to AVAudioSession.ErrorCode.cannotInterruptOthers. Once in this state, all subsequent attempts to play different pieces of content will fail. However, things will start working normally if the user opens the app on their phone and tries again from CarPlay (while the app is in the foreground on their phone). I'm not sure why it would behave this way and want to note that I do have the audio background mode capability enabled. Has anyone else encountered this? Are there any workarounds or changes I could make to prevent this from happening?
0
1
100
Apr ’25
AVAudioMixerNode outputVolume range?
According to the header file the outputVolume properties supported range is 0.0-1.0: /*! @property outputVolume @abstract The mixer's output volume. @discussion This accesses the mixer's output volume (0.0-1.0, inclusive). @property (nonatomic) float outputVolume; However when setting the volume to 2.0 the audio does indeed play louder. Is the header file out of date and if so, what is the supported range for outputVolume? Thanks
0
0
49
Apr ’25
Playing fMP4 Raw Chunks in AVPlayer on iOS
Hello Apple Community, We are working on a real-time streaming feature where we receive chunks of raw MP4 data through a custom protocol and store them in a buffer (array). Our goal is to use these data chunks to play a continuous video stream in AVPlayer. What We've Tried: Custom URL Scheme with AVAssetResourceLoaderDelegate: We implemented a custom URL scheme (customscheme://) to serve the buffered data using AVAssetResourceLoaderDelegate. The method shouldWaitForLoadingOfRequestedResource is called only during the initial allocation. It doesn't get triggered when new chunks are appended to the buffer. Despite appending new data to the buffer, AVPlayer doesn’t request further chunks from the delegate. What We Need: We are looking for a solution where: The player continuously fetches data from the buffer as new chunks are added. The playback remains smooth and uninterrupted, even with real-time data being appended. Ideally, this solution works with AVPlayer while adhering to HLS-like behavior without implementing an HLS server. Questions: Is AVAssetResourceLoaderDelegate the right approach for this use case? If so, how can we ensure shouldWaitForLoadingOfRequestedResource is called whenever new data is available in the buffer? Are there alternative APIs or recommended patterns for playing real-time MP4 data chunks in AVPlayer? Would implementing a custom FFmpeg-based player be necessary, or can this be achieved using AVPlayer and its APIs? We appreciate any guidance, suggestions, or examples that can help us achieve this. Thank you!
0
1
504
Jan ’25
Custom Share Desination stopped working in FCP X 11
We integrate with FCP X using a custom share destination and the Apple Script interface. This has been working fine until the the recent version 11 update of FCP X. With this update we are no longer receiving the open event when the export has completed. We get the apple event to creat the Asset and the file is exported to the location we set in the response. There is just no open event after that. I suspect something is wrong with our scripting support but I have no idea what or how to troubleshoot. This works fine in 10.8.1 and below.
0
0
354
Dec ’24
Mediastreamvalidator Error: Invalid URL
I have a low latency hls with fragmented mp4 setup. When I try to validate it with mediavalidatorstream tool, it gives following error: Detail: '(null)' is not a valid URL Source: media playlisturl - segment url in that playlist What does that error mean?
0
0
286
Dec ’24
Custom FairPlay DRM error handling mechanics
Hi, I have a usecase where I'd like to handle and prevent automatic retries whenever certain errors occur during FairPlay content key requests. Here's the current flow: FairPlay certificate is requested and obtained from my server makeStreamingContentKeyRequestData is called on the keyRequest The license server will return a 403 along with a body response containing a json with the detailed code and message The error is caught and handled properly by calling AVContentKeyRequest.processContentKeyResponseError The AVContentKeySession automatically retries up to 8 times by providing a new key request through public func contentKeySession(_ session: AVContentKeySession, didProvide keyRequest: AVContentKeyRequest) My license server gets hit with 8 requests that will always result in a 403, these retries are useless My custom error is succesfully caught later down the line through AVPlayerItem.observe(\.status), this is great Thing is.. I'd like to catch the 403 error and prevent any retry from being made at step 5, ideally through public func contentKeySession(_ session: AVContentKeySession, contentKeyRequest keyRequest: AVContentKeyRequest, didFailWithError err: Error) I've looked for quite a while and just can't seem to find any way of achieving this. Is this not supported at all?
0
8
700
Dec ’24
Apple music feed parquet files
Hi, I am in need to get the total number of parquet files that are present in the apple music feed api for songs, artists. As there is option for limit and offset. But limit is limited to 200 records and offset is uncertain. How to get total number of parquet files number without quering apple music feed api mulitple times? Need help regarding this. Thanks!
0
0
418
Nov ’24