Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

AVExternalStorageDevice permissions behavior completely broken on iOS 18?
I'm attempting to use AVExternalStorageDevice.requestAccess on iOS 18 using Xcode 16. When calling requestAccess, a dialog does appear, but the completionHandler closure is never called to indicate whether access was granted. If using the async version, the function just never returns. Calling requestAccess also results in a mediaServicesWereReset (-11819) error without fail. Supposedly, "the system only presents the dialog to a person the first time your app calls the method." That also doesn't appear to be the case. The dialog appears every time requestAccess is called, regardless of previous invocations and whether "Allow" or "Don't Allow" was selected. The dialog itself says "You can change this in Privacy settings." I cannot find this permission anywhere in the Settings app, neither under Privacy & Security nor under the app-specific settings page. Has anyone else experienced these issues? Am I missing something here? I did suspect permissions issues and tried adding a NSRemovableVolumesUsageDescription entry to the app. This did not appear to change anything.
1
2
654
Sep ’24
Challenges with Remotely Controlling iPhone Camera from Mac: Need Guidance
Hello everyone, I am working on an iOS app that involves capturing images automatically, and I would like to control the start/stop of the capture process remotely from a Mac app. I explored the iPhone Mirroring feature, which allows some remote control but has the limitation of only functioning when the iPhone is locked, and it doesn’t permit access to the iPhone’s camera from the Mac. Ideally, I am looking for a solution that would allow me to: Remotely control the camera capture process on the iOS app from the Mac app. Ensure the iPhone’s camera remains fully operational and controllable from the Mac during the capture process. I have considered using options like Handoff for communication between the apps but faced some issues while communicating between the iOS and mac app. I would like to know if there is a more optimal solution within Apple’s ecosystem, or if there are APIs I might have overlooked. Any advice or guidance on how to achieve this functionality would be greatly appreciated! Thanks in advance!
1
0
572
Sep ’24
IOS 18 Voicemail feature
I am facing an Issue regarding the voicemail feature. when someone calls me, it would go to voicemail only if I click on the voicemail icon. If I do not respond at all to the incoming call, it does not give the caller an option to record voicemail. I have tried switching the voicemail on and off, its working on other devices Iphone 14 and 15 for my family member, just not for me. How can i resolve this?
1
0
629
Sep ’24
AVPlayer Error in iOS 18.0
When attempting to play a video using AVPlayer on iOS 18.0, I am encountering an error that does not occur on versions earlier than 18.0. Could you please advise what might be causing this issue? Error Domain=AVFoundationErrorDomain Code=-11828 Error Domain=NSOSStatusErrorDomain Code=-12847 "(null)" This is Error code This is the URL information I retrieved using the curl command. HTTP/1.1 200 Content-Disposition: inline;filename="sample.mp4" Accept-Ranges: bytes ETag: sample.mp4 Last-Modified: Tue, 20 Jan 1970 23:52:10 GMT Expires: Mon, 07 Oct 2024 09:15:49 GMT Content-Range: bytes 0-987561/987561 Content-Type: application/octet-stream Content-Length: 987561 Date: Mon, 30 Sep 2024 09:15:49 GMT
2
0
767
Oct ’24
Apple Music Bug
Since the iOS 18 update, there's been a bug that always occurs when listening to music through Apple Music. When music is playing and the iPhone enters or exits standby mode, the music pauses by itself for 1 second. The initial conditions were the same as when this problem didn’t exist.
2
0
630
Oct ’24
AudioConverterFillComplexBuffer not working for (E)AC3 in tvOS 18
Since upgrading to tvOS 18, the above function isn't working for me in converting a stream with these formats. It does work in decoding AAC, however. https://developer.apple.com/documentation/audiotoolbox/1503098-audioconverterfillcomplexbuffer?language=objc I pass a valid ioOutputDataPacketSize in, but it always comes out as zero. Has anyone else observed this too? I wonder if this is related to the issue being discussed widely about 5.1 sound being broken for many people after upgrading to tvOS 18? https://discussions.apple.com/thread/255769102?login=true&sortBy=rank EDIT: further information; the callback gets called once, asking for 1 packet (which is ok). I give it one packet and return noErr. However, after this, the callback is never invoked again. Must be a bug? EDIT2: the same code continues to work correctly on macOS in decoding the same audio stream.
4
2
658
Oct ’24
session.openApplication() -- how to pass data from the extension to the application?
Hello, I apologize if the answer is obvious but I'm having a hard time figuring this one out. Let's say the user taps an "Edit" button in my LockedCameraCaptureSession. The extension calls: activity.userInfo = ["ActivityKey": "ID"] try await session.openApplication(for: activity) Can I retrieve, in my application, the data stored in activity.userInfo (lets say, a flag "open editor"), or is data passing exclusively handled via appContext of CameraCaptureIntent? Thank you!
3
0
419
Oct ’24
AVDevice is ignoring 60fps
Hello, I try to get the Video from an HDMI USB capture card and show it in a PreviewLayer with 60fps. The device I am using (ShadowCast 2) is supporting 1080p with 60fps in "yuvs" and "420v". This is my code with stripped away uninteresting stuff and removed error handling to build the previewLayer. I am using the AVFrameRateRange because the capture device is not directly supporting 60.00 but <AVFrameRateRange: 0x600000875680 60.00 - 60.00 (1000000 / 60000240 - 1000000 / 60000240)> fps. @Observable final class AVFoundationService: AVService { // Live View private let session: AVCaptureSession = .init() var previewLayer: AVCaptureVideoPreviewLayer { let layer = AVCaptureVideoPreviewLayer(session: session) layer.videoGravity = .resizeAspect return layer } var activeVideoDevice: AVCaptureDevice? { // TODO: implement correct logic if let device = videoDevices.first(where: { $0.localizedName.contains("Shadow") }) { return device } return AVCaptureDevice.default(for: .video) } func setupStreamDemo(completion: @escaping (Error?) -> Void) { session.beginConfiguration() if let device = activeVideoDevice { do { let input = try AVCaptureDeviceInput(device: device) if session.canAddInput(input) { session.addInput(input) } else { print("explode") } for format in device.formats { let dimensions = CMVideoFormatDescriptionGetDimensions(format.formatDescription) if dimensions.width == 1920 && dimensions.height == 1080 && format.formatDescription.mediaSubType.description == "'yuvs'" { let foundFPS = format.videoSupportedFrameRateRanges.first { Int($0.minFrameRate) == 60 && Int($0.minFrameRate) == 60 } try device.lockForConfiguration() device.activeFormat = format device.activeVideoMinFrameDuration = foundFPS!.minFrameDuration device.activeVideoMaxFrameDuration = foundFPS!.minFrameDuration device.unlockForConfiguration() } } } catch { return completion(error) } } session.commitConfiguration() session.startRunning() completion(nil) } } I am using the following code in SwiftUI to show the AVCaptureVideoPreviewLayer. struct VideoPreviewView: NSViewRepresentable { private let previewLayer: AVCaptureVideoPreviewLayer func makeNSView(context: Context) -> NSView { let view = NSView() view.layer = self.previewLayer view.layer?.frame = view.bounds return view } func updateNSView(_ nsView: NSView, context: Context) { if let layer = nsView.layer as? AVCaptureVideoPreviewLayer { layer.session = self.previewLayer.session } } } When I now run my app, it will ignore whatever I set on device.activeVideoMinFrameDuration and/or device.activeVideoMaxFrameDuration. If I set it to 10 fps - it's running with 30, if I set 60 it is running with 30. If I start in parallel to my app QuickTime and start a "Recording" from my USB Capture Card, it will switch to 60fps mode. I am on Mac Sequoia 15.0 with Xcode 16.0. What I am doing wrong?
1
1
629
Oct ’24
AVAudioEngine change playback rate in real time (AVAudioUnitVarispeed is non real time)
I am using the AVAudioEngine to play back samples in an iOS game. I would like to change the play back rate of a sample in real time. When using AVAudioUnitVarispeed for chaging the play back rate it creates stutters in the game as it isn't processed in real time (as stated here:AVAudioUnitTimeEffect) The other option I found to change the rate is by using an AVAudioEnvironmentNode and change the rate of the AVAudioPlayerNode. That works without creating stutters but limits the valid values for the rate from 0.5 to 2.0 (I need higher rates then 2.0). See here: AVAudio3DMixing. Are there any other ways to play back a sample with a rate control in real time?
0
0
316
Oct ’24
Receiving eventMetadata from AVPlayerItemMetadataOutput stops responding on iOS18 only
Case-ID: 9391388 Our application uses timed Metadata as part of a rating control system. We noticed a problem in production and diagnosis shows that we stop receiving timed Metadata on iOS18 only Our live streams are primed with metadata at least once per second but we are seeing extended gaps in receiving this content, in excess of 10 minutes. We have also observed that this happens more as the player climbs the bitrate ladder, and doesn't happen if we cap to a low resolution i.e. a preferredMaximumResolution of 768x432. Furthermore, if we throttle network conditions after we stop receiving metadata the we start receiving them again. Following is a simple example that demonstrates the above behaviour, unfortunately I cannot share the live stream endpoint which is primed with metadata publicly, but can provide privately to Apple to reproduce the problem. import UIKit import AVKit class ViewController: UIViewController, AVPlayerItemMetadataOutputPushDelegate { var player: AVPlayer? var itemMetadataOutput: AVPlayerItemMetadataOutput? override func viewDidAppear(_ animated: Bool) { guard let url = URL(string: "endpoint redacted") else { return } let player = AVPlayer(url: url) let controller = AVPlayerViewController() controller.player = player self.player = player present(controller, animated: true) { player.play() let currentItem = player.currentItem let itemMetadataOutput = AVPlayerItemMetadataOutput(identifiers: nil) self.itemMetadataOutput = itemMetadataOutput self.itemMetadataOutput?.setDelegate(self, queue: .main) currentItem?.add(itemMetadataOutput) } } public func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from track: AVPlayerItemTrack?) { print("received metadata \(Date())") } }
5
4
873
Oct ’24
Accessing Events from Video Device
I have an intra-oral video device that's supported by Apple's AVCaptureDevice. I can use the AV classes to connect to the device and get video. However, this device has a button that's used to acquire still images from the video stream. I can't use the IOUSBDeviceInterface to do an asynchronous read, because the Apple driver has the device opened exclusively. How do I go about receiving the button event in this scenario? I know which pipe to read, based on a bus analyzer when I run this on Windows, I just need to know how to access that pipe when the device is opened by another process.
1
0
624
Oct ’24
CMSAMPLEBuffer: audio PCM to MP4 AAC
Hello, As explained in this link, the AVAssetReaderTrackOutput.copyNextSampleBuffer() returns a CMSampleBuffer in linear PCM audio format. I want to place this audio buffer into an AVAssetWriterInput of type kAudioFormatMPEG4AAC, but I can't manage the conversion. Could you help me by providing an extension that returns a CMSampleBuffer converted from linear PCM audio format to kAudioFormatMPEG4AAC? Example: extension CMSampleBuffer { func fromPCMToAAC() -> CMSampleBuffer? { // Here, get a new AudioStreamBasicDescription, create a CMSampleBuffer and a CMBlockBuffer } } I've tried multiple times but without success. Software: iOS 18.1 XCode: 16.0 Thank you!
1
0
775
Oct ’24
dyld[434]: Library not loaded: error when running LockedCameraCapture compatible app on iOS 15
Hello, I am getting the following error while attempting to run my LockedCameraCapture compatible app on an iOS 15 device: dyld[434]: Library not loaded: '/System/Library/Frameworks/LockedCameraCapture.framework/LockedCameraCapture' Referenced from: '/private/var/containers/Bundle/Application/.../MyApp.app/MyApp.debug.dylib' Reason: tried: '/System/Library/Frameworks/LockedCameraCapture.framework/LockedCameraCapture' (no such file) Of course iOS 15 doesn't have the library for LockedCameraCapture, but I have had no issue including Lock Screen Widgets (which require iOS 16), so I am not sure why the error is popping up. Thank you!
2
0
463
Oct ’24
Why Aren't All Songs Being Added to the Queue?
Hi, I've recently been working with the Apple Music API and have had success in loading all the playlists on my account, loading songs from each playlist, and adding songs to the ApplicationMusicPlayer.share.queue. The problem I'm running into is that not all songs from the playlist are being added to the queue, despite confirming all the songs are being based on the PlaybackView.swift I'm about to share with you. I would also like to answer other underlying questions if possible. I am also open to any other suggestions. In this scenario were also assuming isShuffled is true every time. How can I determine when a song has ended? How can I get the album title information? How can I get the current song title, album information, and artist information without pressing play? I can only seem to update the screen when I select my play meaning ApplicationMusicPlayer.shared.play() is being called. How do I get the endTime of the song? I believe it should be ApplicationMusicPlayer.shared.queue.currentEntry.endTime but this doesn't seem to work. // // PlayBackView.swift // // Created by Justin on 8/16/24. // import SwiftUI import MusicKit import Foundation enum PlayState { case play case pause } struct PlayBackView: View { @State var song: Track @Binding var songs: [Track]? @State var isShuffled = false @State private var playState: PlayState = .pause @State private var isFirstPlay = true private let player = ApplicationMusicPlayer.shared private var isPlaying: Bool { return (player.state.playbackStatus == .playing) } var body: some View { VStack { // Album Cover HStack(spacing: 20) { if let artwork = player.queue.currentEntry?.artwork { ArtworkImage(artwork, height: 100) } else { Image(systemName: "music.note") .resizable() .frame(width: 100, height: 100) } VStack(alignment: .leading) { // Song Title Text(player.queue.currentEntry?.title ?? "Song Title Not Found") .font(.title) .fixedSize(horizontal: false, vertical: true) // How do I get AlbumTitle from here?? // Artist Name Text(player.queue.currentEntry?.subtitle ?? "Artist Name Not Found") .font(.caption) } } .padding() Spacer() // Progress View // endTime doesn't work and not sure why. ProgressView(value: player.playbackTime, total: player.queue.currentEntry?.endTime ?? 1.00) .progressViewStyle(.linear) .tint(.red.opacity(0.5)) // Duration View HStack { Text(durationStr(from: player.playbackTime)) .font(.caption) Spacer() if let duration = player.queue.currentEntry?.endTime { Text(durationStr(from: duration)) .font(.caption) } } Spacer() Button { Task { do { try await player.skipToNextEntry() } catch { print(error.localizedDescription) } } } label: { Label("", systemImage: "forward.fill") .tint(.white) } Spacer() // Play/Pause Button Button(action: { handlePlayButton() isFirstPlay = false }, label: { Text(playState == .play ? "Pause" : isFirstPlay ? "Play" : "Resume") .frame(maxWidth: .infinity) }) .buttonStyle(.borderedProminent) .padding() .font(.largeTitle) .tint(.red) } .padding() .onAppear { if isShuffled { songs = songs?.shuffled() if let songs, let firstSong = songs.first { player.queue = .init(for: songs, startingAt: firstSong) player.state.shuffleMode = .songs } } } .onDisappear { player.stop() player.queue = [] player.playbackTime = .zero } } private func handlePlayButton() { Task { if isPlaying { player.pause() playState = .pause } else { playState = .play await playTrack() } } } @MainActor private func playTrack() async { do { try await player.play() } catch { print(error.localizedDescription) } } private func durationStr(from duration: TimeInterval) -> String { let seconds = Int(duration) let minutes = seconds / 60 let remainder = seconds % 60 // Format the string to ensure two digits for the remainder (seconds) return String(format: "%d:%02d", minutes, remainder) } } //#Preview { // PlayBackView() //}
2
0
627
Oct ’24
How to insert multiple AVAssets into AVMutableCompositionTrack with silence in between?
Hi, I'm recording videos frame by frame and occasionally a sound plays (from an MP3 asset). I want to composite these sounds into the video at the correct timings. But this doesn't work. Really pulling my hair out here. I've tried everything, including adding one after another and then inserting silence in between (allegedly this pushes subsequent clips back) but nothing works. Here, _currentTime is the current time according to the video frames added, which are added at 20Hz. You can see I am adding silence long enough to cover the time from the end of the last audio clip to now, plus extra padding to contain the audio we are about to add. Doesn't matter if I remove this, it just doesn't work. Sometimes I can get two pieces of audio to play but never a third and usually, only the first audio plays, and then nothing after. I'm completely stumped. func addFrame(_ pixelBuffer: CVPixelBuffer) { guard CGSize(width: pixelBuffer.width, height: pixelBuffer.height) == _outputSize else { return } let frameTime = CMTimeMake(value: Int64(_frameCount), timescale: _frameRate) if _videoInput?.isReadyForMoreMediaData == true { _pixelBufferAdaptor?.append(pixelBuffer, withPresentationTime: frameTime) _frameCount += 1 _currentTime = frameTime } } func addMP3AudioClip(_ audioData: Data) async throws { let tempURL = FileManager.default.temporaryDirectory.appendingPathComponent(UUID().uuidString + ".mp3") try audioData.write(to: tempURL) let asset = AVAsset(url: tempURL) let duration = try await asset.load(.duration) let audioTrack = try await asset.loadTracks(withMediaType: .audio).first! let currentAudioTime = _currentTime.convertScale(duration.timescale, method: .default) _audioTrack?.insertEmptyTimeRange(CMTimeRangeFromTimeToTime(start: _lastAudioClipEndTime, end: currentAudioTime)) _audioTrack?.insertEmptyTimeRange(CMTimeRangeFromTimeToTime(start: currentAudioTime, end: CMTimeAdd(currentAudioTime, duration))) let timeRange = CMTimeRangeMake(start: .zero, duration: duration) try _audioTrack?.insertTimeRange(timeRange, of: audioTrack, at: currentAudioTime) _lastAudioClipEndTime = CMTimeAdd(currentAudioTime, duration) try FileManager.default.removeItem(at: tempURL) _audioClipTimeRanges.append(CMTimeRangeMake(start: _currentTime, duration: duration)) } Thank you, -- B.
0
0
358
Oct ’24
Take correctly sized screenshots with ScreenCaptureKit
I've been using CGWindowListCreateImage which automatically creates an image with the size of the captured window. But SCScreenshotManager.captureImage(contentFilter:configuration:) always creates images with the width and height specified in the provided SCStreamConfiguration. I could be setting the size explicitly by reading SCWindow.frame or SCContentFilter.contentRect and multiplying the width and height by SCContentFilter.pointPixelScale , but it won't work if I want to keep the window shadow with SCStreamConfiguration.ignoreShadowsSingleWindow = false. Is there a way and what's the best way to take full-resolution screenshots of the correct size? import Cocoa import ScreenCaptureKit class ViewController: NSViewController { @IBOutlet weak var imageView: NSImageView! override func viewDidAppear() { imageView.imageScaling = .scaleProportionallyUpOrDown view.wantsLayer = true view.layer!.backgroundColor = .init(red: 1, green: 0, blue: 0, alpha: 1) Task { let windows = try await SCShareableContent.excludingDesktopWindows(false, onScreenWindowsOnly: true).windows let window = windows[0] let filter = SCContentFilter(desktopIndependentWindow: window) let configuration = SCStreamConfiguration() configuration.ignoreShadowsSingleWindow = false configuration.showsCursor = false configuration.width = Int(Float(filter.contentRect.width) * filter.pointPixelScale) configuration.height = Int(Float(filter.contentRect.height) * filter.pointPixelScale) print(filter.contentRect) let windowImage = try await SCScreenshotManager.captureImage(contentFilter: filter, configuration: configuration) imageView.image = NSImage(cgImage: windowImage, size: CGSize(width: windowImage.width, height: windowImage.height)) } } }
3
0
744
Oct ’24
Glitch in AVPlayer while playing HLS videos
We have observed consistent glitches in video playback when using AVPlayer to stream HLS (HTTP Live Streaming) videos on iOS. The issue manifests as intermittent frame drops, stuttering, and playback instability during HLS streams. However, the same behavior is not present when playing MP4 videos using the same AVPlayer instance. The HLS streams being used follow standard encoding practices, and network conditions have been ruled out as a cause for this problem. https://drive.google.com/file/d/1lhdpHTyjPYCYLHjzvb6ZF6P6jehIuwY0/view?usp=sharing Steps to Reproduce: 1. Load an HLS video into AVPlayer and initiate playback. 2. Observe intermittent glitches and stuttering during video playback. 3. Load and play an MP4 video in the same AVPlayer instance. 4. Notice that MP4 playback is smooth without any glitches.
3
0
636
Oct ’24
Cancel or quit from loadValuesAsynchronouslyForKeys?
The app needs to play remote videos. Sometimes it takes very long time (~10 seconds) to load the media and play with AVPlayer. So I use a timer to check and try to play next video if it is over 5 seconds: AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoUrl options:nil];// line 1 NSArray *keys = @[@"playable"]; mediaLoaded = NO; [asset loadValuesAsynchronouslyForKeys:keys completionHandler:^() { // line 2 mediaLoaded = YES; // line 4 dispatch_async(dispatch_get_main_queue(), ^{ [self.player replaceCurrentItemWithPlayerItem:[AVPlayerItem playerItemWithAsset:asset]]; [self.player playImmediatelyAtRate:playSpeed]; }); }]; dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(5 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{ if (!mediaLoaded) { [self playNextVideo]; // line 3 } }); So the flow is: line 1 (of video 1)- line 2 (of video 1)- line 3 (if over 5 seconds and video 1 is not playing)- line 1 (of video 2)-... Now the problem is that seems line 2 is blocking line 1: only line 4 (for video 1 after ~10 seconds) or the completionHandler is executed will line 2 (for video 2) will be executed. Anybody can give any insight? Thx!
1
0
380
Oct ’24