Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics

Post

Replies

Boosts

Views

Activity

Airplay not working with Encrypted HLS m3u8 url using avplayer
Hi, I am using avplayer, want to stream airplay with my encrypted hls m3u8 url. I am sharing my code snippet. ` let contentUrl = URL(string: String(format:videoUrl)) let headers = ["token": token] let asset: AVURLAsset = AVURLAsset(url: contentUrl!, options["AVURLAssetHTTPHeaderFieldsKey": headers]) let playerItem: AVPlayerItem = AVPlayerItem(asset: asset) self.avPlayer?.replaceCurrentItem(with: playerItem) self.avPlayer?.play()` Airplay is not working on my tv when i start stream. My encrypted url won't work. Is there any way to stream airplay with encrypted url. Stuck here........
0
0
300
May ’24
How to play Encrypted HLS m3u8 url via Airplay using avplayer.
I have an avplayer with an encrypted m3u8 url and this is my code snippet let contentUrl = URL(string: String(format:videoUrl)) let headers = ["token": token] let asset: AVURLAsset = AVURLAsset(url: contentUrl!, options: ["AVURLAssetHTTPHeaderFieldsKey": headers]) let playerItem: AVPlayerItem = AVPlayerItem(asset: asset) self.avPlayer?.replaceCurrentItem(with: playerItem) self.avPlayer?.play() When i try to stream Airplay the content not displaying. Airplay is not working with an encrypted url. How can i stream? Is there any way.
0
0
300
May ’24
Will the new iPad Pro support RAW capture?
Just watched the new product release, and I'm really hoping the new iPad Pro being advertised as the next creative tool for filmmakers and artists will finally allow RAW captures in the native Camera app or AVFoundation API (currently RAW available devices returns 0 on the previous iPad Pro). With all these fancy multicam camera features and camera hardware, I don't think it really takes that much to enable ProRAW and Action Mode on the software side of the iPad. Unless their strategy is to make us "shoot on iPhone and edit on iPad" (as implied in their video credits) which has been my workflow with the iPhone 15 and 2022 iPad Pro :( :(
0
0
337
May ’24
Question regarding the kVTVideoEncoderList_IsHardwareAccelerated flag
I am a bit confused on whether certain Video Toolbox (VT) encoders support hardware acceleration or not. When I query the list of VT encoders (VTCopyVideoEncoderList(nil,&encoderList)) on an iPhone 14 Pro device, for avc1 (AVC / H.264) and hevc1 (HEVC / H.265) encoders, the kVTVideoEncoderList_IsHardwareAccelerated flag is not there, which -based on the documentation found on the VTVideoEncoderList.h- means that the encoders do not support hardware acceleration: optional. CFBoolean. If present and set to kCFBooleanTrue, indicates that the encoder is hardware accelerated. In fact, no encoders from this list return this flag as true and most of them do not include the flag at all on their dictionaries. On the other hand, when I create a compression session using the VTCompressionSessionCreate() and pass the kVTVideoEncoderSpecification_EnableHardwareAcceleratedVideoEncoder as true in the encoder specifications, after querying the kVTCompressionPropertyKey_UsingHardwareAcceleratedVideoEncoder using the following code, I get a CFBoolean value of true for both H.264 and H.265 encoder. In fact, I get a true value (for both of the aforementioned encoders) even if I don't specify the kVTVideoEncoderSpecification_EnableHardwareAcceleratedVideoEncoder during the creation of the compression session (note here that this flag was introduced in iOS 17.4 ^1). So the question is: Are those encoders actually hardware accelerated on my device, and if so, why isn't that reflected on the VTCopyVideoEncoderList() call?
3
1
378
May ’24
Does HLS support MVHEVC HDR10 ?
I am setting up a HLS server for MVHEVC files... just find that if tag mp4 files using Asset Writer with let colorPropertySettings = [ AVVideoColorPrimariesKey: AVVideoColorPrimaries_ITU_R_709_2, AVVideoYCbCrMatrixKey: AVVideoTransferFunction_ITU_R_709_2, AVVideoTransferFunctionKey: AVVideoYCbCrMatrix_ITU_R_709_2 ] the HLS playback well on Safari. but if tag mp4 files using using Asset Writer with let colorPropertySettings = [ AVVideoColorPrimariesKey: AVVideoColorPrimaries_ITU_R_2020, AVVideoYCbCrMatrixKey: AVVideoYCbCrMatrix_ITU_R_2020, AVVideoTransferFunctionKey: AVVideoTransferFunction_SMPTE_ST_2084_PQ ] the HLS can not play on Safari. looks like HLS does NOT support MVHEVC HDR10. OR have I lost any setting for MVHEVC HDR10? thanks.
0
0
271
May ’24
iOS 17 MusicKit Song lastPlayedDate is always nil
Hey, I've been trying to fetch my Apple Music recently played songs for an app I'm working on, and I want to access the lastPlayedDate field. If I'm not mistaken, this field should exist for a Song according to Apple's documentation: https://developer.apple.com/documentation/musickit/song/lastplayeddate However, whenever I try to fetch this data, the lastPlayedDate field is always nil. All the other data I'm looking for, however, seems to fetch without issue. Here's the code I'm using: //Request as described in Apple MusicKit //https://developer.apple.com/documentation/musickit/musicrecentlyplayedrequestable var request = MusicRecentlyPlayedRequest<Song>() request.limit=30 do { let response = try await request.response() let songs = response.items.compactMap { song -> RecentlyPlayedSong? in let songName = song.title let songArtist = song.artistName let songAlbum = song.albumTitle let artwork: MusicArtworkType let preview_url = song.previewAssets?.first?.url?.absoluteString if let appleMusicArtwork = song.artwork { print("Found a song, \(song) with lastPlayedDate \(song.lastPlayedDate)") artwork = .AppleMusic(appleMusicArtwork) return RecentlyPlayedSong(name: songName, artist: songArtist, album: songAlbum, artwork: artwork, preview_url: preview_url, lastPlayedDate: song.lastPlayedDate ?? Date()) } I'm trying to map the response into a custom struct I made, but here's a sample of what's getting printed to the logs: Found a song, Song(id: "1676362342", title: "pwdr Blu (feat. Brother.)", artistName: "Kx5, deadmau5 & Kaskade") with lastPlayedDate nil Found a song, Song(id: "881289980", title: "Worlds Apart (feat. Kerli)", artistName: "Seven Lions") with lastPlayedDate nil Found a song, Song(id: "1501540431", title: "What’s Done Is Done", artistName: "Seven Lions & HALIENE") with lastPlayedDate nil Even though I just listened to these songs a a few minutes ago. Anyone ever run into this issue before? Any settings I need to look at changing to get this to show?
1
0
320
May ’24
HDR10 MVHECV can not play on Safari
Hi, just generated a HDR10 MVHEVC file, mediainfo is below: Color range : Limited Color primaries : BT.2020 Transfer characteristics : PQ Matrix coefficients : BT.2020 non-constant Codec configuration box : hvcC+lhvC then generate the segment files with below command: mediafilesegmenter --iso-fragmented -t 4 -f av_1 av_new_1.mov then upload the segment files and prog_index.m3u8 to web server. just find that can not play the HLS stream on Safari... the url is http://ip/vod/prog_index.m3u8 just checked that if i remove the tag Transfer characteristics : PQ when generating the MVHEVC file. above same mediafilesegmenter command and upload the files to web server. the new version of HLS stream is can play on Safari... Is there any way to play HLS PQ video on Safari. thanks.
0
0
311
May ’24
Error on first photo captured after device orientation change
I am using AVFoundation to capture a photo. This was all working fine, then I realized all the photos were saving to the photo library in portrait mode. I wanted them to save in the orientation the device was in when the camera took the picture, much as the built in camera app does on iOS. So I added this code: if let videoConnection = photoOutput.connection(with: .video), videoConnection.isVideoOrientationSupported { // From() is just a helper to get video orientations from the device orientation. videoConnection.videoOrientation = .from(UIDevice.current.orientation) print("Photo orientation set to \(videoConnection.videoOrientation).") } With this addition, the first photo taken after a device rotation logs this error in the debugger: <<<< FigCaptureSessionRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSessionRemote.m:866) - (err=-12784) Subsequent photos will not repeat the error. Once you rotate the device again, same behavior. Photos taken after the app loads, but before any rotations have been made, do not produce this error. I have tried many things, no dice. If I comment this code out it works without error, but of course the photos are all saved in portrait mode again.
0
2
298
May ’24
Deadlock in AudioWorkIntervalCreate
Sometimes when I call AudioWorkIntervalCreate the call hangs with the following stacktrace. The call is made on the main thread. mach_msg2_trap 0x00007ff801f0b3ce mach_msg2_internal 0x00007ff801f19d80 mach_msg_overwrite 0x00007ff801f12510 mach_msg 0x00007ff801f0b6bd HALC_Object_AddPropertyListener 0x00007ff8049ea43e HALC_ProxyObject::HALC_ProxyObject(unsigned int, unsigned int, unsigned int, unsigned int) 0x00007ff8047f97f2 HALC_ProxyObjectMap::_CreateObject(unsigned int, unsigned int, unsigned int, unsigned int) 0x00007ff80490f69c HALC_ProxyObjectMap::CopyObjectByObjectID(unsigned int) 0x00007ff80490ecd6 HALC_ShellPlugIn::_ReconcileDeviceList(bool, bool, std::__1::vector<unsigned int, std::__1::allocator<unsigned int>>&, std::__1::vector<unsigned int, std::__1::allocator<unsigned int>>&) 0x00007ff8045d68cf HALB_CommandGate::ExecuteCommand(void () block_pointer) const 0x00007ff80492ed14 HALC_ShellObject::ExecuteCommand(void () block_pointer) const 0x00007ff80470f554 HALC_ShellPlugIn::ReconcileDeviceList(bool, bool) 0x00007ff8045d6414 HALC_ShellPlugIn::ConnectToServer() 0x00007ff8045d74a4 HAL_HardwarePlugIn_InitializeWithObjectID(AudioHardwarePlugInInterface**, unsigned int) 0x00007ff8045da256 HALPlugInManagement::CreateHALPlugIn(HALCFPlugIn const*) 0x00007ff80442f828 HALSystem::InitializeDevices() 0x00007ff80442ebc3 HALSystem::CheckOutInstance() 0x00007ff80442b696 AudioObjectAddPropertyListener_mac_imp 0x00007ff80469b431 auoop::WorkgroupManager_macOS::WorkgroupManager_macOS() 0x00007ff8040fc3d5 auoop::gWorkgroupManager() 0x00007ff8040fc245 AudioWorkIntervalCreate 0x00007ff804034a33
2
0
303
May ’24
AVCaptureVideoPreviewLayer transforms are not working. Need pan/zoom solution.
I am implementing pan and zoom features for an app using a custom USB camera device, in iPadOS. I am using an update function (shown below) to apply transforms for scale and translation but they are not working. By re-enabling the animation I can see that the scale translation seems to initially take effect but then the image animates back to its original scale. This all happens in a fraction of a second but I can see it. The translation transform seems to have no effect at all. Printing out the value of AVCaptureVideoPreviewLayer.transform before and after does show that my values have been applied. private func updateTransform() { #if false // Disable default animation. CATransaction.begin() CATransaction.setDisableActions(true) defer { CATransaction.commit() } #endif // Apply the transform. logger.debug("\(String(describing: self.videoPreviewLayer.transform))") let transform = CATransform3DIdentity let translate = CATransform3DTranslate(transform, translationX, translationY, 0) let scale = CATransform3DScale(transform, scale, scale, 1) videoPreviewLayer.transform = CATransform3DConcat(translate, scale) logger.debug("\(String(describing: self.videoPreviewLayer.transform))") } My question is this, how can I properly implement pan/zoom for an AVCaptureVideoPreviewLayer? Or even better, if you see a problem with my current approach or understand why the transforms I am applying do not work, please share that information.
0
0
319
May ’24
Extract metadata MXF files
I'm trying to read meta information from MXF files without success. I get an empty AVAsset array. I saw that there are mentions of "MTRegisterProfessionalVideoWorkflowFormatReaders". But there is absolutely no documentation. I don't know where to look. Has anyone encountered this? Please help with any information
1
0
255
May ’24
iOS App crashes while converting CVPixelbuffer to JPG
I have been seeing some crash reports for my app on some devices (not all of them). The crash occurs while converting a CVPixelBuffer captured from Video to a JPG using VTCreateCGImageFromCVPixelBuffer from VideoToolBox. I have not been able to reproduce the crash on local devices, even under adverse memory conditions (many apps running in the background). The field crash reports show that VTCreateCGImageFromCVPixelBuffer does the conversion in another thread and that thread crashed at call to vConvert_420Yp8_CbCr8ToARGB8888_vec. Any suggestions on how to debug this further would be helpful.
2
0
354
Apr ’24
AVCapturePhoto always highest resolution with builtInDualWideCamera and constituent device photo delivery enabled
I have built a camera application which uses a AVCaptureSession with the AVCaptureDevice set to .builtInDualWideCamera and isVirtualDeviceConstituentPhotoDeliveryEnabled=true to enable delivery of "simultaneous" photos (AVCapturePhoto) for a single capture request. I am using the hd1920x1080 preset, but both the wide and ultra-wide photos are being delivered in the highest possible resolution (4224x2376). I've tried to disable any setting that suggests that it should be using that 4k resolution rather than 1080p on the AVCapturePhotoOutput, AVCapturePhotoSettings and AVCaptureDevice, but nothing has worked. Some debugging that I've done: When I turn off constituent photo delivery by commenting out the line of code below, I end up getting a single photo delivered with the 1080p resolution, as you'd expect. // photoSettings.virtualDeviceConstituentPhotoDeliveryEnabledDevices = captureDevice.constituentDevices I tried the constituent photo delivery with the .builtInDualCamera and got only 4k results (same as described above) I tried using a AVCaptureMultiCamSession with .builtInDualWideCamera and also only got 4k imagery I inspected the resolved settings on photo.resolvedSettings.photoDimensions, and the dimensions suggest the imagery should be 1080p, but then when I inspect the UIImage, it is always 4k. guard let imageData = photo.fileDataRepresentation() else { return } guard let capturedImage = UIImage(data: imageData ) else { return } print("photo.resolvedSettings.photoDimensions", photo.resolvedSettings.photoDimensions) // 1920x1080 print("capturedImage.size", capturedImage.size) // 4224x2376 -- Any help here would be greatly appreciated, because I've run out of things to try and documentation to follow 🙏
2
0
426
Apr ’24
Empty space with Screen Capture kit full screen window recording
I am using Screen Capture Kit to capture the windows and record it. But from macOS Sanoma onwards I see a wired behaviour when I try to capture the window which is in Full screen mode. The CMSampleBuffer returned by Screen capture kit has empty space at the top of the full screen window content. The ContentRect attachment in CMSampleBuffer includes this empty space. So there is no way to know what is the actual window content in the CMSampleBuffer. In the CaptureCample sample code provided by Apple it does not enumerate the Full screen windows. I made a change in that to enumerate full screen windows. The issue is reproduced in that also. Attaching the Image of showing the empty space. Has anybody encountered this issue?
4
0
356
Apr ’24
RealityView with FairPlay
I'm trying to implement the playback of an HLS content with FairPlay, and I want to insert it into a RealityView using a VideoMaterial of a sphere. When I use unencrypted HLS content everything works correctly, but when I use FairPlay it doesn't. To initialize FairPlay I am using the following in the view: let contentKeyDelegate = ContentKeySessionDelegate(licenseURL: licenseURL, certificateURL: certificateURL) // Create the Content Key Session using the FairPlay Streaming key system. let contentKeySession = AVContentKeySession(keySystem: .fairPlayStreaming) contentKeySession.setDelegate(contentKeyDelegate, queue: DispatchQueue.main) contentKeySession.addContentKeyRecipient(asset) Has anyone else encountered this problem? Note: I'm testing in Vision Pro directly because the simulator hasn't support for FairPlay.
2
0
402
Apr ’24
Using "include=albums" on the catalog/<storefront>/songs endpoint with a filter causes 504, gateway timeouts.
When accessing the REST API, If you apply "include=albums" to a 'catalog//songs' endpoint requests with a filter on ISRC, the API will, without fail, return a 504 error status. If you remove the 'include=albums' and/or replace it with something like 'include=artists' it works fine. This has been like this for months and we need to get album details back with these requests. Could the Apple team please respond and verify the issue as it's blocking production for us. Thanks.
0
0
278
Apr ’24
Compile error on MyTarget-Swift.h, unknown class name SCContentSharingPickerObserver
I am working on ScreenCaptureKit sample with SCContentSharingPickerObserver. My Target is SwiftUI based and calling Objective-C class method. I added [MyTarget]-Bridging.h and [MyTarget]-Swift.h I got compile error of unknown class name SCContentSharingPickerObserver in [MyTarget]-Swift.h. But I do not know how to fix this error since [MyTarget]-Swift.h is Xcode generated file. I set macOS target is 14.0, swift language ver is 5 Anyone know how to fix this error or waiting for Xcode update?
0
0
256
Apr ’24