Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

Floating point value support in CMFormatDescriptionExtension
I updated macOS to 15.0 yesterday, and I found some floating point value support under CMFormatDescriptionExtensions and CVPixelBuffer's Attachment seems to be broken. When I call CMSampleBufferCreateReadyWithImageBuffer() from CVPixelBuffer, macOS 15.0 always fail with floating point values. a. kCMFormatDescriptionExtension_GammaLevel Previous macOS 14.x works with double value like NSString* keyGamma = (__bridge NSString*)kCMFormatDescriptionExtension_GammaLevel; extensions[keyGamma] = @(2.2); b. kCMFormatDescriptionExtension_CleanAperture I am not sure yet but such non-integer value issue also seems to be applied to CleanAperture. kCMFormatDescriptionKey_CleanApertureWidth kCMFormatDescriptionKey_CleanApertureHeight kCMFormatDescriptionKey_CleanApertureHorizontalOffset kCMFormatDescriptionKey_CleanApertureVerticalOffset Also, When I add rational values to extensions, it cannot pass CMVideoFormatDescriptionMatchesImageBuffer() with: kCMFormatDescriptionKey_CleanApertureWidthRational kCMFormatDescriptionKey_CleanApertureHeightRational kCMFormatDescriptionKey_CleanApertureHorizontalOffsetRational kCMFormatDescriptionKey_CleanApertureVerticalOffsetRational Is there any known workaround?
0
0
340
Sep ’24
Fairplay Streaming certificate - expiry date and renewal
The list of certificates on the Apple Developer web console shows the expiry of my Fairplay Streaming certificate as 'Never'. However, if I download the same certificate and import it into my KeyChain, the certificate details show the listed expiry as 11 OCT 2023. Which of these is correct? If the expiry in the certificate is correct, how do I renew it safely. With my App the below lines fails at the process of -streamingContentKeyRequestData- CODE guard let contentIdData = (loadingRequest.request.url?.host ?? "").data(using: .utf8), let spcData = try? loadingRequest.streamingContentKeyRequestData( forApp: certificate!, // This certificate is expired contentIdentifier: contentIdData, options: nil ) else { print("Error: Failed to generate SPC data due to expired certificate.") loadingRequest.finishLoading(with: NSError(domain: "com.example.error", code: -3, userInfo: nil)) return false }
1
0
686
Sep ’24
Audio Muted After Building Unity App to iOS Device – Only Resetting Device Settings Fixes It
Hi, I'm facing an issue with my Unity-based app when deploying it to the AVP. Often, after building and running the app on the device, the audio gets muted. I couln't find any setting that let me unmute it. The only solution I've found is to reset the device settings, which makes the audio work again. Here are a few things I’ve noticed: The sound works fine when I reset my device’s settings. I haven't changed any sound or audio settings on the device before or after deploying the app. The issue doesn’t always occur immediately, but when it does, resetting settings seems to be the only fix. Could there be something in the AVP audio configuration that causes this problem? I’d appreciate any advice or suggestions. Thanks!
0
0
280
Sep ’24
Native camera and AVCapture image difference
We are trying to build a simple image capture app using AVFoundation and AVCaptureDevice. Custom settings are used for exposure point and bias. But when image is captured using front camera , the image captured from the app and front native camera does not match. The image captured from the app includes more area than the native app. Also there is difference between the tilt angle between two images. So is there any way to capture image exactly same as native camera using AVFoundation and AVCaptureDevice. Native Custom
0
0
609
Sep ’24
fcpxml asset-clip "tcFormat" attribute question
I'm trying to create code to generate an fcpxml file so I can automate Final Cut Pro timeline (project) creation. Here's an xml element that FCP successfully imports (and successfully creates a project/timeline). <project name="2013-08-09 19_23_07 (id).mov"> <sequence format="r1"> <spine> <asset-clip ref="r2" offset="0s" name="2013-08-09 19_23_07 (id).mov" start="146173027/60000s" duration="871871/60000s" tcFormat="DF" audioRole="dialogue"></asset-clip> </spine> </sequence> </project> The xml element example above was generated by exporting a simple timeline with a single clip. The problem I'm having is the media asset has timecode that gives a start time in relation to the timecode. When I try to remove timecode attributes and change the start time to "0s" <asset-clip ref="r2" offset="0s" name="2013-08-09 19_23_07 (id).mov" start="0s" duration="871871/60000s" audioRole="dialogue"></asset-clip> FCP complains with the import error: 2013-08-09 19_23_07 (id).fcpxml Invalid edit with no respective media. (/fcpxml[1]/project[1]/sequence[1]/spine[1]/asset-clip[1]) I guess the question is, does AVAsset provide a way to get the timecode information and the timecode based start offset, or is there a way to tell FCP to use a default start time independent of timecode?
1
0
380
Sep ’24
MusicKit UPCs changing and handling that
I use Universal Product Codes (UPC) in my app to reliably identify albums after having used albumIDs for a time. AlbumIDs can change over time for no obvious reasons (see here for songIDs) so I switched to UPCs since I believed they cannot change. Well apparently they can. A few days ago I populated a JSON with UPCs including 196871067713. Today trying to perform a MusicCatalogResourchRequest for the UPC does not return anything. When using that UPC and putting it into an Apple Music link like https://music.apple.com/de/album/folge-89-im-geistergarten/1683337782?l=en-GB redirects to https://music.apple.com/de/album/folge-89-im-geistergarten/1683337782?l=en-GB so I assume the UPC has changed from 196871067713 to 1683337782. Apple Music can handle that and redirects to the new upc both in the app and as a website. But a MusicCatalogResourceRequest cannot do that. I filed a suggestion for that (FB15167146) but I need a solution quicker. Can I somehow detect where the URL is redirecting to? Is there a way MusicCatalogResourceRequest can do this? Performing a MusicCatalogSearchRequest can be an option but seems unreliable when using the title as search term. Other ideas? Thank you
1
1
590
Sep ’24
How To Add Multiple Songs/Playlist to the Queue?
A couple of weeks ago I got help here to play one song and the solution to my problem was that I wasn't adding the song (Track Type) to the queue correctly, so now I want to be able to add a playlist worth of songs to the queue. The problem is when I try to add an array of the Track type I get an error. The other part of this issue for me is how do I access an individual song off of the queue after I add it? I see I can do ApplicationMusicPlayer.shared.queue.currentItem but I think I'm missing/misunderstanding something here. Anyway's I'll post the code I have to show how I'm attempting to do this at this moment. In this scenario we're getting passed in a playlist from another view. import SwiftUI import MusicKit struct PlayBackView: View { @State var song: Track? @State private var songs: [Track] = [] @State var playlist: Playlist private let player = ApplicationMusicPlayer.shared VStack { // Album Cover HStack(spacing: 20) { if let artwork = player.queue.currentEntry?.artwork { ArtworkImage(artwork, height: 100) } else { Image(systemName: "music.note") .resizable() .frame(width: 100, height: 100) } VStack(alignment: .leading) { // Song Title Text(player.queue.currentEntry?.title ?? "Song Title Not Found") .font(.title) .fixedSize(horizontal: false, vertical: true) } } } .padding() .task { await loadTracks() // It's Here I thought I could do something like this player.queue = tracks // Since I can do this with one singular track player.queue = [song] do { try await player.queue.insert(songs, position: .afterCurrentEntry) } catch { print(error.localizedDescription) } } } @MainActor private func loadTracks() async { do { let detailedPlaylist = try await playlist.with([.tracks]) let tracks = detailedPlaylist.tracks ?? [] setTracks(tracks) } catch { print(error.localizedDescription) } } @MainActor private func setTracks(_ tracks: MusicItemCollection<Track>) { songs = Array(tracks) } }
1
0
615
Sep ’24
Blocking the main thread when calling the pause method of AVPlayer.
Basic iPhone 11 iOS 17.5.1 Main Thread libsystem_kernel.dylib___ulock_wait (in libsystem_kernel.dylib) +8 libdispatch.dylib__dlock_wait (in libdispatch.dylib) +52 libdispatch.dylib__dispatch_thread_event_wait_slow (in libdispatch.dylib) +52 libdispatch.dylib___DISPATCH_WAIT_FOR_QUEUE__ (in libdispatch.dylib) +364 libdispatch.dylib__dispatch_sync_f_slow (in libdispatch.dylib) +144 MediaToolbox_fpic_CopyCurrentEvent (in MediaToolbox) +132 AVFCore___104-[AVPlayer _setRate:withVolumeRampDuration:playImmediately:rateChangeReason:affectsCoordinatedPlayback:]_block_invoke_2 (in AVFCore) +244 AVFCore-[AVPlayer _setRate:withVolumeRampDuration:playImmediately:rateChangeReason:affectsCoordinatedPlayback:] (in AVFCore) +276 AVFCore-[AVPlayer setRate:] (in AVFCore) +56 call AVPlayer pause Thread 81 name: fpic-sync libsystem_kernel.dylib___ulock_wait (in libsystem_kernel.dylib) +8 libdispatch.dylib__dlock_wait (in libdispatch.dylib) +52 libdispatch.dylib__dispatch_thread_event_wait_slow (in libdispatch.dylib) +52 libdispatch.dylib___DISPATCH_WAIT_FOR_QUEUE__ (in libdispatch.dylib) +364 libdispatch.dylib__dispatch_sync_f_slow (in libdispatch.dylib) +144 MediaToolbox_itemasync_CopyProperty (in MediaToolbox) +588 MediaToolbox_fpic_CurrentItemMoment (in MediaToolbox) +184 MediaToolbox___fpic_EstablishCurrentEventForCurrentItem_block_invoke (in MediaToolbox) +136 libdispatch.dylib__dispatch_client_callout (in libdispatch.dylib) +16 libdispatch.dylib__dispatch_lane_barrier_sync_invoke_and_complete (in libdispatch.dylib) +52 MediaToolbox_fpic_ServiceCurrentEvent (in MediaToolbox) +600 MediaToolbox___fpic_NotifyServiceCurrentEvent_block_invoke (in MediaToolbox) +912 libdispatch.dylib__dispatch_call_block_and_release (in libdispatch.dylib) +28 libdispatch.dylib__dispatch_client_callout (in libdispatch.dylib) +16 libdispatch.dylib__dispatch_lane_serial_drain (in libdispatch.dylib) +744 libdispatch.dylib__dispatch_lane_invoke (in libdispatch.dylib) +428 libdispatch.dylib__dispatch_root_queue_drain (in libdispatch.dylib) +388 libdispatch.dylib__dispatch_worker_thread (in libdispatch.dylib) +256 libsystem_pthread.dylib__pthread_start (in libsystem_pthread.dylib) +132 libsystem_pthread.dylib_thread_start (in libsystem_pthread.dylib) +4 Thread 93 name: com.apple.coremedia.player.async.0x303c60240.P/GR libsystem_kernel.dylib_mach_msg2_trap (in libsystem_kernel.dylib) +8 libsystem_kernel.dylib_mach_msg2_internal (in libsystem_kernel.dylib) +76 libsystem_kernel.dylib_mach_msg_overwrite (in libsystem_kernel.dylib) +432 libsystem_kernel.dylib_mach_msg (in libsystem_kernel.dylib) +20 libdispatch.dylib__dispatch_mach_send_and_wait_for_reply (in libdispatch.dylib) +540 libdispatch.dylib_dispatch_mach_send_with_result_and_wait_for_reply (in libdispatch.dylib) +56 libxpc.dylib_xpc_connection_send_message_with_reply_sync (in libxpc.dylib) +260 CoreMedia_FigXPCConnectionSendSyncMessageCreatingReply (in CoreMedia) +288 CoreMedia_FigXPCRemoteClientSendSyncMessageCreatingReply (in CoreMedia) +44 MediaToolbox_remoteXPCPlayer_SetRateWithOptions (in MediaToolbox) +148 MediaToolbox_playerasync_runOneCommand (in MediaToolbox) +768 MediaToolbox_playerasync_runAsynchronousCommandOnQueue (in MediaToolbox) +180 libdispatch.dylib__dispatch_client_callout (in libdispatch.dylib) +16 libdispatch.dylib__dispatch_lane_serial_drain (in libdispatch.dylib) +744 libdispatch.dylib__dispatch_lane_invoke (in libdispatch.dylib) +428 libdispatch.dylib__dispatch_root_queue_drain (in libdispatch.dylib) +388 libdispatch.dylib__dispatch_worker_thread (in libdispatch.dylib) +256 libsystem_pthread.dylib__pthread_start (in libsystem_pthread.dylib) +132 libsystem_pthread.dylib_thread_start (in libsystem_pthread.dylib) +4
1
0
429
Sep ’24
Custom AudioObjectPropertySelector on audio plugins to get the data
I successfully retrieved strings, arrays, and other data through a custom AudioObjectPropertySelector, but I can only get fixed returns. Whenever I modify it to use dynamic data, it results in an error. Below is my code. case kPlugIn_CustomPropertyID: { *((CFStringRef*)outData) = CFSTR("qin@@@123"); *outDataSize = sizeof(CFStringRef); } break; case kPlugIn_ContainDic: { CFMutableDictionaryRef mutableDic1 = CFDictionaryCreateMutable(kCFAllocatorDefault, 0, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); CFDictionarySetValue(mutableDic1, CFSTR("xingming"), CFSTR("qinmu")); *((CFDictionaryRef*)outData) = mutableDic1; *outDataSize = sizeof(CFPropertyListRef); // *((CFPropertyListRef*)outData) = mutableDic; } break; case kPlugIn_ContainArray: { CFMutableArrayRef mutableArray = CFArrayCreateMutable(kCFAllocatorDefault, 0, &kCFTypeArrayCallBacks); CFArrayAppendValue(mutableArray, CFSTR("Hello")); CFArrayAppendValue(mutableArray, CFSTR("World")); *((CFArrayRef*)outData) = mutableArray; *outDataSize = sizeof(CFArrayRef); } break; These are fixed returns, and there are no issues when I retrieve the data. When I change the return data in kPlugIn_ContainDic to the following, the first time I restart the CoreAudio service and retrieve the data, it works fine. However, when I attempt to retrieve it again, it results in an error: case kPlugIn_ContainDic: { *outDataSize = sizeof(CFPropertyListRef); *((CFPropertyListRef*)outData) = mutableDic; } break; error code: HALC_ShellDevice::CreateIOContextDescription: failed to get a description from the server HAL_HardwarePlugIn_ObjectGetPropertyData: no object HALPlugIn::ObjectGetPropertyData: got an error from the plug-in routine, Error: 560947818 (!obj) The declaration and usage of mutableDic are as follows: static CFMutableDictionaryRef mutableDic; static OSStatus BlackHole_Initialize(AudioServerPlugInDriverRef inDriver, AudioServerPlugInHostRef inHost) { OSStatus theAnswer = 0; gPlugIn_Host = inHost; if (mutableDic == NULL){ mutableDic = CFDictionaryCreateMutable(kCFAllocatorDefault, 100, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); } } static OSStatus BlackHole_AddDeviceClient(AudioServerPlugInDriverRef inDriver, AudioObjectID inDeviceObjectID, const AudioServerPlugInClientInfo* inClientInfo) { CFStringRef string = CFStringCreateWithFormat(kCFAllocatorDefault, NULL, CFSTR("%u"), inClientInfo->mClientID); CFMutableDictionaryRef dic = CFDictionaryCreateMutable(kCFAllocatorDefault, 0, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); CFDictionarySetValue(dic, CFSTR("clientID"), string); CFDictionarySetValue(dic, CFSTR("bundleID"), inClientInfo->mBundleID); CFDictionarySetValue(mutableDic, string, dic); } Can someone tell me why
0
0
548
Sep ’24
Strange behaviour after modifying exposure duration and going back to AVCaptureExposureModeContinuousAutoExposure
When I set a custom exposure duration, like 1/8, and then switch back to continuous auto exposure, the exposure duration in areas that were previously 1/17 changes to something like 1/5 or 1/10. As a result, the screen becomes laggy and overexposed. I'm not sure why this is happening.
0
0
404
Sep ’24
Song releaseDate always nil
I am fetching playlist songs from the users library and also need the releaseDate (or year) for the song for my use case. However, the releaseDate is always nil since I have upgraded to sequoia. I am pretty sure, this was working before the upgrade, but I couldn't find any documentation on changes related to this. Furthermore I noticed, the IDs also now seem to be the catalog IDs instead of the global ones like i.PkdZbQXsPJ4DX04 Here's in a nutshell what I am doing func fetchSongs(playlist: Playlist) async throws { let detailedPlaylist = try await playlist.with([.tracks]) var currentTracks: MusicItemCollection<Track>? = detailedPlaylist.tracks repeat { for track in currentTracks! { guard case .song(let song) = track else { print("This is not a song") continue } print(song.releaseDate) } currentTracks = try await currentTracks?.nextBatch() } while currentTracks != nil }
1
0
386
Sep ’24
iOS18 and Xcode16 using AVPlayer prints lots of warning logs
<<<< FigPlayerInterstitial >>>> fpic_ServiceCurrentEvent signalled err=-15671 (kFigPlayerInterstitialError_ClientReleased) (no primary) at FigPlayerInterstitialCoordinator.m:7885 <<<< FigPlayerInterstitial >>>> fpic_ServiceCurrentEvent signalled err=-15671 (kFigPlayerInterstitialError_ClientReleased) (no primary) at FigPlayerInterstitialCoordinator.m:7885 <<<< FigPlayerInterstitial >>>> fpic_ServiceCurrentEvent signalled err=-15671 (kFigPlayerInterstitialError_ClientReleased) (no primary) at FigPlayerInterstitialCoordinator.m:7885 <<<< FigPlayerInterstitial >>>> fpic_ServiceCurrentEvent signalled err=-15671 (kFigPlayerInterstitialError_ClientReleased) (no primary) at FigPlayerInterstitialCoordinator.m:7885 <<<< FigPlayerInterstitial >>>> fpic_ServiceCurrentEvent signalled err=-15671 (kFigPlayerInterstitialError_ClientReleased) (no primary) at FigPlayerInterstitialCoordinator.m:7885 My project uses AVPlayer (AVPlayerViewController) to play video. There are continuous warning logs while playing and when it goes to dealloc, it prints information below. <<<< PlayerRemoteXPC >>>> remoteXPCItem_handleSetProperty signalled err=-12860 (kFigPlayerError_ParamErr) (propertyValue should be MTAudioProcessingTap) at FigPlayer_RemoteXPC.m:2760 This only happens in iOS 18 and I have no idea about this. There is no any information for FigPlayerInterstitial and else.
1
2
592
Sep ’24
carplay audio lost after ios18
Iphone 13mini updated to ios18. Carplay is wired on my 2021 RAM Laramie. After the update => Premium audio is lost, I can only hear low quality audio. When I manually change to Bluetooth instead of usb on the car, then audio comes in speaker mode on my phone and not on the truck.
2
2
466
Sep ’24
Apple Relay Registration
I was wondering if anyone could assist with the following query. Apple's Private Relay functionality requires companies to register all email-sending subdomains for the service to function properly. With 26 markets and 3 subdomains per market for one department, and another department with around 20 markets and even more subdomains, the limit of 100 sending domains is exceeded. As a result, we’re unable to register all the domains currently being used to send emails to our customers. Does any have any recommendations to overcome this?
0
0
481
Sep ’24
MusicKit Queue broke in iOS18
It's simple to reproduce. The bug is simply when you queue a bunch of songs to play, it will always queue less than what you gave it. Here, I'm attempting to play an apple curated playlist, it will only queue a subset, usually less than 15, but as low as 1 out of 100. Use the system's forward and backwards to test it out. Here is the code, just paste it in to the ContentView file and make sure you have the capibility to run it. import SwiftUI import MusicKit struct ContentView: View { var body: some View { VStack{ Button("Play Music") { Task{ await playMusic() } } } } } func getOnlySongsFromTracks(tracks:MusicItemCollection<Track>?) async throws ->MusicItemCollection<Song>?{ var songs:[Song]? if let t = tracks{ songs = [Song]() for track in t { if case let .song(song) = track { songs?.append(song) print("track is song \(track.debugDescription)") }else{ print("track not song \(track.debugDescription)") } } } if let songs = songs { let topSongs = MusicItemCollection(songs) return topSongs } return nil } func playMusic() async { // Request authorization let status = await MusicAuthorization.request() guard status == .authorized else { print("Music authorization denied.") return } do { // Perform a hardcoded search for a playlist let searchTerm = "2000" let request = MusicCatalogSearchRequest(term: searchTerm, types: [Playlist.self]) let response = try await request.response() guard let playlist = response.playlists.first else { print("No playlists found for the search term '\(searchTerm)'.") return } // Fetch the songs in the playlist let detailedPlaylist = try await playlist.with([.tracks]) guard let songCollection = try await getOnlySongsFromTracks(tracks: detailedPlaylist.tracks) else { print("no songs found") return } guard let t = detailedPlaylist.tracks else { print("no tracks") return } // Create a queue and play let musicPlayer = ApplicationMusicPlayer.shared let q = ApplicationMusicPlayer.Queue(for: t) musicPlayer.queue = q try await musicPlayer.play() print("Now playing playlist: \(playlist.name)") } catch { print("An error occurred: \(error.localizedDescription)") } }
3
1
710
Sep ’24
iOS 18 arm64 simulator disables audio output with unknown "AudioConverterService" error
Hello, I'm getting an unknown, never-before-seen error at application launch, when running my iOS SpriteKit game on the iOS 18 arm64 simulator from Xcode 16.0 (16A242d) — AudioConverterOOP.cpp:847 Failed to prepare AudioConverterService: -302 This is occurs on all iOS 18 simulator devices, between application(_:didFinishLaunchingWithOptions:) and the first applicationDidBecomeActive(_:) — the SKScene object may have been already initialized by SpriteKit, but the scene's didMove(to:) method hasn't been called yet. Also, note that the error message is being emitted from a secondary (non-main) thread, obviously not created by the app. After the error occurs, no SKScene is able to play audio — this had never occurred on iOS versions prior to 18, neither on physical devices nor on the simulator. Has anyone seen anything like this on a physical device running 18? Unfortunately, at the moment I cannot test myself on an 18 device, only on the simulator... Thank you, D.
2
0
857
Sep ’24
How best to handle AirPods audio glitches in Game Mode?
Hello! The new lower latency support for AirPods in Game Mode is impressive, but I'm not sure of the best way to handle the transition into/out of Game Mode while audio is playing. In order to lower the latency, the system appears to drop some number of samples, with the result being a good deal less latency. My use case is macOS where it's easier to switch in/out of the fullscreen game (a simple swipe left), thus causing more issues for Game Mode since the audio is playing the entire time. It would be nice if offscreen games could remain in game mode, but I understand not wanting to give developers that control. Are there any best practices for avoiding or masking the audio glitch caused by this skip-ahead? Is there a system event I can receive to know when Game Mode is about to be enabled or disabled, where I could perhaps fade out the audio? My callback checks the inTimestamp->mSampleTime value to detect gaps, but it only rarely detects a Game Mode gap, even though the audio skip-ahead always happens. BTW, I am currently only developing on macOS (15.0) and I'm working at a low level with AudioUnit callbacks and a SpatialMixer. I am not currently using any higher-level audio APIs. And here's a few questions I don't necessarily expect answers to, but it doesn't hurt to ask: Is there any additional technical details about how this latency reduction works, or exactly how much of a reduction is achieved (or said another way, how many samples are dropped)? How much does this affect AirPods battery life? And finally, is there a way to query the actual latency value? I check the value for kAudioDevicePropertyLatency but it seems to always report 160ms for AirPods. Thanks!
1
0
1k
Sep ’24