Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

iOS: Want to use USB client mode to send data.
I want my iOS app to be able to use USB client mode to send LiDAR data and camera frames to another device. What are my options for doing this. I've found IOUSBHost for host mode, but I want to see all my options. The device I want driving the bus is a Meta Quest 3, which, and I mention for the sake of clarity, is inherently an Android device. The iPhone is to be used as a sensor hub, sending data to the Quest 3 for further processing. As a workaround, I could let the iPhone drive the bus and have the Quest 3 use Android's accessory mode, which lets other devices drive the USB bus. But, there are more USB devices I want to attach for my project, and doing this makes such more difficult. I want to avoid it.
0
0
294
Jan ’25
New playback error on iOS/tvOS 18.x "CoreMediaErrorDomain Code=-15486"
Hello, Our users have started to see a new fatal AVPlayer error during playback starting with iOS/tvOS 18.0. The error is defined as "CoreMediaErrorDomain Code=-15486". We have not been able to reproduce this issue locally within our development team. Is there any documentation on the cause of this error or steps to recover from this error? Thank you, Howard
0
2
633
Feb ’25
Strange behaviour after modifying exposure duration and going back to AVCaptureExposureModeContinuousAutoExposure
When I set a custom exposure duration, like 1/8, and then switch back to continuous auto exposure, the exposure duration in areas that were previously 1/17 changes to something like 1/5 or 1/10. As a result, the screen becomes laggy and overexposed. I'm not sure why this is happening.
0
0
405
Sep ’24
Usage of colorCurves CIFilter
How can I use my RGB Curve points: let redCurve = [CIVector(x: 0, y: 0), CIVector(x: 0.235, y: 0.152), CIVector(x: 0.5, y: 0.5), CIVector(x: 1, y: 1)] let greenCurve = [CIVector(x: 0, y: 0), CIVector(x: 0.247, y: 0.196), CIVector(x: 0.5, y: 0.5), CIVector(x: 1, y: 1)] let blueCurve = [CIVector(x: 0, y: 0), CIVector(x: 0.235, y: 0.184), CIVector(x: 0.466, y: 0.466), CIVector(x: 1, y: 1)] in colorCurvesFilter which I've found in Apple Docs: func colorCurves(inputImage: CIImage) -> CIImage { let colorCurvesEffect = CIFilter.colorCurves() colorCurvesEffect.inputImage = inputImage colorCurvesEffect.curvesDomain = CIVector(x: 0, y: 1) colorCurvesEffect.curvesData = Data( bytes: [Float32]([ 0.0,0.0,0.0, 0.8,0.8,0.8, 1.0,1.0,1.0 ]), count: 36) colorCurvesEffect.colorSpace = CGColorSpaceCreateDeviceRGB() return colorCurvesEffect.outputImage! }
0
0
345
Jan ’25
MPRemoteCommand play conflicting with .playPause gesture
We develop a video playback app on Apple TV which has the two following features: Its content browsing screen has installed a gesture recognizer for presses on the PlayPause Siri remote button in order to directly launch a playback. The gesture recognizer is attached to the content browsing UIViewController view. It presents its own custom playback UI with an AVPlayerLayer for the video and supports MPNowPlayingSession in order to publish current playback information and respond to remote commands. It also supports switching between fullscreen and Picture in Picture playback. Both features work fine, ie. the playback is launched when pressing the PlayPause Siri remote button and, during playback, the playback info are properly advertised on other devices and remote commands are also triggered as expected. However, when pressing the PlayPause Siri remote button while the video is playing in PiP, the "pause" remote command is sometimes triggered instead of the .playPause gesture recognizer. The issue may not occur the first time but for subsequent PlayPause presses. Navigating a bit in the app UI seems to help preventing the issue to occur. Finally, the issue only occurs if the video is playing. If the video is paused, the PlayPause Siri remote button gesture is always recognized instead of the remote command. Please note that, before using MPNowPlayingSession (and the corresponding MPRemoteCommandCenter), the app was using the default MPRemoteCommandCenter to support remote commands and the issue did not occur. We don't reproduce this issue with the Apple TV app so there's probably something we are not doing right. Has someone any clue?
0
0
589
Nov ’24
ProRAW to CIRAWFilter to HEIF producing borked HDR results
Following WWDC 2023 "Support HDR images in your app", I'm trying to save 48-megapixel ProRAWs (taken on an iPhone 14 Pro Max) as HDR HEICs to the Photo Library. After processing the ProRAW file using CIRAWFilter, whether I use CIContext.heif10Representation() or convert to a CGImage, then UIImage, and use UIImage.heicData(), I get photos that behave oddly in the Photo Library. They appear too dark, and visibly brighten when first viewed, but more problematic is that the photos brighten a great deal more when you edit them with the Photos editor. This is the behavior when using the itur_2100_PQ color space, but itur_2100_HLG behaves similarly, except that it gets dramatically darker when edited. This behavior occurs whether CIRAWFilter.extendedDynamicRangeAmount is set to 0.0, or 2.0, or not set at all. So what am I doing wrong? Here is a minimal iOS app -- well, just the ContentView -- that demonstrates the issue. You also need a .dng ProRAW file included in the project directory named test.dng. I'd love to include such a file, but I can't. Be prepared for a multi-second wait when you save the photo. import SwiftUI import Photos struct ContentView: View { let context = CIContext() let hdrColorSpace = CGColorSpace(name: CGColorSpace.itur_2100_PQ)! var body: some View { VStack(spacing: 100) { Button("Save Photo From CGImage/UIImage") { savePhotoFromUIImage() } Button("Save Photo From CIImage") { savePhotoDirectFromCIImage() } }.padding(60) } //convert RAW with CIRAWFilter to CIImage, then convert to CGImage, then UIImage, then HEIF private func savePhotoFromUIImage() { if let ciImage = processRAW(url: Bundle.main.url(forResource:"test", withExtension: "dng")!) { guard let outputCGImage = context.createCGImage(ciImage, from: ciImage.extent, format: .RGB10, colorSpace: hdrColorSpace) else { return } let uiImage = UIImage(cgImage: outputCGImage) if let heicData = uiImage.heicData() { saveHEIFPhotoToLibrary(imageData: heicData) } else { print("Failed to convert UIImage to HEIC") } } } //convert RAW with CIRAWFilter to CIImage, then to HEIF private func savePhotoDirectFromCIImage() { if let ciImage = processRAW(url: Bundle.main.url(forResource:"test", withExtension: "dng")!) { do { let heif = try context.heif10Representation(of: ciImage, colorSpace: hdrColorSpace) saveHEIFPhotoToLibrary(imageData: heif) } catch { print("Failed to get HEIF representation from CIContext") } } } private func processRAW(url: URL) -> CIImage? { guard let coreRawFilter = CIRAWFilter(imageURL: url) else { return nil } coreRawFilter.extendedDynamicRangeAmount = 2.0 //the issue persists whether this is not set, or set to 0, or set to, say, 2.0 guard let ciImage = coreRawFilter.outputImage else { return nil } return ciImage } private func saveHEIFPhotoToLibrary(imageData: Data) { PHPhotoLibrary.shared().performChanges({ let creationRequest = PHAssetCreationRequest.forAsset() let options = PHAssetResourceCreationOptions() creationRequest.addResource(with: .photo, data: imageData, options: options) }) { success, error in if let error = error { print("Error saving photo: \(error.localizedDescription)") } else { print("Photo saved.") } } } }
0
1
522
Jan ’25
Is it possible to retrieve EXIF metadata from PHAsset without downloading photos (even if offloaded to iCloud Photo Library)?
iOS (Official) Photos app can display some EXIF-related metadata (e.g. camera and lens info, ISO, shutter speed, F-number) even when photos are offloaded to iCloud and the device is not connected to internet (e.g. airplane mode). However, with the Photos.framework, we need to download photos to retrive those metadata (which means it will not work with airplane mode). I tried the following methods, but none of those worked when photos were offloaded to iCloud and the device was in airplane mode: Requesting data with PHImageManager.default().requestImageDataAndOrientation Result: It does not return Data if the photo is not stored locally on the device, even with options.deliveryMode = .fastFormat Converting PHAsset#localIdentifier to an AssetsLibrary.framework URL (assets-library://asset/...) (I am aware that AssetsLibrary.framework is deprecated, but this was just a test.) Result: If PHImageManager does not returns Data, ALAsset#defaultRepresentation().metadata() returns an empty NSDictionary
0
1
583
Nov ’24
Slow performance decoding large images with Core Image.
I'm building a camera app that does some post processing after the photo has been taken. With 12MP the processing is pretty good, but larger images 24MP is very slow. I created a very simple example to demonstrate the issue, which is loading an image and the rendering it to data. let context = CIContext() let imageUrl = Bundle.main.url(forResource: "12mp", withExtension: "jpg")! let data = try! Data(contentsOf: imageUrl) let ciImage = CIImage(data: data)! let start = CFAbsoluteTimeGetCurrent() let data = context.jpegRepresentation(of: ciImage, colorSpace: context.workingColorSpace!) print(data?.count) print("Resize Completed: " + String(CFAbsoluteTimeGetCurrent() - start)) Running this code on an iPhone 16 Pro with different images produces these benchmarks: 12MP => 0.03s 24MP => 1.22s 48MP => 2.98s I understand that processing time will increase with resolution but it doesn't seem linear. I have tried setting different CiContext options such as .useSoftwareRenderer: false but it has made no difference. From profiling the process it looks like the JPEG decoding is the bottle neck. This is for a 48MP Image: Is there any way this can be improved?
0
0
581
Dec ’24
Help with CoreAudio Input level monitoring
I have spent the past 2 weeks diving into CoreAudio and have seemingly run into a wall... For my initial test, I am simply trying to create an AUGraph for monitoring input levels from a user chosen Audio Input Device (multi-channel in my case). I was not able to find any way to monitor input levels of a single AUHAL input device - so I decided to create a simple AUGraph for input level monitoring. Graph looks like: [AUHAL Input Device] -> [B1] -> [MatrixMixerAU] -> [B2] -> [AUHAL Output Device] Where B1 is an audio stream consisting of all the input channels available from the input device. The MatrixMixer has metering mode turned on, and level meters are read from the each submix of the MatrixMixer using kMatrixMixerParam_PostAveragePower. B2 is a stereo (2 channel) stream from the MatrixMixerAU to the default audio device - however, since I don't really want to pass audio through to an actual output, I have the volume muted on the MatrixMixerAU output channel. I tried using a GenericOutputAU instread of the default system output, however, the GenericOutputAU never seems to pull date from the ringBuffer (the graph renderProc is never called if a GenericOutputAU is used instead of AUHAL default output device). I have not been able to get this simple graph to work. I do not see any errors when creating the graph and initializing the graph, and I have verified that the inputProc is being called for filling up the ringBuffer - but when I read the level of the MatrixMixer, the levels are always -758 (silence). I have posted my demo project on github in hopes I can find someone with CoreAudio expertise to help with this problem. I am willing to move this to DTS Code Level support if there is someone in DTS with CoreAudio experience. Notes: My App is not sandboxed in this test I have tried with and without hardened runtime with Audio Input checked The multichannel audio device I am using for testing is the Audient iD14 USB-C audio device. It supports 12 input channels and 6 output channels. All input channels have been tested and are working in Ableton Live and Logic Pro. Of particular interest, is that I can't even get the Apple CAPlayThrough demo to work on my system. I see no errors when creating the graph, but all I hear is silence. The MatrixMixerTest from the Apple documentation archives does work - but note, that that demo does not use Audio Input devices - it reads audio into the graph from an audio file. Link to github project page. Diagram of AUGraph for initial test (code that is on github) Once I get audio input level metering to work, my plan is to implement something like in Phase 2 below - with the purpose of capturing a stereo input stream, mixing to mono, and sending to lowpass, bandpass, hipass AUs - and will again use MatrixMixer for level monitoring of the levels out of each filter. I have no plans on passthough audio (sending actual audio out to devices) - I am simple monitoring input levels Diagram of ultimate scope - rendering audio levels of a stereo to mono stream after passing through various filters
0
0
373
Nov ’24
Slow Image Retrieval Using requestImageForAsset:targetSize:contentMode:options:resultHandler:
Hello, I am experiencing slow image retrieval when using the requestImageForAsset:targetSize:contentMode:options:resultHandler: method in my application. The delay is significantly impacting the performance of my app. Here are the details of my implementation: for (PHAsset *asset in assets) { @autoreleasepool { PHImageManager *imageManager = [PHImageManager defaultManager]; PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init]; options.synchronous = YES; options.deliveryMode = PHImageRequestOptionsDeliveryModeFastFormat; options.resizeMode = PHImageRequestOptionsResizeModeNone; [imageManager requestImageForAsset:asset targetSize:CGSizeMake(100, 100) contentMode:PHImageContentModeAspectFill options:options resultHandler:^(UIImage *thumbnail, NSDictionary *info) { CameraRollCellDto *cellDto = [[CameraRollCellDto alloc] init]; cellDto.index = index; cellDto.thumbnail = thumbnail; cellDto.propertyDate = asset.creationDate; if (self.segmentedTorikomi.selectedSegmentIndex == SEG_INDEX_IKKATSU) { cellDto.isSelected = YES; } else { cellDto.isSelected = NO; } [list addObject:cellDto]; }]; index++; } } Has anyone else encountered this issue? Are there any known solutions or optimizations that can help improve the speed of image retrieval using this method? Thank you for your assistance.
0
0
285
Jan ’25
Silent output MP4 when using AVAssetReaderHello, I am trying to create an MP4 by obtaining the content from another source MP4. The source MP4 would be read with `AVAssetReader` and the output written with `AVAssetWriter`. I wanted to do partial t
Hello, I am trying to create an MP4 by obtaining the content from another source MP4. The source MP4 would be read with AVAssetReader and the output written with AVAssetWriter. I wanted to do partial tests: first, I placed only the video in the output MP4. Now, I am trying to place only the audio in the output MP4. I even managed to get the output MP4 to have the same length (in seconds) as the source MP4. But the problem is simple: the output MP4 is simply silent. Naturally, I want it to have audio. Below are two excerpts from the source code. Reading and writing. Note: The variable videoURL is from the class where the function writeVideo() is located. Its assignment happens in another scope, already debugged. Snippet: let semaphore = DispatchSemaphore(value: 0) func writeVideo() { var audioReaderBuffers = [CMSampleBuffer]() // File directory url = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0].appendingPathComponent("*****/output.mp4") guard let url = url else { return } try FileManager.default.createDirectory(at: url.deletingLastPathComponent(), withIntermediateDirectories: true) if FileManager.default.fileExists(atPath: url.path()) { try FileManager.default.removeItem(at: url) } if let videoURL = videoURL { let videoAsset = AVAsset(url: videoURL) Task { let audioTrack = try await videoAsset.loadTracks(withMediaType: .audio).first! let reader = try AVAssetReader(asset: videoAsset) let audioSettings = [ AVFormatIDKey: kAudioFormatLinearPCM, AVSampleRateKey: 44100, AVNumberOfChannelsKey: 2 ] as [String : Any] let audioOutputTest = try await audioTrack.getAudioSettings() let readerAudioOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: audioSettings) reader.add(readerAudioOutput) reader.startReading() while let sampleBuffer = readerAudioOutput.copyNextSampleBuffer() { audioReaderBuffers.append(sampleBuffer) } semaphore.signal() } } semaphore.wait() let audioInput = createAudioInput(sampleBuffer: audioReaderBuffers[0]) let assetWriter = try AVAssetWriter(outputURL: url, fileType: .mp4) assetWriter.add(audioInput) assetWriter.startWriting() assetWriter.startSession(atSourceTime: .zero) for (index, buffer) in audioReaderBuffers.enumerated() { while !audioInput.isReadyForMoreMediaData { usleep(1000) } audioInput.append(buffer) } assetWriter.finishWriting { switch assetWriter.status { case .completed: print("Operation completed successfully: \(url.absoluteString)") case .failed: if let error = assetWriter.error { print("Error description: \(error.localizedDescription)") } else { print("Error not found.") } default: print("Error not found.") } } } Below is the createAudioInput method: func createAudioInput(sampleBuffer: CMSampleBuffer) -> AVAssetWriterInput { let audioSettings = [ AVFormatIDKey: kAudioFormatMPEG4AAC, AVSampleRateKey: 48000, AVEncoderBitRatePerChannelKey: 64000, AVNumberOfChannelsKey: 1 ] as [String : Any] let audioInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioSettings, sourceFormatHint: sampleBuffer.formatDescription) audioInput.expectsMediaDataInRealTime = false return audioInput } I await your help, please.
0
0
566
Sep ’24
Trouble with getting extended Album info from user library
Hello! I have a problem with getting album extended info from users library. Note that app authorised to use Apple Music according documentation. I get albums from users library with this code: func getLibraryAlbums() async throws -> MusicItemCollection<Album> { let request = MusicLibraryRequest<Album>() let response = try await request.response() return response.items } This is an example of Albums request respones: { "data" : [ { "meta" : { "musicKit_identifierSet" : { "isLibrary" : true, "id" : "1945382328890400383", "dataSources" : [ "localLibrary", "legacyModel" ], "type" : "Album", "deviceLocalID" : { "databaseID" : "37336CB19CF51727", "value" : "1945382328890400383" }, "catalogID" : { "kind" : "adamID", "value" : "1173535954" } } }, "id" : "1945382328890400383", "type" : "library-albums", "attributes" : { "artwork" : { "url" : "musicKit:\/\/artwork\/transient\/{w}x{h}?id=4A2F444C%2D336D%2D49EA%2D90C8%2D13C547A5B95B", "width" : 0, "height" : 0 }, "genreNames" : [ "Pop" ], "trackCount" : 1, "artistName" : "Сара Окс", "isAppleDigitalMaster" : false, "audioVariants" : [ "lossless" ], "playParams" : { "catalogId" : "1173535954", "id" : "1945382328890400383", "musicKit_persistentID" : "1945382328890400383", "kind" : "album", "musicKit_databaseID" : "37336CB19CF51727", "isLibrary" : true }, "name" : "Нимфомания - Single", "isCompilation" : false } }, { "meta" : { "musicKit_identifierSet" : { "isLibrary" : true, "id" : "-8570883332059662437", "dataSources" : [ "localLibrary", "legacyModel" ], "type" : "Album", "deviceLocalID" : { "value" : "-8570883332059662437", "databaseID" : "37336CB19CF51727" }, "catalogID" : { "kind" : "adamID", "value" : "1618488499" } } }, "id" : "-8570883332059662437", "type" : "library-albums", "attributes" : { "isCompilation" : false, "genreNames" : [ "Pop" ], "trackCount" : 1, "artistName" : "TIMOFEEW & KURYANOVA", "isAppleDigitalMaster" : false, "audioVariants" : [ "lossless" ], "playParams" : { "catalogId" : "1618488499", "musicKit_persistentID" : "-8570883332059662437", "kind" : "album", "id" : "-8570883332059662437", "musicKit_databaseID" : "37336CB19CF51727", "isLibrary" : true }, "artwork" : { "url" : "musicKit:\/\/artwork\/transient\/{w}x{h}?id=BEA6DBD3%2D8E14%2D4A10%2D97BE%2D8908C7C5FC2C", "width" : 0, "height" : 0 }, "name" : "Не звони - Single" } }, ... ] } In AlbumView using task: view modifier I request extended information about the album with this code: func loadExtendedInfo(_ album: Album) async throws -> Album { let response = try await album.with([.tracks, .audioVariants, .recordLabels], preferredSource: .library) return response } but in the response some of the fields are always nil, for example recordLabels, releaseDate, url, editorialNotes, copyright. Please tell me what I'm doing wrong?
0
0
425
Dec ’24
Replace MWPhotoBrowser with modern alternative
I have an iPad app, written in objective-c and distributed through Enterprise developer, as it is not for public use but specific to some large companies. The app has a local database and works offline For some functions of the app I need to display images (not edit or cut them, just display them) Right now there is integrated MWPhotoBrowser viewer, which has not been maintained for almost 10 years, so in addition to warnings in compilation I have to fight with some historical bugs especially on high resolution images. https://github.com/mwaterfall/MWPhotoBrowser Do you know of a modern and maintained OFFLINE photo viewer? I evaluate both free and paid (maybe an SDK). My needs are very basic I have found this one https://github.com/TimOliver/TOCropViewController, but I need to disable the photos edit features and especially I would lose the useful feature of displaying multiple images (mwphoto for multiple images showed a gallery)
0
0
368
Nov ’24
Debug MediaExtension plugin in system exectutable?
I am developing a macOS 15 MediaExtension plugin to enable additional codecs and container formats in AVFoundation My Plugin is sort of working, but i'd like to debug the XPC process that AVFoundation 'hoists' for me from the calling app (ie - the process hosting my plugin instance that is managing the MESampleBuffer protocol calls for example) Is there a method to configure XCode for interactive attaching to this background process for interactive debugging? Right now I have to use Console + Print which is not fun or productive. Does Apple have a working example of a MediaExtension anywhere? This is an exciting API that is very under-documented. I'm willing to spend a Code Review 'credit' for this, but my issues are not quite focused. Any assistance is highly appreciated!
0
0
493
Dec ’24
Drawing shapes and interacting with them
I have an app that lets the user draw a circle or line over a live video feed. There’s a view for circles and a view for lines. These are displayed in ContentView and their position in the ZStack is altered based on the toolbar section by the user. Whichever view is on top of the stack is active and usable. That all works fine but I want the user to be able to select any shape on screen whether it is a circle or line. My question is what is best practice to manage tool selection and drawing the shapes? I’m guessing I should draw all shapes onto one view but I would like to avoid this as I have all my logic in each shapes respective view.
0
0
324
Sep ’24
Increased delay when AUGraph's output device is system output
I'm using an AUGraph to mix audio from different sources for a real time streaming application. Whenever the audio device used as the graph's output device is also the Mac's default output device, the measured latency increases by about 35 milliseconds for wired devices. Any idea why this is? Is there a way around this besides nagging the user to not the use the system output in our app?
0
0
288
Dec ’24
Is it worth using Accelerate to convert 16 bpc RGB to 8 bpc RGB
I am working on an image processing app that requires 8 bit per channel (bpc) images. Sometimes, input images are 16 bpc (e.g. sRGB IEC61966-2.1; extended range) The app already draws the input image in a CGContext producing an 8 bpc CGImage. This works fine if the input is 16 bpc and I get an 8 bpc image. I am wondering if it would be better for image quality to convert the 16 bpc images to 8 bpc using Accelerate before that CGContext draw? or does that draw essentially do the equivalent?
0
0
308
Sep ’24
iOS AVPlayer Subtitles / Captions
As of iOS 18, as far as I can tell, it appears there's still no AVPlayer options that allow users to toggle the caption / subtitle track on and off. Does anyone know of a way to do this with AVPlayer or with SwiftUI's VideoPlayer? The following code reproduces this issue. It can be pasted into an app playground. This is a random video and a random vtt file I found on the internet. import SwiftUI import AVKit import UIKit struct ContentView: View { private let video = URL(string: "https://server15700.contentdm.oclc.org/dmwebservices/index.php?q=dmGetStreamingFile/p15700coll2/15.mp4/byte/json")! private let captions = URL(string: "https://gist.githubusercontent.com/samdutton/ca37f3adaf4e23679957b8083e061177/raw/e19399fbccbc069a2af4266e5120ae6bad62699a/sample.vtt")! @State private var player: AVPlayer? var body: some View { VStack { VideoPlayerView(player: player) .frame(maxWidth: .infinity, maxHeight: 200) } .task { // Captions won't work for some reason player = try? await loadPlayer(video: video, captions: captions) } } } private struct VideoPlayerView: UIViewControllerRepresentable { let player: AVPlayer? func makeUIViewController(context: Context) -> AVPlayerViewController { let controller = AVPlayerViewController() controller.player = player controller.modalPresentationStyle = .overFullScreen return controller } func updateUIViewController(_ uiViewController: AVPlayerViewController, context: Context) { uiViewController.player = player } } private func loadPlayer(video: URL, captions: URL?) async throws -> AVPlayer { let videoAsset = AVURLAsset(url: video) let videoPlusSubtitles = AVMutableComposition() try await videoPlusSubtitles.add(videoAsset, withMediaType: .video) try await videoPlusSubtitles.add(videoAsset, withMediaType: .audio) if let captions { let captionAsset = AVURLAsset(url: captions) // Must add as .text. .closedCaption and .subtitle don't work? try await videoPlusSubtitles.add(captionAsset, withMediaType: .text) } return await AVPlayer(playerItem: AVPlayerItem(asset: videoPlusSubtitles)) } private extension AVMutableComposition { func add(_ asset: AVAsset, withMediaType mediaType: AVMediaType) async throws { let duration = try await asset.load(.duration) try await asset.loadTracks(withMediaType: mediaType).first.map { track in let newTrack = self.addMutableTrack(withMediaType: mediaType, preferredTrackID: kCMPersistentTrackID_Invalid) let range = CMTimeRangeMake(start: .zero, duration: duration) try newTrack?.insertTimeRange(range, of: track, at: .zero) } } }
0
0
77
Apr ’25
No charts genres for some storefronts
Hello, This is about the Get Catalog Top Charts Genres endpoint : GET https://api.music.apple.com/v1/catalog/{storefront}/genres I noticed that for some storefronts, no genre is returned. You can try with the following storefront values : France (fr) Poland (pl) Kyrgyzstan (kg) Uzbekistan (uz) Turkmenistan (tm) Is that a bug or is it on purpose ? Thank you.
0
0
417
Dec ’24