Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

Using image instead of circles in particle system
Hi all, I'm working on a particle system. Got it to work using drawn circles. Now I want to replace the circle with an image. Trying to do so in Draw section, but not sure if that's the right place. Any suggestions for coding to: connect the image from BIN to Xcode to replace particles with the image(s). Kindly ty
1
0
332
Oct ’24
[iPadOS18 Beta3] on iPad 7th gen, camera app cannot detect QR code
Hi, After installing iPadOS18 Beta3 on my iPad 7th gen, the default camera app no ​​longer detects QR codes. I tried updating to Beta7, but the issue remained. Also, third-party apps that use AVCaptureMetadataOutput in AVFoundation Framework to detect QR codes also no longer work. You can reproduce the issue by running default camera app or the AVFoundation sample code from the Apple developer site on iPad 7th gen (iPadOS18Beta installed). https://developer.apple.com/documentation/avfoundation/capture_setup/avcambarcode_detecting_barcodes_and_faces Has anyone else experienced this issue? I would like to know if this issue occurs on other iPad models as well. This is similar to the following issue that previously occurred with iPadOS 17.4. https://support.apple.com/en-lamr/118614 https://developer.apple.com/forums/thread/748092
5
0
962
Oct ’24
MusicKit currentEntry.item is nil but currentEntry exists.
I'm trying to get the item that's assigned to the currentEntry when playing any song which is currently coming up nil when the song is playing. Note currentEntry returns: MusicPlayer.Queue.Entry(id: "evn7UntpH::ncK1NN3HS", title: "SONG TITLE") I'm a bit stumped on the API usage. if the song is playing, how could the queue item be nil? if queueObserver == nil { queueObserver = ApplicationMusicPlayer.shared.queue.objectWillChange .sink { [weak self] in self?.handleNowPlayingChange() } } } private func handleNowPlayingChange() { if let currentItem = ApplicationMusicPlayer.shared.queue.currentEntry { if let song = currentItem.item { self.currentlyPlayingID = song.id self.objectWillChange.send() print("Song ID: \(song.id)") } else { print("NO ITEM: \(currentItem)") } } else { print("No Entries: \(ApplicationMusicPlayer.shared.queue.currentEntry)") } }
0
0
365
Oct ’24
PHIImageManager cannot resize images
I am using PHImageRequestOptions and PHImageManager to load images to my app. I use version.original and resizeMode.none, version.original and resizeMode.extract. Both used to work well but since iOS18 version.original and resizeMode.extract doesn't work anymore. The images are loaded but the they are not shown. (Only the frames?) Anyone knows why? Thank you for reading.
0
0
251
Oct ’24
slow motion video playback "ramp-up" stage
In iOS, when I use AVPlayerViewController to play back a slow motion video, it has a "ramp-up" stage at the start and a "ramp-down" stage at the end, and the video plays at the normal speed (i.e. not slow motion) during these stages. My question is: are these non-slow-motion stages defined in the video file itself? (e.g. some kind of meta data?) Or, is it just a playback approach used by AVPlayerViewController ? Thanks!
1
1
503
Oct ’24
Spatial video export fails with AVAssetExportSession
We captured a spatial video with iPhone 15 pro. When we try to export the video with AVAssetExportSession and AVAssetExportPresetMVHEVC960x960 it always go failed state and exportSession.error?.localizedDescription yield "Operation Stopped" error. Code implementation is straight forward .. other HEVC file works well.This problem occurred with only mv-hevc file. func exportSpatialVideo(videoFilePath: String, outputUrl: URL){ let url:URL? = URL(fileURLWithPath: videoFilePath) let asset: AVAsset = AVAsset(url:url!) print(asset.description) print(asset.tracks.first?.mediaType.rawValue) let preset = "AVAssetExportPresetMVHEVC960x960" let exportSession:AVAssetExportSession = AVAssetExportSession(asset: asset, presetName: preset)! exportSession.outputURL = outputUrl exportSession.shouldOptimizeForNetworkUse = true exportSession.outputFileType = AVFileType.mov exportSession.exportAsynchronously(completionHandler: { switch exportSession.status { case .unknown: print("Unknown Error") case .waiting: print( "waiting ... ") case .exporting: print( "exporting ...") case .completed: print( "completed.") case .failed: print("failed.\(String(describing: exportSession.error?.localizedDescription))") case .cancelled: } }) } is there any solution for it ?
1
0
454
Oct ’24
Some use AVCaptureControl problems
Set 3 controls to the AVCaptureSession and remove them all. The number of controls in the session is indeed 0, but the camera controls button still shows the previous 3 controls. If it is only 3->2 or 3->1, it can be modified normally, 3->0 is not OK, 0->3 is OK. f (self.captureControl.zoom) { if (self.zoomScaleControl) { self.zoomScaleControl.enabled = false; [_session removeControl:self.zoomScaleControl]; } AVCaptureSlider *zoomSlider = [self.captureControl.zoom fetchCaptureSlider]; [zoomSlider setActionQueue:dispatch_get_main_queue() action:^(float zoomFactor) { @strongify(self); if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:tryChangeZoomScale:)]) { [self.dataOutputDelegate videoCaptureSession:self tryChangeZoomScale:zoomFactor]; } }]; self.zoomScaleControl = zoomSlider; } else { if (self.zoomScaleControl) { self.zoomScaleControl.enabled = false; [_session removeControl:self.zoomScaleControl]; } self.zoomScaleControl = nil; } if (self.captureControl.exposure) { if (self.exposureBiasControl) { self.exposureBiasControl.enabled = false; [_session removeControl:self.exposureBiasControl]; } AVCaptureSlider *exposureSlider = [self.captureControl.exposure fetchCaptureSlider]; [exposureSlider setActionQueue:dispatch_get_main_queue() action:^(float bias) { @strongify(self); if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:tryChangeExposureBias:)]) { [self.dataOutputDelegate videoCaptureSession:self tryChangeExposureBias:bias]; } }]; self.exposureBiasControl = exposureSlider; } else { if (self.exposureBiasControl) { self.exposureBiasControl.enabled = false; [_session removeControl:self.exposureBiasControl]; } self.exposureBiasControl = nil; } if (self.captureControl.len) { if (self.lenControl) { self.lenControl.enabled = false; [_session removeControl:self.lenControl]; } ORLenCaptureControlCustomModel *len = self.captureControl.len; AVCaptureIndexPicker *picker = [len fetchCaptureSlider]; [picker setActionQueue:dispatch_get_main_queue() action:^(NSInteger selectedIndex) { @strongify(self); if ([self.dataOutputDelegate respondsToSelector:@selector(videoCaptureSession:didChangeLenIndex:datas:)]) { [self.dataOutputDelegate videoCaptureSession:self didChangeLenIndex:selectedIndex datas:self.captureControl.len.indexDatas]; } }]; self.lenControl = picker; } else { if (self.lenControl) { self.lenControl.enabled = false; [_session removeControl:self.lenControl]; } self.lenControl = nil; } if ([_session canAddControl:self.zoomScaleControl]) { [_session addControl:self.zoomScaleControl]; } else { self.zoomScaleControl = nil; } if ([_session canAddControl:self.lenControl]) { [_session addControl:self.lenControl]; } else { self.lenControl = nil; } if ([_session canAddControl:self.exposureBiasControl]) { [_session addControl:self.exposureBiasControl]; } else { self.exposureBiasControl = nil; } if (_session.controlsDelegate == nil) { [_session setControlsDelegate:self queue:GetCaptureControlQueue()]; }
0
0
424
Oct ’24
Sending '$0' risks causing data races
I had no luck to compile a sample code provided by apple with Xcode 16.0 beta 5. ScreenCaptureKit demo (https://developer.apple.com/documentation/screencapturekit/capturing_screen_content_in_macos) The part it is failling is, streamOutput.capturedFrameHandler = { continuation.yield($0) } And the error message is Sending '$0' risks causing data races Task-isolated '$0' is passed as a 'sending' parameter; Uses in callee may race with later task-isolated uses Please enlighten me why this is an issue and how to avoid? Thanks in advance!
3
2
2.6k
Oct ’24
Realitiy Composer file size and iOS 18
Hi folks: I've been creating .reality files out of Reality Composer for over a year. Some of the files are up to 500 mB and, prior to the last month they opened fine as AR projected experiences on even basic iPhones and iPads. Now, I think since iOS 18, a 64Mb file will open as an AR experience but files it seems from about 350MB up don't open. Files just opens a window displaying the name of the file, that it's a .reality file and the file size. But it no longer opens into either an AR or Object display of the .reality scene. Has there been a new file size limit put on .reality files that Files will open or what else is going on here. Have a client who was about to launch and experience based on the .Reality file I can no longer open. Please help.
0
0
502
Oct ’24
Is there a recommended approach to safeguarding against audio recording losses via app crash?
AVAudioRecorder leaves a completely useless chunk of file if a crash happens while recording. I need to be able to recover. I'm thinking of streaming the recording to disk. I know that is possible with AVAudioEngine but I also know that API is a headache that will lead to unexpected crashes unless you're lucky and the person who built it. Does Apple have a recommended strategy for failsafe audio recordings? I'm thinking of chunking recordings using many instances of AVAudioRecorder and then stitching those chunks together.
1
0
406
Oct ’24
Upstream Service Error on Apple Music Feed API
Hi, I'm working on an integration with the Apple Music Feed, but over the last day, I've been getting a 500 Upstream Service Error on 99% of the API calls I make. Remarkably, retrying the same endpoint 20-30 times sometimes gives the correct response, but mostly it's a 500 error. Just to give an example, this: GET https://api.media.apple.com/v1/feed/album/latest Returns this generic error: { "errors": [ { "id": "U4ARRA2QDCGYKYRI2IPEJVBTHY", "title": "Upstream Service Error", "detail": "Call to get metadata for album feed failed", "status": "500", "code": "50001" } ] } The same goes for other feeds like song, artist, and so on. Before today, I did get the same error message sometimes, but a few retries would solve the issue. Any insight on what's happening and/or an ETA on fixing it? Thank you
2
1
543
Oct ’24
What format for writeHEIFRepresentation preserves HDR?
In the WWDC 24 session "Use HDR for dynamic image experiences in your app" it's noted this is how you save edits for Adaptive HDR: SDR + HDR: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrImage: hdrImage]) SDR + Gain: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrGainMapImage: gainImage]) This won't compile because the format argument is missing. What format should be used? In the WWDC 23 session "Support HDR images in your app" RGBAf, RGBAh, and RGBA16, and RGB10 were mentioned but I'm not sure which one to use. If relevant, I'm editing photos from the user's photo library, so the image was probably taken on iPhone but perhaps not. Thanks!
1
0
674
Oct ’24
Swift fails to fetch Photo Albums Synced using Windows iTunes from External Drive
See Configuration Details at the end of this message. Despite numerous attempts, I have been unable to determine the correct syntax to fetch photo albums from my iPad Pro 13.0 using Xcode and Swift. All the photo album were synced to the iPad Pro 13-inch using the latest versions of Apple iTunes for Windows from an external Western Digital G-Drive hard drive (No iCloud). All synced albums appear under "From My Mac" on the iPad. I only want to access each album's photo and video count. See sample code snippet below. I have tried multiple subtype options and album types without success. Zero albums are always returned despite having around 3900 albums in the iPad Pro 13.0 photo library. Authorization to the photo library does not appear to be the problem. PHPhotoLibrary.requestAuthorization { status in if status == .authorized { let result = PHAssetCollection.fetchAssetCollections(with: .album, subtype: .any, options: nil) if result.count == 0 { print("No albums found.") return } } } Any help or suggestions would greatly be appreciated. ApplePhoto Configuration Details iPad Pro 13-inch (M4) (iPad16,6) iPadOS = 17.7 iCloud = Turned Off iPad Pro Photo Library Albums = 3900 iPad Pro Photo Library Photos = 118000 iPad Pro Photo Library Videos = 4800 MacOS = Sonoma 14.6.1 XCode Version = 16.0 Swift Version = 5.0 (Xcode Default) Microsoft Windows 10 Pro Version = 22H2 Apple iTunes for Windows = 12.13.4.4
1
0
296
Oct ’24
Video Hardware Acceleration on Mac
I observe significant performance differences when encoding a video in mp4 format (H264). The code I use is standard (using AVAssetWriter, AVAssetWriterInput...). Here is what I notice when I run the same code on different platforms: On an iPhone, the video is encoded in 3 seconds (iPhone 13, 14, 15, 16, Pro...). On a Mac equipped with an M2 Pro, the video is encoded in 50 seconds. On a Mac equipped with an Intel processor (2,3 GHz Intel Xeon W 18 cœurs), the video is encoded in 2 minutes. The encoding on an iPhone is very fast due to hardware acceleration. However, I don’t understand why I don’t get similar performance with a Mac M2 Pro, which is equipped with a dedicated component for hardware acceleration (H264 media engine)? Is hardware acceleration disabled on a Mac?
0
0
439
Oct ’24
High / prohibitive memory usage for AVAssetExportSession
I am making an app that can two two videos, and then stitch them together on the screen (one video on top half and the other on bottom half). This is achieved with AVMutableComposition, and then I am using AVAssetExportSession to export a mp4 file out: guard let export = AVAssetExportSession(asset: composition, presetName: AVAssetExportPreset1920x1080) else { return } export.exportAsynchronously { .... When the two input videos are around 1GB each, starting the export session immediately increases memory usage by ~2GB, as if it moves the input files into memory immediately (my guess), and at some point my app is killed for using too much memory. Is there a way to avoid this upfront memory usage and/or avoid getting killed?
1
3
347
Oct ’24
arm64 Logic Leaking Plugins (Not Calling AP_Close)
I'm running into an issue where in some cases, when the AUHostingServiceXPC_arrow process is shut down by Logic, the process is terminated abruptly without calling AP_Close on all of the plugins hosted in the process. In our case, we have filesystem resources we need to clean up, and having stale files around from the last run can cause issues in new sessions, so this leak is having some pretty gnarly effects. I can reproduce the issue using only Apple sample plugins, and it seems to be triggered by a timeout. If I have two different AU plugins in the session, and I add a 1 second sleep to the destructor of one of the sample plugins, Logic will force terminate the process and the remaining destructors are not called (even for the plugins without the 1 second sleep). Is there a way to avoid this behavior? Or to safely clean up our plugin even if other plugins in the session take a second to tear down?
1
1
517
Oct ’24
Get audio volume from microphone
Hello. We are trying to get audio volume from microphone. We have 2 questions. 1. Can anyone tell me about AVAudioEngine.InputNode.volume? AVAudioEngine.InputNode.volume Return 0 in the silence, Return float type value within 1.0 depending on the volume are expected work, but it looks 1.0 (default value) is returned at any time. Which case does it return 0.5 or 0? Sample code is below. Microphone works correctly. // instance member private var engine: AVAudioEngine! private var node: AVAudioInputNode! // start method self.engine = .init() self.node = engine.inputNode engine.prepare() try! engine.start() // volume getter print(\(self.node.volume)) 2. What is the best practice to get audio volume from microphone? Requirements are: Without AVAudioRecorder. We use it for streaming audio. it should withstand high frequency access. Testing info device: iPhone XR OS version: iOS 18 Best Regards.
2
0
815
Oct ’24
Play audio data coming from serial port
Hi. I want to read ADPCM encoded audio data, coming from an external device to my Mac via serial port (/dev/cu.usbserial-0001) as 256 byte chunks, and feed it into an audio player. So far I am using Swift and SwiftSerial (GitHub - mredig/SwiftSerial: A Swift Linux and Mac library for reading and writing to serial ports. 3) to get the data via serialPort.asyncBytes() into a AsyncStream but I am struggling to understand how to feed the stream to a AVAudioPlayer or similar. I am new to Swift and macOS audio development so any help to get me on the right track is greatly appreciated. Thx
0
0
268
Oct ’24