Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics

Post

Replies

Boosts

Views

Activity

PHPhotoLibrary.shared().performChanges raised error 3303
I want to modify the photo's exif information, which means putting the original image in through CGImageDestinationAddImageFromSource, and then adding the modified exif information to a place called properties. Here is the complete process: static func setImageMetadata(asset: PHAsset, exif: [String: Any]) { let options = PHContentEditingInputRequestOptions() options.canHandleAdjustmentData = { (adjustmentData) -> Bool in return true } asset.requestContentEditingInput(with: options, completionHandler: { input, map in guard let inputNN = input else { return } guard let url = inputNN.fullSizeImageURL else { return } let output = PHContentEditingOutput(contentEditingInput: inputNN) let adjustmentData = PHAdjustmentData(formatIdentifier: AppInfo.appBundleId(), formatVersion: AppInfo.appVersion(), data: Data()) output.adjustmentData = adjustmentData let outputURL = output.renderedContentURL guard let source = CGImageSourceCreateWithURL(url as CFURL, nil) else { return } guard let dest = CGImageDestinationCreateWithURL(outputURL as CFURL, UTType.jpeg.identifier as CFString, 1, nil) else { return } CGImageDestinationAddImageFromSource(dest, source, 0, exif as CFDictionary) let d = CGImageDestinationFinalize(dest) // d is true, and I checked the content of outputURL, image has been write correctly, it could be convert to UIImage and image is ok. PHPhotoLibrary.shared().performChanges { let changeReq = PHAssetChangeRequest(for: asset) changeReq.contentEditingOutput = output } completionHandler: { succ, err in if !succ { print(err) // 3303 here, always! } } }) }
1
0
113
5d
Do we need an MFi authentication chip to send video/commands to iPhone?
Hello, my company is developing a product that will send data to/from the phone over cable and Wi-Fi. I have three questions: Do we need an MFi authentication chip in our product if we plan to send video and commands to the iPhone/iPad over USB or Lightning cable? Likewise, do we need an MFI authentication chip for communication over Wi-Fi? (Informal research suggests that the answer is no to this one.) And, do we even still need MFI certification at all for Wi-Fi comms? (We are not using HomeKit.) Thank you!
0
0
208
6d
SFSpeechRecognizer throws User denied access to speech recognition
I have created an app where you can speak using SFSpeechRecognizer and it will recognize you speech into text, translate it and then return it back using speech synthesis. All locales for SFSpeechRecognizer and switching between them work fine when the app is in the foreground but after I turn off my screen(the app is still running I just turned off the screen) and try to create new recognitionTask it it receives this error inside the recognition task: User denied access to speech recognition. The weird thing about this is it only happens with some languages. The error happens with Croatian or Hungarian locale for speech recognition but doesn't with English or Spanish locale.
0
0
160
6d
PHPickerViewController update preselectedAssetIdentifiers
I am trying to recreate the iOS Messages app photo selection UI, where a PHPickerViewController is displayed half screen, with the message text field and a scrolling photo viewer on top. I have that UI mostly working, but cannot figure out how to allow a user to remove an image from my scrolling photo viewer (just like in the iOS Messages app). When the picker is initially displayed, I can show selected images using the preselectedAssetIdentifiers. However, if the user taps the "x" to remove an image from the scrolling photo viewer, there is no way that I have found to update that selection in the picker. I can dismiss/show a new picker with the animated property set to false, but that creates a very apparent bounce in the screen. Are there any ways I am missing to accomplish this? Here is what I have so far:
1
0
186
1w
Can we say that HLS uses UDP?
Hello, Apple video engineers. According to the official documentation, HLS is built on HTTP and traditionally ran on top of TCP. However, with the introduction of HTTP/3, which uses QUIC (runs on top of UDP), I would like to clarify the following: Has the official HLS specification changed in a way that allows it to be considered UDP-based when using HTTP/3? And is it fair to say that HLS supports UDP since the transport can go over HTTP/3 and QUIC? Would it be more accurate to say that HLS remains HTTP-dependent, and the transport protocol (TCP or QUIC) only determines how HTTP requests are delivered? My thoughts: Since HTTP/3 uses QUIC running over UDP, we still can't say that HLS supports UDP in a classical way, as it is introduced in RTP, RTSP, SRT.
0
0
201
1w
[Fairplay Streaming] Time Manipulation Detection DRM License Servers
Our team conducted security testing and found one vulnerability with fairplay license acquisition. Our QA engineer manually changed the device's system date and time (setting it 4 days into the future) and was able to successfully obtain a license response and initiate playback on an iOS device. However, on an Android device, the license acquisition failed. Can you please tell us if Time Manipulation Detection is available in FairPlay SDK?
0
0
213
1w
[Crash Report] PHAssetResourceManager.writeDataForAssetResource causes app crash on iOS18
We are encountering a critical, intermittently occurring crash issue when accessing photo data using PHAssetResourceManager.writeDataForAssetResource on iOS 18. The problem does not arise on iOS 17 or earlier versions. We have been unable to identify a consistent reproduction path. Based on user feedback, the issue seems to involve Live Photo and Raw image files. Our investigation has revealed that the crash occurs in the +[PISchema identifier] method of the PhotoImaging Framework. When called manually, this method causes a crash on iOS 18 but works without issues on iOS 17. Reproduction Steps: 1.Fetch PHAsset. 2.Get PHAssetResource by [PHAssetResource assetResourcesForAsset:]. 3.Call [PHAssetResourceManager writeDataForAssetResource:toFile:options:completionHandler:]. Crash Log: Incident Identifier: CFD60092-FDB1-43B4-BA42-3F507F7B8B96 CrashReporter Key: 260b4780989083a54e0cb451930fe9a3bed64862 Hardware Model: iPhone13,4 AppStoreTools: 16C5031b AppVariant: 1:iPhone13,4:18 Code Type: ARM-64 (Native) Role: Foreground Parent Process: launchd [1] Date/Time: 2025-02-15 19:07:57.7054 +0800 Launch Time: 2025-02-15 19:07:55.4106 +0800 OS Version: iPhone OS 18.3.1 (22D72) Release Type: User Baseband Version: 5.20.03 Report Version: 104 Exception Type: EXC_CRASH (SIGABRT) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: SIGNAL 6 Abort trap: 6 Terminating Process: mCloud_iPhone [11109] Triggered by Thread: 11 Application Specific Information: abort() called Thread 11 name: Dispatch queue: com.apple.NSXPCConnection.m-user.com.apple.photos.service Thread 11 Crashed: 0 libsystem_kernel.dylib 0x1e850b2d4 __pthread_kill + 8 1 libsystem_pthread.dylib 0x221b4959c pthread_kill + 268 2 libsystem_c.dylib 0x19ec24b08 abort + 128 3 NeutrinoCore 0x1bdcdbdec -[NUAssertionPolicyAbort notifyAssertion:] + 68 4 NeutrinoCore 0x1bdcdbbf4 -[NUAssertionPolicyComposite notifyAssertion:] + 160 5 NeutrinoCore 0x1bdcdc098 -[NUAssertionPolicyUnique notifyAssertion:] + 176 6 NeutrinoCore 0x1bdcdb524 -[NUAssertionHandler handleFailureInFunction:file:lineNumber:currentlyExecutingJobName:description:arguments:] + 156 7 NeutrinoCore 0x1bdcdc4bc _NUAssertFailHandler + 176 8 NeutrinoCore 0x1bdc8ea98 -[NUIdentifier initWithNamespace:name:version:] + 2352 9 NeutrinoCore 0x1bdc8eba8 -[NUIdentifier initWithName:version:] + 84 10 NeutrinoCore 0x1bdc8ec10 -[NUIdentifier initWithName:] + 68 11 PhotoImaging 0x1bda54ce4 +[PISchema identifier] + 36 12 PhotoImaging 0x1bda550fc +[PISchema registeredPhotosSchemaIdentifier] + 32 13 PhotoImaging 0x1bd9d7128 +[PIPhotoEditHelper newComposition] + 28 14 PhotoImaging 0x1bd940798 +[PICompositionSerializer deserializeCompositionFromAdjustments:metadata:formatIdentifier:formatVersion:sidecarData:error:] + 160 15 PhotoImaging 0x1bd9412ec +[PICompositionSerializer deserializeCompositionFromData:formatIdentifier:formatVersion:sidecarData:error:] + 224 16 PhotoLibraryServices 0x1afabf75c -[PLPhotoEditPersistenceManager loadCompositionFrom:formatIdentifier:formatVersion:sidecarData:error:] + 1856 17 PhotoLibraryServices 0x1afabffe4 +[PLPhotoEditPersistenceManager validateAdjustmentData:formatIdentifier:formatVersion:error:] + 108 18 Photos 0x1af4ac360 __167+[PHContentEditingInputRequestContext contentEditingInputRequestContextForAsset:requestID:managerID:networkAccessAllowed:downloadIntent:progressHandler:resultHandler:]_block_invoke + 260 19 Photos 0x1af4ac67c -[PHAdjustmentData(ContentEditingInput) _contentEditing_readableByClientWithVerificationBlock:] + 136 20 Photos 0x1af4ac4b0 -[PHAdjustmentData(ContentEditingInput) _contentEditing_requiredBaseVersionReadableByClient:verificationBlock:] + 88 21 Photos 0x1af4abb8c -[PHContentEditingInputRequestContext _adjustmentBaseVersionFromResult:request:canHandleAdjustmentData:] + 404 22 Photos 0x1af4a911c -[PHContentEditingInputRequestContext produceChildRequestsForRequest:reportingIsLocallyAvailable:isDegraded:result:] + 624 23 Photos 0x1af2c1d10 -[PHMediaRequestContext _produceChildRequestsForRequest:withResult:] + 88 24 Photos 0x1af2c11e8 -[PHMediaRequestContext mediaRequest:didFinishWithResult:] + 88 25 Photos 0x1af505184 -[PHAdjustmentDataRequest _finishFromAsynchronousCallback] + 124 26 Photos 0x1af5050a0 __39-[PHAdjustmentDataRequest startRequest]_block_invoke + 584 27 PhotoLibraryServicesCore 0x1b001be8c __106-[PLAssetsdResourceClient adjustmentDataForAsset:networkAccessAllowed:trackCPLDownload:completionHandler:]_block_invoke.86 + 864 28 CoreFoundation 0x196dd8e34 __invoking___ + 148 29 CoreFoundation 0x196dd7e7c -[NSInvocation invoke] + 428 30 Foundation 0x195a64ae0 __NSXPCCONNECTION_IS_CALLING_OUT_TO_EXPORTED_OBJECT__ + 16 31 Foundation 0x195a63514 -[NSXPCConnection _decodeAndInvokeReplyBlockWithEvent:sequence:replyInfo:] + 532 32 Foundation 0x195a6653c __88-[NSXPCConnection _sendInvocation:orArguments:count:methodSignature:selector:withProxy:]_block_invoke_5 + 188 33 libxpc.dylib 0x221babb80 _xpc_connection_reply_callout + 116 34 libxpc.dylib 0x221b9e2d0 _xpc_connection_call_reply_async + 80 35 libdispatch.dylib 0x19eb6b028 _dispatch_client_callout3 + 20 36 libdispatch.dylib 0x19eb88b64 _dispatch_mach_msg_async_reply_invoke + 340 37 libdispatch.dylib 0x19eb7242c _dispatch_lane_serial_drain + 352 38 libdispatch.dylib 0x19eb73158 _dispatch_lane_invoke + 432 39 libdispatch.dylib 0x19eb7e38c _dispatch_root_queue_drain_deferred_wlh + 288 40 libdispatch.dylib 0x19eb7dbd8 _dispatch_workloop_worker_thread + 540 41 libsystem_pthread.dylib 0x221b44680 _pthread_wqthread + 288 42 libsystem_pthread.dylib 0x221b42474 start_wqthread + 8
0
0
150
1w
How to receive AVMetricEvent performance data?
I am would like to look at AVMetricEvent data during video playback, so I have added this code to a test video player app: let playerItem: AVPlayerItem = ... let allMetrics = playerItem.allMetrics() Task.init { print("metrics task started") do { for try await metricEvent in allMetrics { print("metric event: \(metricEvent.description)") } } catch { print("unexpected metric iterator error \(error)") } } Running this in Simulator on iPhone 16 Pro (18.0) does not result in any "metric event" diagnostic messages being printed when the video associated with this AVPlayerItem is playing. Only the "metric task started" diagnostic message is seen. What am I doing wrong that prevents metric data being received?
1
0
163
1w
AVSpeechSynthesizer & Bluetooth Issues
Hello, I have a CarPlay Navigation app and utilize the AVSpeechSynthesizer to speak directions to a user. Everything works great on my CarPlay simulator as well as when plugged into my GMC truck. However, I found out yesterday that one of my users with a Ford truck the audio would cut in an out. After much troubleshooting, I was able to replicate this on my own truck when using Bluetooth to connect to CarPlay. My user was also utilizing Bluetooth. Has anyone else experienced this? Is there a fix to the problem? import SwiftUI import AVFoundation class TextToSpeechService: NSObject, ObservableObject, AVSpeechSynthesizerDelegate { private var speechSynthesizer = AVSpeechSynthesizer() static let shared = TextToSpeechService() override init() { super.init() speechSynthesizer.delegate = self } func configureAudioSession() { speechSynthesizer.delegate = self do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .voicePrompt, options: [.mixWithOthers, .allowBluetooth]) } catch { print("Failed to set audio session category: \(error.localizedDescription)") } } func speak(_ text: String) { Task(priority: .high) { let speechUtterance = AVSpeechUtterance(string: text) speechUtterance.voice = AVSpeechSynthesisVoice(language: AVSpeechSynthesisVoice.currentLanguageCode()) try AVAudioSession.sharedInstance().setActive(true, options: .notifyOthersOnDeactivation) speechSynthesizer.speak(speechUtterance) } } func speechSynthesizer(_ synthesizer: AVSpeechSynthesizer, didFinish utterance: AVSpeechUtterance) { Task { stopSpeech() try AVAudioSession.sharedInstance().setActive(false) } } func stopSpeech() { speechSynthesizer.stopSpeaking(at: .immediate) } }
0
0
191
1w
AVMIDIPlayer.play() function crashes on iOS 18
It's only occurs on iOS 18+. Backtrace attached below. Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: SIGNAL 6 Abort trap: 6 Terminating Process: NoteKeys [24384] Triggered by Thread: 0 Last Exception Backtrace: 0 CoreFoundation 0x1a2d4c7cc __exceptionPreprocess + 164 (NSException.m:249) 1 libobjc.A.dylib 0x1a001f2e4 objc_exception_throw + 88 (objc-exception.mm:356) 2 CoreFoundation 0x1a2e47748 +[NSException raise:format:] + 128 (NSException.m:0) 3 AVFAudio 0x1bd41f4c8 -[AVMIDIPlayer play:] + 300 (AVMIDIPlayer.mm:145) 4 NoteKeys 0x1023c0670 SoundGenerator.playData() + 20 (SoundGenerator.swift:170) 5 NoteKeys 0x1023c0670 EditViewController.playBtnTapped(startIndex:) + 940 (EditViewController.swift:2034) 6 NoteKeys 0x1024497fc specialized Keyboard.playBtnTapped(sender:) + 1904 (Keyboard.swift:1249) 7 NoteKeys 0x10244631c Keyboard.playBtnTapped(sender:) + 4 (<compiler-generated>:0) 8 NoteKeys 0x10244631c @objc Keyboard.playBtnTapped(sender:) + 48 9 UIKitCore 0x1a58739cc -[UIApplication sendAction:to:from:forEvent:] + 100 (UIApplication.m:5816) 10 UIKitCore 0x1a58738a4 -[UIControl sendAction:to:forEvent:] + 112 (UIControl.m:942) 11 UIKitCore 0x1a58736f4 -[UIControl _sendActionsForEvents:withEvent:] + 324 (UIControl.m:1013) 12 UIKitCore 0x1a5fe8d8c -[UIButton _sendActionsForEvents:withEvent:] + 124 (UIButton.m:4198) 13 UIKitCore 0x1a5fea5a0 -[UIControl touchesEnded:withEvent:] + 400 (UIControl.m:692) 14 UIKitCore 0x1a57bb9ac -[UIWindow _sendTouchesForEvent:] + 852 (UIWindow.m:3318) 15 UIKitCore 0x1a57bb3d8 -[UIWindow sendEvent:] + 2964 (UIWindow.m:3641) 16 UIKitCore 0x1a564fb70 -[UIApplication sendEvent:] + 376 (UIApplication.m:12972) 17 UIKitCore 0x1a565009c __dispatchPreprocessedEventFromEventQueue + 1048 (UIEventDispatcher.m:2686) 18 UIKitCore 0x1a5659f3c __processEventQueue + 5696 (UIEventDispatcher.m:3044) 19 UIKitCore 0x1a5552c60 updateCycleEntry + 160 (UIEventDispatcher.m:133) 20 UIKitCore 0x1a55509d8 _UIUpdateSequenceRun + 84 (_UIUpdateSequence.mm:136) 21 UIKitCore 0x1a5550628 schedulerStepScheduledMainSection + 172 (_UIUpdateScheduler.m:1171) 22 UIKitCore 0x1a555159c runloopSourceCallback + 92 (_UIUpdateScheduler.m:1334) 23 CoreFoundation 0x1a2d20328 __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 28 (CFRunLoop.c:1970) 24 CoreFoundation 0x1a2d202bc __CFRunLoopDoSource0 + 176 (CFRunLoop.c:2014) 25 CoreFoundation 0x1a2d1ddc0 __CFRunLoopDoSources0 + 244 (CFRunLoop.c:2051) 26 CoreFoundation 0x1a2d1cfbc __CFRunLoopRun + 840 (CFRunLoop.c:2969) 27 CoreFoundation 0x1a2d1c830 CFRunLoopRunSpecific + 588 (CFRunLoop.c:3434) 28 GraphicsServices 0x1eecfc1c4 GSEventRunModal + 164 (GSEvent.c:2196) 29 UIKitCore 0x1a5882eb0 -[UIApplication _run] + 816 (UIApplication.m:3844) 30 UIKitCore 0x1a59315b4 UIApplicationMain + 340 (UIApplication.m:5496) 31 NoteKeys 0x10254bc10 main + 68 (AppDelegate.swift:15) 32 dyld 0x1c870aec8 start + 2724 (dyldMain.cpp:1334) Thanks very much for any help: )
1
0
126
1w
3D scanning and 3D model with texture mapping
I am new here and would appreciate help in coding or an explanation what to use in swift for an app which will be able to capture LiDAR scanning and RGB data from taken pictures, generate a 3D mesh, and create .OBJ, .MTL, and .JPEG file set for further manipulation of 3D model. I am able to create from LiDAR scanning 3D mesh and .OBJ file but can not generate .MTL and .JPEG for a texture of 3D model.
1
0
170
1w
Issues After Switching to v3
I've been having issues with authorization after switching from v1 to v3, have tried some of the suggestions provided by someone in a reply to a post of mine. Have tried to reach out to Apple Support a few times as well, though I haven't received any support that has helped me to move forward. I have tested my token at https://jwt.io and I'm getting a "Signature Verified", tried multiple browsers in private/incognito mode, now when I try to sign into my Apple Account to test my player I am receiving an error that stats "There is a Problem Connecting. There May be an Issue with Your Network." (which is not the case). I have tried everything I can think of, I'm at a loss and would appreciate any help to get my project moving forward. This is what I am seeing in the browser developer console (using Firefox): Authorization failed: AUTHORIZATION_ERROR: Unauthorized MKError https://js-cdn.music.apple.com/musickit/v3/musickit.js:13 authorize https://js-cdn.music.apple.com/musickit/v3/musickit.js:28 asyncGeneratorStep$w https://js-cdn.music.apple.com/musickit/v3/musickit.js:28 _next https://js-cdn.music.apple.com/musickit/v3/musickit.js:28 media.mydomain.com:398:19 https://media.mydomain.com/:398
5
0
308
1w
Unkown photos were added to my iPhone Photo App.
There are several unknown pictures were added to my iPhone Photos. I'm developing a iOS app in swiftUI. And I'm sure that neither my developing app asked a permission to save photos to iOS device, nor these pictures are taken by myself, downloaded from the web, or synced from my iCloud. Why does this happen? Is my account or my iPhone attacked or hacked? btw, my device has turned the debug mode on, will this setting option would leads some vulnerability or some debug tools would add photos to my albums? The attached pictures are the unknown photos in my Photos App. I'v search this photo with google. And I found that many developers in stack overflow use this picture when they encounter troubles with image display or other manipulation. Where where are this photo come from? Why does it exist in my Photo albums?
1
0
232
1w
MacOS: AudioUnit packaged as .appex won't load when host app is sandboxed
Hi, I'm working on an audio mixing app, that comes with bundled audio units that provide some of the app's core functionality. For the next release of that app, we are planning to make two changes: make the app sandboxed package the bundled audio units as .appex bundles instead as .component bundles, so we don't need to take care of the installation at the correct spot in the file system When trying this new approach, we run into problems where [[AVAudioUnitEffect alloc] initWithAudioComponentDescription:] crashes when trying to load our audio unit with the exception: AVAEInternal.h:109 [AUInterface.mm:468:AUInterfaceBaseV3: (AudioComponentInstanceNew(comp, &_auv2)): error -10863 Our audio unit has the `sandboxSafe flag enabled, and loads fine when the host app is not sandboxed, so I'm guessing I got the bundle id/code signing requirements for the .appex correct. It seems, that my .appex isn't even loaded, and the system rejects it because of its metadata. Maybe there something wrong the Info.plist generated by Juice? "BuildMachineOSBuild" => "23H222" "CFBundleDisplayName" => "elgato_sample_recorder" "CFBundleExecutable" => "ElgatoSampleRecorder" "CFBundleIdentifier" => "com.iwascoding.EffectLoader.samplerecorderAUv3" "CFBundleName" => "elgato_sample_recorder" "CFBundlePackageType" => "XPC!" "CFBundleShortVersionString" => "1.0.0.0" "CFBundleSignature" => "????" "CFBundleSupportedPlatforms" => [ 0 => "MacOSX" ] "CFBundleVersion" => "1.0.0.0" "DTCompiler" => "com.apple.compilers.llvm.clang.1_0" "DTPlatformBuild" => "24C94" "DTPlatformName" => "macosx" "DTPlatformVersion" => "15.2" "DTSDKBuild" => "24C94" "DTSDKName" => "macosx15.2" "DTXcode" => "1620" "DTXcodeBuild" => "16C5032a" "LSMinimumSystemVersion" => "10.13" "NSExtension" => { "NSExtensionAttributes" => { "AudioComponents" => [ 0 => { "description" => "Elgato Sample Recorder" "factoryFunction" => "elgato_sample_recorderAUFactoryAUv3" "manufacturer" => "Manu" "name" => "Elgato: Elgato Sample Recorder" "sandboxSafe" => 1 "subtype" => "Znyk" "tags" => [ 0 => "Effects" ] "type" => "aufx" "version" => 65536 } ] } "NSExtensionPointIdentifier" => "com.apple.AudioUnit-UI" "NSExtensionPrincipalClass" => "elgato_sample_recorderAUFactoryAUv3" } "NSHighResolutionCapable" => 1 } Any ideas what I am missing?
3
0
192
1w
Does PHAsset localIdentifier change after device transfer?
Hi, I’m working on a photo backup app. I track the PHAsset localIdentifier to determine which photos have been backed up and which haven’t. Recently, I’ve noticed that two users seem to have experienced the localIdentifier changes after transferring data to a new iPhone using Quick Start. Additionally, others on StackOverflow have mentioned that the localIdentifier sometimes changes after updating the iOS version. https://stackoverflow.com/questions/40094728/phobject-localidentifier-reliability I’d like to confirm the reliability of the localIdentifier after an iOS version upgrade or device transfer. Can I continue using these locally stored localIdentifiers? Or is there another recommended approach, such as using PHCloudIdentifier?
2
0
190
1w
Photo permission dialog not shown when iOS app runs on Mac
According to the docs: The first time your app performs an operation that requires [photo library] authorization, the system automatically and asynchronously prompts the user for it. (https://developer.apple.com/documentation/photokit/delivering-an-enhanced-privacy-experience-in-your-photos-app) I.e. it's not necessary for the app to call PHPhotoLibrary.requestAuthorization. This does seem to be what happens when my app runs on an iPhone or iPad; the prompt is shown. But when it runs on a Mac in "designed for iPad" mode, the permission dialog is not presented. Instead the code continues to see status == .notDetermined. That's today, on macOS 15.3. It may have worked in the past. Is anyone else seeing issues with this? Should I call requestAuthorization explicitly? (Would that actually work?)
0
0
189
2w
How to write RGB & Depth Frames Without Losing Synchronization
I’m currently working on a project where I capture both depth frames and RGB frames using AVCaptureDataOutputSynchronizer. Depth frames are stored as raw binary data and RGB frames are saved with AVAssetWriter. The issue I’m facing is that AVAssetWriter enforces a fixed framerate, meaning it adds or discards frames to maintain that rate (as I understand it). This causes a desynchronization between the depth and RGB frames, which is a problem because I need each depth frame to be exactly matched with the corresponding RGB frame as they were captured. How can I ensure that the RGB frames are saved without AVAssetWriter modifying the frame count?
0
0
188
2w
Music Keeps cutting off
Everytime I put my AirPods in and connect them to my phone or my Mac or my iPad since the iOS 18.3 update on my devices they’ve been disconnecting without reason, pausing songs I’m in the middle of playing, and only partially reconnecting in one pod and it’s getting really frustrating
1
0
161
2w
Build a vision capture system for apple vision pro
Hi, I am a newbie here. We have been given a task to build a robotic vision system to capture an immersive video in a hazed environment, which will later be played on Apple Vision Pro. I am thinking of starting with 2 or 4 basic CMOS camera sensors, such as IMX378, AR0144, or VD66GY, and designing an FPGA-based circuit to synchronously capture and store raw frame-by-frame data. Some frame initial processing such as demosaicing and filtering can also be done by the FPGA. Then, I would use software for post-processing to convert the data into a compatible video format for Apple Vision Pro. Will this idea work? I can handle the raw data capture, but I’m unsure if this approach is feasible and what post-processing software I should use. Thanks a lot for your suggestions! Charlie
0
0
135
2w