Integrate photo, audio, and video content into your apps.

Media Documentation

Posts under Media tag

61 Posts
Sort by:
Post not yet marked as solved
8 Replies
2.6k Views
Hi~ I have got the error below, when I tried to export the video from iphone to my project using exportAsynchronously of AVAssetExportSession. MediaPickerError: nsError : Error Domain=AVFoundationErrorDomain Code=-11800 "这项操作无法完成" UserInfo={NSLocalizedFailureReason=发生未知错误(-17507), NSLocalizedDescription=这项操作无法完成, NSUnderlyingError=0x2806f3870 {Error Domain=NSOSStatusErrorDomain Code=-17507 "(null)"}} Can I get more information for this error? And how can I fix it? ^_^
Posted
by
Post not yet marked as solved
6 Replies
1.2k Views
We implemented the new PHPicker and have run into an issue we haven't been able to replicate on our own devices but see a lot of users running into it. The problem is an error after getting the PHPickerResults and trying to get the UIImages. Because the user can select several images at once, what we do is get the results and iterate over each itemProvider object. I'm following apple's guidance and checking if itemProvider canLoadObjectOfClass:UIImage.class, before executing itemProvider loadObjectOfClass:UIImage.class. However we are getting hundreds of reports of users where this last method returns an error. Firstly, this is how we configure our PHPickerViewController: PHPickerConfiguration *configuration = [[PHPickerConfiguration alloc] init]; configuration.selectionLimit = self.pictureSelectionLimit; configuration.filter = PHPickerFilter.imagesFilter; configuration.preferredAssetRepresentationMode = PHPickerConfigurationAssetRepresentationModeCurrent; PHPickerViewController *pickerViewController = [[PHPickerViewController alloc] initWithConfiguration:configuration]; pickerViewController.delegate = self; pickerViewController.modalPresentationStyle = UIModalPresentationFullScreen; [viewController presentViewController:pickerViewController               animated:YES              completion:nil]; And this is what we do with the PHPickerResult. This is a method that returns a block with an array of an object NewPicture instantiated with the UIImage I should be getting. NSMutableArray *picArray = [[NSMutableArray alloc] init]; NSArray *itemProviders = [self.results custom_map: ^id _Nullable (PHPickerResult *_Nonnull current) {   return current.itemProvider; }]; dispatch_group_t dispatchGroup = dispatch_group_create(); for (NSItemProvider *itemProvider in itemProviders) { dispatch_group_enter(dispatchGroup); /**  We cannot properly retrieve raw type images with the current authorization status.  If the image is of type raw, we ignore it. */ if ([itemProvider hasItemConformingToTypeIdentifier:@"public.camera-raw-image"]) { NSException *exception = [NSException exceptionWithName:@"ImageIsTypeRaw"                         reason:[NSString stringWithFormat:@"Object is type raw. ItemProvider: %@", itemProvider.description]                        userInfo:nil]; // Log exception... dispatch_group_leave(dispatchGroup); continue; } if ([itemProvider canLoadObjectOfClass:UIImage.class]) { [itemProvider loadObjectOfClass:UIImage.class completionHandler: ^(__kindof id NSItemProviderReading _Nullable object, NSError *_Nullable error) {   if ([object isKindOfClass:UIImage.class]) { NewPicture *picture = [[NewPicture alloc]initWithImage:object]; [picArray addObject:picture]; }   if (error) { NSException *exception = [NSException exceptionWithName:@"CouldNotLoadImage"                         reason:[NSString stringWithFormat:@"Object is nil. UserInfo: %@", error.userInfo]                        userInfo:error.userInfo]; // Log exception... }   dispatch_group_leave(dispatchGroup); }]; } } dispatch_group_notify(dispatchGroup, dispatch_get_main_queue(), ^{ picturesBlock(picArray); }); The most common error we see our users are getting is: Object is nil. UserInfo: { NSLocalizedDescription = "Cannot load representation of type public.jpeg"; NSUnderlyingError = "Error Domain=NSCocoaErrorDomain Code=260 \"The file \U201cversion=1&uuid=*&mode=current.jpeg\U201d couldn\U2019t be opened because there is no such file.\" UserInfo={NSURL=file:///private/var/mobile/Containers/Shared/AppGroup/*/File%20Provider%20Storage/photospicker/version=1&uuid=*&mode=current.jpeg, NSFilePath=/private/var/mobile/Containers/Shared/AppGroup/*/File Provider Storage/photospicker/version=1&uuid=***&mode=current.jpeg, NSUnderlyingError=0x283822970 {Error Domain=NSPOSIXErrorDomain Code=2 \"No such file or directory\"}}"; } I'm having a really hard time understanding why this sometimes fails. I'd really appreciate it if someone could give me a hand with this. I'm attaching the stack trace: stack_trace - https://developer.apple.com/forums/content/attachment/051f7018-05ff-4ad1-a626-29f248d0b497
Posted
by
Post not yet marked as solved
0 Replies
458 Views
Someone can help me to understand why I'm receiving this issue "Rejection Binary": 5. 2.3 Legal: Intellectual Property - Audio/Video Downloading My APP there is no way to download any kind of file. The APP just consume an API and has a player button. I really don't know why I'm receiving this message. We are a radio broadcasting station in Brazil. And we are trying to put our station online through the app.
Posted
by
Post marked as solved
2 Replies
839 Views
I have a Catalyst application that uses (as expected) MPNowPlayingInfoCenter to set the now playing info and MPRemoteCommandCenter to get the media events for play/pause/stop/favorite/etc. The code is shared on iOS, tvOS and watchOS and it works correctly there. It seems not to work on macOS (app is compiled as a Catalyst application) on Big Sur (and Monterey, fwiw). Media keys on the keyboard starts the Music app, the music part of the control center do not show now playing info (nor the media controls there send messages to the app). I seem to remember that it used to work in Catalina (at least the media key part) and for sure it used to work in a precedent version of the same app that used to be an UIKit one. Is this a bug (worth a feedback to Apple) or something wrong on my side? I forgot some magic capability for macOS? App is sandboxed and uses hardened runtime, in case this is significant. Thank you for any hint!
Posted
by
Post not yet marked as solved
0 Replies
459 Views
When use presentLimitedLibraryPicker or when use "Select Photos.." in library, there is an abnormal condition. The search box is transparent, it's only happen in iOS 14, iOS 15 Beta is not happen. Where can I handle it?
Posted
by
Post not yet marked as solved
0 Replies
335 Views
I'm setting up an API call to Tenor.com, docs here https://tenor.com/gifapi/documentation#responseobjects-gif . I'm setting up a struct for my returning JSON, and I'm stuck on the media part. For "media" it says to use "[ { GIF_FORMAT : MEDIA_OBJECT } ]". How do I declare gif format and media objects? Or is there another way to set this up? Here's what I've got do far. struct structForAllApiResults: Codable { // MARK: - Gif Object let created: Float // a unix timestamp representing when this post was created. let hasaudio: Bool // true if this post contains audio (only video formats support audio, the gif image file format can not contain audio information). let id: String //Tenor result identifier let media: [ Dictionary<GIF,>] // An array of dictionaries with GIF_FORMAT as the key and MEDIA_OBJECT as the value let tags: [String] // an array of tags for the post let title: String // the title of the post. let itemurl: String // the full URL to view the post on tenor.com. let hascaption: Bool // true if this post contains captions let url: String // a short URL to view the post on tenor.com. // MARK: - Category Object let searchterm: String let path: String let image: String let name: String // MARK: - Media Object let preview: String let url: String let dims: [Int] // dimensions // MARK: - Format Types let gif: }
Posted
by
Post not yet marked as solved
0 Replies
634 Views
WebRTC video on iOS/iPadOS Safari goes black in 8 mins if SDP has no Audio. I have a WebRTC app that can video call on iOS/iPadOS Safari. But if Audio is disable in webRTC(or SDP), video goes black in 8 mins. After video goes black, webRTC call doesn't end. Just video goes black. After change tab dan get back to the tab which has a video call, video works well again. It seems that iOS/iPadOS Safari has a function which if video has no audio, video goes black in 8 mins. Any idea or any solution?
Posted
by
Post not yet marked as solved
0 Replies
396 Views
I am playing around with the keystoneCorrection filters. The properties that one can change are inputTopLeft, inputTopRight, inputBottomLeft, inputBottomRight, and inputFocalLength, The problem is that I cannot find any documentation as to how this filter works or any sample code. Would anyone have insights as to how this all works?
Posted
by
Post not yet marked as solved
1 Replies
1k Views
We are experiencing audio sync issues during playback on fMP4 HLS live streams (HLS and LL-HLS) on Apple devices only (iOS and macOS) and we're not sure what's causing the problem. The issue does not occur during playback on Windows or Android platforms. During playback in Safari, everything is fine until the sync gets lost suddenly, usually 5-10 minutes after playback begins. The extent of the desync varies but is very noticeable when it does - usually in the 15-30 frame range. Sync is always restored when restarting the player, until it becomes lost again some minutes later. We are capturing the streams on iPhone devices and encoding HEVC / AAC-LC at 30fps locally on the device, and then sending to a media server for further processing. We then transcode the source stream and create multiple variations at different bitrates (HEVC). Because we are streaming from mobile devices in the field, during our server-side transcoding we set a constant 30fps frame rate in case of drops due to network issues. I should add that the issue occurs just as much with h264 as HEVC (we've tested many different combinations of input/output formats and protocols). Regardless of whether we playback the source stream, the individual transcoded variations, or the ABR playlist with all variations, the sync problem appears in the same manner. One interesting note. The issue seldom occurs on one of our older devices, an iPhone 6s Plus running a slightly older iOS version (14.4.1). We suspect it has something to do with discontinuities inherent in our input streams that are not being corrected during our normalization/transcoding process. The Apple player is not compensating as other players are doing on other platforms. We've run the Apple MediaStreamValidator validator tool and discovered multiple "must fix" issues - but it's not clear which of these, if any, are causing our problems. See output attached. MediaStreamValidator output Also, here is the full HLS report from the validator tool (in PNG format due to file restrictions here): Happy to share more details or run more tests. We've been trying to debug this for weeks now. Thanks for your help.
Posted
by
Post marked as solved
1 Replies
544 Views
My CODE: the mediaURL.path is obtained from UIImagePickerControllerDelegate guard UIVideoEditorController.canEditVideo(atPath: mediaURL.path) else { return } let editor = UIVideoEditorController() editor.delegate = self editor.videoPath = mediaURL.path editor.videoMaximumDuration = 10 editor.videoQuality = .typeMedium self.parentViewController.present(editor, animated: true) Error description on console as below. Video export failed for asset <AVURLAsset: 0x283c71940, URL = file:///private/var/mobile/Containers/Data/PluginKitPlugin/7F7889C8-20DB-4429-9A67-3304C39A0725/tmp/trim.EECE5B69-0EF5-470C-B371-141CE1008F00.MOV>: Error Domain=AVFoundationErrorDomain Code=-11800 It doesn't call func videoEditorController(_ editor: UIVideoEditorController, didFailWithError error: Error) After showing error on console, UIVideoEditorController automatically dismiss itself. Am I doing something wrong? or is it a bug in swift? Thank you in advance.
Posted
by
Post marked as solved
4 Replies
968 Views
How can I find out what the Problem is? Every Time I start Audio and hear it when the iPad/iPhone is turned off and then activate Display of the device after 10-15 Minutes, the App crashes. Here are the First Lines of the Crash Report: Hardware Model: iPad8,12 Process: VOH-App [16336] Path: /private/var/containers/Bundle/Application/5B2CF582-D108-4AA2-B30A-81BA510B7FB6/VOH-App.app/VOH-App Identifier: com.voiceofhope.VOH Version: 7 (1.0) Code Type: ARM-64 (Native) Role: Non UI Parent Process: launchd [1] Coalition: com.voiceofhope.VOH [740] Date/Time: 2021-08-18 22:51:24.0770 +0200 Launch Time: 2021-08-18 22:36:50.4081 +0200 OS Version: iPhone OS 14.7.1 (18G82) Release Type: User Baseband Version: 2.05.01 Report Version: 104 Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Subtype: KERN_PROTECTION_FAILURE at 0x000000016d2dffb0 VM Region Info: 0x16d2dffb0 is in 0x16d2dc000-0x16d2e0000; bytes after start: 16304 bytes before end: 79 REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL CG raster data 11cad0000-11d814000 [ 13.3M] r--/r-- SM=COW GAP OF 0x4fac8000 BYTES ---> STACK GUARD 16d2dc000-16d2e0000 [ 16K] ---/rwx SM=NUL ... for thread 0 Stack 16d2e0000-16d3dc000 [ 1008K] rw-/rwx SM=PRV thread 0 Termination Signal: Segmentation fault: 11 Termination Reason: Namespace SIGNAL, Code 0xb Terminating Process: exc handler [16336] Triggered by Thread: 0 Thread 0 name: Dispatch queue: com.apple.main-thread Thread 0 Crashed: 0 libswiftCore.dylib 0x00000001a8028360 swift::MetadataCacheKey::operator==+ 3773280 (swift::MetadataCacheKey) const + 4 1 libswiftCore.dylib 0x00000001a801ab8c _swift_getGenericMetadata+ 3718028 (swift::MetadataRequest, void const* const*, swift::TargetTypeContextDescriptor<swift::InProcess> const*) + 304 2 libswiftCore.dylib 0x00000001a7ffbd00 __swift_instantiateCanonicalPrespecializedGenericMetadata + 36 Here is a full crash Report: VOH-App 16.08.21, 20-22.crash
Posted
by
Post not yet marked as solved
0 Replies
287 Views
Good afternoon! Can you advise, I want to implement photo exposure fixing by clicking on photo preview point, at the moment I use DragGesture to get CGPoint and pass it to capture setup let device = self.videoDeviceInput.device do { try device.lockForConfiguration() if device.isFocusPointOfInterestSupported { device.exposurePointOfInterest = focusPoint device.exposureMode = .autoExpose device.unlockForConfiguration() } } The values are printed to the terminal, but in the preview it feels like one point closer to the bottom edge is being used. The code for the view is: .gesture( DragGesture(minimumDistance: 0) .onChanged({ value in self.expFactor = value.location print(expFactor) }) .onEnded({ value in model.exp(with: expFactor) }) Can you tell me if anyone has already tried to implement the fixing in SwiftUI, I want it roughly like the standard camera.
Posted
by
Post not yet marked as solved
0 Replies
473 Views
The problem is, I have a video file which is about 111MB with resolution 1216x2160 aaaand I can’t save it on my iPhone even though I have a plenty enough space 😭 I tried to send it via airdrop and it shows a pop up with an error and ask me if I want to save it in my documents (I tried this one as well and there’s no way to save it in my gallery from the app), I tried to send a file via telegram and also get the same error. What should I do? I can’t believe that I can shoot in 4K, but can’t save a video with a higher resolution on my iPhone
Posted
by
Post not yet marked as solved
1 Replies
503 Views
Hi everyone. When i update my iPhone Xs to ios 15 beta 6 live text stop working. Before this update it works excellent. After 7 and 8 beta it still doesn’t work. What should i do?
Posted
by
Post not yet marked as solved
2 Replies
755 Views
I am running into a weird bug where videos embedded in WKWebViews do not appear. I get audio, but the screen goes black. A few notes: This happens on iOS 15 and the 15.1 beta, but not iOS 14.8. I have three WKWebViews in the view hierarchy. If I reduce the number of WebViews to one, I do not encounter this issue. When I encounter this issue with one WebView, playing the same video from another WebView will work. It seems to happen more often when in dark mode than in light mode. This might be an Apple bug, but I have not been successful in building a standalone app that can reproduce this.I know this isn’t a lot to go on, but if anyone has pointers or can suggest something to try I would appreciate it. Thanks. John
Posted
by
Post not yet marked as solved
2 Replies
1.4k Views
I've updated my iPhone 12 pro max to iOS 15 and all of the websites I've developed earlier are not loading Html5 videos (mp4) - this was working fine on iOS14 - is there anything that I should be aware of when writing code? Is there any kind of fix for this? I've seen it in other posts that there is a toggle in the settings for gpu video, but I obviously can't force each visitor to go to their settings to toggle something (if it fixes it at all)
Posted
by