Post not yet marked as solved
I am trying to play videos in AVSampleBufferDisplayLayer. Everything works well except it seems like the screenshot no longer works for the AVSBDPL when taken programatically.
I have tried a couple of approaches and the screenshot taken is always a black screen in the area of the AVSBDPL. Here are the approached that I have tried, but none of them works:
1. Get an image from image context with [view drawViewHierarchyInRect:view.bounds afterScreenUpdates:YES]
- (UIImage *)_screenshot:(UIView *)view {
UIGraphicsBeginImageContextWithOptions(view.frame.size, view.opaque, 0.0);
[view drawViewHierarchyInRect:view.bounds afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
No matter which view I provided to the function(the screen, the player container view, etc.), the video area is always a black image. And I have tried different setup for the image context, or flip the afterScreenUpdates, the result is always the same.
2. Get an image from image context with [view.layer renderInContext:UIGraphicsGetCurrentContext()]
- (UIImage *)_screenshot:(UIView*)view {
UIGraphicsBeginImageContextWithOptions(view.frame.size, view.opaque, 0.0);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
[layer renderInContext:UIGraphicsGetCurrentContext()] is an old API that is used below iOS 10. This is very slow and replaced by [drawViewHierarchyInRect:view] after iOS 10. Same here, the screenshot just shows a black screen.
3. Use UIGraphicsImageRenderer
- (UIImage *)_screenshotNew:(UIView*)view {
UIGraphicsImageRendererFormat *format = [UIGraphicsImageRendererFormat new];
format.opaque = view.opaque;
format.scale = 0.0;
UIGraphicsImageRenderer *renderer = [[UIGraphicsImageRenderer alloc] initWithSize:view.frame.size format:format];
UIImage *screenshotImage = [renderer imageWithActions:^(UIGraphicsImageRendererContext *_Nonnull rendererContext) {
[view drawViewHierarchyInRect:view.bounds afterScreenUpdates:YES];
}];
return screenshotImage;
}
This is the latest API to take a screenshot and convert it to an UIImage, which does not work either.
4. Use [view snapshotViewAfterScreenUpdates:YES]
UIView *snapView = [self.view snapshotViewAfterScreenUpdates:YES];
UIView has an API called snapshotViewAfterScreenUpdates. Surprisingly, the UIView returned by this API can be rendered directly in the UI, and it shows the right screenshot(Woohoo!). However, when I tried to convert the UIView to an UIImage, it becomes a black screen again.
Some additional configurations that I have tried
preventsCapture instance property in AVSBDPL. This is NO by default. And when it is set to YES, it prevents the user from taking screenshot of the layer by pressing the physical buttons on the phone. But it does not have any effect on programmatically taking screenshot.
outputObscuredDueToInsufficientExternalProtection instance property of AVSBDPL. This property is always NO for me. Thus, I don't think it obscures anything. Also, this is a iOS 14.5+ API, and I do see the issue below 14.5.
There are also very few posts when I searched on Google and all of them have run into the same issue but cannot solve. It would be really appreciated if any one can help me with this!
Post not yet marked as solved
Hey there,
I'm not able to get a video element to play again after fullscreen ended using webkitendfullscreen event. The event is fired as expected, but calling myVideo.play() seems to not work.
Any suggestions?
Regards
RonMen
Post not yet marked as solved
I have created three related Feedback Assistant issues that haven't been replied to, and I also have found many WebKit bugs entered over the past six months that could be related to this issue (see links below).
FB9688897
FB9666426
FB9554184
Replication in iPadOS:
Download - http://files.panomoments.com/bbb_sunflower_2160p_60fps_normal.mp4
Attempt to play locally stored on the device either using Files app or Safari.
Note the first few seconds playback with many frame drops and pauses (sound is unaffected).
After initial playback, try seeking to various places in the timeline and note frame drops and stuttering.
Note - I am testing on a 1st Gen iPad Pro 13in
Replication in MacOS:
Download - http://files.panomoments.com/bbb_sunflower_2160p_60fps_normal.mp4
Note when opening in Quicktime the first frame is black.
This also happens in the Finder spacebar preview function, but is harder to see.
The reason why you don’t usually see it in the spacebar preview, is likely because the preview video player has already decoded several frames asynchronously, and you just miss them due to the loading time of the UI. It’s a very fast flicker that’s easy to just ignore (unlike the frozen black frame in Quicktime Player)
Note - I am testing on a 2018 Macbook Pro i9
Regarding potentially related WebKit issues (it seems there was a ton of video decoding / GPU / WebGL work in iOS 15 and Safari 15), see these links:
https://bugs.webkit.org/show_bug.cgi?id=223740
https://bugs.webkit.org/show_bug.cgi?id=231031
https://bugs.webkit.org/show_bug.cgi?id=216250
https://bugs.webkit.org/show_bug.cgi?id=215908
https://bugs.webkit.org/show_bug.cgi?id=230617
https://bugs.webkit.org/show_bug.cgi?id=231359
https://bugs.webkit.org/show_bug.cgi?id=231424
https://bugs.webkit.org/show_bug.cgi?id=231012
https://bugs.webkit.org/show_bug.cgi?id=227586
https://bugs.webkit.org/show_bug.cgi?id=231354
Post not yet marked as solved
I've updated my iPhone 12 pro max to iOS 15 and all of the websites I've developed earlier are not loading Html5 videos (mp4) - this was working fine on iOS14 - is there anything that I should be aware of when writing code? Is there any kind of fix for this?
I've seen it in other posts that there is a toggle in the settings for gpu video, but I obviously can't force each visitor to go to their settings to toggle something (if it fixes it at all)
Post not yet marked as solved
Hi everyone,
I have a technical problem when developing a video downloader on iPhone.
My APP trys to mux H.265 stream and AAC stream to a ".mov" video file with ffmpeg(v2.5.8) muxer.
The problem is: the generated mov file is not recognized by QuickTime and cann't be played with QuickTime. But the mov can be played by VLC. I've already tried 'hvc1' tag, still not work.
Please tell me how can I mux H.265 to mov properly?
Thanks a lot!
Post not yet marked as solved
I am running into a weird bug where videos embedded in WKWebViews do not appear. I get audio, but the screen goes black. A few notes:
This happens on iOS 15 and the 15.1 beta, but not iOS 14.8.
I have three WKWebViews in the view hierarchy. If I reduce the number of WebViews to one, I do not encounter this issue. When I encounter this issue with one WebView, playing the same video from another WebView will work.
It seems to happen more often when in dark mode than in light mode.
This might be an Apple bug, but I have not been successful in building a standalone app that can reproduce this.I know this isn’t a lot to go on, but if anyone has pointers or can suggest something to try I would appreciate it.
Thanks.
John
Post marked as Apple Recommended
I want to save images into a UIImage array in the same order in which it is selected using PHPickerViewController. Is this possible?
Post not yet marked as solved
how can I to convert a photo taken from my camera to binary using swift?
now I transform image to UIImage and after I try to do it using pngData() or jpegData(compressionQuality: 1.0)
It's works well with a png image or jpeg image stoked in my gallery but if I try with a camera photo it doesn't work, the result is a black image
let uiImage: UIImage = image.asUIImage() let imageData: Data = uiImage.jpegData(compressionQuality: 1.0) ?? Data() //uiImage.pngData() ?? Data()
My CODE:
the mediaURL.path is obtained from UIImagePickerControllerDelegate
guard UIVideoEditorController.canEditVideo(atPath: mediaURL.path) else { return }
let editor = UIVideoEditorController()
editor.delegate = self
editor.videoPath = mediaURL.path
editor.videoMaximumDuration = 10
editor.videoQuality = .typeMedium
self.parentViewController.present(editor, animated: true)
Error description on console as below.
Video export failed for asset <AVURLAsset: 0x283c71940, URL = file:///private/var/mobile/Containers/Data/PluginKitPlugin/7F7889C8-20DB-4429-9A67-3304C39A0725/tmp/trim.EECE5B69-0EF5-470C-B371-141CE1008F00.MOV>: Error Domain=AVFoundationErrorDomain Code=-11800
It doesn't call
func videoEditorController(_ editor: UIVideoEditorController, didFailWithError error: Error)
After showing error on console, UIVideoEditorController automatically dismiss itself.
Am I doing something wrong? or is it a bug in swift?
Thank you in advance.
Post not yet marked as solved
Hi everyone. When i update my iPhone Xs to ios 15 beta 6 live text stop working. Before this update it works excellent. After 7 and 8 beta it still doesn’t work. What should i do?
Post not yet marked as solved
The problem is, I have a video file which is about 111MB with resolution 1216x2160 aaaand I can’t save it on my iPhone even though I have a plenty enough space 😭 I tried to send it via airdrop and it shows a pop up with an error and ask me if I want to save it in my documents (I tried this one as well and there’s no way to save it in my gallery from the app), I tried to send a file via telegram and also get the same error. What should I do? I can’t believe that I can shoot in 4K, but can’t save a video with a higher resolution on my iPhone
How can I find out what the Problem is?
Every Time I start Audio and hear it when the iPad/iPhone is turned off and then activate Display of the device after 10-15 Minutes, the App crashes.
Here are the First Lines of the Crash Report:
Hardware Model: iPad8,12
Process: VOH-App [16336]
Path: /private/var/containers/Bundle/Application/5B2CF582-D108-4AA2-B30A-81BA510B7FB6/VOH-App.app/VOH-App
Identifier: com.voiceofhope.VOH
Version: 7 (1.0)
Code Type: ARM-64 (Native)
Role: Non UI
Parent Process: launchd [1]
Coalition: com.voiceofhope.VOH [740]
Date/Time: 2021-08-18 22:51:24.0770 +0200
Launch Time: 2021-08-18 22:36:50.4081 +0200
OS Version: iPhone OS 14.7.1 (18G82)
Release Type: User
Baseband Version: 2.05.01
Report Version: 104
Exception Type: EXC_BAD_ACCESS (SIGSEGV)
Exception Subtype: KERN_PROTECTION_FAILURE at 0x000000016d2dffb0
VM Region Info: 0x16d2dffb0 is in 0x16d2dc000-0x16d2e0000; bytes after start: 16304 bytes before end: 79
REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL
CG raster data 11cad0000-11d814000 [ 13.3M] r--/r-- SM=COW
GAP OF 0x4fac8000 BYTES
---> STACK GUARD 16d2dc000-16d2e0000 [ 16K] ---/rwx SM=NUL ... for thread 0
Stack 16d2e0000-16d3dc000 [ 1008K] rw-/rwx SM=PRV thread 0
Termination Signal: Segmentation fault: 11
Termination Reason: Namespace SIGNAL, Code 0xb
Terminating Process: exc handler [16336]
Triggered by Thread: 0
Thread 0 name: Dispatch queue: com.apple.main-thread
Thread 0 Crashed:
0 libswiftCore.dylib 0x00000001a8028360 swift::MetadataCacheKey::operator==+ 3773280 (swift::MetadataCacheKey) const + 4
1 libswiftCore.dylib 0x00000001a801ab8c _swift_getGenericMetadata+ 3718028 (swift::MetadataRequest, void const* const*, swift::TargetTypeContextDescriptor<swift::InProcess> const*) + 304
2 libswiftCore.dylib 0x00000001a7ffbd00 __swift_instantiateCanonicalPrespecializedGenericMetadata + 36
Here is a full crash Report:
VOH-App 16.08.21, 20-22.crash
Post not yet marked as solved
Good afternoon!
Can you advise, I want to implement photo exposure fixing by clicking on photo preview point, at the moment I use DragGesture to get CGPoint and pass it to capture setup
let device = self.videoDeviceInput.device
do {
try device.lockForConfiguration()
if device.isFocusPointOfInterestSupported {
device.exposurePointOfInterest = focusPoint
device.exposureMode = .autoExpose
device.unlockForConfiguration()
}
}
The values are printed to the terminal, but in the preview it feels like one point closer to the bottom edge is being used.
The code for the view is:
.gesture(
DragGesture(minimumDistance: 0)
.onChanged({ value in
self.expFactor = value.location
print(expFactor)
})
.onEnded({ value in
model.exp(with: expFactor)
})
Can you tell me if anyone has already tried to implement the fixing in SwiftUI, I want it roughly like the standard camera.
Post not yet marked as solved
Hi, I am interested in extracting/accessing timestamp of each frame captured while recording a video via iPhone (HEVC - 4k 60fps). Any links to relevant documentation will be very useful.
Post not yet marked as solved
I am playing around with the keystoneCorrection filters. The properties that one can change are inputTopLeft, inputTopRight, inputBottomLeft, inputBottomRight, and inputFocalLength, The problem is that I cannot find any documentation as to how this filter works or any sample code. Would anyone have insights as to how this all works?
Post not yet marked as solved
WebRTC video on iOS/iPadOS Safari goes black in 8 mins if SDP has no Audio.
I have a WebRTC app that can video call on iOS/iPadOS Safari.
But if Audio is disable in webRTC(or SDP), video goes black in 8 mins.
After video goes black, webRTC call doesn't end. Just video goes black. After change tab dan get back to the tab which has a video call, video works well again.
It seems that iOS/iPadOS Safari has a function which if video has no audio, video goes black in 8 mins.
Any idea or any solution?
Post not yet marked as solved
Hi Team, I’m just bought my new 12 mini but experienced lagging during any game or video making or just FB surfing. Has Anyone else met such issue? It is very annoying and disappointing!
Post not yet marked as solved
I'm setting up an API call to Tenor.com, docs here https://tenor.com/gifapi/documentation#responseobjects-gif .
I'm setting up a struct for my returning JSON, and I'm stuck on the media part.
For "media" it says to use "[ { GIF_FORMAT : MEDIA_OBJECT } ]".
How do I declare gif format and media objects? Or is there another way to set this up?
Here's what I've got do far.
struct structForAllApiResults: Codable {
// MARK: - Gif Object
let created: Float // a unix timestamp representing when this post was created.
let hasaudio: Bool // true if this post contains audio (only video formats support audio, the gif image file format can not contain audio information).
let id: String //Tenor result identifier
let media: [ Dictionary<GIF,>] // An array of dictionaries with GIF_FORMAT as the key and MEDIA_OBJECT as the value
let tags: [String] // an array of tags for the post
let title: String // the title of the post.
let itemurl: String // the full URL to view the post on tenor.com.
let hascaption: Bool // true if this post contains captions
let url: String // a short URL to view the post on tenor.com.
// MARK: - Category Object
let searchterm: String
let path: String
let image: String
let name: String
// MARK: - Media Object
let preview: String
let url: String
let dims: [Int] // dimensions
// MARK: - Format Types
let gif:
}
Post not yet marked as solved
Hi~ I have got the error below, when I tried to export the video from iphone to my project using
exportAsynchronously of AVAssetExportSession.
MediaPickerError: nsError : Error Domain=AVFoundationErrorDomain Code=-11800 "这项操作无法完成" UserInfo={NSLocalizedFailureReason=发生未知错误(-17507), NSLocalizedDescription=这项操作无法完成, NSUnderlyingError=0x2806f3870 {Error Domain=NSOSStatusErrorDomain Code=-17507 "(null)"}}
Can I get more information for this error?
And how can I fix it? ^_^
Post not yet marked as solved
When use presentLimitedLibraryPicker or when use "Select Photos.." in library, there is an abnormal condition. The search box is transparent, it's only happen in iOS 14, iOS 15 Beta is not happen. Where can I handle it?