Post not yet marked as solved
how can I to convert a photo taken from my camera to binary using swift?
now I transform image to UIImage and after I try to do it using pngData() or jpegData(compressionQuality: 1.0)
It's works well with a png image or jpeg image stoked in my gallery but if I try with a camera photo it doesn't work, the result is a black image
let uiImage: UIImage = image.asUIImage() let imageData: Data = uiImage.jpegData(compressionQuality: 1.0) ?? Data() //uiImage.pngData() ?? Data()
Post not yet marked as solved
Hi everyone,
I have a technical problem when developing a video downloader on iPhone.
My APP trys to mux H.265 stream and AAC stream to a ".mov" video file with ffmpeg(v2.5.8) muxer.
The problem is: the generated mov file is not recognized by QuickTime and cann't be played with QuickTime. But the mov can be played by VLC. I've already tried 'hvc1' tag, still not work.
Please tell me how can I mux H.265 to mov properly?
Thanks a lot!
Post not yet marked as solved
I have created three related Feedback Assistant issues that haven't been replied to, and I also have found many WebKit bugs entered over the past six months that could be related to this issue (see links below).
FB9688897
FB9666426
FB9554184
Replication in iPadOS:
Download - http://files.panomoments.com/bbb_sunflower_2160p_60fps_normal.mp4
Attempt to play locally stored on the device either using Files app or Safari.
Note the first few seconds playback with many frame drops and pauses (sound is unaffected).
After initial playback, try seeking to various places in the timeline and note frame drops and stuttering.
Note - I am testing on a 1st Gen iPad Pro 13in
Replication in MacOS:
Download - http://files.panomoments.com/bbb_sunflower_2160p_60fps_normal.mp4
Note when opening in Quicktime the first frame is black.
This also happens in the Finder spacebar preview function, but is harder to see.
The reason why you don’t usually see it in the spacebar preview, is likely because the preview video player has already decoded several frames asynchronously, and you just miss them due to the loading time of the UI. It’s a very fast flicker that’s easy to just ignore (unlike the frozen black frame in Quicktime Player)
Note - I am testing on a 2018 Macbook Pro i9
Regarding potentially related WebKit issues (it seems there was a ton of video decoding / GPU / WebGL work in iOS 15 and Safari 15), see these links:
https://bugs.webkit.org/show_bug.cgi?id=223740
https://bugs.webkit.org/show_bug.cgi?id=231031
https://bugs.webkit.org/show_bug.cgi?id=216250
https://bugs.webkit.org/show_bug.cgi?id=215908
https://bugs.webkit.org/show_bug.cgi?id=230617
https://bugs.webkit.org/show_bug.cgi?id=231359
https://bugs.webkit.org/show_bug.cgi?id=231424
https://bugs.webkit.org/show_bug.cgi?id=231012
https://bugs.webkit.org/show_bug.cgi?id=227586
https://bugs.webkit.org/show_bug.cgi?id=231354
Post not yet marked as solved
Hey there,
I'm not able to get a video element to play again after fullscreen ended using webkitendfullscreen event. The event is fired as expected, but calling myVideo.play() seems to not work.
Any suggestions?
Regards
RonMen
Post not yet marked as solved
I am trying to play videos in AVSampleBufferDisplayLayer. Everything works well except it seems like the screenshot no longer works for the AVSBDPL when taken programatically.
I have tried a couple of approaches and the screenshot taken is always a black screen in the area of the AVSBDPL. Here are the approached that I have tried, but none of them works:
1. Get an image from image context with [view drawViewHierarchyInRect:view.bounds afterScreenUpdates:YES]
- (UIImage *)_screenshot:(UIView *)view {
UIGraphicsBeginImageContextWithOptions(view.frame.size, view.opaque, 0.0);
[view drawViewHierarchyInRect:view.bounds afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
No matter which view I provided to the function(the screen, the player container view, etc.), the video area is always a black image. And I have tried different setup for the image context, or flip the afterScreenUpdates, the result is always the same.
2. Get an image from image context with [view.layer renderInContext:UIGraphicsGetCurrentContext()]
- (UIImage *)_screenshot:(UIView*)view {
UIGraphicsBeginImageContextWithOptions(view.frame.size, view.opaque, 0.0);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
[layer renderInContext:UIGraphicsGetCurrentContext()] is an old API that is used below iOS 10. This is very slow and replaced by [drawViewHierarchyInRect:view] after iOS 10. Same here, the screenshot just shows a black screen.
3. Use UIGraphicsImageRenderer
- (UIImage *)_screenshotNew:(UIView*)view {
UIGraphicsImageRendererFormat *format = [UIGraphicsImageRendererFormat new];
format.opaque = view.opaque;
format.scale = 0.0;
UIGraphicsImageRenderer *renderer = [[UIGraphicsImageRenderer alloc] initWithSize:view.frame.size format:format];
UIImage *screenshotImage = [renderer imageWithActions:^(UIGraphicsImageRendererContext *_Nonnull rendererContext) {
[view drawViewHierarchyInRect:view.bounds afterScreenUpdates:YES];
}];
return screenshotImage;
}
This is the latest API to take a screenshot and convert it to an UIImage, which does not work either.
4. Use [view snapshotViewAfterScreenUpdates:YES]
UIView *snapView = [self.view snapshotViewAfterScreenUpdates:YES];
UIView has an API called snapshotViewAfterScreenUpdates. Surprisingly, the UIView returned by this API can be rendered directly in the UI, and it shows the right screenshot(Woohoo!). However, when I tried to convert the UIView to an UIImage, it becomes a black screen again.
Some additional configurations that I have tried
preventsCapture instance property in AVSBDPL. This is NO by default. And when it is set to YES, it prevents the user from taking screenshot of the layer by pressing the physical buttons on the phone. But it does not have any effect on programmatically taking screenshot.
outputObscuredDueToInsufficientExternalProtection instance property of AVSBDPL. This property is always NO for me. Thus, I don't think it obscures anything. Also, this is a iOS 14.5+ API, and I do see the issue below 14.5.
There are also very few posts when I searched on Google and all of them have run into the same issue but cannot solve. It would be really appreciated if any one can help me with this!
Post not yet marked as solved
We have implemented a check on our app, that validates when a user has seen a video in it's entirety and hasn't skipped any section.
We were using the video.played TimeRange Buffer available on a video HTML5 element.
The expected behaviour
For a user playing a video once and letting it end, it means you get one TimeRange where start is zero and end is the duration of the video in seconds. The end is not always the exact duration but it was always within 1 second of it.
The issue
After iOS15 the end integer is never the correct one or even close to the duration when a video ends (The "ended" event fires). It almost always is close to the start timer.
When pausing the video though the end duration is correct on the played TimeRange.
Testing on iOS 15.0.2
Post not yet marked as solved
I have a web application that uses the mediarecorder API to record and show video. For short videos, it works fine.
For videos of 1 minute or more on IOS, the web page reloads (with a generic error message flashing right before it loads). There are no errors in the console.
Long videos work fine in Mac Safari, but also fail on IOS Chrome.
It's clearly some sort of IOS resource issue.
Has anyone successfully used the mediarecorder API to record and then play longer videos on IOS?
Note that the crash happens shortly after this line of code which I've seen in countless mediarecorder examples:
video.src = URL.createObjectURL(new Blob(blobs, { type: mediaRecorder.mimeType }));
Post not yet marked as solved
Hi everyone,
I have an app that uses the front-facing TrueDepthCamera for functionality. I.e., taking a photo is essential for functionality.
My problem, as quoted from App Store review team, is as follows:
App crashed when we tapped to take a picture
Review device details:
Device type: iPad
OS version: iOS 15.1
I have tested this app on iPhone 13, 12, and 11 successfully. I intended to make this an iPhone ONLY app (changed hardware requirements in info.plist. Didn't change target deployment info though... i.e. iPad still selected there).
I have spent hours trying to figure out the following: is there any way to restrict an app to iPhone only for testing and publishing?
I am a solo developer who has spent lots of time on trying to make this a reality, and am unfortunately stuck on this issue. All help is appreciated!
Post not yet marked as solved
App getting crashed after updating OS version to OS 15.1 at the time of first time launch, and after crash it works fine. In earlier version like in 15.0 it was working fine.
While Debug I found in the first time in audio video permission app getting stuck in below code.
if ([AVCaptureDevice respondsToSelector:@selector(requestAccessForMediaType: completionHandler:)]) {
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeAudio completionHandler:^(BOOL granted) {
if (granted) {
dispatch_async(dispatch_get_main_queue(), ^{
});
} else {
}
}];
} else {
}
Post not yet marked as solved
UIImageWriteToSavedPhotosAlbum is not working reliably for me.
It seems to work about a third of the times I call it.
When I say does not work, I mean the callback is not called, and it silently returns
When I use the code snipped below I get the results:
Call UIImageWriteToSavedPhotosAlbum
Called UIImageWriteToSavedPhotosAlbum
If I call it twice in a row sometimes the first call works, never the second.
Neither of the print statements in the callback are executed
class ImageSaver: NSObject {
func writeToPhotoAlbum(image: UIImage) {
print("Call UIImageWriteToSavedPhotosAlbum")
UIImageWriteToSavedPhotosAlbum(image, self, #selector(image(_:didFinishSavingWithError:contextInfo:)), nil)
print("Called UIImageWriteToSavedPhotosAlbum")
}
// https://www.hackingwithswift.com/example-code/media/uiimagewritetosavedphotosalbum-how-to-write-to-the-ios-photo-album
@objc func image(_ image: UIImage, didFinishSavingWithError error: NSError?, contextInfo: UnsafeRawPointer) {
if let error = error {
// we got back an error!
print("Error: \(error.localizedDescription)")
} else {
print ("Image saved")
}
}
}
Post not yet marked as solved
Just curious about the possibility of malware in images and videos on iPhones. A) are there malware protections in place in the photos app / library to prevent images and videos that contain malware from getting in. B) if not is it possible for my app which uses Picker(s) to pass infected images / videos through to our cloud storage?
Post not yet marked as solved
I know that if you want background audio from AVPlayer you need to detatch your AVPlayer from either your AVPlayerViewController or your AVPlayerLayer in addition to having your AVAudioSession configured correctly.
I have that all squared away and background audio is fine until we introduce AVPictureInPictureController or use the PiP behavior baked into AVPlayerViewController.
If you want PiP to behave as expected when you put your app into the background by switching to another app or going to the homescreen you can't perform the detachment operation otherwise the PiP display fails.
On an iPad if PiP is active and you lock your device you continue to get background audio playback. However on an iPhone if PiP is active and you lock the device the audio pauses.
However if PiP is inactive and you lock the device the audio will pause and you have to manually tap play on the lockscreen controls. This is the same between iPad and iPhone devices.
My questions are:
Is there a way to keep background-audio playback going when PiP is inactive and the device is locked (iPhone and iPad)
Is there a way to keep background-audio playback going when PiP is active and the device is locked? (iPhone)
Post not yet marked as solved
CVPixelBuffer.h defines
kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange = '420v', /* Bi-Planar Component Y'CbCr 8-bit 4:2:0, video-range (luma=[16,235] chroma=[16,240]). baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */
kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange = 'x420', /* 2 plane YCbCr10 4:2:0, each 10 bits in the MSBs of 16bits, video-range (luma=[64,940] chroma=[64,960]) */
But when I set above format camera output, and I find the output pixelbuffer's value is exceed the range.I can see [0 -255] for 420YpCbCr8BiPlanarVideoRange and
[0,1023] for 420YpCbCr10BiPlanarVideoRange
Is it a bug or something wrong of the output?If it is not how can I choose the correct matrix transfer the yuv data to rgb?
Post not yet marked as solved
I don't know how to take out SUBTITLE file.
Post not yet marked as solved
Dear Sirs and Madams,
probably I'm wrong here but maybe you know where I can find a picture of current iPad models. My junior enterprise team would like to conduct a study focusing on the wishes of young employees. This will involve a raffle for an iPad among the participants. The survey will be announced and distributed publicly on social media. We would also like to show the first prize, i.e. an iPad, in the post. Does Apple have any photos of the products for such purposes that we can download and include there? We do not want to violate any copyright regulations.
Your advice would help me a lot.
Best wishes from Germany,
Nathalie
Post not yet marked as solved
We've been developing a web app that uses WebAudio (specifically, AudioBufferSourceNode) to produce closely synchronised audio on desktop and mobile devices.
In recent testing on Chrome (97) on iPhone (15.2.1), we sometimes see a tab get into a state where it imposes extreme delay (we've seen up to ~1500ms) on audio output. This applies both to buffer-source playback and to trivial OscillatorNode-generated beeps. The audio eventually emerges intact from the speaker, but it's as if a 1500ms delay node has been introduced. It doesn't seem to be a hardware state; playback from Safari at those times is unaffected.
We don't yet have a reliable procedure for reproducing this. It seems to happen more readily when we put an audio-producing tab into the background for a while (say, 15 minutes or more) then wake it up again. We are usually testing with two or more tabs running the same app, so maybe that's part of the trigger too.
Our main question for this forum is: what can we do to diagnose how that extra delay is happening?
Or, alternatively: is there some kind of reset of the audio framework that should be able to force playback back to normal after getting into this state?
We would of course also be very grateful for pointers to anywhere that this situation has already been discussed. It doesn't seem to have cropped up before in these forums.
Post not yet marked as solved
I noticed user are unable to capture screenshot on Netflix content with Safari, I wonder which webkit API they use in order to make screen capture result black screen?
Post not yet marked as solved
Hello everyone,
I am developing a web application and I have a video presentation within the app and it seems to have issues only the iPad or mobile devices. I've read that autoplay doesn't seem to work on the iOS systems anymore and playing muted is allow. I've verified this but is there any way to get the video to play? I have a button the should start the video but it doesn't I've also tried using various other things to get it to work but to no avail.
I'm currently using a software platform called Bubble.io using javascript, HTML, Dom commands. Any tips/advise would be greatly appreciated also here is a link to the test video to show what is going on.
https://testingvideoplayer.bubbleapps.io/version-test?debug_mode=true
If you have any questions/concerns let me know.
Post not yet marked as solved
Hi,
I have written a DAL virtual webcam plugin which works fine with all apps (Zoom, OBS, ...) except Apple QuickTime.
Other 3rd party virtual webcams show up in QuickTime, for instance the OBS virtual cam plugin:
https://github.com/obsproject/obs-studio/tree/dde4d57d726ed6d9e244ffbac093d8ef54e29f44/plugins/mac-virtualcam/src/dal-plugin
My first intention was that it has something to do with code signing, so I removed the signature from OBS virtual cam plugin but it kept working in QuickTime.
This is the source code of my plugin's entry function:
#include <CoreFoundation/CoreFoundation.h>
#include "plugininterface.h"
extern "C" void *TestToolCIOPluginMain(CFAllocatorRef allocator, CFUUIDRef requestedTypeUUID)
{
// This writes to a log file in /tmp/logfile.txt but is NEVER called from QuickTime:
Logger::write("Called TestToolCIOPluginMain");
if (!CFEqual(requestedTypeUUID, kCMIOHardwarePlugInTypeID))
return nullptr;
return VCam::PluginInterface::create();
}
And the plugin's Info.plist (almost the same as OBS virtual cam's one):
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>CFBundleDevelopmentRegion</key>
<string>English</string>
<key>CFBundleExecutable</key>
<string>TestDriverCIO</string>
<key>CFBundleIdentifier</key>
<string>com.test.cmio.DAL.VirtualCamera</string>
<key>LSMinimumSystemVersion</key>
<string>10.13</string>
<key>CFBundleInfoDictionaryVersion</key>
<string>6.0</string>
<key>CFBundleName</key>
<string>TestDriverCIO</string>
<key>CFBundlePackageType</key>
<string>BNDL</string>
<key>CFBundleShortVersionString</key>
<string>3.0.0</string>
<key>CFBundleSignature</key>
<string>????</string>
<key>CFBundleVersion</key>
<string>3.0.0</string>
<key>CFBundleSupportedPlatforms</key>
<array>
<string>MacOSX</string>
</array>
<key>CFPlugInFactories</key>
<dict>
<key>AAAAAAAA-7320-5643-616D-363462697402</key>
<string>TestToolCIOPluginMain</string>
</dict>
<key>CMIOHardwareAssistantServiceNames</key>
<array>
<string>com.test.cmio.VCam.Assistant</string>
</array>
<key>CFPlugInTypes</key>
<dict>
<key>30010C1C-93BF-11D8-8B5B-000A95AF9C6A</key>
<array>
<string>AAAAAAAA-7320-5643-616D-363462697402</string>
</array>
</dict>
</dict>
</plist>
Interestingly "TestToolCIOPluginMain" is never called (the logger never writes an output) when starting QuickTime and the camera is not shown in QuickTime.
Is there something special required to get the DAL plugin to show up in QuickTime? What am I missing here?
Regards,
Post not yet marked as solved
I have implemented an app with ionicvue.
On one page I want to display videos with the HTML5 video control, e.g.
video src="https://media.geeksforgeeks.org/wp-content/uploads/20210314115545/sample-video.mp4" ....
it works on web, on android, but it does not work on iOS.
There is only a play button with a strike through line
I have tried all of the available option conbinations (controls playsinline webkit-playsinline etc.) but it does not help.
Any Ideas?
Thanks a lot
Alex