Post not yet marked as solved
I'm trying to create a scaled down version of a video selected from the users photo album. The max dimensions of the output will be 720p. Therefore, when retrieving the video, I'm using the .mediumQualityFormat as the deliveryMode.
This causes iOS to retrieve a 720p video from iCloud if the original video or its medium quality version don't exist in the users device.
swift
let videoRequestOptions = PHVideoRequestOptions()
videoRequestOptions.deliveryMode = .mediumQualityFormat
videoRequestOptions.isNetworkAccessAllowed = true
PHImageManager.default().requestAVAsset(forVideo: asset, options: videoRequestOptions) { (asset, audioMix, info) in
// Proceess the asset
}
The problem is, when I use AVAssetExportSession to create a scaled down version of the asset, if the asset is a medium variant and not the original version, the export process fails immediately with the following error:
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-17507), NSLocalizedDescription=İşlem tamamlanamadı, NSUnderlyingError=0x283bbcf60 {Error Domain=NSOSStatusErrorDomain Code=-17507 "(null)"}}
I couldn't find anything about the meaning of this error anywhere.
When I set the deliveryMode property to .auto or .highQualityFormat, everything is working properly.
When I checked the asset url's, I noticed that if the video has been retrieved from iCloud, its filename has a ".medium" postfix like in this example:
file:///var/mobile/Media/PhotoData/Metadata/PhotoData/CPLAssets/group338/191B2348-5E19-4A8E-B15C-A843F9F7B5A3.medium.MP4
The weird thing is, if I use FileManager to copy the video in this url to another directory, create a new AVAsset from that file, and use that asset when creating the AVExportSession instance, the problem goes away.
I'd really appreciate if someone could provide some insight about what the problem could be.
This is how I use AVAssetExportSession to create a scaled down version of the original video.
swift
let originalVideoURL = "The url of the asset retrieved from requestAVAsset"
let outputVideoPath = NSTemporaryDirectory() + "encodedVideo.mp4"
let outputVideoURL = URL(fileURLWithPath: outputVideoPath)
guard
let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetHighestQuality),
let videoTrack = asset.tracks(withMediaType: .video).first else {
handleError()
return
}
let videoComposition = AVMutableVideoComposition()
videoComposition.renderSize = scaledSize
videoComposition.frameDuration = CMTimeMake(value: 1, timescale: 30)
let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack)
let transform = videoTrack.preferredTransform
layerInstruction.setTransform(transform, at: .zero)
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRangeMake(start: .zero, duration: asset.duration)
instruction.layerInstructions = [layerInstruction]
videoComposition.instructions = [instruction]
exportSession.videoComposition = videoComposition
exportSession.outputURL = outputVideoURL
exportSession.outputFileType = .mp4
exportSession.shouldOptimizeForNetworkUse = true
exportSession.exportAsynchronously(completionHandler: {[weak self] in
guard let self = self else { return }
if let url = exportSession.outputURL, exportSession.status == .completed {
// Works for local videos
} else {
// Fails with error code 17507 when loading videos with delivery size "Medium"
}
})
Post not yet marked as solved
Hello,
I have a problem with my iOS application which displays video streaming via MobileVlcKit. this function has been available on my application for many months but, for a few days, I can't see the video stream in my application !
when I use the xCode simulator, the video stream is displayed correctly.
but, when i launch a local version or a version in testflight i get a black screen without my video stream flux !
when I run a local version via the USB cord, I see this message **** in the debugging console :
"Unable to determine our source address: This computer has an invalid IP address: 0.0.0.0"
can someone please help me?
Post not yet marked as solved
Looking for the best ott platform provider or solution to launch our own video on demand business with customized features and functionalities, revenue models. We are focusing on movie streaming content to broadcast across the web, IOS, Apple TV, Amazon fire tv and monetize over it. Need suggestion regarding this.
Thanks, Advance
Post not yet marked as solved
I am encoding CMAF contents and when I try to stream a DRM'd video on Safari, nothing happens and I don't even have an error code.
It is working without any problem on Chrome (widevine). Our partners can stream the content on their side but we can't, on Macbook or iPhone...
We have tested CMAF on the following iOS versions on various device types (phones and tablets):
.3
.1
.1
.1
• 14
• 14.3
It was working without any problem.
We had an issue on:
Macbook pro M1 - Big Sur 11.3.1
Safari Version 14.1 (16611.1.21.161.6)
iPhone 12.- iOS 14.5.1
Iphone 12. - iOS 14.4.2
Macbook Air Intel 2020 - Big Sur 11.2.3
Sometimes, it is working when I reload the page ... I don't know what happens...
Post not yet marked as solved
Would like to play with this example project shown on session wwdc21-10187
Post not yet marked as solved
I'm trying to use the sample code associated to the talk Author fragmented MPEG-4 content with AVAssetWriter which can be found here.
It works well when I run it on macOS, but after adapting it to run in iOS (basically moving the code in the main file to a view controller), it doesn't work. The problem is that the function:
assetWriter(_:didOutputSegmentData:segmentType:segmentReport:)
is never called for the last segment.
In macOS, the last segment is reported after calling the function AVAssetWriter.finishWriting(completionHandler:), but before the completionHandler parameter block is invoked. In iOS, nothing happens at that point.
Is there anything I could do from my side to fix this problem?
Thanks in advance!
Post not yet marked as solved
WebRTC video on iOS/iPadOS Safari goes black in 8 mins if SDP has no Audio.
I have a WebRTC app that can video call on iOS/iPadOS Safari.
But if Audio is disable in webRTC(or SDP), video goes black in 8 mins.
After video goes black, webRTC call doesn't end. Just video goes black. After change tab dan get back to the tab which has a video call, video works well again.
It seems that iOS/iPadOS Safari has a function which if video has no audio, video goes black in 8 mins.
Any idea or any solution?
Post not yet marked as solved
Hi,
I probably know how to download a hls video. But can someone tell me that if video is drm protected how do i download a video? How do i provide key to player while offline? I can't find piece of code anywhere, please can you provide me with an example of code especially how to provide key / license to player for offline playback
Post not yet marked as solved
Trying to download an encrypted HLS stream we faced the following behaviour:
Setting requestCachePolicy and/or urlCache properties of URLSessionConfiguration that used to create the AVAssetDownloadURLSession seems to have no effect at all.
This is important for as since we need the HTTP caching policy to be applied for the .m3u8 manifest file of the stream.
Is this the intended behaviour of the download process or some kind of an issue?
Post not yet marked as solved
Trying to download an encrypted HLS stream we faced the following issue:
When we start a new download, calling resume() function of AVAssetDownloadTask, the download process gets stuck (not every time) and neither of urlSession(_:assetDownloadTask:didFinishDownloadingTo:) or urlSession(_:task:didCompleteWithError:) delegate functions (AVAssetDownloadDelegate) are getting called.
There are cases where not even the urlSession(_:assetDownloadTask:didLoad:totalTimeRangesLoaded:timeRangeExpectedToLoad:) delegate function is getting called.
Any suggestions on how to troubleshoot?
Post not yet marked as solved
We are experiencing audio sync issues during playback on fMP4 HLS live streams (HLS and LL-HLS) on Apple devices only (iOS and macOS) and we're not sure what's causing the problem. The issue does not occur during playback on Windows or Android platforms.
During playback in Safari, everything is fine until the sync gets lost suddenly, usually 5-10 minutes after playback begins. The extent of the desync varies but is very noticeable when it does - usually in the 15-30 frame range. Sync is always restored when restarting the player, until it becomes lost again some minutes later.
We are capturing the streams on iPhone devices and encoding HEVC / AAC-LC at 30fps locally on the device, and then sending to a media server for further processing. We then transcode the source stream and create multiple variations at different bitrates (HEVC). Because we are streaming from mobile devices in the field, during our server-side transcoding we set a constant 30fps frame rate in case of drops due to network issues. I should add that the issue occurs just as much with h264 as HEVC (we've tested many different combinations of input/output formats and protocols).
Regardless of whether we playback the source stream, the individual transcoded variations, or the ABR playlist with all variations, the sync problem appears in the same manner.
One interesting note. The issue seldom occurs on one of our older devices, an iPhone 6s Plus running a slightly older iOS version (14.4.1).
We suspect it has something to do with discontinuities inherent in our input streams that are not being corrected during our normalization/transcoding process. The Apple player is not compensating as other players are doing on other platforms.
We've run the Apple MediaStreamValidator validator tool and discovered multiple "must fix" issues - but it's not clear which of these, if any, are causing our problems. See output attached.
MediaStreamValidator output
Also, here is the full HLS report from the validator tool (in PNG format due to file restrictions here):
Happy to share more details or run more tests. We've been trying to debug this for weeks now. Thanks for your help.
My CODE:
the mediaURL.path is obtained from UIImagePickerControllerDelegate
guard UIVideoEditorController.canEditVideo(atPath: mediaURL.path) else { return }
let editor = UIVideoEditorController()
editor.delegate = self
editor.videoPath = mediaURL.path
editor.videoMaximumDuration = 10
editor.videoQuality = .typeMedium
self.parentViewController.present(editor, animated: true)
Error description on console as below.
Video export failed for asset <AVURLAsset: 0x283c71940, URL = file:///private/var/mobile/Containers/Data/PluginKitPlugin/7F7889C8-20DB-4429-9A67-3304C39A0725/tmp/trim.EECE5B69-0EF5-470C-B371-141CE1008F00.MOV>: Error Domain=AVFoundationErrorDomain Code=-11800
It doesn't call
func videoEditorController(_ editor: UIVideoEditorController, didFailWithError error: Error)
After showing error on console, UIVideoEditorController automatically dismiss itself.
Am I doing something wrong? or is it a bug in swift?
Thank you in advance.
Post not yet marked as solved
Has the method of inputting Videos files into Reality Composer been updated?
If there is a way to insert a video file into a specific mesh using XCode, please let me know.
Post not yet marked as solved
Hello Everyone,
We have a feature in our application wherein our user can upload a picture or a video for others to look into.
We would like to add some compression logic to both the media types when uploaded so we can save the memory and our users can also upload the media quickly and don’t have to wait for a longer time.
We have tried adding iOS native compression however that is degrading the quality of the photo or a video. Can you please help us with the best possible solution which we can integrate without losing the quality of the media?
As an alternative for now, we are restricting the users to upload the video of max 30 seconds but if we are able to integrate the compression, we would like to allow them to upload a video of 3 mins.
Please let us know if you need any additional information.
Thank you.
Post not yet marked as solved
I like to use the iPhone12 Pro as a frontend in a project where we shall evaluate people behavior. The processing will be done on an external Ubuntu machine. I therefore like to transfer body motion keypoints, IMU data, Video frames and sound to the external machine. We are able to extract body motion keypoints and IMU data but are struggling with combining this with video and sound streaming. Is there a method in ARKit to extract the video frame that was used for the body pose estimation? In the same way, is there a way to extract the audio samples in ARKit or can this be done in another way?
I am not an experienced IOS programmer, but like to know if it is possible to achieve what we want.
Post not yet marked as solved
We have an HLS video stream which is failing to play in AVPlayer but plays on ExoPlayer. AVPlayer throws the above error code, which we cannot find any reference to or help with understanding what exactly is AVPlayer doing (or finds unacceptable with the stream).
We'd like to know what causes this error to be thrown. We see this when HLS segments are authored from an MP4 with some questionable I and P frame data near the end of the container.
The error is thrown on iOS 14.7.1 AVPlayer.
Post not yet marked as solved
Trying to download an encrypted HLS stream we faced the following behaviour:
Setting requestCachePolicy and/or urlCache properties of URLSessionConfiguration that used to create the AVAssetDownloadURLSession seems to have no effect at all.
In our application the user can add multiple encrypted HLS streams at a queue. Before adding them in queue, we make sure that the manifest gets cached using the shared URLSession like this:
URLSession.shared.configuration.urlCache = .shared
let task = URLSession.shared.dataTask(with: media.url) { _, _, _ in
self.addMediaToQueue(media)
}
task.resume()
and we setup our AVAssetDownloadURLSession like this:
// Create the configuration for the AVAssetDownloadURLSession.
let backgroundConfiguration = URLSessionConfiguration.background(withIdentifier: "AAPL-Identifier")
backgroundConfiguration.urlCache = .shared
backgroundConfiguration.requestCachePolicy = .returnCacheDataElseLoad
// Create the AVAssetDownloadURLSession using the configuration.
assetDownloadURLSession = AVAssetDownloadURLSession(
configuration: backgroundConfiguration,
assetDownloadDelegate: self,
delegateQueue: .main
)
Here is an example of the caching headers that we use:
Last-Modified: Thu, 11 Mar 2021 02:23:57 GMT
Cache-Control: max-age=604800
This is important for us since our manifest url is signed and expires after 12 hours.
Example of manifest URL:
https://example.host.gr/v1/791/888/773923397316/773923397316.ism/.m3u8[…]~hmac=ee37a750b8238745b5c8cf153ebcd0b693dd5d83
If the client followed the HTTP cache policy and didn’t request the .m3u8 manifest file over the internet, the download would start, despite the 12 hours limit.
Is this the intended behaviour of the download process or some kind of an issue? Could you suggest a workaround?
Post not yet marked as solved
The problem is, I have a video file which is about 111MB with resolution 1216x2160 aaaand I can’t save it on my iPhone even though I have a plenty enough space 😭 I tried to send it via airdrop and it shows a pop up with an error and ask me if I want to save it in my documents (I tried this one as well and there’s no way to save it in my gallery from the app), I tried to send a file via telegram and also get the same error. What should I do? I can’t believe that I can shoot in 4K, but can’t save a video with a higher resolution on my iPhone
Both standard mp4 files and streaming HLS files are experiencing substantial playback and rendering issues on iOS 15.
This includes:
Safari immediately crashes
Video displays only black (occasional audio can be heard)
Video is frozen on 1st frame despite time updating
Substantial load times (10+ seconds). Should be immediate.
GPU Process:Media has been disabled yet issues persist.
Safari immediately crashes with GPU Process: WebGL enabled.
These videos are being rendered via WebGL (threejs)
None of these issues were present on iOS 14.
I’m on an iPad Pro 12.9 2020.
Hello, how can I resolve this error ?
Cannot convert value of type 'Binding<[Video]>.Type' to expected argument type 'Binding<[Video]>'
NavigationLink(destination: SomeView.init(data: Binding<[Video]> ... // error is here