Post not yet marked as solved
We have implemented a check on our app, that validates when a user has seen a video in it's entirety and hasn't skipped any section.
We were using the video.played TimeRange Buffer available on a video HTML5 element.
The expected behaviour
For a user playing a video once and letting it end, it means you get one TimeRange where start is zero and end is the duration of the video in seconds. The end is not always the exact duration but it was always within 1 second of it.
The issue
After iOS15 the end integer is never the correct one or even close to the duration when a video ends (The "ended" event fires). It almost always is close to the start timer.
When pausing the video though the end duration is correct on the played TimeRange.
Testing on iOS 15.0.2
Post not yet marked as solved
Our app is receiving more of these 2 errors recently. But there is 0 documentation or search result about these 2 errors. Anyone know what they mean?
CoreMediaErrorDomain: -16934
CoreMediaErrorDomain: 28
Post not yet marked as solved
Hello,
We are trying to setup a web stream server with Mpeg-2 (H262) video format in the HLS container.
Is there a way to play Mpeg-2 codec video on iPhone safari?
Take care
Post not yet marked as solved
I am trying to replace gifs with mp4s on my website. Currently its working great in Chrome and Firefox, but the behavior is odd in Safari.
<video autoplay loop muted playsinline defaultmuted preload="auto">
<source src="/path/to/video.mp4" type="video/mp4">
</video>
This video is an h264 mp4 video with no audio track.
Firefox, Chrome on my Macbook: Works as expected (autoplay's like it were a gif)
iOS Safari without Low Power Mode: Works as expected
iOS Safari with Low Power Mode: Autoplays but there is a play button on top that disappears when tapped.
macOS Safari: Does not autoplay. A play button appears and it plays if clicked.
I have been following https://developer.apple.com/documentation/webkit/delivering_video_content_for_safari as well as other guides on the internet and it still isn't working. I'm pretty sure there is a recent change responsible for this because it used to work in an older version of desktop safari.
Post not yet marked as solved
How to get a Video, like .mp4 or .mov converted or placed into USDZ File?
Post not yet marked as solved
I am trying to develop an app that can choose a video from the iphone and save the url (or string representation of the url) so the user can select the video from the app (by selecting an assigned video title) without having to use the videopicker again. I have implemented the videopicker and can print the url (e.g. file:///private/var/mobile/Containers/Data/PluginKitPlugin/3BBCBF37-7659-439E-A3D6-4390D751F29D/tmp/trim.D973E0CE-468C-4B5F-B5FC-63FA1F647175.MOV), but I can't find a way to use this string to select the video. Can someone help me with this? Thanks.
Post not yet marked as solved
As the title says CGDisplayCopyAllDisplayModes does not appear to return ALL of the display modes. I've attached a screenshot showing a list of the modes returned by CGDisplayCopyAllDisplayModes. Notice that a CGDisplayMode with the currently used mode ID# 13 is not in the list. My second screen, a non-Retina display, seems to behave as expected.
How do you find 'all' of the CGDisplayModes for an Apple Studio Display?
Post not yet marked as solved
I'm trying to understand what exactly is made possible by Media Device Discovery Extensions, what responsibility the containing app has, and what exactly is made available to other apps or the system, if anything.
I haven't been able to find any meaningful high level documentation, and WWDC 2022 session 10096 only mentions these new extensions in passing. The most comprehensive body of information I found is the example project:
https://developer.apple.com/documentation/devicediscoveryextension/discovering_a_third-party_media-streaming_device?changes=latest_beta&language=objc
However, I don't think it's working the way it should out of the box:
I've got the Client target app built and running on an iPad Pro running iPadOS 16 Beta 2
I've got the MacServer target running on a Mac Mini with macOS 13 Ventura Beta 2
I've got the Server target running on an iPhone with iOS 15.5. (Non-beta)
If I tap the AirPlay icon on the Client's video player, I can see the two servers, but selecting one just causes a spinner to show up next to its name. This keeps going for a while, eventually the spinner goes away again, but the device selection tick stays next to 'iPad'. The text "Select route" also doesn't change, which I think it's supposed to, judging by the code.
I've tried a variety of combinations of settings on the servers - bluetooth only, bonjour only, different protocols, etc., but I'm always getting the same behaviour.
Has anyone had any success in getting the example to work, and how? Is there any high level documentation available that I've missed?
Can someone explain what exactly we can build with this in more detail than "implementations of custom A/V streaming protocols?" The WWDC session video talks about 3rd party SDKs, so do these extensions have to be embedded in every app that would be streaming the video, implying that it's not useful for mirroring?
Post not yet marked as solved
Hi!
I have limited mobility so I want to create voice command short cuts to navigate my video settings. How do I create custom shortcuts to:
Take cinematic video
Take Time lapse
Take Slow motion
Also is there a way to set a time limit? I find Siri and even sometimes voice control does not allow voice control when recording because it’s using the microphone to record.
so I would like to try a shortcut Like this:
”Take Timelapse”
Start Timelapse immediately
stop recording after 2 min.
thanks!
Post not yet marked as solved
Hi,
I see announcement of the availability of beta version of Advanced Video Quality Tool (AVQT) for Linux in wwdc2022. However I am unable to find the AVQT packages for Linux. The AVQT resource page seems to still point to the .dmg file which is for MacOS. Where can I find the Linux version of AVQT? Thanks
Post not yet marked as solved
This video session is essentially a consumer facing video, there isn't even a single line of code shown.
VideoPlayer(player: player)
doesn't give the shown "new features" by default, an example / implementation should be expected of a WWDC session.
Post not yet marked as solved
Hello,
One of the features of AVKit you list is "performance optimized". Could you confirm that you've made performance improvements in the last 12 months to AVKit? If so, could you share in what areas or what metrics improved? Thanks!
Post not yet marked as solved
What is the first version of iOS and tvOS to support HLS Steering? Is the AppleTV+ service using HLS Steering to manage multiple CDNs?
Post not yet marked as solved
I'm trying to build an educational SwiftUI iOS app with course videos. I've tried storing these videos on YouTube as private videos and also Vimeo. But they both show the video controls which allows the URL to be extracted, which I don't want.
Storing the videos as a local resource is a no no otherwise the app would be several gb's.
I can also store the videos just on my web hosting, but again I think these are discoverable and I don't want to go down the route of creating log ins and user accounts.
Are there any other solutions of doing this? Is it possible to store the videos in Firebase and get the app to access them stored there?
Post not yet marked as solved
Hello everyone,
Everything was working good until I updated to mac OS Monterey, once I finish the update, I did check the app and it start to consume a lot of memory, the app starts in 35MB and 3 seconds later grows to 345, then 650mb then 780 and so on.
I don't know if someone is experimenting the same but your help to solve this will be very appreciated, thank you.
This is the code for that:
let detector = CIDetector(
ofType: CIDetectorTypeQRCode,
context: nil,
options: [CIDetectorAccuracy: CIDetectorAccuracyLow, CIDetectorTracking: false]
)
guard let features = detector?.features(in: ciimage) else {
return decode
}
for feature in features as! [CIQRCodeFeature] {
decode = feature.messageString!
}
Post not yet marked as solved
Hello
We're implementing Search in our Apple TV App, and want to show Search Suggestions. We've followed the WWDC talk https://developer.apple.com/videos/play/wwdc2020/10634/ and so far so good.
We've got our suggestions showing, but at the start of our suggestions list is the default built in suggestion of whatever has already been typed.
We want to remove this first suggestion, because below the suggestions is already the results for the typed search term, and therefore offering this suggestion doesn't work for us.
An array of two UISearchSuggestionItem were added to self.searchController.searchSuggestions, but this seems to have been pre-pended with another suggestion, which is what the customer has already typed. This is what we want to turn off.
This suggestion isn't present in the Apple TV+ app, so it feels like we must somehow be able to turn this off, but haven't found any way to do so.
This is AppleTV+ app which does NOT show a suggestion of "jo".
Please help.
Thanks
Antony
Post not yet marked as solved
I'm trying to create a scaled down version of a video selected from the users photo album. The max dimensions of the output will be 720p. Therefore, when retrieving the video, I'm using the .mediumQualityFormat as the deliveryMode.
This causes iOS to retrieve a 720p video from iCloud if the original video or its medium quality version don't exist in the users device.
swift
let videoRequestOptions = PHVideoRequestOptions()
videoRequestOptions.deliveryMode = .mediumQualityFormat
videoRequestOptions.isNetworkAccessAllowed = true
PHImageManager.default().requestAVAsset(forVideo: asset, options: videoRequestOptions) { (asset, audioMix, info) in
// Proceess the asset
}
The problem is, when I use AVAssetExportSession to create a scaled down version of the asset, if the asset is a medium variant and not the original version, the export process fails immediately with the following error:
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-17507), NSLocalizedDescription=İşlem tamamlanamadı, NSUnderlyingError=0x283bbcf60 {Error Domain=NSOSStatusErrorDomain Code=-17507 "(null)"}}
I couldn't find anything about the meaning of this error anywhere.
When I set the deliveryMode property to .auto or .highQualityFormat, everything is working properly.
When I checked the asset url's, I noticed that if the video has been retrieved from iCloud, its filename has a ".medium" postfix like in this example:
file:///var/mobile/Media/PhotoData/Metadata/PhotoData/CPLAssets/group338/191B2348-5E19-4A8E-B15C-A843F9F7B5A3.medium.MP4
The weird thing is, if I use FileManager to copy the video in this url to another directory, create a new AVAsset from that file, and use that asset when creating the AVExportSession instance, the problem goes away.
I'd really appreciate if someone could provide some insight about what the problem could be.
This is how I use AVAssetExportSession to create a scaled down version of the original video.
swift
let originalVideoURL = "The url of the asset retrieved from requestAVAsset"
let outputVideoPath = NSTemporaryDirectory() + "encodedVideo.mp4"
let outputVideoURL = URL(fileURLWithPath: outputVideoPath)
guard
let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetHighestQuality),
let videoTrack = asset.tracks(withMediaType: .video).first else {
handleError()
return
}
let videoComposition = AVMutableVideoComposition()
videoComposition.renderSize = scaledSize
videoComposition.frameDuration = CMTimeMake(value: 1, timescale: 30)
let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack)
let transform = videoTrack.preferredTransform
layerInstruction.setTransform(transform, at: .zero)
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRangeMake(start: .zero, duration: asset.duration)
instruction.layerInstructions = [layerInstruction]
videoComposition.instructions = [instruction]
exportSession.videoComposition = videoComposition
exportSession.outputURL = outputVideoURL
exportSession.outputFileType = .mp4
exportSession.shouldOptimizeForNetworkUse = true
exportSession.exportAsynchronously(completionHandler: {[weak self] in
guard let self = self else { return }
if let url = exportSession.outputURL, exportSession.status == .completed {
// Works for local videos
} else {
// Fails with error code 17507 when loading videos with delivery size "Medium"
}
})
Post not yet marked as solved
I'm trying to move a video I create from images within my app from a temporary path to the photo library.
I've verified that the movie exists by downloading the app data via devices/xcode and the movie then plays fine on my macbook.
I've tried:
UISaveVideoAtPathToSavedPhotosAlbum(
videoPath,
self,
#selector(self.video(_:didFinishSavingWithError:contextInfo:)),
nil)
with Error:
Optional(Error Domain=ALAssetsLibraryErrorDomain Code=-1 "Unknown error" UserInfo={NSLocalizedDescription=Unknown error, NSUnderlyingError=0x283684570 {Error Domain=ALAssetsLibraryErrorDomain Code=-1 "Unknown error" UserInfo={NSLocalizedDescription=Unknown error, NSUnderlyingError=0x283681860 {Error Domain=ALAssetsLibraryErrorDomain Code=-1 "Unknown error" UserInfo={NSLocalizedDescription=Unknown error, NSUnderlyingError=0x28366e490 {Error Domain=com.apple.photos.error Code=42001 "(null)"}}}}}})
and
PHPhotoLibrary.requestAuthorization { status in
// Return if unauthorized
guard status == .authorized else {
print("Error saving video: unauthorized access")
return
}
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: videoURL as URL)
}) { success, error in
if !success {
print("Error saving video: \(String(describing: error))")
}
}
}
with Error:
Domain=ALAssetsLibraryErrorDomain Code=-1 "Unknown error" UserInfo= ...... {Error Domain=com.apple.photos.error Code=42001 "(null)"
both compile fine and are called, but end up giving me errors that do not help in the slightest.
I have a full help request on StackOverflow with a link to the project (that it does not let me post here): https://stackoverflow.com/questions/63575539/swift-ios-save-video-to-library
Post not yet marked as solved
Playback glitch is observed on iPhone and iPad devices for our encrypted asset, But playback is working fine on safari browser for the same asset. I have attached below listed files for your reference
Manifest files
Index.m3u8
Level(4417274)
Log snippet with decoder errors(Captured on iPhone8 and iPhone12)
iPhone12-Log-Snippet
iPhone8-Log-Snippet