took a bit of experimenting...
import AVKit // macOS & iOS
class ViewController_myShow: NSViewController {
@IBOutlet var myMOV: AVPlayerView!
10.
override func viewDidLoad() {
super.viewDidLoad()
let playerView = AVPlayerView()
20.
21. playerView.translatesAutoresizingMaskIntoConstraints = false
22.
23. view.addSubview(playerView)
24.
25.
26.
27. playerView.leadingAnchor.constraint (equalTo: view.safeAreaLayoutGuide.leadingAnchor ).isActive = true
28.
29. playerView.trailingAnchor.constraint (equalTo: view.safeAreaLayoutGuide.trailingAnchor).isActive = true
30.
31. playerView.topAnchor.constraint (equalTo: view.safeAreaLayoutGuide.topAnchor ).isActive = true
32.
33. playerView.bottomAnchor.constraint (equalTo: view.safeAreaLayoutGuide.bottomAnchor ).isActive = true
34.
35.
36.
37. playerView.controlsStyle = .floating
38.
39. playerView.showsFrameSteppingButtons = true
40.
41. playerView.showsFullScreenToggleButton = true
42.
43.
44.
45. guard let path = Bundle.main.url(forResource: "myMovie", withExtension: "mov") else { return }
46.
47.
48.
49. let player = AVPlayer(url: path)
50.
51. playerView.player = player
52.
53. playerView.player?.play()
54.
55. }
56.
57. }
The StoryBoard:AttributesInspector:AVPlayerView settings don't work for me... exactly. I had to set StoryBoard:AttributesInspector:AVPlayerView:ControlsStyle:none then include lines 37-41 in my code.
The default is StoryBoard:AttributesInspector:AVPlayerView:ControlsStyle:inline which appears on screen but does nothing, i.e., no control.
Line 21 is also required to be false.
The .mov file can be File:AddFilesTo 'd or copied to Assets.xcassets
see also:
Developer Forum:
"can't get extremely simple macOS video player to work..."
https://developer.apple.com/documentation/avfoundation/media_playback_and_selection/creating_a_basic_video_player_macos
			
Post not yet marked as solved
I'm using AVPlayerView to display a video whose title I want to override in the system-wide NowPlaying view (e.g. on Big Sur in the Notification Center).
On iOS / tvOS / Catalyst, this can (supposedly) be done by setting the AVPlayerItem's externalMetadata as desired, but this property is unsupported on macOS. What's the supported way of doing this for a "normal" AppKit app?
My simple attempt of manually setting the information via MPNowPlayingInfoCenter didn't work; I assume that's getting overwritten by the "automatic" support from AVPlayer(View) with the faulty (empty) title from the actual video.
Any pointers?
Post not yet marked as solved
Hi,
I'm trying to add custom actions on AVPlayerViewController on iOS. I was able to use transportBarCustomMenuItems for tvOS but I can't find any iOS equivalent.
On the Apple's TV app for iOS they use custom menus like this.
Post not yet marked as solved
I'm trying to enforce a duration limit for picked videos. It doesn't appear that there's a configuration option in PHPickerConfiguration unless I missed it.
I'm able to get the duration from the URL returned by loadFileRepresentation but it appears that this loads the entire file which somewhat defeats the point of limiting video size and can take a long time.
It also looks like I can use the local assetIdentifier but this requires initializing PHPickerConfiguration with a PHPhotoLibrary which requires asking the user for permission and complicates the whole flow when I just want them to pick a single file.
Is there a way to take advantage of PHPickerViewController and let the user choose a video and enforce a video limit without loading the entire file / requiring photo library permissions?
Thanks!
Post not yet marked as solved
I know that if you want background audio from AVPlayer you need to detatch your AVPlayer from either your AVPlayerViewController or your AVPlayerLayer in addition to having your AVAudioSession configured correctly.
I have that all squared away and background audio is fine until we introduce AVPictureInPictureController or use the PiP behavior baked into AVPlayerViewController.
If you want PiP to behave as expected when you put your app into the background by switching to another app or going to the homescreen you can't perform the detachment operation otherwise the PiP display fails.
On an iPad if PiP is active and you lock your device you continue to get background audio playback. However on an iPhone if PiP is active and you lock the device the audio pauses.
However if PiP is inactive and you lock the device the audio will pause and you have to manually tap play on the lockscreen controls. This is the same between iPad and iPhone devices.
My questions are:
Is there a way to keep background-audio playback going when PiP is inactive and the device is locked (iPhone and iPad)
Is there a way to keep background-audio playback going when PiP is active and the device is locked? (iPhone)
Post not yet marked as solved
Hello everyone I'm seeing weird crash on bugsnag.
Its about player on tvOS and it happens when I'm exiting player.
And its 2 system classes.
Can someone help me understand what's going on here
Unable to activate constraint with anchors <NSLayoutXAxisAnchor:0x2831d5480 "AVFocusProxyView:0x1224b3370.left"> and <NSLayoutXAxisAnchor:0x28356cf40 "AVPlayerLayerView:0x1224bf9b0.left"> because they have no common ancestor. Does the constraint or its anchors reference items in different view hierarchies? That's illegal.
Post not yet marked as solved
I want to make a custom UI that integrates each video feed from my FaceTime group activity participants.
I found the app Share+ in the App Store integrates the video from each FaceTime participant into it's own UI, so I know it's possible.
Can anyone point me to the relevant documentation that shows me how I can get to the video of each FaceTime group member to put in my own AVSampleBufferDisplayLayer or otherwise?
Post not yet marked as solved
App getting crashed after updating OS version to OS 15.1 at the time of first time launch, and after crash it works fine. In earlier version like in 15.0 it was working fine.
While Debug I found in the first time in audio video permission app getting stuck in below code.
if ([AVCaptureDevice respondsToSelector:@selector(requestAccessForMediaType: completionHandler:)]) {
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeAudio completionHandler:^(BOOL granted) {
if (granted) {
dispatch_async(dispatch_get_main_queue(), ^{
});
} else {
}
}];
} else {
}
Post not yet marked as solved
RealityKit ARImageAnchor with VideoMaterial problems
When I move the camera closer, sometimes the image from the ARResources overlaps the video with itself. What could be the problem?
Links:
https://www.dropbox.com/s/b8yaczq4xjk9v1p/IMG_9429.PNG?dl=0
https://www.dropbox.com/s/59dj4ldf6l3yj4u/RPReplay_Final1637392988.mov?dl=0
VideoEntity class
final class VideoEntity {
var videoPlayer = AVPlayer()
func videoModelEntity(width: Float?, height: Float?) -> ModelEntity {
let plane = MeshResource.generatePlane(width: width ?? Float(), height: height ?? Float())
let videoItem = createVideoItem(with: "Cooperation")
let videoMaterial = createVideoMaterial(with: videoItem)
return ModelEntity(mesh: plane, materials: [videoMaterial])
}
func placeVideoScreen(videoEntity: ModelEntity, imageAnchor: ARImageAnchor, arView: ARView) {
let anchorEntity = AnchorEntity(anchor: imageAnchor)
let rotationAngle = simd_quatf(angle: GLKMathDegreesToRadians(-90), axis: SIMD3<Float>(x: 1, y: 0, z: 0))
videoEntity.setOrientation(rotationAngle, relativeTo: anchorEntity)
videoEntity.setPosition(SIMD3<Float>(x: 0, y: 0.015, z: 0), relativeTo: anchorEntity)
anchorEntity.addChild(videoEntity)
arView.scene.addAnchor(anchorEntity)
}
private func createVideoItem(with filename: String) -> AVPlayerItem {
guard let url = Bundle.main.url(forResource: filename, withExtension: "mov") else {
fatalError("Fatal Error: - No file source.")
}
return AVPlayerItem(url: url)
}
private func createVideoMaterial(with videoItem: AVPlayerItem) -> VideoMaterial {
let videoMaterial = VideoMaterial(avPlayer: videoPlayer)
videoPlayer.replaceCurrentItem(with: videoItem)
videoPlayer.actionAtItemEnd = .none
videoPlayer.play()
NotificationCenter.default.addObserver(self, selector: #selector(loopVideo), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: videoPlayer.currentItem)
return videoMaterial
}
@objc
private func loopVideo(notification: Notification) {
guard let playerItem = notification.object as? AVPlayerItem else { return }
playerItem.seek(to: CMTime.zero, completionHandler: nil)
videoPlayer.play()
}
}
ViewModel class
func startImageTracking(arView: ARView) {
guard let arReferenceImage = ARReferenceImage.referenceImages(inGroupNamed: "ARResources", bundle: Bundle.main) else { return }
let configuration = ARImageTrackingConfiguration().do {
$0.trackingImages = arReferenceImage
$0.maximumNumberOfTrackedImages = 1
}
let personSegmentation: ARWorldTrackingConfiguration.FrameSemantics = .personSegmentationWithDepth
if ARWorldTrackingConfiguration.supportsFrameSemantics(personSegmentation) {
configuration.frameSemantics.insert(personSegmentation)
}
arView.session.run(configuration, options: [.resetTracking])
}
ARSessionDelegate protocol
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
for anchor in anchors {
if let imageAnchor = anchor as? ARImageAnchor {
let videoEntity = viewModel.videoEntity.videoModelEntity(width: Float(imageAnchor.referenceImage.physicalSize.width), height: Float(imageAnchor.referenceImage.physicalSize.height))
viewModel.videoEntity.placeVideoScreen(videoEntity: videoEntity, imageAnchor: imageAnchor, arView: arView)
}
}
}
I am currently working on a macOS app which will be creating very large video files with up to an hour of content. However, generating the images and adding them to AVAssetWriter leads to VTDecoderXPCService using 16+ GB of memory and the kernel-task using 40+ GB (the max I saw was 105GB).
It seems like the generated video is not streamed onto the disk but rather written to memory for it to be written to disk all at once. How can I force it to stream the data to disk while the encoding is happening?
Btw. my app itself consistently needs around 300MB of memory, so I don't think I have a memory leak here.
Here is the relevant code:
func analyse()
{
				self.videoWritter = try! AVAssetWriter(outputURL: outputVideo, fileType: AVFileType.mp4)
				let writeSettings: [String: Any] = [
						AVVideoCodecKey: AVVideoCodecType.h264,
						AVVideoWidthKey: videoSize.width,
						AVVideoHeightKey: videoSize.height,
						AVVideoCompressionPropertiesKey: [
								AVVideoAverageBitRateKey: 10_000_000,
						]
				]
				self.videoWritter!.movieFragmentInterval = CMTimeMake(value: 60, timescale: 1)
				self.frameInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: writeSettings)
				self.frameInput?.expectsMediaDataInRealTime = true
				self.videoWritter!.add(self.frameInput!)
				if self.videoWritter!.startWriting() == false {
						print("Could not write file: \(self.videoWritter!.error!)")
						return
				}
}
func writeFrame(frame: Frame)
{
				/* some more code to determine the correct frame presentation time stamp */
				let newSampleBuffer = self.setTimeStamp(frame.sampleBuffer, newTime: self.nextFrameStartTime!)
				self.frameInput!.append(newSampleBuffer)
				/* some more code */
}
func setTimeStamp(_ sample: CMSampleBuffer, newTime: CMTime) -> CMSampleBuffer {
				var timing: CMSampleTimingInfo = CMSampleTimingInfo(
						duration: CMTime.zero,
						presentationTimeStamp: newTime,
						decodeTimeStamp: CMTime.zero
				)
				var newSampleBuffer: CMSampleBuffer?
				CMSampleBufferCreateCopyWithNewTiming(
						allocator: kCFAllocatorDefault,
					 sampleBuffer: sample,
					 sampleTimingEntryCount: 1,
					 sampleTimingArray: &timing,
					 sampleBufferOut: &newSampleBuffer
			 )
				return	newSampleBuffer!
		}
My specs:
MacBook Pro 2018
32GB Memory
macOS Big Sur 11.1
Post not yet marked as solved
Hi,
I have an app attempting to do some video playback and editing, but I'm having an issue where with some videos, some operations like AVQueuePlayer/AVPlayerItem.seek will cause all players to move to .failed and just show black.
The only indication the OS gives the app that this has happened is some logs relating to the haptic engine:
[hcln] AVHapticClient.mm:1309 -[AVHapticClient handleServerConnectionInterruption]: [xpc] Entered (due to connection interruption) for client ID 0x100031e
[hapi] CHHapticEngine.mm:614 -[CHHapticEngine finishInit:]_block_invoke: ERROR: Server connection broke with error 'The operation couldn’t be completed. (com.apple.CoreHaptics error -4811.)
In the device console, the most likely culprit seems to be this (reported by kernal):
EXC_RESOURCE -> mediaserverd[768] exceeded mem limit: ActiveSoft 2500 MB (non-fatal)
If my assumption is correct, is it possible to mitigate this issue before mediaserverd and all the associated players get killed?
My understanding of this is that AV memory is separate to the app memory so responding to the usual didReceiveMemoryWarningNotification isn't applicable in this case, and that notification doesn't seem to be being send before these failures.
Thanks
Post not yet marked as solved
Issue
I'm using AVFoundation to implement a Camera that is able to record videos while running special AI processing.
Having an AVCaptureMovieFileOutput (for video recording) and a AVCaptureVideoDataOutput (for processing AI) running at the same time is not supported (see https://stackoverflow.com/q/4944083/5281431), so I have decided to use a single AVCaptureVideoDataOutput which is able to record videos to a file while running the AI processing in the same captureOutput(...) callback.
To my surprise, doing that drastically increases RAM usage from 58 MB to 187 MB (!!!), and CPU from 3-5% to 7-12% while idle. While actually recording, the RAM goes up even more (260 MB!).
I am wondering what I did wrong here, since I disabled all the AI processing and just compared the differences between AVCaptureMovieFileOutput and AVCaptureVideoDataOutput.
My code:
AVCaptureMovieFileOutput
Setup
swift
if let movieOutput = self.movieOutput {
captureSession.removeOutput(movieOutput)
}
movieOutput = AVCaptureMovieFileOutput()
captureSession.addOutput(movieOutput!)
Delegate
(well there is none, AVCaptureMovieFileOutput handles all that internally)
Benchmark
When idle, so not recording at all:
RAM: 56 MB
CPU: 3-5%
When recording using AVCaptureMovieFileOutput.startRecording:
RAM: 56 MB (how???)
CPU: 20-30%
AVCaptureVideoDataOutput
Setup
swift
// Video
if let videoOutput = self.videoOutput {
captureSession.removeOutput(videoOutput)
self.videoOutput = nil
}
videoOutput = AVCaptureVideoDataOutput()
videoOutput!.setSampleBufferDelegate(self, queue: videoQueue)
videoOutput!.alwaysDiscardsLateVideoFrames = true
captureSession.addOutput(videoOutput!)
// Audio
if let audioOutput = self.audioOutput {
captureSession.removeOutput(audioOutput)
self.audioOutput = nil
}
audioOutput = AVCaptureAudioDataOutput()
audioOutput!.setSampleBufferDelegate(self, queue: audioQueue)
captureSession.addOutput(audioOutput!)
Delegate
swift
extension CameraView: AVCaptureVideoDataOutputSampleBufferDelegate,
AVCaptureAudioDataOutputSampleBufferDelegate {
public final func captureOutput(_ captureOutput: AVCaptureOutput,
didOutput sampleBuffer: CMSampleBuffer,
from _: AVCaptureConnection) {
// empty
}
public final func captureOutput(_ captureOutput: AVCaptureOutput,
didDrop buffer: CMSampleBuffer,
from _: AVCaptureConnection) {
// empty
}
}
yes, they are literally empty methods. My RAM and CPU usage is still that high without doing any work here.
Benchmark
When idle, so not recording at all:
RAM: 151-187 MB
CPU: 7-12%
When recording using a custom AVAssetWriter:
RAM: 260 MB
CPU: 64%
Why is the AVCaptureMovieFileOutput so much more efficient than an empty AVCaptureVideoDataOutput? Also, why does it's RAM not go up at all when recording, compared to how my AVAssetWriter implementation alone consumes 80 MB?
Here's my custom AVAssetWriter implementation: [RecordingSession.swift](https://github.com/cuvent/react-native-vision-camera/blob/frame-processors/ios/RecordingSession.swift), and here's where I call it - https://github.com/cuvent/react-native-vision-camera/blob/a48ca839e93e6199ad731f348e19427774c92821/ios/CameraView%2BRecordVideo.swift#L16-L86.
Any help appreciated!
Post not yet marked as solved
Is there a way to programatically check out if the spatial audio setting is activated on an iOS device?
Post not yet marked as solved
When I opened the video in HTML, it crashed when I started playing it。 Here is my app crash report:
0
CoreFoundation
__HALT + 2
arrow_right
1
QuartzCore
CA::Layer::setter(unsigned int, _CAValueType, void const*) + 252
2
QuartzCore[CALayer setBackgroundColor:] + 56
3
QuartzCore[CAStateSetValue apply:] + 620
4
QuartzCore[CAStateController setState:ofLayer:transitionSpeed:] + 1364
5
AVKit[AVMicaPackage transitionToStateWithName:onLayer:] + 144
6
AVKit[AVMicaPackage transitionToStateWithName:] + 92
7
AVKit[AVMicaPackage _setState:] + 108
8
AVKit[AVMicaPackage setState:color:] + 116
9
AVKit[AVPlaybackControlsRoutePickerView updateButtonAppearance] + 204
10
AVKit[AVRoutePickerView layoutSubviews] + 632
11
UIKitCore[UIView(CALayerDelegate) layoutSublayersOfLayer:] + 2500
12
QuartzCore[CALayer layoutSublayers] + 296
13
QuartzCore
CA::Layer::layout_if_needed(CA::Transaction*) + 524
14
QuartzCore
CA::Layer::layout_and_display_if_needed(CA::Transaction*) + 144
15
QuartzCore
CA::Context::commit_transaction(CA::Transaction*, double, double*) + 416
16
QuartzCore
CA::Transaction::commit() + 732
17
QuartzCore
CA::Transaction::observer_callback(__CFRunLoopObserver*, unsigned long, void*) + 96
18
CoreFoundation
CFRUNLOOP_IS_CALLING_OUT_TO_AN_OBSERVER_CALLBACK_FUNCTION + 36
19
CoreFoundation
__CFRunLoopDoObservers + 576
20
CoreFoundation
__CFRunLoopRun + 1056
21
CoreFoundation
CFRunLoopRunSpecific + 600
22
GraphicsServices
GSEventRunModal + 164
23
UIKitCore[UIApplication _run] + 1072
24
UIKitCore
UIApplicationMain + 168
25
AXNews
main.m - 第 15 行
main + 15
26
libdyld.dylib
start + 4
Post not yet marked as solved
I'm stuck on these errors
"Cannot declare entity named '$reel'; the '$' prefix is reserved for implicitly-synthesized declarations"
"Generic struct 'ForEach' requires that 'Binding<[Reel]>' conform to 'RandomAccessCollection'"
Both errors are on this line of code
ForEach($reels) { $reel in
ReelsPlayer(reel: $reel)
}
I have a TabView {} and am trying to make each page a new video similar to tik tok or vine. It works when I pass in dummy data but when I pass from my array of videos I get those errors.
Here's the rest of the code
The View:
import SwiftUI
import AVKit
struct ReelsView: View {
@State var currentReel = ""
// extracting AVPlayer from media file
@State var reels: [Reel] = MediaFileJSON.map { item -> Reel in
let url = Bundle.main.path(forResource: item.url, ofType: "MP4") ?? ""
let player = AVPlayer(url: URL(fileURLWithPath: url))
return Reel(player: player, mediaFile: item)
}
var body: some View {
// setting width and height for rotated view
GeometryReader { proxy in
let size = proxy.size
// vertical page tab view
TabView(selection: $currentReel) {
ForEach($reels) { $reel in // ERRORS HERE
ReelsPlayer(reel: $reel)
// setting width
.frame(width: size.width)
.padding()
// rotate content
.rotationEffect(.init(degrees: -90))
}
}
// Rotate View
.rotationEffect(.init(degrees: 90))
// setting height as width
.frame(width: size.height)
.tabViewStyle(PageTabViewStyle(indexDisplayMode: .never))
// setting max width
.frame(width: size.width)
}
}
}
struct ReelsView_Previews: PreviewProvider {
static var previews: some View {
ContentView()
}
}
struct ReelsPlayer: View, Identifiable {
var id: ObjectIdentifier
@Binding var reel: Reel
var body: some View {
ZStack {
}
}
}
MediaFile struct:
struct MediaFile: Identifiable {
var id = UUID().uuidString
var url: String
var title: String
var isExpanded: Bool = false
}
var MediaFileJSON = [
MediaFile(url: "Reel1", title: "Apple AirTag......."),
MediaFile(url: "Reel2", title: "this is the second."),
MediaFile(url: "Reel3", title: "the third"),
MediaFile(url: "Reel4", title: "wooooooooooooo ya"),
MediaFile(url: "Reel5", title: "ay ayayayayay ayay lsdfk sl"),
MediaFile(url: "Reel6", title: "this is the last one....."),
]
Reel struct:
struct Reel: Identifiable {
var id: String = UUID().uuidString
var player: AVPlayer?
var mediaFile: MediaFile
}
Post not yet marked as solved
I am building a macOs app with SwiftUI
I would like to have the window resize according the media's aspect ratio, like what vlc player does.
But I could not found any function to control the size of the window on runtime.
The "frame" control either produce a fixed size window or not working(maxH maxW) at all.
Post not yet marked as solved
I'm trying to programmatically take a screenshot of a view controller that has an AVPlayerViewController. The problem is that, when taking the screenshot on the simulator, the video player appears but on a real device the video player will appear as blank.
Here's the relevant code:
UIGraphicsBeginImageContextWithOptions(imageSize, NO, 0.0f);
CGContextRef context = UIGraphicsGetCurrentContext();
[window drawViewHierarchyInRect:windowFrame afterScreenUpdates:afterScreenUpdates];
UIImage *screenShot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Post not yet marked as solved
I have a music app that can play in the background, using AVQueuePlayer. I'm in the process of adding support for CloudKit sync of the CoreData store, switching from NSPersistentContainer to NSPersistentCloudKitContainer.
The initial sync can be fairly large (10,000+ records), depending on how much the user has used the app. The issue I'm seeing is this:
✅ When the app is in the foreground, CloudKit sync uses a lot of CPU, nearly 100% for a long time (this is expected during the initial sync).
✅ If I AM NOT playing music, when I put the app in the background, CloudKit sync eventually stops syncing until I bring the app to the foreground again (this is also expected).
❌ If I AM playing music, when I put the app in the background, CloudKit never stops syncing, which leads the system to terminate the app after a certain amount of time due to high CPU usage.
Is there any way to pause the CloudKit sync when the app is in the background or is there any way to mitigate this?
Post not yet marked as solved
I'm using AVPictureInPictureController.isPictureInPictureSupported() to detect PiP feature is supported on the device. It's working on iPadOS 13 and 14.
As we know that, iOS 14 is supporting PiP on iPhone. I'm using the same code but it returns false.
And I try AVPictureInPictureController(playerLayer: playerLayer).isPictureInPicturePossible, it returns nil.
I tested it on iOS 14 beta 1 and beta 2, iPhone X on simulator, it still the same.
I also see that the PiP button is also not shown on Safari HTML5 video playback.
How to make it work on iOS 14? Or how to enable for it?
Post not yet marked as solved
How can I play video using AVPlayer? I have retrieved the file URL i.e file:///Users/admin/Library/Developer/CoreSimulator/Devices/718B08F8-D4DD-44E6-9DFA-0E81D5EDA78C/data/Containers/Shared/AppGroup/D82C51F4-E1B2-4390-9885-296A185ACF16/File%20Provider%20Storage/photospicker/version=1&uuid=BCC39930-E835-4BBE-A6F1-716B21CA10A0&mode=compatible.mov
how to play using this?