Create view-level services for media playback, complete with user controls, chapter navigation, and support for subtitles and closed captioning using AVKit.

Posts under AVKit tag

82 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Outputvolume of AVAudioSession returning rounded values
Using the hardware volume buttons on the iPhone, you have 16 steps you can adjust your volume to. I want to implement a volume control slider in my app. I am updating the value of the slider using AVAudioSession.sharedInstance().outputVolume. The problem is that this returns values rounded to the nearest 0 or 5. This makes the slider jump around. .formatted() is not causing this problem. You can recreate the problem using code below. @main struct VolumeTestApp: App { init() { try? AVAudioSession.sharedInstance().setActive(true) } var body: some Scene { WindowGroup { ContentView() } } } struct ContentView: View { @State private var volume = Double() @State private var difference = Double() var body: some View { VStack { Text("The volume changed by \(difference.formatted())") Slider(value: $volume, in: 0...1) } .onReceive(AVAudioSession.sharedInstance().publisher(for: \.outputVolume), perform: { value in volume = Double(value) }) .onChange(of: volume) { oldValue, newValue in // Only used to make the problem more obvious if oldValue > newValue { difference = oldValue - newValue } else { difference = newValue - oldValue } } } } Here is a video of the problem in action: https://share.icloud.com/photos/00fmp7Vq1AkRetxcIP5EXeAZA What am I doing wrong or what can I do to avoid this? Thank you
1
0
207
4d
Crash in AVPlayerView.setPlaybackControlsViewController
A user of my app, which shows subtitles loaded from a text file above a video loaded from another file, reported that it crashes within minutes of launching it. The user confirmed that the crash only happens when they load a video within the app; if they use it without a video, it doesn't crash. The subtitles are shown in a NSTextView added to AVPlayerView.contentOverlayView. What could cause such a crash? I'm not able to reproduce it. (I tried attaching the full crash report but I always got a validation error "This post contains sensitive language. Please revise it in order to continue.") Crashed Thread: 0 Dispatch queue: com.apple.main-thread Exception Type: EXC_BREAKPOINT (SIGTRAP) Exception Codes: 0x0000000000000001, 0x000000019c50cae8 Termination Reason: Namespace SIGNAL, Code 5 Trace/BPT trap: 5 Terminating Process: exc handler [20614] Application Specific Backtrace 0: 0 CoreFoundation 0x0000000198a472ec __exceptionPreprocess + 176 1 libobjc.A.dylib 0x000000019852e788 objc_exception_throw + 60 2 Foundation 0x0000000199b2caa0 -[NSKeyValueNestedProperty object:withObservance:didChangeValueForKeyOrKeys:recurse:forwardingValues:] + 664 3 Foundation 0x0000000199af4e08 -[NSKeyValueUnnestedProperty object:withObservance:didChangeValueForKeyOrKeys:recurse:forwardingValues:] + 196 4 Foundation 0x0000000199b8bc54 NSKeyValueDidChange + 200 5 Foundation 0x0000000199acde1c -[NSObject(NSKeyValueObservingPrivate) _changeValueForKeys:count:maybeOldValuesDict:maybeNewValuesDict:usingBlock:] + 684 6 Foundation 0x0000000199af7484 -[NSObject(NSKeyValueObservingPrivate) _changeValueForKey:key:key:usingBlock:] + 64 7 Foundation 0x0000000199b110e8 _NSSetObjectValueAndNotify + 284 8 AVKit 0x00000001bd3ff694 -[AVPlayerControlsViewController setPlayerController:] + 376 9 Foundation 0x0000000199acddd0 -[NSObject(NSKeyValueObservingPrivate) _changeValueForKeys:count:maybeOldValuesDict:maybeNewValuesDict:usingBlock:] + 608 10 Foundation 0x0000000199af7484 -[NSObject(NSKeyValueObservingPrivate) _changeValueForKey:key:key:usingBlock:] + 64 11 Foundation 0x0000000199b110e8 _NSSetObjectValueAndNotify + 284 12 AVKit 0x00000001bd3c6e68 -[AVPlayerView setPlaybackControlsViewController:] + 88 13 Foundation 0x0000000199b11074 _NSSetObjectValueAndNotify + 168 14 AVKit 0x00000001bd40f4b8 -[AVPlayerView _updatePlaybackControlsViewControllerIfNeeded] + 536 15 AVKit 0x00000001bd3cc0b4 -[AVPlayerView viewDidMoveToWindow] + 136 16 AppKit 0x000000019c24456c -[NSView _setWindow:] + 1788 17 AppKit 0x000000019ccce4a0 __21-[NSView _setWindow:]_block_invoke.146 + 268 18 AppKit 0x000000019c244564 -[NSView _setWindow:] + 1780 19 AppKit 0x000000019ccce4a0 __21-[NSView _setWindow:]_block_invoke.146 + 268 20 AppKit 0x000000019c244564 -[NSView _setWindow:] + 1780 ... 35 AppKit 0x000000019ccce4a0 __21-[NSView _setWindow:]_block_invoke.146 + 268 36 AppKit 0x000000019c244564 -[NSView _setWindow:] + 1780 37 AppKit 0x000000019c428ec0 -[NSWindow dealloc] + 684 38 Foundation 0x000000019a1fa118 _NSKVOPerformWithDeallocatingObservable + 172 39 Foundation 0x0000000199b17404 NSKVODeallocate + 180 40 Foundation 0x0000000199b122f0 empty + 88 41 Foundation 0x0000000199af77fc dealloc + 60 42 Foundation 0x0000000199af7740 -[NSConcreteMapTable dealloc] + 76 43 AppKit 0x000000019c936e80 ___NSTouchBarFinderSetNeedsUpdateOnMain_block_invoke_2 + 1388 44 AppKit 0x000000019c2d4c4c NSDisplayCycleObserverInvoke + 168 45 AppKit 0x000000019c2d48a8 NSDisplayCycleFlush + 644 46 QuartzCore 0x00000001a0bc3f64 _ZN2CA11Transaction19run_commit_handlersE18CATransactionPhase + 120 47 QuartzCore 0x00000001a0bc2d04 _ZN2CA11Transaction6commitEv + 320 48 AppKit 0x000000019c3589d0 __62+[CATransaction(NSCATransaction) NS_setFlushesWithDisplayLink]_block_invoke + 272 49 AppKit 0x000000019cd18208 ___NSRunLoopObserverCreateWithHandler_block_invoke + 64 50 CoreFoundation 0x00000001989d187c __CFRUNLOOP_IS_CALLING_OUT_TO_AN_OBSERVER_CALLBACK_FUNCTION__ + 36 51 CoreFoundation 0x00000001989d1768 __CFRunLoopDoObservers + 536 52 CoreFoundation 0x00000001989d0d94 __CFRunLoopRun + 776 53 CoreFoundation 0x00000001989d0434 CFRunLoopRunSpecific + 608 54 HIToolbox 0x00000001a317419c RunCurrentEventLoopInMode + 292 55 HIToolbox 0x00000001a3173e2c ReceiveNextEventCommon + 220 ... Thread 0 Crashed:: Dispatch queue: com.apple.main-thread 0 AppKit 0x19c50cae8 -[NSApplication _crashOnException:] + 240 1 AppKit 0x19c358b44 __62+[CATransaction(NSCATransaction) NS_setFlushesWithDisplayLink]_block_invoke + 644 2 AppKit 0x19cd18208 ___NSRunLoopObserverCreateWithHandler_block_invoke + 64 3 CoreFoundation 0x1989d187c __CFRUNLOOP_IS_CALLING_OUT_TO_AN_OBSERVER_CALLBACK_FUNCTION__ + 36 4 CoreFoundation 0x1989d1768 __CFRunLoopDoObservers + 536 5 CoreFoundation 0x1989d0d94 __CFRunLoopRun + 776 6 CoreFoundation 0x1989d0434 CFRunLoopRunSpecific + 608 7 HIToolbox 0x1a317419c RunCurrentEventLoopInMode + 292 8 HIToolbox 0x1a3173e2c ReceiveNextEventCommon + 220 9 HIToolbox 0x1a3173d30 _BlockUntilNextEventMatchingListInModeWithFilter + 76 10 AppKit 0x19c22fd68 _DPSNextEvent + 660 11 AppKit 0x19ca25808 -[NSApplication(NSEventRouting) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 700 12 AppKit 0x19c22309c -[NSApplication run] + 476 13 AppKit 0x19c1fa2e0 NSApplicationMain + 880 14 Underword 0x102c099ac 0x102c08000 + 6572 15 dyld 0x19856a0e0 start + 2360
0
0
102
1w
CoreMediaErrorDomain Code=-12884
I have an AVPlayerViewController in my app playing video and my largest source of errors is NSURLErrorDomain Code=-1008. The underlying error they provide is Error Domain=CoreMediaErrorDomain Code=-12884 "(null)". I couldn't find what the error implies or is caused by - osstatus.com also does not have a reference to it.
1
0
1.3k
1w
Issues with playing 3D videos and adding additional views in VisionOs AvplayerViewController
I want to add additional UI to places outside of AVPlayerViewController, such as adding likes and comments, but I found that once AVPlayerViewController is wrapped in other views, it becomes inline mode and cannot play 3D effects, instead becoming 2D effects. 3D effects can only be played when it is full screen. I want to know if there is any way to meet my needs like this? On the premise of being able to play 3D videos, add additional layouts outside the AVPlayerViewController, or use AVPlayerViewController to achieve the method of both playing 3D videos and adding additional layouts outside the player. If anyone knows, could you give me some guidance? Thank you.
0
1
187
1w
Controlling spacial video from floating window
I've created a Full Immersive VisionOS project and added a spacial video player in the ImmersiveView swift file. I have a few buttons on a different VideosView swift file on a floating window and i'd like switch the video playing in ImmersiveView when i click on a button in VideosView file. Video player working great in ImmersiveView: RealityView { content in if let videoEntity = try? await Entity(named: "Video", in: realityKitContentBundle) { guard let url = Bundle.main.url(forResource: "video1", withExtension: "mov") else {fatalError("Video was not found!")} let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) let player = AVPlayer() videoEntity.components[VideoPlayerComponent.self] = .init(avPlayer: player) content.add(videoEntity) player.replaceCurrentItem(with: playerItem) player.play() }else { print("file not found!") } } Buttons in floating window from VideosView: struct VideosView: View { var body: some View { VStack{ Button(action: {}) { Text("video 1").font(.title) } Button(action: {}) { Text("video 2").font(.title) } Button(action: {}) { Text("video 3").font(.title) } } } } In general how do I control the video player across views and how do I replace the video when each button is selected. Any help/code/links would be greatly appreciated.
1
0
248
3w
AVPlayer and TLS 1.3 compliance for low latency HLS live stream
Hi guys, I'm investigating failure to play low latency Live HLS stream and I'm getting following error: (String) “<AVPlayerItemErrorLog: 0x30367da10>\n#Version: 1.0\n#Software: AppleCoreMedia/1.0.0.21L227 (Apple TV; U; CPU OS 17_4 like Mac OS X; en_us)\n#Date: 2024/05/17 13:11:46.046\n#Fields: date time uri cs-guid s-ip status domain comment cs-iftype\n2024/05/17 13:11:16.016 https://s2-h21-nlivell01.cdn.xxxxxx.***/..../xxxx.m3u8 -15410 \“CoreMediaErrorDomain\” \“Low Latency: Server must support http2 ECN and SACK\” -\n2024/05/17 13:11:17.017 -15410 \“CoreMediaErrorDomain\” \“Invalid server blocking reload behavior for low latency\” -\n2024/05/17 13:11:17.017 The stream works when loading from dev server with TLS 1.3, but fails on CDN servers with TLS 1.2. Regular Live streams and VOD streams work normally on those CDN servers. I tried to configure TLSv1.2 in Info.plist, but that didn't help. When running nscurl --ats-diagnostics --verbose it is passing for the server with TLS 1.3, but failing for CDN servers with TLS 1.2 due to error Code=-1005 "The network connection was lost." Is TLS 1.3 required or just recommended? Refering to https://developer.apple.com/documentation/http-live-streaming/enabling-low-latency-http-live-streaming-hls and https://datatracker.ietf.org/doc/html/draft-pantos-hls-rfc8216bis Is it possible to configure AVPlayer to skip ECN and SACK validation? Thanks.
0
0
237
4w
We successfully implemented VR180 with MV-HEVC video. However, the results from the URL connection and the AFP connection are too different.
After numerous trials and errors, we finally succeeded in implementing VR180. However, there is a problem. Videos played via a URL (Internet) connection experience significant lag. Initially, I thought it was a bitrate issue. But after various tests, I began to suspect that the problem might be with the internet connection processing..itself I tested the same video through both file opening (set up as a network drive) and URL (AWS) connections. Since AWS provides stable speeds, I concluded there is no issue there. The video files are 8K. The bitrate is between 80-90 Mbps. The conditions for decoding and implementing 8K are the same. Also, when I mirrored the video, there was significant lag. Both AFP and URL use the same wireless conditions. I assume the conditions for implementing 8K are the same. When mirroring, the AFP connection had no lag at all. Could it be that VisionOS's URL (Internet connection) is causing a high system load? I noticed that an app called AmazeVR allows videos to be downloaded before playing. Could this be because of the URL issue? If anyone knows, please respond.
2
0
402
Apr ’24
SwiftUI - observing AVPlayer playback state
I am learning SwiftUI, I want to observe an AVPlayer status so I know when the videos is paused or not. My current approach is more less like this: I have VideosView that holds a list of a videos (in ZStack cards). VideoViews has a VideosViewModel. in VideosView i am calling in onAppear VideosViewModel.getItems... struct ItemModel: Identifiable, Codable, Hashable, Equatable { var id: String var author: String // video owner var url: URL? // url to the video var player: AVPlayer? // AVPlayer created based on self.url... mutating func setPlayer(_ avPlayer: AVPlayer) { self.player = avPlayer } } // vm class FeedViewModel: ObservableObject { @Published private(set) var items: [ItemModel] = [] func getItems() async { do { // fetch data from the API let data = try await dataService.fetchFeeds() // download and attach videos downloadFeedVideos(data) } catch { // .... } } private func downloadFeedVideos(_ feeds: [ItemModel]) { for index in feeds.indices { var item = feeds[index] if let videoURL = item.url { self.downloadQueue.queueDownloadIfFileNotExists( videoURL, DownloadOperation( session: URLSession.shared, downloadTaskURL: videoURL, completionHandler: { [weak self] (localURL, response, error) in guard let tempUrl = localURL else { return } let saveResult = self?.fileManagerService.saveInTemp(tempUrl, fileName: videoURL.lastPathComponent) switch saveResult { case .success(let savedURL): DispatchQueue.main.async { // maybe this is a wrong place to have it? item.setPlayer(AVPlayer(url: savedURL)) self?.items.append(item) if self?.items.count ?? 0 > 1 { // once first video is downloaded, use all device cores to fetch next videos // all newest iOS devices has 6 cores self?.downloadQueue.setMaxConcurrentOperationCount(.max) } } case .none: break case .failure(_): EventTracking().track("Video download fail", [ "id": item.id, "ulr": videoURL.absoluteString.decodeURL() ]) } }), { fileCacheURL in // file already downloaded DispatchQueue.main.async { item.setPlayer(AVPlayer(url: fileCacheURL)) self.items.append(item) } }) } } } } I found this article with some pseudo-code of how to track video playback state but I'm not sure how to implement it in my code.... https://developer.apple.com/documentation/avfoundation/media_playback/observing_playback_state
1
0
1k
Apr ’24
SwiftUI View nested in UIRepresentable will not update size
I'm using something similar to this example. import SwiftUI struct ContentView: View { @State private var toggle = false var body: some View { CustomParentView { Button { toggle.toggle() } label: { Text(toggle.description) } } } } struct CustomParentView<Content: View>: UIViewRepresentable { let content: Content @inlinable init(@ViewBuilder content: () -> Content) { self.content = content() } func makeUIView(context: Context) -> UIView { let view = UIView() let hostingController = context.coordinator.hostingController hostingController.view.frame = view.bounds hostingController.view.autoresizingMask = [.flexibleWidth, .flexibleHeight] view.addSubview(hostingController.view) return view } func updateUIView(_ uiView: UIView, context: Context) { context.coordinator.hostingController.rootView = self.content } class Coordinator: NSObject { var hostingController: UIHostingController<Content> init(hostingController: UIHostingController<Content>) { self.hostingController = hostingController } } func makeCoordinator() -> Coordinator { return Coordinator(hostingController: UIHostingController(rootView: content)) } } The only different thing is I'm using UIScrollView. When I have a @State width and call .frame(width) on the content, the content would stay with initial width even when width is changed. I tried: hostingController.sizingOptions = .intrinsicContentSize This time the size would change to correct size if I pinch zoom the content, but the initial size that trigger updateUIView would be .zero. This prevents me to center the content. Is there a way to dynamically set size and get correct rendering just like any child view of a normal SwiftUI view?
0
0
336
Apr ’24
VideoPlayer SwiftUI change background color
I added VideoPlayer view inside my project, but I noticed that during loading or with different aspect ratio the default color of this view is black. I would like to change it according to to my app background. Unfortunately using modifiers such as .background or .foregroundColor doesn't seem to change it. Is there a way to customize this color? struct PlayerLooperView: View { private let queuePlayer: AVQueuePlayer! private let playerLooper: AVPlayerLooper! init(url: URL) { let playerItem = AVPlayerItem(url: url) self.queuePlayer = AVQueuePlayer(items: [playerItem]) self.queuePlayer.isMuted = true self.playerLooper = AVPlayerLooper(player: queuePlayer, templateItem: playerItem) self.queuePlayer.play() } var body: some View { VideoPlayer(player: queuePlayer) .disabled(true) .scaledToFit() } }
2
0
454
Apr ’24
CallKit invokes CXEndCallAction after starting the call, resulting in hangup on Simulator
I am using Xcode Version 15.3 (15E204a) and different versions of Simulator runtimes (17.x, 16.x, 15.0) The app makes outgoing calls and can respond to incoming calls. After starting the call, ~2s pass before a hangup occurs. In the Console logs I see that CXEndCallAction was invoked by CallKit and the last suspicious log before invoking the CXEndCallAction is callservicesd Disconnecting call because there wont be a UI to host the call: &lt;CSDProviderCall 0x107054300 type=PhoneNumber, value=sdsddsdds, stat=Sending tStat=0, model=&lt;TUCallModel 0x103f661e0 hold=1 grp=1 ungrp=1&gt; ... This used to work before, but since upgrading to Xcode 15 and iOS 17.x it happens constantly on simulator versions 17.x, and sometimes on 16.x, whereas I wasn't able to reproduce it on 15.0 version. Can someone help me understand why this happens and how to fix it? I provided some logs down below, and I don't see similar logs in the cases when the call is okay and CallKit doesn't hangup it. Also, this does not happen on real devices From the time CXStartCallAction is invoked until the CallKit invokes CXEndCallAction, these are some of the error or warn logs that appear: callservicesd -AVSystemController- +[AVSystemController sharedInstance]: Failed to allocate AVSystemController, numberOfAttempts=3 callservicesd [WARN] +[AVSystemController sharedAVSystemController] returned nil value callservicesd [WARN] Not allowing requested start call action because a call with same UUID already exists callWithUUID: (omitted) callservicesd Error while determining process action for callSource: (omitted) callservicesd Determined that callSource: &lt;CXXPCCallSource 0x103d060a0, ...&gt;, should process action: &lt;CXStartCallAction 0x107232760 UUID=8D34853F-55DD-4DEC-97A7-551BFD27C924, error: Error Domain=com.apple.CallKit.error.requesttransaction Code=5 "(null)" callservicesd [0x103e417a0] invalidated after the last release of the connection object callservicesd [WARN] No paired device, so unable to send message UpdateCallContext callservicesd FaceTime caller ID (null) is not a valid outgoing relay caller ID callservicesd Attempting to find a valid outgoing caller ID in set of available outgoing caller IDs {( )} callservicesd Could not automatically select an outgoing caller ID; multiple telephone numbers are listed in the set of available outgoing caller IDs {( )} callservicesd Adding call &lt;CSDProviderCall 0x107054300 ...&gt; to dirty calls pool callservicesd Entitlement check: ... entitlementCapabilities={( "access-call-providers", "modify-calls", "access-call-capabilities", "access-calls" )}&gt; lacks capability 'access-screen-calls' callservicesd [WARN] ... but no dynamic identifier could be found (1) or no handoff user info exists (1). Not broadcasting frontmost call error com.apple.CallKit.CallDirectoryUnable to initialize CXCallDirectoryStore for reading: Error Domain=NSCocoaErrorDomain Code=513 "You don’t have permission to save the file “CallDirectory” in the folder “Library”." ... {Error Domain=NSPOSIXErrorDomain Code=13 "Permission denied"}} The logs provided are in order in which they are logged, but some of them are recurring After these logs there is still a message that CXStartCallAction is fullfilled: callservicesd Start call action fulfilled: &lt;CXStartCallAction 0x107231fe0 UUID=8D34853F-55DD-4DEC-97A7-551BFD27C924 ...&gt; After which the last suspicious log is logged before CXEndCallAction is invoked by CallKit: Disconnecting call because there wont be a UI to host the call: &lt;CSDProviderCall 0x107054300 ...&gt;
0
0
406
Apr ’24
shouldWaitForLoadingOfRequestedResource delegate method not getting called on TVOS app
I'm attempting to integrate DRM into the app. I've developed a prototype, but the delegate method shouldWaitForLoadingOfRequestedResource isn't being triggered on certain devices, although it functions correctly on others. Notably, it's invoked on Apple TV 4K (3rd generation) Wi-Fi (A2737) but not on Apple TV HD (A1625). Are there any specific configurations needed to ensure this method is invoked? let url = URL(string: RESOURCE_URL)! // Create the asset instance and the resource loader because we will be asked // for the license to playback DRM protected asset. let asset = AVURLAsset(url: url) let queue = DispatchQueue(label: CUSTOM_SERIAL_QUEUE_LABEL) asset.resourceLoader.setDelegate(self, queue: queue) // Create the player item and the player to play it back in. let playerItem = AVPlayerItem(asset: asset) let player = AVPlayer(playerItem: playerItem) // Create a new AVPlayerViewController and pass it a reference to the player. let controller = AVPlayerViewController() controller.player = player // Modally present the player and call the player's play() method when complete. present(controller, animated: true) { player.play() } } //Please note if your delegate method is not being called then you need to run on a REAL DEVICE func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool { // Getting data for KSM server. Get the URL from tha manifest, we wil need it later as it // contains the assetId required for the license request. guard let url = loadingRequest.request.url else { print(#function, "Unable to read URL from loadingRequest") loadingRequest.finishLoading(with: NSError(domain: "", code: -1, userInfo: nil)) return false } // Link to your certificate on BuyDRM's side. // Use the commented section if you want to refer the certificate from your bundle i.e. Store Locally /* guard let certificateURL = Bundle.main.url(forResource: "certificate", withExtension: "der"), let certificateData = try? Data(contentsOf: certificateURL) else { print("failed...", #function, "Unable to read the certificate data.") loadingRequest.finishLoading(with: NSError(domain: "com.domain.error", code: -2, userInfo: nil)) return false } */ guard let certificateData = try? Data(contentsOf: URL(string: CERTIFICATE_URL)!) else { print(#function, "Unable to read the certificate data.") loadingRequest.finishLoading(with: NSError(domain: "", code: -2, userInfo: nil)) return false } // The assetId from the main/variant manifest - skd://***, the *** part. Get the SPC based on the // already collected data i.e. certificate and the assetId guard let contentId = url.host, let contentIdData = contentId.data(using: String.Encoding.utf8) else { loadingRequest.finishLoading(with: NSError(domain: "", code: -3, userInfo: nil)) print(#function, "Unable to read the SPC data.") return false } guard let spcData = try? loadingRequest.streamingContentKeyRequestData(forApp: certificateData, contentIdentifier: contentIdData, options: nil) else { loadingRequest.finishLoading(with: NSError(domain: "", code: -3, userInfo: nil)) print(#function, "Unable to read the SPC data.") return false } // Prepare to get the license i.e. CKC. let requestUrl = CKC_URL let stringBody = "spc=\(spcData.base64EncodedString())&assetId=\(contentId)" let postData = NSData(data: stringBody.data(using: String.Encoding.utf8)!) // Make the POST request with customdata set to the authentication XML. var request = URLRequest(url: URL(string: requestUrl)!) request.httpMethod = "POST" request.httpBody = postData as Data request.allHTTPHeaderFields = ["customdata" : ACCESS_TOKEN] let configuration = URLSessionConfiguration.default let session = URLSession(configuration: configuration) let task = session.dataTask(with: request) { data, response, error in if let data = data { // The response from the KeyOS MultiKey License server may be an error inside JSON. do { let parsedData = try JSONSerialization.jsonObject(with: data) as! [String:Any] let errorId = parsedData["errorid"] as! String let errorMsg = parsedData["errormsg"] as! String print(#function, "License request failed with an error: \(errorMsg) [\(errorId)]") } catch let error as NSError { print(#function, "The response may be a license. Moving on.", error) } // The response from the KeyOS MultiKey License server is Base64 encoded. let dataRequest = loadingRequest.dataRequest! // This command sends the CKC to the player. dataRequest.respond(with: Data(base64Encoded: data)!) loadingRequest.finishLoading() } else { print(#function, error?.localizedDescription ?? "Error during CKC request.") } } task.resume() // Tell the AVPlayer instance to wait. We are working on getting what it wants. return true }
0
0
413
Mar ’24
Capture most recent frame from remote livestream
I'm trying to use AVPlayer to capture frames from a livestream that is remotely playing. Eventually I want to convert these frames to UIImages to be displayed. The code I have right now is not working because pixel_buffer doesn't have an actual value for some reason. When I print itemTime its value is continuously 0, which I think might be a potential cause of this issue. Would appreciate any help with getting this to work. import RealityKit import RealityKitContent import AVFoundation import AVKit class ViewController: UIViewController { let player = AVPlayer(url: URL(string: {webrtc stream link})!) let videoOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: [String(kCVPixelBufferPixelFormatTypeKey): NSNumber(value: kCVPixelFormatType_32BGRA)]) override func viewDidLoad() { print("doing viewDidLoad") super.viewDidLoad() player.currentItem!.add(videoOutput) player.play() let displayLink = CADisplayLink(target: self, selector: #selector(displayLinkDidRefresh(link:))) displayLink.add(to: RunLoop.main, forMode: RunLoop.Mode.common) } @objc func displayLinkDidRefresh(link: CADisplayLink) { let itemTime = videoOutput.itemTime(forHostTime: CACurrentMediaTime()) if videoOutput.hasNewPixelBuffer(forItemTime: itemTime) { if let pixelBuffer = videoOutput.copyPixelBuffer(forItemTime: itemTime, itemTimeForDisplay: nil) { print("pixelBuffer \(pixelBuffer)") // yay, pixel buffer let image = CIImage(cvImageBuffer: pixelBuffer) // or maybe CIImage? print("CIImage \(image)") } } } } struct ImmersiveView: View { var body: some View { RealityView { content in if let scene = try? await Entity(named:"Immersive", in: realityKitContentBundle) { content.add(scene) } let viewcontroller = ViewController() viewcontroller.viewDidLoad() }
0
0
315
Mar ’24
Error in using the AV KIT in the Vision Pro for Vedic link app
Dear Apple Developer Forum Community, I hope this message finds you well. I am writing to seek assistance regarding an error I encountered while attempting to create a Vedic content in the app from the one YouTube link. I have been unsuccessful in resolving it. I am reaching out to the community in the hope that someone might have encountered a similar issue or have expertise in troubleshooting Xcode errors. Any guidance, suggestions, or solutions would be greatly appreciated. Thank you very much for your time and assistance. Sincerely, Zipzy [games]
2
0
516
Feb ’24
Use of Microphone and speaker at the same time without stoping audio-engine
I want to develop an AI assistant ios application using whisper and chatGPT OpenAI apis. I am implementing these following steps. Audio-engine to record the user's voice Send audio chunk to Whisper for Speech to Text Send that text to chatgpt openAI to get response Now sending that response to Speech Synthesizer to speak response through built-in speaker In this process, i don't want to disable microphone. Because user can interrupt the speech synthesizer anytime he likes. It should be realtime and look like continuous call between the user and AI assistant. Problem: When user speaks, microphone takes the input and appends into the audioengine recording file. Then sends that chunk to whisper for transcribing, transcribed text is then sent to chatgpt api to get response and response is sent to speech synthesiser which generates an output on speaker. Issue is that the microphone again takes synthesiser voice from speaker, and create a loop. What should i possibly do to stop my microphone to not take the input from iphone speaker. Talking tom, callAnnie applications and many other ios applications are continuously using microphone and generating outputs from speaker without overlapping and loop. Suggest the possible ways. I tried to set all possible ways for setting audio-engine category and settings with record, playback, playandrecord etc. Nothing gives me the solution to avoid speaker voice into my microphone. Technically as I think of microphone should never take the device generated voices. What could be the possible solution. If my approach is wrong also i am open to plenty suggestions and guidance.
0
0
443
Feb ’24
Parental Controls in SwiftUI on tvOS
How to implement Player in SwiftUI with support for parental controls? Sample Code: Working with Overlays and Parental Controls in tvOS I use AVPlayerViewController in SwiftUI. On cancel, rejecting the request, the screen is black. On change the channel, on move command up or down direction, I am replacing the current player item with new. Status is ready to play, status of the request is successful upon replacement and I set player to play. The screen is still black.
0
0
464
Feb ’24