Streaming

RSS for tag

Deep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.

Streaming Documentation

Post

Replies

Boosts

Views

Activity

Is there a way to directly go from VideoToolbox to Metal for 10-bit/BT.2020 YCbCr HEVC?
tl;dr how can I get raw YUV in a Metal fragment shader from a VideoToolbox 10-bit/BT.2020 HEVC stream without any extra/secret format conversions? With VideoToolbox and 10-bit HEVC, I've found that it defaults to CVPixelBuffers w/ formats kCVPixelFormatType_Lossless_420YpCbCr10PackedBiPlanarFullRange or kCVPixelFormatType_Lossy_420YpCbCr10PackedBiPlanarFullRange. To mitigate this, I have the following snippet of code to my application: // We need our pixels unpacked for 10-bit so that the Metal textures actually work var pixelFormat:OSType? = nil let bpc = getBpcForVideoFormat(videoFormat!) let isFullRange = getIsFullRangeForVideoFormat(videoFormat!) // TODO: figure out how to check for 422/444, CVImageBufferChromaLocationBottomField? if bpc == 10 { pixelFormat = isFullRange ? kCVPixelFormatType_420YpCbCr10BiPlanarFullRange : kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange } let videoDecoderSpecification:[NSString: AnyObject] = [kVTVideoDecoderSpecification_EnableHardwareAcceleratedVideoDecoder:kCFBooleanTrue] var destinationImageBufferAttributes:[NSString: AnyObject] = [kCVPixelBufferMetalCompatibilityKey: true as NSNumber, kCVPixelBufferPoolMinimumBufferCountKey: 3 as NSNumber] if pixelFormat != nil { destinationImageBufferAttributes[kCVPixelBufferPixelFormatTypeKey] = pixelFormat! as NSNumber } var decompressionSession:VTDecompressionSession? = nil err = VTDecompressionSessionCreate(allocator: nil, formatDescription: videoFormat!, decoderSpecification: videoDecoderSpecification as CFDictionary, imageBufferAttributes: destinationImageBufferAttributes as CFDictionary, outputCallback: nil, decompressionSessionOut: &decompressionSession) In short, I need kCVPixelFormatType_420YpCbCr10BiPlanar so that I have a straightforward MTLPixelFormat.r16Unorm/MTLPixelFormat.rg16Unorm texture binding for Y/CbCr. Metal, seemingly, has no direct pixel format for 420YpCbCr10PackedBiPlanar. I'd also rather not use any color conversion in VideoToolbox, in order to save on processing (and to ensure that the color transforms/transfer characteristics match between streamer/client, since I also have a custom transfer characteristic to mitigate blocking in dark scenes). However, I noticed that in visionOS 2, the CVPixelBuffer I receive is no longer a compressed render target (likely a bug), which caused GPU texture read bandwidth to skyrocket from 2GiB/s to 30GiB/s. More importantly, this implies that VideoToolbox may in fact be doing an extra color conversion step, wasting memory bandwidth. Does Metal actually have no way to handle 420YpCbCr10PackedBiPlanar? Are there any examples for reading 10-bit HDR HEVC buffers directly with Metal?
2
0
196
1w
AVPlayer "Server Not Properly Configured" Error in Production
Issue found in Native App or Hybrid App:Native OS Version:Any Device:Any 4.Description: We are using AVPlayer for streaming videos in our iOS application. The streaming works fine in lower sandbox environment, but we are encountering a "server not properly configured" error in the production environment. 5.Steps to Reproduce: Configure AVPlayer with a video URL from the production server. Attempt to play the video. 6.Expected Behavior: The video should stream successfully as it does in the sandbox environment. 7.Actual Behavior: AVPlayer fails to stream the video and reports a "server not properly configured" error.
0
0
97
1w
Issues with launching a stream in AVPlayer using AppIntent on iOS
I am implementing Siri/Shortcuts for radio app for iOS. I have implemented AppIntent that sends notification to app and app should start playing the stream in AVPlayer. AppIntent sometimes works, sometimes it doesn't. So far I couldn't find the pattern when/why it works and when/why it doesn't. Sometimes it works even if app is killed or is in the background. Sometimes it doesn't work when the app is in the background and when it is killed. I have been observing logs in Console and apparently sometimes it stops when AVPlayer tries to figure out buffer size (then I am getting in console AVPlayerWaitingToMinimizeStallsReason and the AVPlayerItem status is set to .unknown). Sometimes it figures out quickly (for the same stream) and starts playing. Sometimes when the app is killed, after AppIntent call the app is launched in the background (at least I see it as a process in Console) and receives notification from AppIntent and start playing. Sometimes... the app is not called at all, and its process is not visible in the console, so it doesn't receives the notification and doesn't play. I have setup Session correctly (set to .playback without any options and activated), I set AVPlayerItem's preferredForwardBufferDuration to 0 (default), and AVPlayer's automaticallyWaitsToMinimizeStalling to true. Background processing, Audio, AirPlay, Picture in Picture and Siri are added in Singing & Capabilities section of the app project settings. Here are the code examples: Play AppIntent (Stop App Intent is constructed the same way): @available(iOS 16, *) struct PlayStationIntent: AudioPlaybackIntent { static let title: LocalizedStringResource = "Start playing" static let description = IntentDescription("Plays currently selected radio") @MainActor func perform() async throws -> some IntentResult { NotificationCenter.default.post(name: IntentsNotifications.siriPlayCurrentStationNotificationName, object: nil) return .result() } } AppShortcutsProvider: struct RadioTestShortcuts: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { AppShortcut( intent: PlayStationIntent(), phrases: [ "Start station in \(.applicationName)", ], shortTitle: LocalizedStringResource("Play station"), systemImageName: "radio" ) } } Player object: class Player: ObservableObject { private let session = AVAudioSession.sharedInstance() private let streamURL = URL(string: "http://radio.rockserwis.fm/live")! private var player: AVPlayer? private var item: AVPlayerItem? var cancellables = Set<AnyCancellable>() typealias UInfo = [AnyHashable: Any] @Published var status: Player.Status = .stopped @Published var isPlaying = false func setupSession() { do { try session.setCategory(.playback) } catch { print("*** Error setting up category audio session: \(error), \(error.localizedDescription)") } do { try session.setActive(true) } catch { print("*** Error setting audio session active: \(error), \(error.localizedDescription)") } } func setupPlayer() { item = AVPlayerItem(url: streamURL) item?.preferredForwardBufferDuration = TimeInterval(0) player = AVPlayer(playerItem: item) player?.automaticallyWaitsToMinimizeStalling = true player?.allowsExternalPlayback = false let metaDataOuptut = AVPlayerItemMetadataOutput(identifiers: nil) } func play() { setupPlayer() setupSession() handleInterruption() player?.play() isPlaying = true player?.currentItem?.publisher(for: \.status) .receive(on: DispatchQueue.main) .sink(receiveValue: { status in self.handle(status: status) }) .store(in: &self.cancellables) } func stop() { player?.pause() player = nil isPlaying = false status = .stopped } func handle(status: AVPlayerItem.Status) { ... } func handleInterruption() { ... } func handle(interruptionType: AVAudioSession.InterruptionType?, userInfo: UInfo?) { ... } } extension Player { enum Status { case waiting, ready, failed, stopped } } extension Player { func setupRemoteTransportControls() { ... } } Content view: struct ContentView: View { @EnvironmentObject var player: Player var body: some View { VStack(spacing: 20) { Text("AppIntents Radio Test App") .font(.title) Button { if player.isPlaying { player.stop() } else { player.play() } } label: { Image(systemName: player.isPlaying ? "pause.circle" : "play.circle") .font(.system(size: 80)) } } .padding() } } #Preview { ContentView() } Main struct: ```import SwiftUI @main struct RadioTestApp: App { let player = Player() let siriPlayCurrentPub = NotificationCenter.default.publisher(for: IntentsNotifications.siriPlayCurrentStationNotificationName) let siriStop = NotificationCenter.default.publisher(for: IntentsNotifications.siriStopRadioNotificationName) var body: some Scene { WindowGroup { ContentView() .environmentObject(player) .onReceive(siriPlayCurrentPub, perform: { _ in player.play() }) .onReceive(siriStop, perform: { _ in player.stop() }) } } }
0
0
48
19h