Is it possible to play Fisheye VR180 video directly?

Hi I know it's possible to play equirectangular VR180 video either SBS or MV-HEVC. And for fisheye video, the only way I know is to convert it into an AIVU for playback. Is there any way to directly play fisheye video using AVPlayer? Thanks a lot!

Answered by Vision Pro Engineer in 863856022

Yes, you can play fisheye VR180 video directly with AVPlayer, but only if the video file (.mov or .mp4) contains APMP (Apple Projected Media Profile) signaling for ParametricImmersive projection.

What is APMP? It's Apple's metadata standard that embeds lens calibration data directly into video files. This tells AVPlayer how to properly "undistort" and display content captured with specialized cameras (fisheye, 360°, VR180, etc.). Without this metadata, AVPlayer treats fisheye video as regular flat 2D content instead of projected media. If you haven't already, I encourage you to familiarize yourself with APMP. Start with Explore video experiences for visionOS for a high level overview of video playback in visionOS. Then move on to Learn about the Apple Projected Media Profile. Finally watch Support immersive video playback in visionOS apps to learn how to use AVPlayer for playback.

Current Support:

  • Straight-off-the-camera videos from recent action cams like the GoPro HERO13 and Insta360 Ace Pro 2 are automatically converted to wide FOV APMP (they "just work").
  • Older cameras like GoPro 11+, Insta360 X4+, Insta360 Ace Pro+ capture the required vendor lens metadata, but need conversion to add APMP signaling (called "uplifting"). Use the "runtime conversion" or avconvert approaches below.
  • Videos without any lens metadata (re-exported footage, custom rigs, etc.) need programmatic APMP embedding using the "custom code" approach below.

Conversion Methods:

I personally haven't tried these, but Learn about the Apple Projected Media Profile suggests some approaches:

Runtime conversion:

func upliftIntoParametricImmersiveIfPossible(url: URL) -> AVMutableMovie {
	let movie = AVMutableMovie(url: url)

	let assetInfo = try await ParametricImmersiveAssetInfo(asset: movie)
	if (assetInfo.isConvertible) {
		guard let newDescription = assetInfo.requiredFormatDescription else {
			fatalError("no format description for convertible asset")
		}
		let videoTracks = try await movie.loadTracks(withMediaType: .video)
		guard let videoTrack = videoTracks.first,
			  let currentDescription = try await videoTrack.load(.formatDescriptions).first
		else {
      fatalError("missing format description for video track")
		}
		// presumes that format already compatible for intended use case (delivery or production)
    // for delivery then if not already HEVC should transcode for example
		videoTrack.replaceFormatDescription(currentDescription, with: newDescription)
	}
  return movie
}

Offline using avconvert:

# For video with intact vendor metadata.
avconvert --source wfov_source.mp4 --output wfov_apmp.mov --preset PresetPassthrough
# For video with missing vendor metadata (usually through a re-export) 
avconvert --source wfov_source_edited.mp4 --output wfov_apmp.mov --preset PresetPassthrough --useAlternateLensCalibration wfov_source.mp4

Offline using custom code approach (for manual lens calibration):

For self-calibrated videos (where you manually determine lens parameters), you'll need to programmatically embed APMP metadata into your video files and then export them. APMP supports two standard lens distortion models: Brown-Conrady (handles typical camera lens distortion) and Fisheye (Mei-Rives model from OpenCV contrib). Note that APMP distortion correction is limited to second-order coefficients. For stereo/VR180 videos, you'll need to write separate APMP metadata for each eye since left and right channels require individual calibration data. Here are some resources to help you get started:

Accepted Answer

Yes, you can play fisheye VR180 video directly with AVPlayer, but only if the video file (.mov or .mp4) contains APMP (Apple Projected Media Profile) signaling for ParametricImmersive projection.

What is APMP? It's Apple's metadata standard that embeds lens calibration data directly into video files. This tells AVPlayer how to properly "undistort" and display content captured with specialized cameras (fisheye, 360°, VR180, etc.). Without this metadata, AVPlayer treats fisheye video as regular flat 2D content instead of projected media. If you haven't already, I encourage you to familiarize yourself with APMP. Start with Explore video experiences for visionOS for a high level overview of video playback in visionOS. Then move on to Learn about the Apple Projected Media Profile. Finally watch Support immersive video playback in visionOS apps to learn how to use AVPlayer for playback.

Current Support:

  • Straight-off-the-camera videos from recent action cams like the GoPro HERO13 and Insta360 Ace Pro 2 are automatically converted to wide FOV APMP (they "just work").
  • Older cameras like GoPro 11+, Insta360 X4+, Insta360 Ace Pro+ capture the required vendor lens metadata, but need conversion to add APMP signaling (called "uplifting"). Use the "runtime conversion" or avconvert approaches below.
  • Videos without any lens metadata (re-exported footage, custom rigs, etc.) need programmatic APMP embedding using the "custom code" approach below.

Conversion Methods:

I personally haven't tried these, but Learn about the Apple Projected Media Profile suggests some approaches:

Runtime conversion:

func upliftIntoParametricImmersiveIfPossible(url: URL) -> AVMutableMovie {
	let movie = AVMutableMovie(url: url)

	let assetInfo = try await ParametricImmersiveAssetInfo(asset: movie)
	if (assetInfo.isConvertible) {
		guard let newDescription = assetInfo.requiredFormatDescription else {
			fatalError("no format description for convertible asset")
		}
		let videoTracks = try await movie.loadTracks(withMediaType: .video)
		guard let videoTrack = videoTracks.first,
			  let currentDescription = try await videoTrack.load(.formatDescriptions).first
		else {
      fatalError("missing format description for video track")
		}
		// presumes that format already compatible for intended use case (delivery or production)
    // for delivery then if not already HEVC should transcode for example
		videoTrack.replaceFormatDescription(currentDescription, with: newDescription)
	}
  return movie
}

Offline using avconvert:

# For video with intact vendor metadata.
avconvert --source wfov_source.mp4 --output wfov_apmp.mov --preset PresetPassthrough
# For video with missing vendor metadata (usually through a re-export) 
avconvert --source wfov_source_edited.mp4 --output wfov_apmp.mov --preset PresetPassthrough --useAlternateLensCalibration wfov_source.mp4

Offline using custom code approach (for manual lens calibration):

For self-calibrated videos (where you manually determine lens parameters), you'll need to programmatically embed APMP metadata into your video files and then export them. APMP supports two standard lens distortion models: Brown-Conrady (handles typical camera lens distortion) and Fisheye (Mei-Rives model from OpenCV contrib). Note that APMP distortion correction is limited to second-order coefficients. For stereo/VR180 videos, you'll need to write separate APMP metadata for each eye since left and right channels require individual calibration data. Here are some resources to help you get started:

Thanks very much for your reply! But my usecase is a bit different. I'm building a real time VR180 minitoring app, so I will process the video frame by frame to reduce latency. Currently I use STMAP to convert fisheye image to equirectangular format, and render it into a half sphere. AVPlayer way is great, but I won't use AVPlayer... Anyway thanks very much for your reply!

Is it possible to play Fisheye VR180 video directly?
 
 
Q