Hello everyone,
Is it possible to set AVCaptureMovieFileOutput to record the audio for my HECV video as 44100Hz 16 bit mono PCM. If yes: how does it work?
Thank you in advance
Video
RSS for tagDive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.
Post
Replies
Boosts
Views
Activity
The M series utilizes VideoToolBox GPU compression with a YUV422 format kCVPixelFormatType_422YpCbCr8BiPlanarVideoRange input, and the compressed output JPEG image format remains YUV420. For the Intel series GPU compression, a YUV420 format kCVPixelFormatType_420YpCbCr8Planar input is required, and the compressed output JPEG image format is YUV422. The output format after compression is not consistent with the input format. Does VideoToolBox GPU compression support output YUV422 or YUV444 JPEG images and H.264 streams?
Hi.
I know that playing videos from Apple Music was not possible with iOS 17. There is only the workaround to open the Music app.
My question is whether anybody found a solution for iOS 18 (Beta).
Thanks,
Dirk
OS:VisionOS 1.0
Xcode:15.2
In the application under development, do the following
Open ImmersiveSpace
Add VideoPlayerComponent to Entity
Play 8K Video
the App crash
The Apple symbol appears and returned to the Home
but, The problem does not occur if the application is created by extracting only the part of the 8K video to be played back.
Error Log
apply fence tx failed (client=0x6fbf0fcc) [0xfffffecc (ipc/mig) server died]
Failed to commit transaction (client=0x58510d43) [0x10000003 (ipc/send) invalid destination port]
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_protocol_socket_reset_linger [C1:2] setsockopt SO_LINGER failed [22: Invalid argument]
Failed to set override status for bind point component member.
Message from debugger: Terminated due to signal 9
I can't share the entire application, but is anyone else experiencing the same problem?
Is this a memory issue?
A small number of crashes are being reported on Firebase. When attempting to use the insertTimeRange:ofAsset:atTime:error: method of AVMutableComposition, a crash occurred with the error message -[__NSArrayM insertObject:atIndex:]: object cannot be nil.
Most of them appear in versions of ios 17.0 and above.
Here's my code:
- (AVMutableComposition *)createtrimAsset:(AVAsset *)asset
andStartTime:(CGFloat)startTime
endTime:(CGFloat)endTime{
NSError *error = nil;
CGFloat timescale = 1000000;
AVMutableComposition *mutableComposition = [AVMutableComposition composition];
CMTime sStartTime = CMTimeMakeWithSeconds(CMTimeGetSeconds(asset.duration)*startTime, timescale);
CMTime eEndTime = CMTimeMakeWithSeconds(CMTimeGetSeconds(asset.duration)*endTime, timescale);
[mutableComposition insertTimeRange:CMTimeRangeMake(sStartTime, CMTimeSubtract(eEndTime, sStartTime)) ofAsset:asset atTime:kCMTimeZero error:&error];
return mutableComposition;
}
I attempted to reproduce this crash by deliberately setting the timeRange or asset to unusual values, such as asset=nil, or asset.duration=0, or asset.duration=NAN, but all attempts failed.
What could be causing this exception? Any advice would be of great help to me.
I'm working on streaming tvOS app and as you know there are mostly two type of video streams - live and vod. AVPlayerViewController handles these types of streams by showing respective playback controls.
Recently I got a task to implement synchronous vod playback(syncVod), it's when we need to simulate live playback while actual vod stream playback.
In order to simulate live playback below things needs to be handled:
Disabling scrubbing via remote. (Done. playerVc.requiresLinearPlayback = true)
Disabling info panel view w/play "From beginning" button. (Done, playerVc.playbackControlsIncludeInfoViews = false)
Disabling play/pause button.(Done, not ideally though. On rate change observer - if player.rate == 0 && playbackMode == .syncVod { player.play() return }). Why not ideal solution - tapping on remote causes quite short hiccup in playback - but playback resumes, no actual pause happens.
Hiding progress bar and time labels. :(
Point #4 is the main problem, we can't hide progress bar and it's related UI elements(time labels) particularly, but only hide all playback controls - playerVc.showsPlaybackControls = false. The thing is I have custom buttons in transportBarCustomMenuItems and hiding all playback controls is not the right option for me.
Implementing custom playback controls panel is kind of heavy lift, but as of now it seems the only proper way of implementing syncVod playback ideally.
Did anyone face similar issue and could resolve it w/out implementing custom playback controls panel ? Is there way to hide progress bar only in tvOS AVPlayerViewController?
My app need a specific scene that play a video when my iPhone close to NFC Tags. and my app can read the data from NFC Tags, the data will tell us which kind of video can be play.
I tried to write URLScheme or Universal Link in NFC Tags, but all this ways will pop up notifications. not launch my app and play a video, how can I design my app.
please give me some advice, thanks!
I have an mp4 file (which is around 25min) which i need to play but if i open it in quick time player or AVplayer, it shows as around 55min in those two platfroms. I don't understand why this happens. It correctly shows the time when opening with a browser or VSCode built in video playback.
Here is the link to the file:
https://www.dropbox.com/scl/fi/hbg59uqx8xdpiqbnx5wz8/videoplayback-1.mp4?rlkey=7o8l8m7j8dhq0o6f3zssgv9bd&st=5lt6apug&dl=0
Hi,
I am developing an app that has a WKWebView and it can open sites like Youtube. The app is sandboxed as it is meant to be uploaded to the mac App Store. It has a feature PiP where we start the native PiP by calling a browser Javascript where we tell the WKWEBView to fire the PiP. It works well when we are running the code from XCODE in Debug scheme. When we run the code from release mode by archiving it or directly from the build folder, the WKWebView is not able to fire the PiP Agent and thus the Native PiP window is not visible, while the site shows that PiP is opened and we can here the sound being played. But PiP window is not visible. I cannot see PiPAgent in activity monitor. Why does it not work from within the release build outside xcode. But when I try to run the build directly from the Finder in builds folder, this PiP feature does not work.
Request technical help for this.
Thanks!
I recently bought an insta360 flow gimbal. when recording video with the instaflow app, I cannot see the location in apple photos app and all other apple apps. However I can see the location in windows photos app once I downloaded the videos into my windows PC. The location is also visible in android app once I share it through google account.
With an exif app, I can see the location meta data in exif table as well, but again not shown as location.
exiftool in my pc can also see the meta data including location as in attached screenshot.
Compared to video shot with built-in camera app, I cannot find any difference in terms of location meta data.
What could be wrong? I contacted insta360 app support, they do not seem to understand what's going on, just asking for very simple questions again and again like do you enable GPS location access, are you shooting video?
I also contacted apple support, they are just saying it's thirdparty issue and refusing to help further. If it's really thirdparty issue how come the location data is actually embeded as meta data, and windows pc and android device can see the location? BTW, I air drop this video to all my apple devices like iPhone 15 ultra and ipad air, and very old iPhone, all of them cannot see the location.
I am using Metal for rendering, and when calling the newCommandQueue interface of Metal, there is a certain probability that I will get a nil return value. However, when I call the MTLCreateSystemDefaultDevice interface, I can get a non-empty return value, which means my device supports Metal. I would like to ask what causes the newCommandQueue to return nil? Is there any way to avoid this situation? Thank you.
What framework to use to capture screen of a device connected to the Mac in the way OBS or QuickTime Player does when an iOS device is connected to Mac via USB. I tried to list devices with AVFoundation and ScreenCaptureKit but only Mac camera, mic and displays are listed.
When you select New Movie Recording in the QuickTime Player you can choose an Connected iPad or iPhone to record it's screan. Same with OBS.
What is the way to do it in my own MacOS app?
When I use the exportPresetsCompatibleWithAsset interface in iOS 17, everything works fine. However, when I upgraded my iPhone XR to iOS 18, calling the exportPresetsCompatibleWithAsset interface to transcode 4K assets returned success, but when I tried to save to the album, it returned error 3302.
Is it because this interface has already been deprecated in iOS 16?
I'm trying to download M3U8 media on watchOS by this code:
let configuration = URLSessionConfiguration.background(withIdentifier: "com.id")
let session = AVAssetDownloadURLSession(
configuration: configuration,
assetDownloadDelegate: M3U8DownloadDelegate.shared,
delegateQueue: .main
)
let asset = AVURLAsset(url: URL(string: mediaLink)!)
let downloadTask = session.makeAssetDownloadTask(downloadConfiguration: .init(asset: asset, title: ""))
downloadTask.resume()
m3u8DownloadObservation = downloadTask.progress.observe(\.fractionCompleted) { progress, _ in
print(progress)
}
But downloadTask.progress is always zero, and the observation is never called.
How to get the progress correctly?
When I try to run mediafilesegmenter with the following
mediafilesegmenter -iso-fragmented -z frame_index.m3u8 \
-f 720p -i index_720p \
-k key.bin -stream_encrypt -P
-K <url> \
-B seg_720p -t 6 source.mp4
I get the following output
ISO fragmented mode, forcing segments to start with I-Frame
Processing file source.mp4
track 2 of source.mp4 contains more than one valid time mapping
Unable to find any valid tracks to segment.
Segmenting failed (-15650).
What specifically should I be looking for in my MP4 that would be causing this?
Hi, I'm developing a simple app to visualize embedded VR180 3D video.
I used a semisphere and projected the video as its material. The semisphere is in the ambient at a fixed y value of 1.35, which is good for a seated person, but not ideal for a standing person because the stereoscopic vision is not correct. In the AppleTV+ and Kandao applications, I noticed that the translation of the video is anchored to the Apple Vision Pro. I tried using AnchorEntity to the head with trackingMode .once, but there is the problem of rotation; the semisphere starts with the rotation of the head.
Is there a solution, for example, to anchor the semisphere only to the translation and not to the rotation of the head?
Hi!
I am getting AVErrorMediaServicesWereReset (-11819) thrown as an error by AVMutableCompositionTrack.insertTimeRange(_:of:at:) when trying to insert part of an AVAssetTrack into my video track (an AVMutableCompositionTrack) in my AVMutableComposition. This is not happening every time I'm making an AVMutableComposition, but it is happening frequently.
I am also getting this error thrown when trying to export using an AVAssetExportSession.
Is there any insight into what can cause this error in these scenarios?
Thanks!
Is it possible to put an video on loop and autoplaying for visionOS? We used AVPlayerViewController
IOS 18 below FFmpeg subtitle synthesis log
[Parsed_subtitles_0 @ 0x301b37650] Using font provider fontconfig
[Parsed_subtitles_0 @ 0x301b37650] fontselect: (PingFangSC-Semibold, 400, 0) -> /System/Library/Fonts/LanguageSupport/PingFang.ttc, 8, PingFangSC-Semibold
IOS 18 FFmpeg subtitle synthesis log
[Parsed_subtitles_0 @ 0x303e825d0] Using font provider fontconfig
[Parsed_subtitles_0 @ 0x303e825d0] fontselect: (PingFangSC-Regular, 400, 0) -> /System/Library/Fonts/Core/HiraginoKakuGothic.ttc, 10, . HiraKakuInterface-W4
[Parsed_subtitles_0 @ 0x303e825d0] Glyph 0x8FD9 not found, selecting one more font for (PingFangSC-Regular, 400, 0)
[Parsed_subtitles_0 @ 0x303e825d0] fontselect: (PingFangSC-Regular, 400, 0) -> /System/Library/Fonts/Core/LastResort.otf, 0, LastResort
In normal characters, there may be some garbled characters, such as 0x8FD9 mentioned in the log, corresponding to "这"
Traditional Chinese and Simplified Chinese
Normal font:
Russian, Korean, Japanese, French, English, German
IOS 18 below FFmpeg subtitle synthesis log
fontselect: (PingFangSC-Semibold, 400, 0) -> /System/Library/Fonts/LanguageSupport/PingFang.ttc, 8, PingFangSC-Semibold
IOS 18 FFmpeg subtitle synthesis log
fontselect: (PingFangSC-Regular, 400, 0) -> /System/Library/Fonts/Core/HiraginoKakuGothic.ttc, 10, . HiraKakuInterface-W4
In iOS 18, the selected font has changed and instead of continuing to use PingFang.ttc, HiraginoKakuGothic.
Has iOS changed the PingFang font directory, causing ffmpeg to be unable to use system fonts for subtitle synthesis
When I use ReplayKit's exportClipToURL function on iOS to capture a 15-second replay, the resulting video quality is poor, with snowy artifacts on the damaged visuals and audio distortion.