Posts

Post not yet marked as solved
0 Replies
797 Views
HelloI just finished writing rudimentary H264 encoder using Video Toolbox on macOS. Apart from some tricks (like certain VTSession properties seem to have effect only if set before creating VTSession) it is really easy and nice API. But I have one question - I'd like to be able to peek (while encoding) at how my encoded frame looks like. Obviously, this is lossy compression, and so encoded frame won't reconstruct to exactly same image that I have fed. I am interested in how many things have changes, maybe computing some metrics, etc. Question is - can I get last encoded frame's reconstructed image from the encoder?Of course I can create H264 decoder and feed it with outgoing encoded frames, but this is hardly ideal solution, because I add the cost of decoding. And I know that H264 encoder has to "know" how encoded image will look like (otherwise how could it compute motion estimation?).Any help appreciated.Michal
Posted
by MikeAlpha.
Last updated
.
Post not yet marked as solved
6 Replies
2.3k Views
HelloI am using ReplayKit to record iOS screen (system-wide recording, started from Control Center) on iPhone 7/iOS 11.1. So far so good, I am getting frames, pushing them into Video Toolbox, then getting H264 NALs which then I send to the network destination. But I have an issue with device rotation, namely, regardless of device orientation I always get 750x1334 frames (like device was held in portrait mode). And so is the content of the frames - if the device is being held in portrait, image displayed is correct, hovewer in landscape frames are rotated 90/270 degrees.I spent some time Googling, and now I know that I can use CMGetAttachment on video frames send from ReplayKit and get RPVideoSampleOrientationKey, which has correct orientation set. This works fine, and now I'd like to rotate landscape frames properly, for example into 1334x750. And here lies the problem, for I searched Video Toolbox (especially Pixel Transfer stuff) and only found how to set clear aperture and scaling, but no rotation reference.So right now, I am developing the solution of using vImage to take image from one CVImageBuffer and write it, rotated, to another. But this is hardly ideal solution - maybe there is VT-way of doing simple 90-degrees rotation?RegardsMichal
Posted
by MikeAlpha.
Last updated
.
Post not yet marked as solved
11 Replies
3.9k Views
I am trying to develop a prototype Broadcast Upload Extension. I start screen recording from Control Center, by a long press. So far so good, I was able to get video frames, compress them and write to file. I was also able to get mic audio. Alas, I am not able to get ANY app sound. This is how my RPBroadcastSampleHandler derivative's processSampleBuffer looks like:- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType { switch (sampleBufferType) { case RPSampleBufferTypeVideo: [self videoHandler: sampleBuffer]; break; case RPSampleBufferTypeAudioApp: [self audioHandler: sampleBuffer output: audio_output]; break; case RPSampleBufferTypeAudioMic: [self audioHandler: sampleBuffer output: mic_output]; break; default: break; } }Now, if only mic recording is allowed, I get mic data in mic_output, so audioHandler procedure is OK. But I never get anything (zero bytes) in audio_output, so I am thinking that processSampleBuffer simply never gets called with sampleBufferType being equal to RPSampleBufferTypeAudioApp.At first I thought it must be related to me using YouTube app for sound. So I tried other sound sources (like playing video I recerded in Photos, setting up alarm timer, and so on). Zero bytes of audio still. Do I need to enable audio recording elsewhere? Or maybe I am using wrong constant - but documentation lists just Video, AudioApp and AudioMic, and the other two work.Is it a bug?I was using iPhone 7, on both iOS 11.0 and 11.1 (XCode 9.1) and got same results. Please helpRegardsMichal
Posted
by MikeAlpha.
Last updated
.