Hi, I’ve been experiencing a strange issue with video playback on my iPhone. While watching videos, the image will suddenly shift it becomes more greyish, then sometimes briefly goes black, and then returns to normal bright quality. This can happen multiple times during a single video.
This is not limited to the Photos app. I’ve seen it happen:
In the Photos app when playing videos I recorded myself
In Snapchat when watching videos sent by others
Occasionally in other social media apps as well
Additional details:
HDR Video is turned off
Apple ProRes is turned off
Tried both 4K 60fps and 4K 30fps
Camera format set to “Most Compatible”
Low Power Mode is off
Issue happens whether the phone is cool or warm
Doesn’t seem related to the video file itself the same file exported to another device looks fine all the way through
These exact same videos play back completely normally on my iPad with no brightness or contrast shifts at all
I’m currently on the iOS 17 public beta, but this issue was happening before I installed the beta as well, so it’s not beta-specific
It almost feels like the display or system is switching between different brightness/contrast profiles mid playback, regardless of the app.
Has anyone else experienced this, and is there a way to disable this behavior so the brightness and color stay consistent during video playback?
Thanks in advance!
Video
RSS for tagDive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
When I try to send a DRM-protected video via Airplay to an Apple TV, the license request is made twice instead of once as it normally does on iOS.
We only allow one request per session for security reasons, this causes the second request to fail and the video won't play.
We've tested DRM-protected videos without token usage limits and it works, but this creates a security hole in our system.
Why does it request the license twice in function: func contentKeySession(_ session: AVContentKeySession, didProvide keyRequest: AVContentKeyRequest)?
Is there a way to prevent this?
I'm writing some camera functionality that uses AVCaptureVideoDataOutput.
I've set it up so that it calls my AVCaptureVideoDataOutputSampleBufferDelegate on a background thread, by making my own dispatch_queue and configuring the AVCaptureVideoDataOutput.
My question is then, if I configure my AVCaptureSession differently, or even stop it altogether, is this guaranteed to flush all pending jobs on my background thread?
I have a more practical example below, showing how I am accessing something from the foreground thread from the background thread, but I wonder when/how it's safe to clean up that resource.
I have setup similar to the following:
// Foreground thread logic
dispatch_queue_t queue = dispatch_queue_create("avf_camera_queue", nullptr);
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
setupInputDevice(captureSession); // Connects the AVCaptureDevice...
// Store some arbitrary data to be attached to the frame, stored on the foreground thread
FrameMetaData frameMetaData = ...;
MySampleBufferDelegate *sampleBufferDelegate = [MySampleBufferDelegate alloc];
// Capture frameMetaData by reference in lambda
[sampleBufferDelegate setFrameMetaDataGetter: [&frameMetaData]() { return &frameMetaData; }];
AVCaptureVideoDataOutput *captureVideoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
[captureVideoDataOutput setSampleBufferDelegate:sampleBufferDelegate
queue:queue];
[captureSession addOutput:captureVideoDataOutput];
[captureSession startRunning];
[captureSession stopRunning];
// Is it now safe to destroy frameMetaData, or do we need manual barrier?
And then in MySampleBufferDelegate:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
// Invokes the callback set above
FrameMetaData *frameMetaData = frameMetaDataGetter();
emitSampleBuffer(sampleBuffer, frameMetaData);
}
Hi,
I have an app that displays tens of short (<1mb) mp4 videos stored in a remote server in a vertical UICollectionView that has horizontally scrollable sections.
I'm caching all mp4 files on disk after downloading, and I also have a in-memory cache that holds a limited number (around 30) of players. The players I'm using are simple views that wrap an AVPlayerLayer and its AVPlayerItem, along with a few additional UI components.
The scrolling performance was good before iOS 26, but with the release of iOS 26, I noticed that there is significant stuttering during scrolling while creating players with a fileUrl. It happens even if use the same video file cached on disk for each cell for testing.
I also started getting this kind of log messages after the players are deinitialized:
<<<< PlayerRemoteXPC >>>> signalled err=-12785 at <>:1107
<<<< PlayerRemoteXPC >>>> signalled err=-12785 at <>:1095
<<<< PlayerRemoteXPC >>>> signalled err=-12785 at <>:1095
There's also another log message that I see occasionally, but I don't know what triggers it.
<< FigXPC >> signalled err=-16152 at <>:1683
Is there anyone else that experienced this kind of problem with the latest release?
Also, I'm wondering what's the best way to resolve the issue. I could increase the size of the memory cache to something large like 100, but I'm not sure if it is an acceptable solution because:
1- There will be 100 player instance in memory at all times.
2- There will still be stuttering during the initial loading of the videos from the web.
Any help is appreciated!
Hello everyone,
I'm looking for a definitive clarification on how to completely disable all video stabilization, including the hardware OIS, using AVFoundation. The goal is to achieve a completely raw, unstabilized video feed, which is crucial when using external equipment like gimbals to avoid conflicting stabilization motions.
My research points to using the AVCaptureConnection property preferredVideoStabilizationMode and setting it to AVCaptureVideoStabilizationMode.off.
The documentation for the .off case states:
A mode that doesn’t stabilize video capture.
This description is slightly ambiguous. It's unclear whether this only affects software-level stabilization (EIS, EIS+OIS, etc) or if it guarantees the complete deactivation of the physical OIS module. For professional video applications, this is a critical distinction.
So, I'd like to ask the community:
Has anyone been able to definitively confirm that setting preferredVideoStabilizationMode to .off also disables the hardware OIS? Are there any known tests or documentation that prove this behavior?
Is there an alternative or more direct method to ensure the OIS module is physically inactive during video capture?
What is the community's best practice for ensuring absolutely no stabilization is applied to the video pipeline?
Any insights or shared experiences on this topic would be greatly appreciated.
Thank you!
Hi, I downloaded and ran https://developer.apple.com/documentation/realitykit/rendering-stereoscopic-video-with-realitykit
and noticed that memory usage grows linearly.
I replaced the sample video with a different 8k side by side video, and the app crashed almost immediately due to memory leak.
it looks like the culprit is from makeMutablePixelBuffer() function and the allocated pixelBuffers are not recycled after being used.
screenshot is from a physical device.
Our app plays TS files on an iPhone.
The app fragments the TS files, creates an M3U8 playlist, converts them to HLS(HTTP Live Streaming), and then uses AVPlayer to play the video content.
On a device running iOS 26, after starting playback and seeking, restarting playback causes the video and audio to be out of sync (by about 2-3 seconds depending on the situation).
This also occurs on iPadOS/macOS 26.
This issue was not observed prior to iOS 18.
We are trying to fix this issue on the app side, but we have the following questions:
The behavior of AVPlayer is different between iOS 26 and previous versions. Has there been any change that could be considered? Or is it a bug?
We tried pausing before seeking, but it didn’t seem to have any effect. Are there any APIs or workarounds that can improve this?
We would appreciate it if you could tell us any other helpful documents or URLs.
Currently, I am using the Broadcast UploadExtension function to obtain samplebuffer data through APP Group and IPC (based on the local Unix Domain Socket) The screen recording data transmission method of the domain socket is transmitted to the APP. However, when the APP goes back to the background to view videos in the album or other audio and video, the data transmission stops and the APP cannot obtain the screen recording data. I would like to ask how to solve this problem. I suspect that the system has suspended the extended screen recording
Topic:
Media Technologies
SubTopic:
Video