Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics

Post

Replies

Boosts

Views

Activity

Disable iOS Screen Mirroring for Apps
Hello Apple, I am concerned about the new iOS Screen Mirroring that is available on iOS. I have an app that is only meant to be viewed on iPhones (not Macs or Computers, due to security reasons. I am assuming that Screen Mirroring is using AirPlay underneath, otherwise is there an API being planned or coming that can disable this functionality or is there a way for my app to opt out out of iOS Screen Mirroring? Thanks.
0
0
27
5h
Thread safety of AudioObject APIs
Are the AudioObject APIs (such as AudioObjectGetPropertyData, AudioObjectSetPropertyData, etc.) thread-safe? Meaning, for the same AudioObjectID is it safe to do things like: Get a property in one thread while setting the same property in another thread Set the same property in two different threads Add and remove property listeners in different threads Put differently, is there any internal synchronization or mutex for this kind of usage or is the burden on the caller? I was unable to find any documentation either way which makes me think that the APIs are not thread-safe.
0
2
46
10h
Music is not being played through BLE connected headphone on iPhone
I have created a demo iOS app to create BLE connection with surrounding headphone. I am able to connect to headphone successfully through my demo iOS app. I can also see in iPhone Bluetooth Setting that headphone is connected but when i am playing music from Spotify/YouTube then music is not being played through headphone. It is still using iPhone speakers. First i am scanning sarounding bluetooth Devices through CBCentralManager and then connecting one of the found device. cBCenteralManager.scanForPeripherals(withServices: nil, options: nil) For connecting: cBCenteralManager.connect(peripheral, options: nil) Do i need to make any code changes while connecting via BLE? I am expecting when i am connecting to headphone via my Demo app. Same connection is visible in iPhone Bluetooth setting too then when i play music on spotify/youtube then sound should be played on headphone and not on iPhone speakers.
1
0
38
17h
ImmersiveSpaceを切り替えるとAVAudioPlayerで再生していたBGMの音が聞こえなくなる / When switching ImmersiveSpace, background music played by AVAudioPlayer is not heard.
【手順】 1.アプリを起動する。 2.ImmersiveSpace1がopenされ、3Dオブジェクトのアニメーションが再生される。 3.アニメーションが終了するとImmersiveSpace1をdismissしてImmersiveSpace2をopenする。 【期待値】 ImmersiveSpace1をopenするとBGMが再生され、ImmersiveSpace2がopenしても引き続きBGMが再生されていること。 【結果】 ImmersiveSpace1をopenするとBGMが再生され、ImmersiveSpace2がopenするとBGMの再生が止まる。 【環境】 ・実機(VisionOS2)にて発生。 ・シミュレータでは発生しない。 ・Xcode:Version 15.2 (15C500b) 【ログ】 ImmersiveSpace2をopenした際に実機で出力されている。シミュレータでは出力されない。 AVAudioSession_iOS.mm:2223 Server returned an error from destroySession:. Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service with pid 39 named com.apple.audio.AudioSession was invalidated from this process." UserInfo={NSDebugDescription=The connection to service with pid 39 named com.apple.audio.AudioSession was invalidated from this process.} 【Procedure】 Launch the application. ImmersiveSpace1 is opened and the animation of the 3D object is played. When the animation finishes, ImmersiveSpace1 is dismissed and ImmersiveSpace2 is opened. 【Expected value】 When ImmersiveSpace1 is opened, the background music should play, and when ImmersiveSpace2 is opened, the background music should continue to play. 【Result】 When ImmersiveSpace1 is opened, the BGM is played, and when ImmersiveSpace2 is opened, the BGM stops playing. 【Environment】 This problem occurs on an actual machine (VisionOS2). It does not occur on the simulator. Xcode: Version 15.2 (15C500b) 【Log】 Output on actual device when ImmersiveSpace is opened. It is not output on the simulator. AVAudioSession_iOS.mm:2223 Server returned an error from destroySession:. Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service with pid 39 named com.apple.audio.AudioSession was invalidated from this UserInfo={NSDebugDescription=The connection to service with pid 39 named com.apple.audio.AudioSession was invalidated from this AudioSession was invalidated from this process.}
0
2
69
23h
Frame Discontinuity Reasons
I have an app that uses a MultiCamCaptureSession, the devices of which are builtInUltraWideCamera and builtInLiDARDepthCamera cameras. Occasionally when outside I get some frame drops due to discontinuity that end in the media services being reset: [06-24 11:27:13][CameraSession] Capture session runtime error: related decl 'e' for AVError(_nsError: Error Domain=AVFoundationErrorDomain Code=-11819 "Cannot Complete Action" UserInfo={NSLocalizedDescription=Cannot Complete Action, NSLocalizedRecoverySuggestion=Try again later.}) This runtime error notification is always superseded by 4-5 frame drops : [06-24 11:27:10][CaptureSession] Dropped frame because Discontinuity Logging the system temperature shows [06-24 11:27:10][CaptureSession] Temperature is 'Fair' I have some inclination that the frame discontinuity is being caused by the whileBalanceMode of the capture session, perhaps the algorithm requires 5 recent frames to work. I had a similar problem with the lidar depth camera where with filtering enabled exactly 5 frame drops would make the media services reset. When the whiteBalanceMode is locked I do slightly better with 10 frame drops before the mediaServices are reset. Is there any logging utility to determine the actual reason? All of these sampleBuffers come with no info attachment only the not so useful "Dropped frame because Discontinuity." Any ideas for solving this would be helpful as well. Maybe tuning the camera to work better with quickly varying lighting conditions?
0
0
65
1d
Integrating Apple Music Subscriptions into a React Native App
Hi everyone, I'm currently developing an iOS app using React Native and recently got accepted into the Apple Music Global Affiliate Program. To fully utilize this opportunity, I need to implement the following functionalities: Authorize Apple Music usage Play Apple Music within my app Identify if a user has an Apple Music subscription Initiate and complete Apple Music subscription within my app I've successfully implemented the first three functionalities using the react-native-apple-music module. Now, I need your help to understand how I can directly trigger the Apple Music subscription process from within my app. Thank you for your help!
0
0
59
1d
HLS Playback Issue with Discontinuity Tag and Low Bitrate Streams and Seek on iOS 17
I am writing to report an issue encountered with the playback of HLS (HTTP Live Streaming) streams that I believe is specific to iOS version 17. The problem manifests when certain conditions are met during the playback of concatenated HLS segments, particularly those with low video bitrate. Below, I will detail the background, symptoms, and steps required to reproduce the issue. Background: Our business scenario requires concatenating two HLS playlists, referred to as 1.m3u8 and 2.m3u8, into a single playlist 12.m3u8. An example of such a playlist is as follows: #EXTM3U #EXT-X-VERSION:3 #EXT-X-ALLOW-CACHE:YES #EXT-X-TARGETDURATION:2 #EXT-X-MEDIA-SEQUENCE:0 #EXTINF:2.0, 1.1.ts #EXTINF:2.0, 1.2.ts #EXTINF:2.0, 1.3.ts #EXT-X-DISCONTINUITY #EXTINF:2.0, 2.1.ts #EXTINF:2.0, 2.2.ts #EXT-X-ENDLIST Problem Symptoms: On PC web browsers, Android devices, and iOS versions 13 and 15, the following is observed: Natural playback completion occurs without any issues. Seeking to different points within the stream (e.g., from 3 seconds to 9 seconds) works as expected. However, on iOS version 17, there is a significant issue: Natural playback completion is unaffected. When seeking to various points within the first playlist (1.m3u8) after playing for 1, 2, or 3 seconds, the audio for the last 3 seconds of 1.m3u8 gets lost. Conditions for Replication: The issue only arises when all the following conditions are satisfied: The video content is generated from a single image and an audio track, ensuring sound presence in the final 3 seconds. The video stream bitrate is below 500 Kbps. (Tested with 1393 Kbps bitrate, which did not trigger the issue.) The HLS streams are concatenated using the #EXT-X-DISCONTINUITY tag to form a virtual 11.m3u8 playlist. (No issues occur when streams are not concatenated.) Seek operations are performed during playback. (No issues occur without seek operations.) The issue is exclusive to iOS version 17. (No issues reported on iOS versions 13 and 15.) Disrupting any one of these conditions results in normal playback behavior. Steps to Reproduce: Using FFmpeg, generate a video from a single image and an audio track, with a suggested duration of 10 to 20 seconds for testing convenience. If the video's bitrate exceeds 1000 Kbps, consider transcoding it to 500 Kbps or lower to avoid potential edge-case issues. Convert the 1.mp4 file into 1.m3u8 using FFmpeg. The segment duration can be set to between 1 and 5 seconds (tested with both 2-second and 5-second durations). Duplicate 1.m3u8 as 2.m3u8, then concatenate 1.m3u8 and 2.m3u8 into 12.m3u8 as shown below: #EXTM3U #EXT-X-VERSION:3 #EXT-X-ALLOW-CACHE:YES #EXT-X-TARGETDURATION:2 #EXT-X-MEDIA-SEQUENCE:0 #EXTINF:2.0, 1.1.ts #EXTINF:2.0, 1.2.ts #EXT-X-DISCONTINUITY #EXTINF:2.0, 1.1.ts #EXTINF:2.0, 1.2.ts #EXT-X-ENDLIST On an iOS 17 device, play 12.m3u8 for 1, 2, or 3 seconds, then seek to any point between 7 and 9 seconds (within the duration of 1.m3u8). This action results in the loss of audio for the last 3 seconds of 1.m3u8.
0
0
25
1d
Crash with AVMutableComposition insertTimeRange Method
A small number of crashes are being reported on Firebase. When attempting to use the insertTimeRange:ofAsset:atTime:error: method of AVMutableComposition, a crash occurred with the error message -[__NSArrayM insertObject:atIndex:]: object cannot be nil. Most of them appear in versions of ios 17.0 and above. Here's my code: - (AVMutableComposition *)createtrimAsset:(AVAsset *)asset andStartTime:(CGFloat)startTime endTime:(CGFloat)endTime{ NSError *error = nil; CGFloat timescale = 1000000; AVMutableComposition *mutableComposition = [AVMutableComposition composition]; CMTime sStartTime = CMTimeMakeWithSeconds(CMTimeGetSeconds(asset.duration)*startTime, timescale); CMTime eEndTime = CMTimeMakeWithSeconds(CMTimeGetSeconds(asset.duration)*endTime, timescale); [mutableComposition insertTimeRange:CMTimeRangeMake(sStartTime, CMTimeSubtract(eEndTime, sStartTime)) ofAsset:asset atTime:kCMTimeZero error:&error]; return mutableComposition; } I attempted to reproduce this crash by deliberately setting the timeRange or asset to unusual values, such as asset=nil, or asset.duration=0, or asset.duration=NAN, but all attempts failed. What could be causing this exception? Any advice would be of great help to me.
0
0
27
1d
When adding a VideoPlayerComponent to an Entity placed in ImmersiveSpace and attempting to play an 8K video, the application crashes.
OS:VisionOS 1.0 Xcode:15.2 In the application under development, do the following Open ImmersiveSpace Add VideoPlayerComponent to Entity Play 8K Video the App crash The Apple symbol appears and returned to the Home but, The problem does not occur if the application is created by extracting only the part of the 8K video to be played back. Error Log apply fence tx failed (client=0x6fbf0fcc) [0xfffffecc (ipc/mig) server died] Failed to commit transaction (client=0x58510d43) [0x10000003 (ipc/send) invalid destination port] nw_read_request_report [C1] Receive failed with error "No message available on STREAM" nw_protocol_socket_reset_linger [C1:2] setsockopt SO_LINGER failed [22: Invalid argument] Failed to set override status for bind point component member. Message from debugger: Terminated due to signal 9 I can't share the entire application, but is anyone else experiencing the same problem? Is this a memory issue?
0
0
65
1d
When VideoTool Box compress JPEG and H264, the output format(YUV420) is different from the input format.
The M series utilizes VideoToolBox GPU compression with a YUV422 format kCVPixelFormatType_422YpCbCr8BiPlanarVideoRange input, and the compressed output JPEG image format remains YUV420. For the Intel series GPU compression, a YUV420 format kCVPixelFormatType_420YpCbCr8Planar input is required, and the compressed output JPEG image format is YUV422. The output format after compression is not consistent with the input format. Does VideoToolBox GPU compression support output YUV422 or YUV444 JPEG images and H.264 streams?
0
1
127
3d
Focus Peaking/Contrast Detection as seen in Final Cut Camera App
Hello everyone, with the release of Apple's new Final Cut Camera App, we see the possibility to overlay a Focus Peaking indicator over the camera feed, showing focussed areas. We have already had a contrast based autofocus system for some time via the AVCaptureDevice.Format.AutoFocusSystem.contrastDetection, but I haven't found a way to actually present contrast areas to the user. Given that Apple now natively has such an algorithm for the Final Cut Camera App, I wonder if we devs now also get access to this. If not, does anybody know of implementations of focus peaking out there? Thanks and with best regards
0
0
121
4d
WWDC Lab feedback
I am writing to follow up with my lab in WWDC24. I had 1:1 lab with Mr. Kavin, we had good 30 minutes lab and for follow up questions Kavin asked me to post it using feedback. Following is my questin: We have screenshare in our application and trying to use CFMessagePort for passing CVPixelBufferRef from broadcast extension to Applicaiton. Questions: How to copy planes of IOSurface backed CVPixelBufferRef onto another one without using memcpy, is there a zero-copy method? How to get notified when an IOSurface backed CVPixelBufferRef data get changed by another process. How to send an IOSurface backed CVPixelBufferRef from Broadcast Extension to application. How to pass unowned IOSurfaceRef from the Broadcast Extension to appliction.
0
1
42
4d
Audio transition using MPMusicPlayerApplicationController
Hi. I saw that in iOS 18 Beta there is a property "transition" on the Music Kit's ApplicationMusicPlayer. However, in my app I am using MPMusicPlayerApplicationController because I want to play Apple Music songs, local songs and podcasts. But I didn't find an analogue property on MPMusicPlayerApplicationController to specify transitions between songs. Am I missing something? Thanks, Dirk
0
0
97
5d
Add the info of each picture in the photo app, which is derived from the name of the user's self-built album.
Well, I will collect a lot of memes from the Internet and save them on my iPhone. I will name and classify them, but I will click on a photo in "All Photos", and its info does not show which album I added to, which makes me very distressed. If I have this function, I will easily manage the memes that I did not correctly add to the corresponding album.
1
0
115
6d
ProRes 4444 blocky compression artifacts
I’m creating a objective C command-line utility to encode RAW image sequences to ProRes 4444, but I’m encountering, blocky compression artifacts in the ProRes 4444 video output. To test the integrity of the image data before encoding to ProRes, I added a snippet in my encoding function that saves a 16-bit PNG before encoding to ProRes and the PNG looks perfect, I can see all detail in every part of the image dynamic range. Here’s a comparison between the 16-bit PNG(on the right) and the ProRes 4444 output. (on the left) As a further test, I re-encoded the ‘test PNG’ to ProRes 4444 using DaVinci Resolve, and the ProRes4444 output video from Resolve doesn’t have any blocky compression artifacts. Looks identical. In short, this is what the utility does: Unpacks the 12-bit raw data into 16-bit values. After unpacking, the raw data is debayered to convert it into a standard color image format (BGR) using OpenCV. Scale the debayered pixel values from their original 12-bit depth to fit into a 16-bit range. Up to this point everything is fine and confirmed by saving 16bit PNGs. The images are encoded to ProRes 4444 using the AVFoundation framework. The pixel buffers are created and managed using dictionary method with ‘kCVPixelFormatType_64RGBALE’. I need help figuring this out, I’m a real novice when it comes to AVfoundation/encoding to ProRes. See relevant parts of my 'encodeToProRes' function: void encodeToProRes(const std::string &outputPath, const std::vector<std::string> &rawPaths, const std::string &proResFlavor) { NSError *error = nil; NSURL *url = [NSURL fileURLWithPath:[NSString stringWithUTF8String:outputPath.c_str()]]; AVAssetWriter *assetWriter = [AVAssetWriter assetWriterWithURL:url fileType:AVFileTypeQuickTimeMovie error:&error]; if (error) { std::cerr << "Error creating AVAssetWriter: " << error.localizedDescription.UTF8String << std::endl; return; } // Load the first image to get the dimensions std::cout << "Debayering the first image to get dimensions..." << std::endl; Mat firstImage; int width = 5320; int height = 3900; if (!debayer_image(rawPaths[0], firstImage, width, height)) { std::cerr << "Error debayering the first image" << std::endl; return; } width = firstImage.cols; height = firstImage.rows; // Save the first frame as a PNG 16-bit image for validation std::string pngFilePath = outputPath + "_frame1.png"; if (!imwrite(pngFilePath, firstImage)) { std::cerr << "Error: Failed to save the first frame as a PNG image" << std::endl; } else { std::cout << "First frame saved as PNG: " << pngFilePath << std::endl; } NSString *codecKey = nil; if (proResFlavor == "4444") { codecKey = AVVideoCodecTypeAppleProRes4444; } else if (proResFlavor == "422HQ") { codecKey = AVVideoCodecTypeAppleProRes422HQ; } else if (proResFlavor == "422") { codecKey = AVVideoCodecTypeAppleProRes422; } else if (proResFlavor == "LT") { codecKey = AVVideoCodecTypeAppleProRes422LT; } else { std::cerr << "Error: Invalid ProRes flavor specified: " << proResFlavor << std::endl; return; } NSDictionary *outputSettings = @{ AVVideoCodecKey: codecKey, AVVideoWidthKey: @(width), AVVideoHeightKey: @(height) }; AVAssetWriterInput *videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings]; videoInput.expectsMediaDataInRealTime = YES; NSDictionary *pixelBufferAttributes = @{ (id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_64RGBALE), (id)kCVPixelBufferWidthKey: @(width), (id)kCVPixelBufferHeightKey: @(height) }; AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoInput sourcePixelBufferAttributes:pixelBufferAttributes]; ... [assetWriter startSessionAtSourceTime:kCMTimeZero]; CMTime frameDuration = CMTimeMake(1, 24); // Frame rate of 24 fps int numFrames = static_cast<int>(rawPaths.size()); ... // Encoding thread std::thread encoderThread([&]() { int frameIndex = 0; std::vector<CVPixelBufferRef> pixelBufferBuffer; while (frameIndex < numFrames) { std::unique_lock<std::mutex> lock(queueMutex); queueCondVar.wait(lock, [&]() { return !frameQueue.empty() || debayeringFinished; }); if (!frameQueue.empty()) { auto [index, debayeredImage] = frameQueue.front(); frameQueue.pop(); lock.unlock(); if (index == frameIndex) { cv::Mat rgbaImage; cv::cvtColor(debayeredImage, rgbaImage, cv::COLOR_BGR2RGBA); CVPixelBufferRef pixelBuffer = NULL; CVReturn result = CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &pixelBuffer); if (result != kCVReturnSuccess) { std::cerr << "Error: Could not create pixel buffer" << std::endl; dispatch_group_leave(dispatchGroup); return; } CVPixelBufferLockBaseAddress(pixelBuffer, 0); void *pxdata = CVPixelBufferGetBaseAddress(pixelBuffer); for (int row = 0; row < height; ++row) { memcpy(static_cast<uint8_t*>(pxdata) + row * CVPixelBufferGetBytesPerRow(pixelBuffer), rgbaImage.ptr(row), width * 8); } CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); pixelBufferBuffer.push_back(pixelBuffer); ... Thanks very much!
1
0
118
6d