iOS (Official) Photos app can display some EXIF-related metadata (e.g. camera and lens info, ISO, shutter speed, F-number) even when photos are offloaded to iCloud and the device is not connected to internet (e.g. airplane mode).
However, with the Photos.framework, we need to download photos to retrive those metadata (which means it will not work with airplane mode).
I tried the following methods, but none of those worked when photos were offloaded to iCloud and the device was in airplane mode:
Requesting data with PHImageManager.default().requestImageDataAndOrientation
Result: It does not return Data if the photo is not stored locally on the device, even with options.deliveryMode = .fastFormat
Converting PHAsset#localIdentifier to an AssetsLibrary.framework URL (assets-library://asset/...)
(I am aware that AssetsLibrary.framework is deprecated, but this was just a test.)
Result: If PHImageManager does not returns Data, ALAsset#defaultRepresentation().metadata() returns an empty NSDictionary
                    
                  
                Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
  
    
    Selecting any option will automatically load the page
  
  
  
  
    
  
  
          Post
Replies
Boosts
Views
Created
                    
                      Hello,
Using ShazamKit, based on a shazam catalog result, would it be possible to detect the audio-recorded FPS (speed)?
I'm thinking that the shazam catalog which was created from an audio file can be used to compare the speed of a live recorded audio.
Thank you!
                    
                  
                
                    
                      While customizing ImagePicker and using it, we find out that the metadata is not reflected normally and report it.
The situation is as follows.
The time or time zone of an image is changed in the Photos app.
Changing the time zone of an image with an actual capture date of 2024:11:08 08:27:44 → 2024:11:07 17:27:44
Image data is extracted from a PHAsset using PHImageManager.
The metadata is obtained from this image data.
The time zone information exposed in the Exif tag information does not reflect the time or time zone changed in the Photos app.
let asset: PHAsset = ...
....
let options = PHImageRequestOptions()
options.isSynchronous = true
options.version = .current
options.deliveryMode = .highQualityFormat
options.resizeMode = .none
options.normalizedCropRect = .zero
options.isNetworkAccessAllowed = true
options.progressHandler = { progress, error, _, _ in }
PHImageManager.default().requestImageDataAndOrientation(for: asset, options: options) { imageData, uti, orientation, info in
    let cgImageSource = CGImageSourceCreateWithData(imageData! as CFData, nil)
    let properties = CGImageSourceCopyPropertiesAtIndex(cgImageSource!, 0, nil) as? Dictionary<String, Any>
    let exif = properties!["{Exif}"]
    let dictionary = exif as? Dictionary<String, Any>
}
Metadata Check
In this case, it is reflected in the creationDate of PHAsset, so it can be somewhat compensated by forcibly replacing the metadata.
However, because PHAsset does not include time zone information, when changing the time zone as well, it's impossible to calculate the correct time according to the time zone.
PHPicker
This issue is resolved when using the PHPickerResult provided by PHPicker.
extension PhotosPickerViewController: PHPickerViewControllerDelegate {
	  public func picker(_ picker: PHPickerViewController, 
										  didFinishPicking results: [PHPickerResult]) {
		  .....
	  
		  for result in results {
				  let identifier = UTType.image.identifier
          if result.itemProvider.hasItemConformingToTypeIdentifier(identifier) {
              result.itemProvider.loadDataRepresentation(forTypeIdentifier: identifier) { data, error in
                  guard let data = data,
                        let cgImageSource = CGImageSourceCreateWithData(data as CFData, nil),
                        let properties = CGImageSourceCopyPropertiesAtIndex(cgImageSource, 0, nil) as? Dictionary<String, Any>,
                        let exif = properties["{Exif}"],
                        let dictionary = exif as? Dictionary<String, Any>
                  else {
                      return
                  }
              }
          }
       }
	  }
}
Metadata Check
Question
I wonder why this happens, and if this is normal behavior.
Instead of the System Picker that Apple provides as a base, I wonder if there is any way I can supplement it in that situation if I use a customizer.
                    
                  
                
                    
                      I'm trying to apply a Core Image filter to an UIImage. For that I want to get the CIImage format of the UIImage.
I'm trying to obtain the CIImage of the UIImage as shown below.
        
        if let inputImage = self.orginalImageView.image{
            if let ciImage = CIImage(image: inputImage){
                print(ciImage)
                   print(self.orginalImageView.image?.ciImage)
            }
        }
    }
This method works. But one thing I noticed is that there is already a ciImage property and it inside UIImage  and it is always nil.
According to documentation
ciImage
The underlying Core Image data.
var ciImage: CIImage? { get }
Discussion
If the UIImage object was initialized using a CGImage, the value of the property is nil.
Does accessing image property of UIImage comes from CGImage so that the ciImage porperty is nil?
                    
                  
                
                    
                      I want to apply a SCNTechnique pipeline to the camera feed. To achieve this, I want to bring the camera input into the SceneKit world.
The perfects API seems to be:
let captureDevice = …
scnScene.background.contents = captureDevice
This is demonstrated in "SceneKit: What's New" (WWDC17) (at 44m19s) and is mentioned in the documentation of SCNMaterialProperty's contents.
Instead of showing camera feed, it crashes with these messages:
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVCaptureVideoDataOutput setVideoSettings:] Unsupported pixel format type - use -availableVideoCVPixelFormatTypes'
*** First throw call stack:
(0x18993c7cc <REDACTED> 0x211e18488)
libc++abi: terminating due to uncaught exception of type NSException
Please advise.
STEPS TO REPRODUCE
Create a new Xcode project, starting from the SceneKit game template.
Add Info.plist entry for NSCameraUsageDescription.
Add a capture device property to GameViewController:
class GameViewController: UIViewController {
let captureDevice = AVCaptureDevice.default(for: .video)
Set the background contents:
scene.background.contents = captureDevice
Run the app on device.
PLATFORM AND VERSION
iOS
Development environment: Xcode 16.1, macOS 15.0.1. Run-time configuration: iOS 18.1
                    
                  
                
                    
                      Hi, Apple's engineer.
Hoping that you can reply to this one.
We're developing a Text-to-Speak app. Everything went well until the IOS got upgraded to 18.
AVSpeechSynthesisVoice(language: "zh-CN") is running well under IOS 16 AND IOS 17. It speaks Mandarin correctly.
In IOS 18, we noticed that Siri's Language setting interrupted the performance of AVSpeechSynthesisVoice. It plays Cantonese instead of Mandarin.
Buggy language setting in Siri that affects the AVSpeechSynthesisVoice :
Chinese (Cantonese - China mainland)
Chinese (Cantonese -Hong Kong)
                    
                  
                
                    
                      I'm building a streaming app on visionOS that can play sound from audio buffers each frame. The source audio buffer has 2 channels and is in a Float32 interleaved format.
However, when setting up the AVAudioFormat with interleaved to true, the app will crash with a memory issue:
AURemoteIO::IOThread (35): EXC_BAD_ACCESS (code=1, address=0x3)
But if I set AVAudioFormat with interleaved to false, and manually set up the AVAudioPCMBuffer, it can play audio as expected.
Could you please help me fix it? Below is the code snippet.
@Observable
final class MyAudioPlayer {
    private var audioEngine: AVAudioEngine = .init()
    private var audioPlayerNode: AVAudioPlayerNode = .init()
    private var audioFormat: AVAudioFormat?
    
    init() {
        audioEngine.attach(audioPlayerNode)
        audioEngine.connect(audioPlayerNode, to: audioEngine.mainMixerNode, format: nil)
        try? AVAudioSession.sharedInstance().setCategory(.playback, mode: .default)
        try? AVAudioSession.sharedInstance().setActive(true)
        audioEngine.prepare()
        try? audioEngine.start()
        audioPlayerNode.play()
    }
    // more code...
    /// This crashes
    private func audioFrameCallback_Interleaved(buf: UnsafeMutablePointer<Float>?, samples: Int) { 
        guard let buf,
        let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 480000, channels: 2, interleaved: true),
        let audioBuffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: AVAudioFrameCount(samples))
        else { return }
        audioBuffer.frameLength = AVAudioFrameCount(samples)
        if let data = audioBuffer.floatChannelData?[0] {
            data.update(from: buf, count: samples * Int(format.channelCount))
        }
        audioPlayerNode.scheduleBuffer(audioBuffer)
    }
    /// This works
    private func audioFrameCallback_Non_Interleaved(buf: UnsafeMutablePointer<Float>?, samples: Int) { 
        guard let buf,
        let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 480000, channels: 2, interleaved: false),
        let audioBuffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: AVAudioFrameCount(samples))
        else { return }
        audioBuffer.frameLength = AVAudioFrameCount(samples)
        if let data = audioBuffer.floatChannelData {
            for channel in 0 ..< Int(format.channelCount) {
                for frame in 0 ..< Int(audioBuffer.frameLength) {
                    data[channel][frame] = buf[frame * Int(format.channelCount) + channel]
                }
            }
        }
        audioPlayerNode.scheduleBuffer(audioBuffer)
    }
}
                    
                  
                
                    
                      I'm building a streaming app on visionOS that can play sound from audio buffers each frame. The audio format has a bitrate of 48000, and each buffer has 480 samples.
I noticed when calling
audioPlayerNode.scheduleBuffer(audioBuffer)
The memory keeps increasing at the speed of 0.1MB per second And at around 4 minutes, the node seems to be full of buffers and had a hard reset, at which point, the audio is stopped temporary with a memory change. see attached screenshot.
However, if I call
 audioPlayerNode.scheduleBuffer(audioBuffer, at: nil, options: .interrupts)
The memory leak issue is gone, but the audio is broken (sounds like been shortened).
Below is the full code snippet, anyone knows how to fix it?
@Observable
final class MyAudioPlayer {
    private var audioEngine: AVAudioEngine = .init()
    private var audioPlayerNode: AVAudioPlayerNode = .init()
    private var audioFormat: AVAudioFormat?
    
    init() {
        audioEngine.attach(audioPlayerNode)
        audioEngine.connect(audioPlayerNode, to: audioEngine.mainMixerNode, format: nil)
        try? AVAudioSession.sharedInstance().setCategory(.playback, mode: .default)
        try? AVAudioSession.sharedInstance().setActive(true)
        audioEngine.prepare()
        try? audioEngine.start()
        audioPlayerNode.play()
    }
    // more code...
    /// callback every frame
    private func audioFrameCallback_Non_Interleaved(buf: UnsafeMutablePointer<Float>?, samples: Int) { 
        guard let buf,
        let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 48000, channels: 2, interleaved: false),
        let audioBuffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: AVAudioFrameCount(samples))
        else { return }
        audioBuffer.frameLength = AVAudioFrameCount(samples)
        if let data = audioBuffer.floatChannelData {
            for channel in 0 ..< Int(format.channelCount) {
                for frame in 0 ..< Int(audioBuffer.frameLength) {
                    data[channel][frame] = buf[frame * Int(format.channelCount) + channel]
                }
            }
        }
        // memory leak here
        audioPlayerNode.scheduleBuffer(audioBuffer)
    }
}
                    
                  
                
                    
                      Hi, I'm facing an issuer with audio worklet in safari. This issue is clearly an iOS bug (it doesn't occur on iPad or Mac)
Here's the minimal reproduction:
Go to https://googlechromelabs.github.io/web-audio-samples/audio-worklet/basic/hello-audio-worklet/
Press start
Audio will not be playing
Open YouTube on another tab and start any video
Audio from the worklet will start playing
Is this a known issue? Any plans to address that? Any workaround available?
                    
                  
                
                    
                      We have had the same video player in our app for at least 5 years with few issues but the iOS 18 updated has now resulted in video playback for our users who have downloaded the video for offline viewing is now played at 2x speed.
                    
                  
                
                    
                      I donate some INPlayMediaIntent to system, and I find them in Control center, when I click one of them to play media background, the handler don't execute resolve method, I wanna resolve some mediaItems for suggestion playlist
                    
                  
                
                    
                      Hey. I am trying to create a present view with a bunch of media (images/videos). Right now I am using a ZStack to render each media and change opacity based on the index selected using a scrollView. The issue seems to be that sometimes, videos don't seem to load in the main slide. There is a slide created as the video exists, the Player shows controls too but doesn't play anything.
Present View Z-Stack
ZStack {
ForEach(presentation.slides.indices, id: .self) { index in
if let media = mediaCacheManager.mediaCache[index] {
if let player = media as? AVPlayer {
PlayerView(player: player)
.aspectRatio(16/10, contentMode: .fit )
.frame(width: UIScreen.main.bounds.width * 0.8)
.background(Color.gray.opacity(0.2))
.clipShape(RoundedRectangle(cornerRadius: 40))
.overlay(
RoundedRectangle(cornerRadius: 40)
.stroke(Color.gray.opacity(0.5), lineWidth: 1)
)
.onDisappear {
player.pause()
}
.opacity(appModel.currentSlide == index ? 1 : 0)
} else if let image = media as? Image {
image
.resizable()
.scaledToFit()
.frame(width: UIScreen.main.bounds.width * 0.8)
.background(Color.gray.opacity(0.2))
.clipShape(RoundedRectangle(cornerRadius: 40))
.overlay(
RoundedRectangle(cornerRadius: 40)
.stroke(Color.gray.opacity(0.5), lineWidth: 1)
)
.padding(.vertical, 10)
.opacity(appModel.currentSlide == index ? 1 : 0)
}
}
}
}
The PlayerView
public class PlayerUIView: UIView {
let playerVC = AVPlayerViewController()
let gravity: AVLayerVideoGravity
let manageAudio: Bool
override init(frame: CGRect) {
    self.gravity = .resizeAspectFill
    self.manageAudio = true
    super.init(frame: frame)
}
deinit {
    if manageAudio {
        try? AVAudioSession.sharedInstance().setActive(false)
    }
}
init(player: AVPlayer?, gravity: AVLayerVideoGravity, manageAudio: Bool = true) {
    self.gravity = gravity
    self.manageAudio = manageAudio
    super.init(frame: .zero)
    guard let player = player else { return }
    self.playerSetup(player: player)
}
required init?(coder: NSCoder) {
    fatalError("init(coder:) has not been implemented")
}
public override func layoutSubviews() {
    super.layoutSubviews()
    playerVC.view.frame = bounds
    playerVC.view.backgroundColor = .clear
    playerVC.allowsVideoFrameAnalysis = false
}
private func playerSetup(player: AVPlayer) {
    playerVC.updatesNowPlayingInfoCenter = true
    playerVC.player = player
    playerVC.showsPlaybackControls = true
    playerVC.view.backgroundColor = .clear
    playerVC.exitsFullScreenWhenPlaybackEnds = true
    playerVC.videoGravity = gravity
    self.addSubview(playerVC.view)
}
}
                    
                  
                
                    
                      My app reports a lot of crashes from 18.2 users.
I have been able to narrow down the issue to this line of code:
CGImageDestinationFinalize(imageDestination)
The error is Thread 93: EXC_BAD_ACCESS (code=1, address=0x146318000)
But I have no idea why this suddently started to crash.
Here is the code of the function:
private func estimateSizeUsingThumbnailMethod(fromImageURL url: URL, imageSettings: ImageSettings) -> (Int, Int) {
    let sourceOptions = [kCGImageSourceShouldCache: false] as CFDictionary
    guard let source = CGImageSourceCreateWithURL(url as CFURL, sourceOptions),
          let imageProperties = CGImageSourceCopyPropertiesAtIndex(source, 0, nil) as? [CFString: Any],
          let imageWidth = imageProperties[kCGImagePropertyPixelWidth] as? CGFloat,
          let imageHeight = imageProperties[kCGImagePropertyPixelHeight] as? CGFloat else {
        return (0, 0)
    }
    
    let maxImageSize = max(imageWidth, imageHeight)
    let thumbMaxSize = min(2400, maxImageSize) // Use original size if possible, but not if larger than 2400, in this case we'll extrapolate from thumbnail
    
    let downsampleOptions = [
        kCGImageSourceCreateThumbnailFromImageAlways: true,
        kCGImageSourceCreateThumbnailWithTransform: true,
        kCGImageSourceThumbnailMaxPixelSize: thumbMaxSize as CFNumber,
    ] as CFDictionary
    guard let cgImage = CGImageSourceCreateThumbnailAtIndex(source, 0, downsampleOptions) else {
        DLog("CGImage thumb creation error")
        return (0, 0)
    }
    
    let data = NSMutableData()
            
    guard let imageDestination = CGImageDestinationCreateWithData(data, UTType.jpeg.identifier as CFString, 1, nil) else {
        DLog("CGImage destination creation error")
        return (0, 0)
    }
    let destinationProperties = [
        kCGImageDestinationLossyCompressionQuality: imageSettings.quality.compressionRatio() // Set jpeg compression ratio
    ] as CFDictionary
    CGImageDestinationAddImage(imageDestination, cgImage, destinationProperties)
    CGImageDestinationFinalize(imageDestination)   // <----- CRASHES HERE with EXC_BAD_ACCESS
	
	...
}
So far, I'm stuck. Any idea that could help would be greatly appreciated, as I'm scared that this crash will propagate on the official release of 18.2
                    
                  
                
                    
                      Looking to output dv video to my JVC SR-VS30 video deck. I used to be able to do this, but with most firewire stuff being deprecated, I'm not sure how to go about this. I found this old developer sample code that seems to do exactly what I'd like. Surely this could be rolled or updated for current macOS?
https://developer.apple.com/library/archive/samplecode/SimpleVideoOut/Introduction/Intro.html#//apple_ref/doc/uid/DTS10000809-Intro-DontLinkElementID_2
                    
                  
                
                    
                      I have an audio app that can play audio on an AirPlay device.
On non-Apple TV devices, the AirPlay app (on Roku, Samsung, etc.) shows the now playing metadata: title, artist, and album art.
However, on tvOS 18.1, no metadata is shown. The Apple TV device plays the audio, but there is no now playing information shown, nor any other indicators.
Other media apps show the "Now Playing" controls on the upper right of the tvOS home screen.
Can someone point me in the direction of how to solve this issue? I think I am missing something somewhere in regards to the tvOS metadata implementation.
                    
                  
                
                    
                      Hey,
There's like this darkish line on my iPhone and iPad when I open the Photos app. This scared the ding dong out of me the first time I saw it but then I realized in was a software issue when it disappeared as I swiped up to close the app. It's really weird because it's extremely faint but I can't seem to catch it in screenshots. I know for a fact this is a software issue because it doesn't show up in any other apps. It also changes from horizontal to vertical depending on how I turn my iPhone. Can everyone please just check your own iPhone or iPad to make sure I'm not the only one? I'm on the 18.2 developer beta by the way.
Thanks!
                    
                  
                
                    
                      I’ve encountered an issue when trying to transcribe audio during a SharePlay session in VisionOS. Specifically, the AVAudioSession appears to fail when sharing audio, preventing successful transcription. The problem seems related to AVAudioSession.sharedInstance() and using the .mixWithOthers option, which is supposed to enable multiple audio sources to coexist without interference.
Here’s the relevant code snippet that throws the error:
private static func prepareEngine() throws -> (AVAudioEngine, SFSpeechAudioBufferRecognitionRequest) {
    let audioEngine = AVAudioEngine()
    let request = SFSpeechAudioBufferRecognitionRequest()
    request.shouldReportPartialResults = true
    
    let audioSession = AVAudioSession.sharedInstance()
    try audioSession.setCategory(.playAndRecord, mode: .default, options: [.mixWithOthers, .allowBluetooth])
    try audioSession.setActive(true, options: .notifyOthersOnDeactivation)
    
    let inputNode = audioEngine.inputNode
    let recordingFormat = inputNode.outputFormat(forBus: 0)
    
    inputNode.installTap(onBus: 0, bufferSize: 1024, format: recordingFormat) { (buffer: AVAudioPCMBuffer, when: AVAudioTime) in
        request.append(buffer)
    }
    
    audioEngine.prepare()
    try audioEngine.start()
    
    return (audioEngine, request)
}
The setup is designed to initialize an AVAudioEngine and a SFSpeechAudioBufferRecognitionRequest for real-time transcription, but fails within the SharePlay context. Notably, while .mixWithOthers is intended to handle concurrent audio sessions, it doesn’t appear to work as expected during SharePlay. The audioSession.setActive(true) line is where the setup typically fails, with no clear solution to proceed.
Has anyone else faced similar issues with AVAudioSession and SharePlay in VisionOS? Any insights on how to manage audio sharing or transcription during a SharePlay session would be greatly appreciated!
The specific error is:
The operation couldn't be completed. (com.apple.coreaudio.avfaudio error 561145187.)
                    
                  
                
                    
                      I’m working on a memo app that records audio from the iPhone’s microphone (and other devices like MacBook or iPad) and processes it in 10-second chunks at a target sample rate of 16 kHz. However, I’ve encountered limitations with installTap in AVAudioEngine, which doesn’t natively support configuring a target sample rate on the mic input (the default being 44.1 kHz).
To address this, I tried using AVAudioMixerNode to downsample the mic input directly. Although everything seems correctly configured, no audio is recorded—just a flat signal with zero levels. There are no errors, and all permissions are granted, so it seems like an issue with downsampling rather than the mic setup itself.
To make progress, I implemented a workaround by tapping and resampling each chunk tapped using installTap (every 50ms in my case) with AVAudioConverter. While this works, it can introduce artifacts at the beginning and end of each chunk, likely due to separate processing instead of continuous downsampling.
Here are the key issues and questions I have:
1.	Can we change the mic input sample rate directly using AVAudioSession or another native API in AVAudio? Setting up the desired sample rate initially would be ideal for my use case.
2.	Are there alternatives to installTap for recording audio at a different sample rate or for continuously downsampling the live input without chunk-based artifacts?
This issue seems longstanding, as noted in a 2018 forum post:
https://forums.developer.apple.com/forums/thread/111726
Any guidance on configuring or processing mic input at a lower sample rate in real-time would be greatly appreciated. Thank you!
                    
                  
                
                    
                      I have an iPad app, written in objective-c and distributed through Enterprise developer, as it is not for public use but specific to some large companies.
The app has a local database and works offline
For some functions of the app I need to display images (not edit or cut them, just display them)
Right now there is integrated MWPhotoBrowser viewer, which has not been maintained for almost 10 years, so in addition to warnings in compilation I have to fight with some historical bugs especially on high resolution images. https://github.com/mwaterfall/MWPhotoBrowser
Do you know of a modern and maintained OFFLINE photo viewer? I evaluate both free and paid (maybe an SDK). My needs are very basic
I have found this one https://github.com/TimOliver/TOCropViewController, but I need to disable the photos edit features and especially I would lose the useful feature of displaying multiple images (mwphoto for multiple images showed a gallery)
                    
                  
                
                    
                      I am using ImageCaptureCore to access and (sometimes) download media files from a digital camera connected via USB (either to a Mac oder to an iOS device with Apple lightning to USB3 camera adapter).
This works very well in general, but what puzzles me is that for the ICCameraFile's EXIF creation/modification date, it always returns nil.
I can access the ICCameraItem's creation/modification date instead, which, as it says in the documentation "usually [is] the same as its EXIF creation date", but, well not always. Generally the EXIF tags are more reliable than the file dates, especially the modification date is easily messed up when copying files.
As for my cameras, they show the stable EXIF date on their display, so for consistency I would prefer to use the same in my app. Is there a way to get it without downloading the image from the camera and reading it from the file?
Does it possibly depend on the brand of camera (I mostly have Canon) whether ICCameraFile.exifCreationDate is ever populated or always nil?
For a thumb drive with DCIM folder, which is treated just like a camera, it is also nil.
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Media Technologies
  	
                
                
                SubTopic:
                  
                    
	
		Photos & Camera