Hi,It seems like it's pretty easy to consume HTTP Live Streaming content in an iOS app. Unfortunately, I need to consume media from an RTSP server. It seems to me that this is a very similar thing, and that all of the underpinnings for doing it ought to be present in iOS, but I'm having a devil of a time figuring out how to make it work without doing a lot of programming.For starters, I know that there are web-based services that can consume an RTSP stream and rebroadcast it as an HTTP Live Stream that can be easily consumed by the media players in iOS. This won't work for me because my application needs to function in an environment where there is no internet access (it's on a private Wifi network where the only other thing on the network is the device that is serving the RTSP stream).Having read everything I can get my hands on and exploring third-party and open-source solutions, I've compiled the following list of ideas:1. Using an iOS build of the open-source ffmpeg library, which supports RTSP, I've come up with a test app that can receive the RTSP packets, decode them, create UIImages out of the frames, and display those frames on-screen. This provides a crude player, but performance is poor, most likely because ffmpeg can't take advantage of any hardware acceleration. It also doesn't provide me with any way to integrate the video stream into AVFoundation, so I'm on my own as far as saving the stream to a file, transcoding it, etc.2. I know that the AVURLAsset class doesn't directly support the RTSP scheme. Since I have access to the undecoded RTSP packets via ffmpeg, I've thought it should be possible to implement RTSP support myself via a custom NSURLProtocol, essentially fooling AVFoundation into reading those packets as if they originated in a file. I'm not sure if this would work, since the raw packets coming from the RTSP server might lack the headers that would otherwise be present in data being read from a file. I'm not even sure if AVFoundation would recognize my custom protocol.3. If a protocol doesn't work, I've considered that I might be able to implement my own local HTTP Live Streaming server that converts the RTSP packets into an HTTP stream that the media players can read. This sounds like a terribly convoluted solution to the problem, at best, and very difficult at worst.4. Going back to solution (1), if I could speed up the decoding by using some iOS CoreVideo function instead of ffmpeg, this solution might be okay. However, I can't find any documentation for CoreVideo on iOS (Apple only documents it for OS X).5. I'm certainly willing to license a third-party solution if it works well and provides good performance. Unfortunately, everything I've found so far is pretty crummy and mostly just leverages ffmpeg and/or VLC. What is most disappointing to me is that nobody seems to be able or willing to provide a solution that neatly integrates with AVFoundation. I really want to make my RTSP stream available as an AVAsset so I can use it with AVFoundation players and other classes -- I don't want to build an app that relies on custom third-party code for everything.Any ideas, tips, advice would be greatly appreciated.Thanks,Frank
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
Opening this question after discussing the issue in the AVCapture lab, hopefully so we can track down this issue.
We've been noticing some crashes in App Store Connect caused by layoutSublayers being called on a background thread.
After debugging the issue a bit we found that all calls which modified the AVCaptureSession or preview layer were indeed done on the main thread. It would be useful to see what results in AVCaptureVideoPreviewLayer.updateFormatDescription being called.
I've attached the crashlog below.
Crash log.ips - https://developer.apple.com/forums/content/attachment/800b0dba-3477-4c5a-b56c-f4cc393b384f
I'm playing library items (MPMediaItem) and apple music tracks (Track) in MPMusicPlayerApplicationController.applicationQueuePlayer, but I can't use the actual Queue functionality because I can't figure out how to get both media types into the same queue. If there's a way to get both types in a single queue, that would solve my problem, but I've given up on that one.
Because I can't use a queue, I have to be able to detect when a song ends so that I can put the next song in the queue and play it. The only way I can figure out to detect when a song ends is by watching the playBackState, and I've actually got that pretty much working, but it's really ugly, because you get playBackState of paused when a song ends, and when a bluetooth speaker disconnects, etc.
The only answer I've been able to find on the internet is to watch the MPMusicPlayerControllerNowPlayingItemDidChange, and when that fires, and the nowPlayingItem is NIL, a song ends.. but that's not the case. When a song ends, the nowPlayingItem remains the same. There's got to be an answer to this problem, right?
our app meet a wired problem for online version. more and more user get 561145187 when try to call this code:
AudioQueueNewInput(&self->_recordFormat, inputBufferHandler, (__bridge void *)(self), NULL, NULL, 0, &self->_audioQueue)"
I search for several weeks, but nothing help.
we sum up all issues devices, found some similarity:
only happens on iPad OS 14.0 +
occurred when app started or wake from background (we call the code when app received "UIApplicationDidBecomeActiveNotification")
Any Idea why this happens?
Has anyone been able to successfully use MusicCatalogSearchRequest in a playgrounds app?
I have configured my playground similar to a regular app: app id with automatic music token generation turned on, music access authorized within the app itself, but whenever I query MusicCatalogSearchRequest I get an error thrown with .developerTokenRequestFailed.
Considering musickit is restricted in the sim, it would not surprise me if it was the same in playgrounds but it would be super helpful if I could prototype with musickit in playgrounds 4!
I'm using an AVAudioConverter object to decode an OPUS stream for VoIP. The decoding itself works well, however, whenever the stream stalls (no more audio packet is available to decode because of network instability) this can be heard in crackling / abrupt stop in decoded audio. OPUS can mitigate this by indicating packet loss by passing a null pointer in the C-library to
int opus_decode_float (OpusDecoder * st, const unsigned char * data, opus_int32 len, float * pcm, int frame_size, int decode_fec), see https://opus-codec.org/docs/opus_api-1.2/group__opus__decoder.html#ga9c554b8c0214e24733a299fe53bb3bd2.
However, with AVAudioConverter using Swift I'm constructing an AVAudioCompressedBuffer like so:
let compressedBuffer = AVAudioCompressedBuffer(
format: VoiceEncoder.Constants.networkFormat,
packetCapacity: 1,
maximumPacketSize: data.count
)
compressedBuffer.byteLength = UInt32(data.count)
compressedBuffer.packetCount = 1
compressedBuffer.packetDescriptions!
.pointee.mDataByteSize = UInt32(data.count)
data.copyBytes(
to: compressedBuffer.data
.assumingMemoryBound(to: UInt8.self),
count: data.count
)
where data: Data contains the raw OPUS frame to be decoded.
How can I specify data loss in this context and cause the AVAudioConverter to output PCM data whenever no more input data is available?
More context:
I'm specifying the audio format like this:
static let frameSize: UInt32 = 960
static let sampleRate: Float64 = 48000.0
static var networkFormatStreamDescription =
AudioStreamBasicDescription(
mSampleRate: sampleRate,
mFormatID: kAudioFormatOpus,
mFormatFlags: 0,
mBytesPerPacket: 0,
mFramesPerPacket: frameSize,
mBytesPerFrame: 0,
mChannelsPerFrame: 1,
mBitsPerChannel: 0,
mReserved: 0
)
static let networkFormat =
AVAudioFormat(
streamDescription:
&networkFormatStreamDescription
)!
I've tried 1) setting byteLength and packetCount to zero and 2) returning nil but setting .haveData in the AVAudioConverterInputBlock I'm using with no success.
Hi -
Of course I may be doing something wrong, but I'm getting exactly the opposite of what I would expect from
ApplicationMusicPlayer.shared.state.playbackStatus
It returns .playing when the music is paused and .paused when the music is playing.
Am I holding it wrong?
Thanks,
Daniel
The sample code in the Apple documentation found in PHCloudIdentifier does not compile in xCode 13.2.1.
Can the interface for identifier conversion be clarified so that the answer values are more accessible/readable. The values are 'hidden' inside a Result enum
It was difficult (for me) to rewrite the sample code because I made the mistake of interpreting the Result type as a tuple. Result type is really an enum.
Using the Result type as the return from library.cloudIdentifierMappings(forLocalIdentifiers: ) and .localIdentifierMappings(
for: )
puts the actual mapped identifiers inside the the enum where they need additional access via a .stringValue message or an evaluation of an element of the result enum.
For others finding the same compile issue, here is a working version of the sample code. This compiles in xCode 13.2.1.
func localId2CloudId(localIdentifiers: [String]) -> [String] {
var mappedIdentifiers = [String]()
let library = PHPhotoLibrary.shared()
let iCloudIDs = library.cloudIdentifierMappings(forLocalIdentifiers: localIdentifiers)
for aCloudID in iCloudIDs {
let cloudResult: Result = aCloudID.value
// Result is an enum .. not a tuple
switch cloudResult {
case .success(let success):
let newValue = success.stringValue
mappedIdentifiers.append(newValue)
case .failure(let failure):
// do error notify to user
}
}
return mappedIdentifiers
}
``` swift func
func cloudId2LocalId(assetCloudIdentifiers: [PHCloudIdentifier]) -> [String] {
// patterned error handling per documentation
var localIDs = [String]()
let localIdentifiers: [PHCloudIdentifier: Result<String, Error>] = PHPhotoLibrary.shared() .localIdentifierMappings(
for: assetCloudIdentifiers)
for cloudIdentifier in assetCloudIdentifiers {
guard let identifierMapping = localIdentifiers[cloudIdentifier] else {
print("Failed to find a mapping for \(cloudIdentifier).")
continue
}
switch identifierMapping {
case .success(let success):
localIDs.append(success)
case .failure(let failure) :
let thisError = failure as? PHPhotosError
switch thisError?.code {
case .identifierNotFound:
// Skip the missing or deleted assets.
print("Failed to find the local identifier for \(cloudIdentifier). \(String(describing: thisError?.localizedDescription)))")
case .multipleIdentifiersFound:
// Prompt the user to resolve the cloud identifier that matched multiple assets.
print("Found multiple local identifiers for \(cloudIdentifier). \(String(describing: thisError?.localizedDescription))")
// if let selectedLocalIdentifier = promptUserForPotentialReplacement(with: thisError.userInfo[PHLocalIdentifiersErrorKey]) {
// localIDs.append(selectedLocalIdentifier)
default:
print("Encountered an unexpected error looking up the local identifier for \(cloudIdentifier). \(String(describing: thisError?.localizedDescription))")
}
}
}
return localIDs
}
Just wondering if anyone else is having issues with currentPlaybackRate in release version of iOS 15.4? In my particular case this is using MPMusicPlayerController.applicationQueuePlayer.
I've always had issues controlling this property reliably but from what I can see it is now completely non-operational in 15.4.
I've isolated this behavior in a trivial project, and will file a radar, but hoping others may have some insight first.
FWIW- This is my trivial test case:
class ViewController: UIViewController {
lazy var player: MPMusicPlayerApplicationController = {
let player = MPMusicPlayerController.applicationQueuePlayer
player.repeatMode = .none
player.shuffleMode = .off
player.beginGeneratingPlaybackNotifications()
return player
}()
override func viewDidLoad() {
super.viewDidLoad()
NotificationCenter.default.addObserver(forName: .MPMusicPlayerControllerPlaybackStateDidChange, object: nil, queue: .main) { [weak self] notification in
guard let notificationPlayer = notification.object as? MPMusicPlayerApplicationController,
notificationPlayer === self?.player else {
return
}
debugPrint("Player state now: \(notificationPlayer.playbackState)")
}
}
@IBAction func goAction(_ sender: Any) {
guard let item = MPMediaQuery.songs().items?.randomElement() else {
debugPrint("Unable to access media items")
return
}
debugPrint("Now playing item: \(item.title ?? "")")
player.setQueue(with: [item.playbackStoreID])
player.prepareToPlay() { error in
guard error == nil else {
debugPrint("Player error: \(error!.localizedDescription)")
return
}
DispatchQueue.main.async { [weak self] in
self?.player.play()
}
}
}
@IBAction func slowAction(_ sender: Any) {
debugPrint("Setting currentPlaybackRate to 0.5")
player.currentPlaybackRate = 0.5
checkPlaybackRate()
}
@IBAction func fastAction(_ sender: Any) {
debugPrint("Setting currentPlaybackRate to 1.5")
player.currentPlaybackRate = 1.5
checkPlaybackRate()
}
func checkPlaybackRate(afterSeconds delay: TimeInterval = 1.0) {
DispatchQueue.main.asyncAfter(deadline: .now() + delay) {
debugPrint("After \(delay) seconds currentPlaybackRate now: \(self.player.currentPlaybackRate)")
}
}
}
Typical console output:
"Now playing item: I Know You Know"
"Player state now: MPMusicPlaybackState(rawValue: 2)"
"Player state now: MPMusicPlaybackState(rawValue: 1)"
"Setting currentPlaybackRate to 1.5"
"After 1.0 seconds currentPlaybackRate now: 1.0"
"Setting currentPlaybackRate to 0.5"
"After 1.0 seconds currentPlaybackRate now: 1.0"
When playing several short HLS clips using AVPlayer connected to a TV using Apple's Lightning-to-HDMI adapter (A1438) we often fail with those unknown errors.
CoreMediaErrorDomain -12034
and
CoreMediaErrorDomain -12158
Anyone has any clue what the errors mean?
Environment:
iPhone8
iOS 15.4
Lightning-to-HDMI adapter (A1438)
I'm very excited about the new MusicLibrary API, but after a couple of days of playing around with it, I have to say that I find the implementation of filtering MusicLibraryRequests a little confusing. MPMediaQuery has a fairly extensive list of predicates that can be applied, including string and persistentID comparisons for artist, album artist genre, and more. It also lets you filter on an item’s title. MusicLibraryRequests let you filter on the item’s ID, or on its MusicKit Artist and Genre relationships. To me, this seems like it adds an extra step.
With an MPMediaQuery, if I wanted to fetch every album by a given artist, I’d apply an MPMediaPropertyPredicate looking at MPMediaItemPropertyAlbumArtist and compare the string. It was also easy to change the MPMediaPredicateComparison to .contains to match more widely. If I wanted to surface albums by “Aesop Rock” or “Aesop Rock & Blockhead,” I could use that.
In the MusicLibraryRequest implementation, it looks like I need to perform a MusicLibraryRequest<Artist> first in order to get the Artist objects. There’s no filter for the name property, so if I don’t have their IDs, I’ve got to use filter(text:). From there, I can take the results of that request and apply them to my MusicLibraryRequest<Album> using the filter(matching:memberOf) function.
I could use filter(text:) on the MusicLibraryRequest<Album>, but that filters across multiple properties (title and artistName?) and is less precise than defining the actual property I want to match against.
I think my ideal version of the MusicLibraryRequest API would offer something like filter(matching:equalTo:) or filter(matching:contains:) that worked off of KeyPaths rather than relationships. That seems more intuitive to me. I’m not saying we need every property from every filterable MPMediaItemProperty key, but I’d love to be able to do it on title, artistName, and other common metadata. That might look something like:
filter(matching: \.title, contains: “Abbey Road”)
filter(matching: \.artistName, equalTo: “Between The Buried And Me”)
I noticed that filter(text:) is case insensitive, which is awesome, and something I’ve wanted for a long time in MPMediaPropertyPredicate. As a bonus, it would be great if a KeyPath based filter API supported a case sensitivity flag. This is less of a problem when dealing with Apple Music catalog content, but users’ libraries are a harsh environment, and you might have an artist “Between The Buried And Me” and one called “Between the Buried and Me.” It would be great to get albums from both with something like:
filter(matching: \.artistName, equalTo: “Between The Buried And Me”, caseSensitive: false)
I've submitted the above as FB10185685. I also submitted another feedback this morning regarding filter(text:) and repeating text as FB10184823.
My last wishlist item for this API (for the time being!) is exposing the MPMediaItemPropertyAlbumPersistentID as an available filter attribute. I know, I know… hear me out. If you take a look at the other thread I made today, you’ll see that due to missing metadata in MusicKit, I still have some use cases where I need to be able to reference an MPMediaItem and might need to fetch its containing MPMediaItemCollection to get at other tracks on the album. It would be nice to seamlessly be able to fetch the MPMediaItemCollection or the library Album using a shared identifier, especially when it comes to being able to play the album in MusicKit’s player rather than Media Player’s.
I've submitted that list bit as FB10185789
Thanks for bearing with my walls of text today. Keep up the great work!
I've just begun to dip my toes into the iOS16 waters.
One of the first things that I've attempted is to edit a library playlist using:
try await MusicLibrary.shared.edit(targetPlaylist, items: tracksToAdd)
Where targetPlaylist is of type MusicItemCollection<MusicKit.Playlist>.Element and tracksToAdd is of type [Track]
The targetPlaylist was created, using new iOS16 way, here:
let newPlaylist = try await MusicLibrary.shared.createPlaylist(name: name, description: description)
tracksToAdd is derived by performing a MusicLibraryRequest on a specific playlist ID, and then doing something like this:
if let tracksToAdd = try await playlist.with(.tracks).tracks {
// add tracks to target playlist
}
My problem is that when I perform attempt the edit, I am faced with a rather sad looking crash.
libdispatch.dylib`dispatch_group_leave.cold.1:
0x10b43d62c <+0>: mov x8, #0x0
0x10b43d630 <+4>: stp x20, x21, [sp, #-0x10]!
0x10b43d634 <+8>: adrp x20, 6
0x10b43d638 <+12>: add x20, x20, #0xfbf ; "BUG IN CLIENT OF LIBDISPATCH: Unbalanced call to dispatch_group_leave()"
0x10b43d63c <+16>: adrp x21, 40
0x10b43d640 <+20>: add x21, x21, #0x260 ; gCRAnnotations
0x10b43d644 <+24>: str x20, [x21, #0x8]
0x10b43d648 <+28>: str x8, [x21, #0x38]
0x10b43d64c <+32>: ldp x20, x21, [sp], #0x10
-> 0x10b43d650 <+36>: brk #0x1
I assume that I must be doing something wrong, but I frankly have no idea how to troubleshoot this.
Any help would be most appreciated. Thanks. @david-apple?
Hello,
I'm new to the Swift MusicKit API and am starting with the implementation in iOS 16.
I'm getting stuck on an issue where there is no background or text color associated with the Artwork object. Is this something you have to make an additional property request for, and if so, how do you do that?
var catalogSearch = MusicCatalogResourceRequest<Album>(matching: \.id, equalTo: item.id)
let catalogResponse = try await request.response()
guard let firstItem = catalogResponse.items.first else {
return
}
In this example, firstItem.artwork only contains the url and what look like incorrect max width/height values.
here's a printout of firstItem.artwork
Optional(Artwork(
urlFormat: "musicKit://artwork/library/5F37858D-F46B-4F12-BA67-40FA8DD63D87/{w}x{h}?at=item&fat=&id=7718670444435992305&lid=5F37858D-F46B-4F12-BA67-40FA8DD63D87&mt=music&aat=Music122/v4/37/25/f5/3725f515-249f-7b91-77bb-f479cd48201c/22UMGIM32254.rgb.jpg",
maximumWidth: 0,
maximumHeight: 0
))
Hello!
I am having trouble setting start times for songs when using the ApplicationMusicPlayer.
When I initialize a new MusicPlayer.Queue.Entry using the following constructor, I am seeing strange results:
init(
_ playableMusicItem: PlayableMusicItem,
startTime: TimeInterval? = nil,
endTime: TimeInterval? = nil
)
It appears that any value I provide for startTime is also applied to the endTime. For example:
MusicPlayer.Queue.Entry(playable, startTime: TimeInterval(30), endTime: TimeInterval(183))
provides the following console output:
MusicPlayer.Queue.Entry(id: "3D6A3DA3-595E-4657-8DBA-DDD245BBB7EF", transientItem: PlayableMusicItem, startTime: 30.0, endTime: 30.0)
I have also tried setting the endTime to nil with the same result. Does anyone have any experience setting start times for songs using the MusicKit ApplicationMusicPlayer?
Any feedback is greatly appreciated!
Hi all,
Apple dropping on-going development for FireWire devices that were supported with the Core Audio driver standard is a catastrophe for a lot of struggling musicians who need to both keep up to date on security updates that come with new OS releases, and continue to utilise their hard earned investments in very expensive and still pristine audio devices that have been reduced to e-waste by Apple's seemingly tone-deaf ignorance in the cries for on-going support.
I have one of said audio devices, and I'd like to keep using it while keeping my 2019 Intel Mac Book Pro up to date with the latest security updates and OS features.
Probably not the first time you gurus have had someone make the logical leap leading to a request for something like this, but I was wondering if it might be somehow possible of shoe-horning the code used in previous versions of Mac OS that allowed the Mac to speak with the audio features of such devices to run inside the Ventura version of the OS.
Would it possible? Would it involve a lot of work? I don't think I'd be the only person willing to pay for a third party application or utility that restored this functionality.
There has to be 100's of thousands of people who would be happy to spare some cash to stop their multi-thousand dollar investment in gear to be so thoughtlessly resigned to the scrap heap.
Any comments or layman-friendly explanations as to why this couldn’t happen would be gratefully received!
Thanks,
em
The two ScreenCaptureKit WWDC22 sessions show how to capture with the new framework but the retina factor is hardcoded to 2 in SCStreamConfiguration.
When using on a non-retina display, the screencapture is floating on the upper-left corner of the image buffer.
There does not seem to be a simple way to retrieve the retina factor from the SCShareableContent data (when configuring the capture).
When processing the streaming output, the SCStreamFrameInfo attachment is supposed to have a scaleFactor property but .scaleFactor does not return a value.
I have found out that the attachement dictionary contains SCStreamUpdateFrameDisplayResolution. This entry gives me the retina factor but it is not an official SCStreamFrameInfo key. I list the keys to access it.
What is the proper way with ScreenCapture to handle the retina factors ?
Add RPSystemBroadcastPickerView to the app,
After clicking, no method of SampleHandler is triggered
We observed that the PHPicker is unable to load RAW images captured on an iPhone in some scenarios. And it is also somehow related to iCloud.
Here is the setup:
The PHPickerViewController is configured with preferredAssetRepresentationMode = .current to avoid transcoding.
The image is loaded from the item provider like this:
if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeImage) {
itemProvider.loadFileRepresentation(forTypeIdentifier: kUTTypeImage) { url, error in
// work
}
}
This usually works, also for RAW images. However, when trying to load a RAW image that has just been captured with the iPhone, the loading fails with the following errors on the console:
[claims] 43A5D3B2-84CD-488D-B9E4-19F9ED5F39EB grantAccessClaim reply is an error: Error Domain=NSCocoaErrorDomain Code=4097 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x2804a8e70 {Error Domain=NSCocoaErrorDomain Code=4097 "connection from pid 19420 on anonymousListener or serviceListener" UserInfo={NSDebugDescription=connection from pid 19420 on anonymousListener or serviceListener}}}
Error copying file type public.image. Error: Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.image" UserInfo={NSLocalizedDescription=Cannot load representation of type public.image, NSUnderlyingError=0x280480540 {Error Domain=NSCocoaErrorDomain Code=4097 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x2804a8e70 {Error Domain=NSCocoaErrorDomain Code=4097 "connection from pid 19420 on anonymousListener or serviceListener" UserInfo={NSDebugDescription=connection from pid 19420 on anonymousListener or serviceListener}}}}}
We observed that on some devices, loading the image will actually work after a short time (~30 sec), but on others it will always fail.
We think it is related to iCloud Photos: On the device that has iCloud Photos sync enabled, the picker is able to load the image right after it was synced to the cloud. On devices that don't sync the image, loading always fails. It seems that the sync process is doing some processing (?) of the image that will later enable the picker to load it successfully, but that's just guessing.
Additional observations:
This seems to only occur for images that were taken with the stock Camera app. When using Halide to capture RAW (either ProRAW or RAW), the Picker is able to load the image.
When trying to load the image as kUTTypeRawImage instead of kUTTypeImage, it also fails.
The picker also can't load RAW images that were AirDroped from another device, unless it synced to iCloud first.
This is reproducable using the Selecting Photos and Videos in iOS sample code project.
We observed this happening in other apps that use the PHPicker, not just ours.
Is this a bug, or is there something that we are missing?
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
PhotoKit
Photos and Imaging
wwdc2022-10023
I have a custom app running on a Mac Studio with Ventura that grabs a snapshot image from a network camera. It then adds some extra information into the EXIF "MakerNote" field. However the metadata cannot be read back out of the image when running Ventrua, it can however be read out of the same image file on a Mac that is not running Ventura.
It would appear Apple has removed support for reading MakerNote in Ventura but still supports writing MakerNote in Ventura.
This code is about 7 years old and written in ObjC and has worked with no issue until Ventura came along.
Calls used
CGImageDestinationAddImageFromSource(); // used to write the image to disk with the extra metadata - Works on Ventura
CGImageSourceCopyPropertiesAtIndex(); // used to read the meta data from an image - does not return "MakeNote" data
Is there a new way to read EXIF "MakeNote" data from image files that was introduced with Ventura?
I cannot find anything documentation re: isPrivacySensitiveAlbum. I've granted my app access to all photos. Not sure what else to try
Code that triggers the crash:
let options = PHFetchOptions()
options.fetchLimit = 1
let assetColl = PHAssetCollection.fetchAssetCollections(withLocalIdentifiers: [localId], options: options)
if assetColl.count > 0 {
if let asset = PHAsset.fetchKeyAssets(in: assetColl.firstObject!, options: options)
stack trace from here on
`2023-04-15 06:34:41.628537-0700 DPF[33615:6484880] -[PHCollectionList isPrivacySensitiveAlbum]: unrecognized selector sent to instance 0x7ff09232aec0
2023-04-15 06:34:41.632378-0700 DPF[33615:6484880] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[PHCollectionList isPrivacySensitiveAlbum]: unrecognized selector sent to instance 0x7ff09232aec0'
*** First throw call stack:
(
0 CoreFoundation 0x00007ff80045478b __exceptionPreprocess + 242
1 libobjc.A.dylib 0x00007ff80004db73 objc_exception_throw + 48
2 CoreFoundation 0x00007ff8004638c4 +[NSObject(NSObject) instanceMethodSignatureForSelector:] + 0
3 CoreFoundation 0x00007ff800458c66 ___forwarding___ + 1443
4 CoreFoundation 0x00007ff80045ae08 _CF_forwarding_prep_0 + 120
5 Photos 0x00007ff80b8480e1 +[PHAsset fetchKeyAssetsInAssetCollection:options:] + 86
6 DPF 0x0000000100791029 $s3DPF16AlbumListFetcherV22loadKeyImageForLocalIdySo7UIImageCSgSSYaFTY0_ + 569`