Hey there, I'm trying to display all user's albums using the MediaPlayer library. I'm getting many albums returning nil, but I know artwork exists because they show up in the default Music app. There doesn't seem to be much rhyme or reason for what shows up and what doesn't. All downloaded albums display artwork, but some cloud album artwork displays as well. Here's the code I'm using to debug this.
let query = MPMediaQuery.albums()
if let albumCollections = query.collections {
albums = albumCollections
}
for album in albums {
let artwork = album.representativeItem?.artwork
print(artwork, artwork?.image(at: CGSize(width: 100, height: 100)))
}
Any help would be greatly appreciated. Thanks!
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
I am trying to use the Speech Synthesizer to speak the pronunciation of a word in British English rather than play a local audio file which I had before. However, I keep getting this in the debugger:
#FactoryInstall Unable to query results, error: 5 Unable to list voice folder Unable to list voice folder Unable to list voice folder IPCAUClient.cpp:129 IPCAUClient: bundle display name is nil Unable to list voice folder
Here is my code, any suggestions??
` func playSampleAudio() {
let speechSynthesizer = AVSpeechSynthesizer()
let speechUtterance = AVSpeechUtterance(string: currentWord)
// Search for a voice with a British English accent.
let voices = AVSpeechSynthesisVoice.speechVoices()
var foundBritishVoice = false
for voice in voices {
if voice.language == "en-GB" {
speechUtterance.voice = voice
foundBritishVoice = true
break
}
}
if !foundBritishVoice {
print("British English voice not found. Using default voice.")
}
// Configure the utterance's properties as needed.
speechUtterance.rate = AVSpeechUtteranceDefaultSpeechRate
speechUtterance.pitchMultiplier = 1.0
speechUtterance.volume = 1.0
// Speak the word.
speechSynthesizer.speak(speechUtterance)
}
Running in a Mac (Catalyst) target or Apple Silicon (designed for iPad).
Just accessing the playbackStoreID from the MPMediaItem shows this error in the console:
-[ITMediaItem valueForMPMediaEntityProperty:]: Unhandled MPMediaEntityProperty subscriptionStoreItemAdamID.
The value returned is always “”.
This works as expected on iOS and iPadOS, returning a valid playbackStoreID.
import SwiftUI
import MediaPlayer
@main
struct PSIDDemoApp: App {
var body: some Scene {
WindowGroup {
Text("playbackStoreID demo")
.task {
let authResult = await MPMediaLibrary.requestAuthorization()
if authResult == .authorized {
if let item = MPMediaQuery.songs().items?.first {
let persistentID = item.persistentID
let playbackStoreID = item.playbackStoreID // <--- Here
print("Item \(persistentID), \(playbackStoreID)")
}
}
}
}
}
}
Xcode 15.1, also tested with Xcode 15.3 beta 2.
MacOS Sonoma 14.3.1
FB13607631
Hello, we are embedding a PHPickerViewController with UIKit (adding the vc as a child vc, embedding the view, calling didMoveToParent) in our app using the compact mode. We are disabling the following capabilities .collectionNavigation, .selectionActions, .search.
One of our users using iOS 17.2.1 and iPhone 12 encountered a crash with the following stacktrace:
Crashed: com.apple.main-thread
0 libsystem_kernel.dylib 0x9fbc __pthread_kill + 8
1 libsystem_pthread.dylib 0x5680 pthread_kill + 268
2 libsystem_c.dylib 0x75b90 abort + 180
3 PhotoFoundation 0x33b0 -[PFAssertionPolicyCrashReport notifyAssertion:] + 66
4 PhotoFoundation 0x3198 -[PFAssertionPolicyComposite notifyAssertion:] + 160
5 PhotoFoundation 0x374c -[PFAssertionPolicyUnique notifyAssertion:] + 176
6 PhotoFoundation 0x2924 -[PFAssertionHandler handleFailureInFunction:file:lineNumber:description:arguments:] + 140
7 PhotoFoundation 0x3da4 _PFAssertFailHandler + 148
8 PhotosUI 0x22050 -[PHPickerViewController _handleRemoteViewControllerConnection:extension:extensionRequestIdentifier:error:completionHandler:] + 1356
9 PhotosUI 0x22b74 __66-[PHPickerViewController _setupExtension:error:completionHandler:]_block_invoke_3 + 52
10 libdispatch.dylib 0x26a8 _dispatch_call_block_and_release + 32
11 libdispatch.dylib 0x4300 _dispatch_client_callout + 20
12 libdispatch.dylib 0x12998 _dispatch_main_queue_drain + 984
13 libdispatch.dylib 0x125b0 _dispatch_main_queue_callback_4CF + 44
14 CoreFoundation 0x3701c __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 16
15 CoreFoundation 0x33d28 __CFRunLoopRun + 1996
16 CoreFoundation 0x33478 CFRunLoopRunSpecific + 608
17 GraphicsServices 0x34f8 GSEventRunModal + 164
18 UIKitCore 0x22c62c -[UIApplication _run] + 888
19 UIKitCore 0x22bc68 UIApplicationMain + 340
20 WorkAngel 0x8060 main + 20 (main.m:20)
21 ??? 0x1bd62adcc (Missing)
Please share if you have any ideas as to what might have caused that, or what to look at in such a case. I haven't been able to reproduce this myself unfortunately.
Hello everyone,
I was playing a livestream when I received the error -16831/START-TIME is too close to live returned from the AVPlayerItemNewErrorLogEntry function. I don't know why the error is returned.Can you explain to me the reason for this error?
I play livestream thì bị lỗi : -12888 -"Playlist File unchanged for longer than 1.5 * target duration" , I also read error -12888 in the documentation page 170: https://docs.huihoo.com/apple/wwdc/ 2018/502_measuring_and_optimizing_hls_performance.pdf but still don't understand the reason. Please explain to me the reason for the error?
I am developing an app using a data cable to link a camera. When I enter the page for the first time, I can detect the camera device, and then when I exit the page and enter again, I cannot detect the linked camera.
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view.
self.view.backgroundColor = [UIColor whiteColor];
[self addImageCaptureCore];
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(1.0 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
[self checkCameraConnection];
});
}
- (void)checkCameraConnection {
if (@available(iOS 13.0, *)) {
NSArray<ICDevice *> *connectedDevices = self.browser.devices;
if (connectedDevices.count > 0) {
NSLog(@"Camera is connected");
} else {
NSLog(@"Camera is not connected");
}
}
else {
// Fallback on earlier versions
}
}
- (void)viewWillDisappear:(BOOL)animated {
[super viewWillDisappear:animated];
if (@available(iOS 13.0, *)) {
if (self.cameraDevice) {
if (self.cameraDevice.hasOpenSession) {
[self.cameraDevice requestCloseSession];
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.5 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
[self.browser stop];
self.browser.delegate = nil;
self.browser = nil;
});
}
else {
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.5 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
[self.browser stop];
self.browser.delegate = nil;
self.browser = nil;
});
}
}
} else {
// Fallback on earlier versions
}
}
- (void)addImageCaptureCore {
if (@available(iOS 13.0, *)) {
ICDeviceBrowser *browser = [[ICDeviceBrowser alloc] init];
browser.delegate = self;
[browser start];
self.browser = browser;
}
else {
}
}
#pragma mark - ICDeviceBrowserDelegate
- (void)deviceBrowser:(ICDeviceBrowser*)browser didAddDevice:(ICDevice*)device moreComing:(BOOL) moreComing API_AVAILABLE(ios(13.0)){
NSLog(@"Device name = %@",device.name);
if ([device isKindOfClass:[ICCameraDevice class]]) {
if ([device.capabilities containsObject:ICCameraDeviceCanAcceptPTPCommands]) {
ICCameraDevice *cameraDevice = (ICCameraDevice *)device;
cameraDevice.delegate = self;
[cameraDevice requestOpenSession];
self.cameraDevice = cameraDevice;
}
}
}
- (void)deviceBrowser:(ICDeviceBrowser*)browser didRemoveDevice:(ICDevice*)device moreGoing:(BOOL) moreGoing API_AVAILABLE(ios(13.0)){
if (self.cameraDevice) {
if (self.cameraDevice.hasOpenSession) {
[self.cameraDevice requestCloseSession];
self.cameraDevice.delegate = nil;
self.cameraDevice = nil;
}
else {
self.cameraDevice.delegate = nil;
self.cameraDevice = nil;
}
}
}
#pragma mark - ICCameraDeviceDelegate
- (void)cameraDevice:(ICCameraDevice*)camera didAddItems:(NSArray<ICCameraItem*>*) items API_AVAILABLE(ios(13.0)){
if (items.count > 0) {
ICCameraItem *latestItem = items.lastObject;
NSLog(@"name = %@",latestItem.name);
}
}
#pragma mark - ICDeviceDelegate
- (void)device:(ICDevice*)device didOpenSessionWithError:(NSError* _Nullable) error API_AVAILABLE(ios(13.0)){
if (error) {
NSLog(@"Failed to open session %@",error.localizedDescription);
}
else {
NSLog(@"open session success");
}
}
- (void)device:(ICDevice*)device didCloseSessionWithError:(NSError* _Nullable)error API_AVAILABLE(ios(13.0)){
if (error) {
NSLog(@"close session error = %@",error.localizedDescription);
}
else {
NSLog(@"didCloseSession");
}
}
- (void)didRemoveDevice:(ICDevice*)device {
}
I often find when doing basic actions in MusicKit it is incredibly slow compared to Apple's Music App. I've tried different versions, devices, networks, Apple's sample code, it all throughout the last several years, and it is all the same. Does anyone else have this issue?
Hi everyone! Are there any plans or existing alternatives to include the date a track was added to a playlist within Apple Music's API[1]? This functionality exists on Spotify[2] (with their "added_at" attribute), and it would be helpful for ordering tracks retrieved from playlists. Thank in advance for any help!
[1]https://developer.apple.com/documentation/applemusicapi/get_a_catalog_playlist_s_relationship_directly_by_name
[2]https://developer.spotify.com/documentation/web-api/reference/get-playlists-tracks
Under Sonoma 14.4 the compression option doesn't work with PNG images. It works for JPG/HEIF. Preview can export PNG file to HEIC with compression option. What am I missing? Previously this has worked. I am trying with 0.01 and 0.9 as compression quality and the file size is the same for PNG.
Is Preview using some trick to convert the image using ciContext.createCGImage?
PS: Compression option of 1.0 was broken under 14.4 RC and Preview created empty file.
func heifImageDataUsingDestination(at url: URL, compressionQuality : CGFloat) -> Data? {
guard let imageSource = CGImageSourceCreateWithURL(url as CFURL, nil) else { return nil }
guard let cgImage = CGImageSourceCreateImageAtIndex(imageSource, 0, nil) else { return nil }
var mutableData = NSMutableData()
guard let imageDestination = CGImageDestinationCreateWithData(mutableData, "public.heic" as CFString, 1, nil) else { return nil }
let options = [ kCGImageDestinationLossyCompressionQuality: compressionQuality ] as CFDictionary
CGImageDestinationAddImage(imageDestination, cgImage, options)
let success = CGImageDestinationFinalize(imageDestination)
if success {
return mutableData as Data
}
return nil
}
func heifImageDataUsingCIContext(at url: URL, compressionQuality : CGFloat) -> Data? {
guard let ciImage = CIImage(contentsOf: url) else { return nil }
let context = CIContext()
let colorspace = ciImage.colorSpace ?? CGColorSpaceCreateDeviceRGB()
let options = [CIImageRepresentationOption(rawValue: kCGImageDestinationLossyCompressionQuality as String) : compressionQuality]
return context.heifRepresentation(of: ciImage, format: .RGBA8, colorSpace: colorspace, options: options)
}
I'm working on an application that aims to deliver a DJ-like experience by overlapping songs and implementing volume fading. During development, I've encountered a roadblock due to the MusicPlayerController singleton behavior in iOS, which seems to only support playback of one audio stream at a time and doesn't support overlapping of songs or fading volume between tracks.
I understand that Apple Music content is protected and that playback through Apple MusicKit must respect the DRM and licensing agreements. However, I've noticed an application called "Mixonset" that seems to be able to stream songs from Apple Music, use the music player, and create an overlapping effect of songs for users.
Could anyone share insights on how Mixonset might be achieving this within the constraints of the Apple MusicKit? Is there any approach that I could explore to implement similar functionality, such as overlapping songs or crossfading while adhering to Apple's guidelines and without violating any terms of service?
Any advice or direction towards documentation and API capabilities that could support these features would be greatly appreciated.
Thank you for your assistance.
Hello everyone, my name is Joshua Osagie. For 2 months now, I have been trying to build my own music application, but unfortunately, I can’t because I was thinking I would get an API from maybe Apple Music or Spotify that will grant me access to over 100 million music. Even if the API is paid for, I have been doing my research, but then it's kind of impossible. So please, if anyone has an idea on what I can do to bring this application to life, I will really appreciate it. Or if anyone could share me an idea on how to get over millions of music on my app, I will be really grateful.
After thoroughly scouring the internet numerous times, I came to the realization that Apple has not documented their User Authentication flow. Instead, developers are often directed to use their JavaScript solution, which proves to be insufficient and impractical for many projects.
Could we please request a comprehensive documentation outlining the process of generating a Music User Token from the Developer Token? This would greatly benefit developers seeking to integrate Apple Music functionality into their projects.
I was about to finally sign up for the Apple Music affiliate program since one of my apps does provide the MusickKit Subscription Offer. However, it looks like it was renamed and restricts usage to only artists and record labels. (https://partners.applemediaservices.com)
Is this correct? Can I as an indie developer not earn commission of Apple Music that I drive from my apps?
I'm trying to find a library item by title and artist but it returns 0 items. Example below for existing track in my library.
With title filter commented out, it successfully gives me all library items for that artistName.
If I add the title filter, or have only the title filter, I get 0 items. Why is that?
var request = MusicLibraryRequest<MusicKit.Track>()
// request.filter(matching: \.title, equalTo: "Crises (Remastered 2013)")
request.filter(matching: \.artistName, equalTo: "Mike Oldfield")
let response = try await request.response()
I can find the track by filtering the returned tracks by artist, but I feel this might not be an ideal approach if I have a bunch of tracks to find, possibly by different artists.
The reason I'm not querying by id is that I'm planning to do this sort of query for non Apple Music items and if I'm not mistaken there is no cross-device id for those (even with Sync Library on). If I have the app on multiple devices with the same Apple ID looking at the same library, I want device 2 to find the track you interacted with on device 1. If there are better ways to solve this, any ideas are welcome.
Appreciate any help.
I filed FB13689023 for this since it is clearly unexpected.
Non Apple Music library tracks are not returned on macOS using MusicLibraryRequest. I tried querying by artist where I only get Apple Music tracks, and by id where I don't get the track.
On iPhone and iPad I was able to get both AM and non-AM tracks, that's how I got a valid id to try on macOS. Sync Library is on.
The build targets My Mac (Designed for iPad).
I'm using Xcode 15.3 and I'm on Sonoma (14.4).
Any help is appreciated...
I am using ReplayKit's RPScreenRecorder to record my app. When I use it in a mixed immersive space, nothing is actually recorded. The video is entirely blank.
Is this a feature or a bug? I am trying to record everything the user sees, including passthrough. Is there another way to do this?
Access to fetch at 'https://play.itunes.apple.com/WebObjects/MZPlay.woa/wa/webPlayback' from origin 'http://localhost:5173' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
I would like to save the depth map from ARDepthData as .tiff, but notice my output tiff distances are incorrect. Objects that are close are reported to be slightly farther away, and walls that are around 4 meters away from me have a recorded value of 2 meters. I am using this code to write the tiff:
import UIKit
# Save method
extension CVPixelBuffer {
func saveDepthMapToTIFF(to path: URL) {
let ciImage = CIImage(cvPixelBuffer: self)
let context = CIContext()
do {
try context.writeTIFFRepresentation(
of: ciImage,
to: path,
format: .Lf,
colorSpace: CGColorSpaceCreateDeviceGray()
)
} catch {
print("Failed to write TIFF: \(error)")
}
}
}
# Calling the save
arFrame.sceneDepth?.depthMap.saveDepthMapToTIFF(to: depthMapPath)
I am reading the file like this in Python
import tifffile
depth_map = tifffile.imread("test.tiff")
plt.imshow(depth_map)
plt.colorbar()
which creates this image:
The farthest parts of the room should be around 4 meters, not 2. The dark blue spot on the lower right is closer than half a meter away.
Notably the depth map contains distances from the camera plane to each region, not the distance from the camera sensor to the region. Even correcting for this though, the depth map remains about the same.
Is there an issue with how I am saving the depth image? Is there a scale factor or format error?
if balloon == yellow1_balloon {
soundFile = "Sounds/newblop.wav"
playSound()
balloon.isHidden = true
poppedImages.isHidden = false
poppedImages.animationImages = ["popyellow-1","popyellow-2","popyellow-3","popyellow-4","popyellow-5","popyellow-6","popyellow-7"]
.compactMap({ name in
UIImage(named: name)
})
let x:CGFloat = yellow1_balloon.frame.origin.x
let y:CGFloat = yellow1_balloon.frame.origin.y
poppedImages.frame.origin.x = x
poppedImages.frame.origin.y = y
poppedImages.animationDuration = 1.0
poppedImages.animationRepeatCount = 1
poppedImages.startAnimating()
score = score + 10
scoreLbl.text = String(score)
return
}
x,y cordinates are always the same a when yellow1_balloon is first created and not where it ends up after being touched.