I’m running HomePod OS 26 on two HomePod minis and OS 18.6 on main HomePod (original)
I’ve enabled Crossfade in the Home app.
I’m playing Apple Music directly in the HomePod mini.
Crossfade just doesn’t work on any HomePod.
I can understand it not working on the HomePod - but why isn’t it working on the minis running OS 26?
I’ve tried disabling and enabling Crossfade, rebooting HomePods etc but nothing?!
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi there,
I want to set the iphone camera to "S mode, or Shutter Priority" in camera terminology. Which is a semi-auto exposure model with shutter speed fixed, or set manually.
However, when setting the shutter speed manually, it disables the auto exposure.
So is there a way to keep the auto exposure on while restrict the shutter speed?
Also, I would like to keep a low frame rate, e.g. 30 fps. Would I be able to set shutter speed independent of frame rate?
Here's the code for setting up the camera
Best,
I am using https://developer.apple.com/documentation/applemusicapi/add-tracks-to-a-library-playlist
to add tracks to playlists. This endpoint works fine for all playlists except for collaborative playlists.
For collaborative playlist I get the following 500 error as a response:
"errors": [
{
"id": "<some id>",
"title": "Upstream Service Error",
"detail": "Unable to update tracks",
"status": "500",
"code": "50001"
}
]
}
Steps to reproduce:
Create a playlist in your library.
Use the api to add a song.
Confirm that it works.
Make that same playlist collaborative.
Update the playlist ID in your api request (as making a playlist collaborative changes its id)
Confirm that you get the 500 error.
How can media resources in my app be recommended to the system media control center, just like TikTok in the picture
When changing a camera's exposure, AVFoundation provides a callback which offers the timestamp of the first frame captured with the new exposure duration: AVCaptureDevice.setExposureModeCustom(duration:, iso:, completionHandler:).
I want to get a similar callback when changing frame duration.
After setting AVCaptureDevice.activeVideoMinFrameDuration or AVCaptureDevice.activeVideoMinFrameDuration to a new value, how can I compute the index or the timestamp of the first camera frame which was captured using the newly set frame duration?
Hello,
Does anyone have a recipe on how to raycast VNFaceLandmarkRegion2D points obtained from a frame's capturedImage?
More specifically, how to construct the "from" parameter of the frame's raycastQuery from a VNFaceLandmarkRegion2D point?
Do the points need to be flipped vertically? Is there any other transformation that needs to be performed on the points prior to passing them to raycastQuery?
Hi everyone,
We are working on a prototype app for Apple Vision Pro that is similar in functionality to Omegle or Chatroulette, but exclusively for Vision Pro owners.
The core idea is:
– a matching system where one user connects to another through a virtual persona;
– real-time video and audio transmission;
– time limits for sessions with the ability to extend them;
– users can skip a match and move on to the next one.
We have explored WebRTC and Twilio, but unfortunately, they don’t fit our use case.
Question:
What alternative services or SDKs are available for implementing real-time video/audio communication on Vision Pro that would work with this scenario?
Has anyone encountered a similar challenge and can recommend which technologies or tools to use?
Thanks in advance!
I donate INPlayMediaIntent to systerm(donate success), but not show in control center
My code is as follows
let mediaItems = mediaItems.map { $0.inMediaItem }
let intent = if #available(iOS 13.0, *) {
INPlayMediaIntent(mediaItems: mediaItems,
mediaContainer: nil,
playShuffled: false,
playbackRepeatMode: .none,
resumePlayback: true,
playbackQueueLocation: .now,
playbackSpeed: nil,
mediaSearch: nil)
} else {
INPlayMediaIntent(mediaItems: mediaItems,
mediaContainer: nil,
playShuffled: false,
playbackRepeatMode: .none,
resumePlayback: true)
}
intent.suggestedInvocationPhrase = "播放音乐"
let interaction = INInteraction(intent: intent, response: nil)
interaction.donate { error in
if let error = error {
print("Intent 捐赠失败: \(error.localizedDescription)")
} else {
print("Intent 捐赠成功 ✅")
}
}
I'm working on a photo app and I want to allow the user to display, edit and delete photos. I can fetch all photos using PHAsset.fetchAssets(with: options). This works as intended.
However, I can't seem to find a way to prevent the user from seeing photos from a Shared Library. The PHAssetSourceType only contains typeCloudShared to only show items from a specific album; not library.
How can I filter by iCloud Shared Library?
Hello,
I'm evaluating the Apple Music Feed dataset and I noticed that the total number of songs available in the feed is too small. As of today, the number of objects returned in each feed is:
51,198,712 albums
23,093,698 artists
173,235,315 songs
This gives an average of 3.38 songs per album which is quite low. Also, iterating on the data I see that there are albums referencing songs that don't exist in the songs feed. I would like to know:
Is the feed data incomplete?
If so, in what situations an object may be missing from the feed?
Thank you in advance!
Session player regions populate blank, with no sound media when tracks or regions are created.
When making a call to https://api.music.apple.com/v1/me/library/artists to get a user's library artists, it returns the following (as an example):
[
{
id: 'r.FCwruQb',
type: 'library-artists',
href: '/v1/me/library/artists/r.FCwruQb?l=en-US',
attributes: { name: 'A Great Big World' }
},
{
id: 'r.7VSWOgj',
type: 'library-artists',
href: '/v1/me/library/artists/r.7VSWOgj?l=en-US',
attributes: { name: 'Aaliyah' }
},
...
]
If I try and use an artist id from that retuned data to look up additional information about the artist by calling https://api.music.apple.com/v1/catalog/us/artists/{id}, it fails.
User Library Artists don't seem to equal Catalog Artists.
It'd be great if there was a way to use these interchangeably. Am I missing something?
I have a feature requirement: to switch the writer for file writing every 5 minutes, and then quickly merge the last two files. How can I ensure that the merged file is seamlessly combined and that the audio and video information remains synchronized? Currently, the merged video has glitches, and the audio is also out of sync. If there are experts who can provide solutions in this area, I would be extremely grateful.
Hi,
I've had a new deck installed in my car for about 1.5 weeks.
I'm having compatibility issues with my 15PM.
It happens both wired and wirelessly, I get the error "Accessory not supported by this device". It used to happen all the time, now it's 50/50. Sometimes it works.
I've removed and added Bluetooth multiple times on phone and deck, I bought a belkin usb-c to usb-a cable today and it seems to fix it but the problem comes back.
I've changed the setting "FaceID and passcode-allow access when locked-accessories."
The car stereo guy reckons it's definitely an issue with the phone not the deck, I'm inclined to believe him since the error states "by this device".
Any advice appreciated.
Topic:
Media Technologies
SubTopic:
Audio
I'm creating Live Photos programmatically in my app using the Photos and AVFoundation frameworks. While the Live Photos work perfectly in the Photos app (long press shows motion), users cannot set them as motion wallpapers. The system shows "Motion not available" message.
Here's my approach for creating Live Photos:
// 1. Create video with required metadata
let writer = try AVAssetWriter(outputURL: videoURL, fileType: .mov)
let contentIdentifier = AVMutableMetadataItem()
contentIdentifier.identifier = .quickTimeMetadataContentIdentifier
contentIdentifier.value = assetIdentifier as NSString
writer.metadata = [contentIdentifier]
// Video settings: 882x1920, H.264, 30fps, 2 seconds
// Added still-image-time metadata at middle frame
// 2. Create HEIC image with asset identifier
var makerAppleDict: [String: Any] = [:]
makerAppleDict["17"] = assetIdentifier // Required key for Live Photo
metadata[kCGImagePropertyMakerAppleDictionary as String] = makerAppleDict
// 3. Generate Live Photo
PHLivePhoto.request(
withResourceFileURLs: [photoURL, videoURL],
placeholderImage: nil,
targetSize: .zero,
contentMode: .aspectFit
) { livePhoto, info in
// Success - Live Photo created
}
// 4. Save to Photos library
PHAssetCreationRequest.forAsset().addResource(with: .photo, fileURL: photoURL, options: nil)
PHAssetCreationRequest.forAsset().addResource(with: .pairedVideo, fileURL: videoURL, options: nil)
What I've Tried
Matching exact video specifications from Camera app (882x1920, H.264, 30fps)
Adding all documented metadata (content identifier, still-image-time)
Testing various video durations (1.5s, 2s, 3s)
Different image formats (HEIC, JPEG)
Comparing with exiftool against working Live Photos
Expected Behavior
Live Photos created programmatically should be eligible for motion wallpapers, just like those from the Camera app.
Actual Behavior
System shows "Motion not available" and only allows setting as static wallpaper.
Any insights or workarounds would be greatly appreciated. This is affecting our users who want to use their created content as wallpapers.
Questions
Are there additional undocumented requirements for Live Photos to be wallpaper-eligible?
Is this a deliberate restriction for third-party apps, or a bug?
Has anyone successfully created Live Photos that work as motion wallpapers?
Environment
iOS 17.0 - 18.1
Xcode 16.0
Tested on iPhone 16 Pro
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
LivePhotosKit JS
PhotoKit
Core Image
AVFoundation
Hi,
I have just implemented an Audio Unit v3 host.
AgsAudioUnitPlugin *audio_unit_plugin;
AVAudioUnitComponentManager *audio_unit_component_manager;
NSArray<AVAudioUnitComponent *> *av_component_arr;
AudioComponentDescription description;
guint i, i_stop;
if(!AGS_AUDIO_UNIT_MANAGER(audio_unit_manager)){
return;
}
audio_unit_component_manager = [AVAudioUnitComponentManager sharedAudioUnitComponentManager];
/* effects */
description = (AudioComponentDescription) {0,};
description.componentType = kAudioUnitType_Effect;
av_component_arr = [audio_unit_component_manager componentsMatchingDescription:description];
i_stop = [av_component_arr count];
for(i = 0; i < i_stop; i++){
ags_audio_unit_manager_load_component(audio_unit_manager,
(gpointer) av_component_arr[i]);
}
/* instruments */
description = (AudioComponentDescription) {0,};
description.componentType = kAudioUnitType_MusicDevice;
av_component_arr = [audio_unit_component_manager componentsMatchingDescription:description];
i_stop = [av_component_arr count];
for(i = 0; i < i_stop; i++){
ags_audio_unit_manager_load_component(audio_unit_manager,
(gpointer) av_component_arr[i]);
}
But this doesn't show me Audio Unit v2 plugins, why?
regards, Joël
I am trying to get MIDI output from the AU Host demo app using the recent MIDI processor example. The processor works correctly in Logic Pro, but I cannot send MIDI from the AUv3 extension in standalone mode using the default host app to another program (e.g., Ableton).
The MIDI manager, which is part of the standalone host app, works fine, and I can send MIDI using it directly—Ableton receives it without issues. I have already set the midiOutputNames in the extension, and the midiOutBlock is mapped. However, the MIDI data from the AUv3 extension does not reach Ableton in standalone mode. I suspect the issue is that midiOutBlock might never be called in the plugin, or perhaps an input to the plugin is missing, which prevents it from sending MIDI. I am currently using the default routing.
I have modified the MIDI manager such that it works well as described above. Here is a part of my code for SimplePlayEngine.swift and my MIDIManager.swift for reference:
@MainActor
@Observable
public class SimplePlayEngine {
private let midiOutBlock: AUMIDIOutputEventBlock = { sampleTime, cable, length, data in return noErr }
var scheduleMIDIEventListBlock: AUMIDIEventListBlock? = nil
public init() {
engine.attach(player)
engine.prepare()
setupMIDI()
}
private func setupMIDI() {
if !MIDIManager.shared.setupPort(midiProtocol: MIDIProtocolID._2_0, receiveBlock: { [weak self] eventList, _ in
if let scheduleMIDIEventListBlock = self?.scheduleMIDIEventListBlock {
_ = scheduleMIDIEventListBlock(AUEventSampleTimeImmediate, 0, eventList)
}
}) {
fatalError("Failed to setup Core MIDI")
}
}
func initComponent(type: String, subType: String, manufacturer: String) async -> ViewController? {
reset()
guard let component = AVAudioUnit.findComponent(type: type, subType: subType, manufacturer: manufacturer) else {
fatalError("Failed to find component with type: \(type), subtype: \(subType), manufacturer: \(manufacturer))" )
}
do {
let audioUnit = try await AVAudioUnit.instantiate(
with: component.audioComponentDescription, options: AudioComponentInstantiationOptions.loadOutOfProcess)
self.avAudioUnit = audioUnit
self.connect(avAudioUnit: audioUnit)
return await audioUnit.loadAudioUnitViewController()
} catch {
return nil
}
}
private func startPlayingInternal() {
guard let avAudioUnit = self.avAudioUnit else { return }
setSessionActive(true)
if avAudioUnit.wantsAudioInput { scheduleEffectLoop() }
let hardwareFormat = engine.outputNode.outputFormat(forBus: 0)
engine.connect(engine.mainMixerNode, to: engine.outputNode, format: hardwareFormat)
do { try engine.start() } catch {
isPlaying = false
fatalError("Could not start engine. error: \(error).")
}
if avAudioUnit.wantsAudioInput { player.play() }
isPlaying = true
}
private func resetAudioLoop() {
guard let avAudioUnit = self.avAudioUnit else { return }
if avAudioUnit.wantsAudioInput {
guard let format = file?.processingFormat else { fatalError("No AVAudioFile defined.") }
engine.connect(player, to: engine.mainMixerNode, format: format)
}
}
public func connect(avAudioUnit: AVAudioUnit?, completion: @escaping (() -> Void) = {}) {
guard let avAudioUnit = self.avAudioUnit else { return }
engine.disconnectNodeInput(engine.mainMixerNode)
resetAudioLoop()
engine.detach(avAudioUnit)
func rewiringComplete() {
scheduleMIDIEventListBlock = auAudioUnit.scheduleMIDIEventListBlock
if isPlaying { player.play() }
completion()
}
let hardwareFormat = engine.outputNode.outputFormat(forBus: 0)
engine.connect(engine.mainMixerNode, to: engine.outputNode, format: hardwareFormat)
if isPlaying { player.pause() }
let auAudioUnit = avAudioUnit.auAudioUnit
if !auAudioUnit.midiOutputNames.isEmpty { auAudioUnit.midiOutputEventBlock = midiOutBlock }
engine.attach(avAudioUnit)
if avAudioUnit.wantsAudioInput {
engine.disconnectNodeInput(engine.mainMixerNode)
if let format = file?.processingFormat {
engine.connect(player, to: avAudioUnit, format: format)
engine.connect(avAudioUnit, to: engine.mainMixerNode, format: format)
}
} else {
let stereoFormat = AVAudioFormat(standardFormatWithSampleRate: hardwareFormat.sampleRate, channels: 2)
engine.connect(avAudioUnit, to: engine.mainMixerNode, format: stereoFormat)
}
rewiringComplete()
}
}
and my MIDI Manager
@MainActor
class MIDIManager: Identifiable, ObservableObject {
func setupPort(midiProtocol: MIDIProtocolID,
receiveBlock: @escaping @Sendable MIDIReceiveBlock) -> Bool {
guard setupClient() else { return false }
if MIDIInputPortCreateWithProtocol(client, portName, midiProtocol, &port, receiveBlock) != noErr {
return false
}
for source in self.sources {
if MIDIPortConnectSource(port, source, nil) != noErr {
print("Failed to connect to source \(source)")
return false
}
}
setupVirtualMIDIOutput()
return true
}
private func setupVirtualMIDIOutput() {
let virtualStatus = MIDISourceCreate(client, virtualSourceName, &virtualSource)
if virtualStatus != noErr {
print("❌ Failed to create virtual MIDI source: \(virtualStatus)")
} else {
print("✅ Created virtual MIDI source: \(virtualSourceName)")
}
}
func sendMIDIData(_ data: [UInt8]) {
print("hey")
var packetList = MIDIPacketList()
withUnsafeMutablePointer(to: &packetList) { ptr in
let pkt = MIDIPacketListInit(ptr)
_ = MIDIPacketListAdd(ptr, 1024, pkt, 0, data.count, data)
if virtualSource != 0 {
let status = MIDIReceived(virtualSource, ptr)
if status != noErr {
print("❌ Failed to send MIDI data: \(status)")
} else {
print("✅ Sent MIDI data: \(data)")
}
}
}
}
}
Hi, when using ApplicationMusicPlayer from MusicKit my app automatically gets the media controls on the lock screen: Play/ Pause, Skip Buttons, Playback Position etc.
I would like to customize these. Tried a bunch of things, e.g. using MPRemoteCommandCenter. So far I haven't had any success.
Does anyone know how I can customize the media controls of ApplicationMusicPlayer.
Thank you.
In iOS 26 (Developer Beta), the AVCaptureMetadataOutputObjectsDelegate no longer receives callbacks when metadataOutput.metadataObjectTypes = [.face] is set. On earlier iOS versions the issue does not occur. Interestingly, face detection works if I set the sessionPreset to .medium, but not with .high — except on the iPhone 16 Pro Max, where it works regardless.
A bit of a novice to app development here but I have a paid developer account, I have registered the identifier for MusicKit on the developer website (using the bundle identifier I've selected in Xcode) but the option to add MusicKit as a capability is not available in Xcode?
I've manually updated the certificates, closed the app and reopened it, started a new project and tried with a different demo project?
Apologies if I am missing something obvious but could someone help me get this capability added?