Post not yet marked as solved
Hi . All
We try to connect the USB UAC1.0 device to computer or andriod phone , is work well . the we try to connect to iphone or ipad use "Lightning to USB adapter “,it can find device then play the audio but the audio stream is not continuous.
can you tell my iphone or ipad support usb uac1.0 or uac2.0 device ?
Thank you!
Isaac
Post not yet marked as solved
Hi All,
I am using Azure Media Service and CDN and I have set up some rules from the CDN side that can only allow traffic from my domain. But I find this work well on Windows PC and Mac book, but not iPhone or iPad.
For further investigation, I caught the network trace and find that the request didn't have the 'Origin' in the header so the request will be denied by the CDN rules.
This could be observed using either chrome or Safari browser on iPhones and iPad.
Have anyone seen this issue before? Is it related to the iOS system or the AV player?
Is there any workaround for such issue?
Post not yet marked as solved
MusicPlayer.State supports shuffleMode and repeatMode, but there is no way to enable Autoplay.
In the case of SystemMusicPlayer a user can enable AutoPlay from iOS Music app, but for ApplicationMusicPlayer, there's no way to offer Autoplay support.
Additionally, when setting a new queue iOS Music app, disables Autoplay mode even if it was enabled before, but when using iOS Music app, the Autoplay stays enabled. I can understand that it may be intentional for some reason but doesn't seem user-friendly. Due to this, even when playing from Shortcuts app, Autoplay gets disabled, even though in most cases it shouldn't.
Post not yet marked as solved
Are there more developers noticing that metadata is not working on iOS 16 beta 1 and 2? Both possible solutions are not working:
observeValue for key "timedMetadata" and
AVPlayerItemMetadataOutputPushDelegate.
The code I produced works fine on iOS 16 in the simulator, but not on a real device (iPhone XR).
Post not yet marked as solved
I understand why there isn’t an API to remove items from the library or playlist, but I propose a way to safely remove items from the library or playlist which would show a system alert to confirm the removal. This is exactly how the deletion is allowed for the Photos Library via third-party apps.
Post not yet marked as solved
we are playing a HLS stream with the help of AV player and trying to read HLS manifest. We are able to detect majority of the tags however player is not detecting the EXT-X-DATERANGE:ID with DURATION tag ie
#EXT-X-DATERANGE:ID="aba74c45-e963-45bf-8171-1f910c33f64a",DURATION=32.44
Where as, the other #EXT-X-DATERANGE:ID has been detected at the beginning of the manifest.
#EXT-X-DATERANGE:ID="aba74c45-e963-45bf-8171-1f910c33f64a",START-DATE="2022-03-10T13:18:15.179Z",PLANNED-DURATION=15,X-AD-ID="9858"
#EXT-X-DISCONTINUITY
We are using the AVPlayers metadata collector delegate method to detect the metadata
func metadataCollector(_ metadataCollector: AVPlayerItemMetadataCollector,
didCollect metadataGroups: [AVDateRangeMetadataGroup],
indexesOfNewGroups: IndexSet,
indexesOfModifiedGroups: IndexSet) {}
We are not able to detect the EXT-X-DATERANGE:ID with DURATION tag with the delegate used above
Any help appreciated.
Post not yet marked as solved
I only have the simulator for testing currently, so perhaps it's a quirk of the simulator.
I have an audio app. I want to keep the view on the phone in sync with state of my player, which involves setting the isPlaying property on on the CPListItems in the list. When I change its value, the list scrolls to the top. I would rather have the list scroll to reveal the playing item, or not at all.
Post not yet marked as solved
Does anyone know if it is possible to keep the iOS 15 playback button layout instead of the new iOS 16 layout? (see image attached)
I mean, keeping the compact playback buttons at the bottom of the screen in iOS 16?
Post not yet marked as solved
Hi there!
I have been trying to play the music videos we get from Apple Music API and have been unsuccessful.
Here's my code:
var video: MusicVideo
var body: some View {
VideoPlayer(player: AVPlayer(url: video.url!))
}
I know the URL from the MusicVideo is not in a music format but just the URL to the video in the Apple Music catalog.
How do I go about playing it without using something like MPMusicPlayerController.systemMusicPlayer.openToPlay(queueDescriptor) and provide an in-app experience (and not take the user to the Apple Music app)?
Hi ,
I have a usecase where I need to add multiple songs to apple music playlists. I am using MPMediaplaylist.addItem(withProductID: ) function to add a single song to the playlist. Is there a way to convert the song to MPMediaItem so that I can use MPMediaplaylist.add([MPMediaItem]) to add multiple songs to the playlist?
Regards
Given an MPMediaItem the user selected from MPMediaPickerController or from MPMusicPlayerController.systemMusicPlayer.nowPlayingItem, is it possible to find out if this song is lossless and if it supports Spatial Audio? Thanks!
Hey there Apple Music team! I'm excited to dig into the sessions coming up this week, and what I've seen so far from the developer documentation diffs looks great: audio quality, artist images, and a way to interface with a user's music library in MusicKit. Love it!
The thing at the very top of my WWDC wishlist this year was macOS/Mac Catalyst support for the ApplicationMusicPlayer class. I just got finished installing Ventura and Xcode 14, and sadly it looks like the support story is the same as on Big Sur. No API availability on macOS, and an available Mac Catalyst API that ultimately results in the same error from a feedback I submitted on Big Sur: FB9851840
The connection to service named com.apple.Music.MPMusicPlayerApplicationControllerInternal was invalidated: failed at lookup with error 3 - No such process.
Is that the end of the story on Ventura, or is there a chance support might be added in a later beta? Is there any additional detail at all that can be shared? I field several requests a week asking if/when my app is coming to the Mac, and I would really love to be able to make that happen. If there is anything at all I can do to test and help overcome the engineering challenges alluded to in the past, I am ready, willing, and able!
In any case, thanks for the great work, and I'm looking forward to spending time with the new stuff this summer.
Post not yet marked as solved
I use AVPlayerItemMetadataOutput for live HLS audio stream, each segment is a 0.96 second long AAC containing id3 metadata.
In all previous versions of iOS, the AVPlayerItemMetadataOutput delegate method for this stream is called approximately every 0.96 seconds.
This behaviour has changed with iOS 15.4.1. In this version, the delegate method is called exactly every 1 second, resulting in a delay in reading the metadata for each segment.
Example:
time(sec)----|0___________1___________2___________3______
segments-----|[segment_1][segment_2][segment_3][segment_4]
|^----------^----------^----------^---------
iOS 15.2-----|call_1 call_2 call_3 call_4
|^-----------^-----------^-----------^------
iOS 15.4.1---|call_1 call_2 call_3 call_4
As it can be seen, call_4 will be called much later than segment_4 starts playing. In all previous versions of iOS, it was called simultaneously with the start of segment_4 playback.
The AVMetadataItem.time property also shows the wrong time (see attached pictures).
Tried adjusting the delegation in both main and background queue - no success. Changing advanceIntervalForDelegateInvocation did not change this behavior.
Post not yet marked as solved
I have an issue with updating MPNowPlayingInfoCenter when trying to read nowPlayingInfo I don't get a response leading to blocking the current thread indefinitely.
I'm updating MPNowPlayingInfoCenter on main thread which results in an app freeze.
func staticUpdate() {
logger.log(.debug, "start static update")
infoCenter.nowPlayingInfo = nowPlayingInfo
logger.log(.debug, "end static update")
}
func dynamicUpdate() {
logger.log(.debug, "start update - read")
var mpInfo = infoCenter.nowPlayingInfo ?? [String: Any]()
logger.log(.debug, "start update - write")
...
infoCenter.nowPlayingInfo = mpInfo
logger.log(.debug, "end update")
}
/*
2022-04-25 09:28:19.051435+0200 [Debug] [main] [NowPlayingInfoCenterController.swift:128] start static update
2022-04-25 09:28:19.051834+0200 [Debug] [main] [NowPlayingInfoCenterController.swift:130] end static update
2022-04-25 09:28:19.052251+0200 [Debug] [main] [NowPlayingInfoCenterController.swift:186] start update - read
*/
I'm overwriting nowPlayingInfo when media changes, then I'm updating it on any status changes (progress, status,...)
(see timestamps, we read ~1ms after write but never reach infoCenter.nowPlayingInfo = mpInfo)
Questions:
shouldn't infoCenter.nowPlayingInfo always be readable?
can I update infoCenter from any queue? (this would solve only app freeze..)
Post not yet marked as solved
Just wondering if anyone else is having issues with currentPlaybackRate in release version of iOS 15.4? In my particular case this is using MPMusicPlayerController.applicationQueuePlayer.
I've always had issues controlling this property reliably but from what I can see it is now completely non-operational in 15.4.
I've isolated this behavior in a trivial project, and will file a radar, but hoping others may have some insight first.
FWIW- This is my trivial test case:
class ViewController: UIViewController {
lazy var player: MPMusicPlayerApplicationController = {
let player = MPMusicPlayerController.applicationQueuePlayer
player.repeatMode = .none
player.shuffleMode = .off
player.beginGeneratingPlaybackNotifications()
return player
}()
override func viewDidLoad() {
super.viewDidLoad()
NotificationCenter.default.addObserver(forName: .MPMusicPlayerControllerPlaybackStateDidChange, object: nil, queue: .main) { [weak self] notification in
guard let notificationPlayer = notification.object as? MPMusicPlayerApplicationController,
notificationPlayer === self?.player else {
return
}
debugPrint("Player state now: \(notificationPlayer.playbackState)")
}
}
@IBAction func goAction(_ sender: Any) {
guard let item = MPMediaQuery.songs().items?.randomElement() else {
debugPrint("Unable to access media items")
return
}
debugPrint("Now playing item: \(item.title ?? "")")
player.setQueue(with: [item.playbackStoreID])
player.prepareToPlay() { error in
guard error == nil else {
debugPrint("Player error: \(error!.localizedDescription)")
return
}
DispatchQueue.main.async { [weak self] in
self?.player.play()
}
}
}
@IBAction func slowAction(_ sender: Any) {
debugPrint("Setting currentPlaybackRate to 0.5")
player.currentPlaybackRate = 0.5
checkPlaybackRate()
}
@IBAction func fastAction(_ sender: Any) {
debugPrint("Setting currentPlaybackRate to 1.5")
player.currentPlaybackRate = 1.5
checkPlaybackRate()
}
func checkPlaybackRate(afterSeconds delay: TimeInterval = 1.0) {
DispatchQueue.main.asyncAfter(deadline: .now() + delay) {
debugPrint("After \(delay) seconds currentPlaybackRate now: \(self.player.currentPlaybackRate)")
}
}
}
Typical console output:
"Now playing item: I Know You Know"
"Player state now: MPMusicPlaybackState(rawValue: 2)"
"Player state now: MPMusicPlaybackState(rawValue: 1)"
"Setting currentPlaybackRate to 1.5"
"After 1.0 seconds currentPlaybackRate now: 1.0"
"Setting currentPlaybackRate to 0.5"
"After 1.0 seconds currentPlaybackRate now: 1.0"
Post not yet marked as solved
Everything 's working perfectly before ios 15.4.1, but few days ago, I 've update my iphone to 15.4.1, MPRemoteCommandCenter does not show up any more. It only shows up only when I toggle play pause my AVPlayer while Avplayer playing. Anyone has experience with this?
Post not yet marked as solved
AVplayer.seekToTime is not working on iOS 15.4.1
It worked well before I upgraded to latest iOS Version.
Post not yet marked as solved
How do you add a audio player to a swift playgrounds app project? How would you be able to add local audio along in the backend along with a interactive audio player in the frontend??? Thanks!
Post not yet marked as solved
Is there a way to disable the default video controls (play/pause/scrubber/etc) on the new SwiftUI VideoPlayer in iOS14, so I could create a custom one?
Post not yet marked as solved
Hi all.
I'm trying to create custom controls for AVPlayer and managed to do it.
Last thing I want to achieve is making the player go full screen while it is in-line mode, inside UICollectionViewCell, on device rotation.
I'm using this code to present the player in full screen:
override func viewWillTransition(to size: CGSize, with coordinator: UIViewControllerTransitionCoordinator) {
super.viewWillTransition(to: size, with: coordinator)
collectionView?.collectionViewLayout.invalidateLayout()
let sourceVC = self
let destinationVC = LandscapePlayerViewController()
// LandscapePlayerViewController also in view hierarchy within UICollectionViewCell
destinationVC.modalPresentationStyle = .fullScreen
sourceVC.present(destinationVC, animated: true)
}
This does the trick, but for some reason the player presents empty. It doesn't pass the playing video to presented full screen player.
Any suggestions?
Thank you!