Post not yet marked as solved
I found that the player has increased support for ambient viewing environment SEI,this will lead to inconsistent display brightness of HDR streams processed in different ways。
I want to know the system version, software version and hardware version that support this SEI。In addition, are there other types of SEI syntax supported?
Post not yet marked as solved
Hi!
In Becoming a Now Playable App - https://developer.apple.com/documentation/mediaplayer/becoming_a_now_playable_app project for iOS target after changing playback rate to 2.0 (or any other value greater than 1.0) some bluetooth headphones fail to pause playback (for example Motorola Pulse Escape).
The same issue manifests in other apps where playback speed change is allowed, but works well for Apple's Podcasts app.
I tried to create custom UIWindow subclass to log all events received by OS, but iOS 13 doesn't receive pause command from headset at all (when playback is active and rate > 1.0), however skip backward/previous track make app work until you change rate back to 2.
Does anyone have an idea how to make pause work in this case?
Thanks!
Post not yet marked as solved
I have a MUSIC website that uses the BetterAudioPlaylist front-end HTML5 player (https://github.com/NelsWebDev/BetterAudioPlaylist).
For over 2 years it has worked flawlessly on my iPhone, however when I updated to iOS 12.1 it no longer advances to the next audio file if the screen is locked.
I have tried going into Safari settings and used my troubleshooting skills to verify it has nothing to do with my settings.
Is this Apple pushing out the little guy in favor of iTunes?
Has anyone else had this problem? I contacted Apple's Developers support email, however they never responded.
Please help!
Post not yet marked as solved
How can I know the width and height of a video?
For example, the video is playing and the screen is rotated.
Post not yet marked as solved
let time1cmt = CMTimeGetSeconds(playerOK.currentTime())
time1 = Double(time1cmt)
let time2cmt = CMTimeGetSeconds(playerOK.currentTime())
time2 = Double(time2cmt)
videoDif = (time2 - time1)
Now I'm using .currentTime() to get the start point and end point, but they don't seem to be accurate, is there any other way to get the time more accurately?
Post not yet marked as solved
AVPlayer gets the list of URLs from the m3u8 file. I need to add some query string at the end of each URL.
Is there any option in the AVPlayer to do this?
Example:
HLS URL : http://example.com/hls.m3u8
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:10
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:10.417000,ts
#EXTINF:10.417000,ts
#EXTINF:9.470000,ts
#EXTINF:10.417000,ts
#EXTINF:9.470000,ts
#EXTINF:10.417000,ts
#EXTINF:9.470000,ts
#EXTINF:3.840611,ts
#EXT-X-ENDLIST
AVPlayer is trying to download http://example.com/1.ts
But I want the AVPlayer to add
"?st=2020-09-01T13%3A59%3A03Z&se=2020-09-02T13%3A59%3A03Z&sp=rl&sv=2018-03-28&sr=b&sig=Pua9sv8mgvPF6gNwuBSghdEq%2BefMFmwBuyUdjCetmw4%3D"
So AVPlayer will try
http://example.com/1.ts?st=2020-09-01T13%3A59%3A03Z&se=2020-09-02T13%3A59%3A03Z&sp=rl&sv=2018-03-28&sr=b&sig=Pua9sv8mgvPF6gNwuBSghdEq%2BefMFmwBuyUdjCetmw4%3D instead of http://example.com/1.ts
Post not yet marked as solved
I am playing a video with AVPlayer, how can I get the float value of the FPS of the video?
I readed it's possible with AVAssetTrack but I don't know how to implement it.
On Swift/SwiftUI
Thanks!
Post not yet marked as solved
I'm using AVPlayer for playing m3u8.
I want to get more control over the download audio chunks - redownload chunk after a failed operation.
If some chunks are unavailable, the player will still play without returning an error.
Can you advise me on some articles, maybe examples?
Post not yet marked as solved
Hello,
I'm using systemMusicPlayer to play Apple Music Live Radio Station got from Apple Music API. But it doesn't work. How can I do that?
Error:
Test[46751:13235249] [SDKPlayback] Failed to prepareToPlay error: Error Domain=MPMusicPlayerControllerErrorDomain Code=6 "Failed to prepare to play" UserInfo={NSDebugDescription=Failed to prepare to play}
My implementation:
let musicPlayerController = MPMusicPlayerController.systemMusicPlayer
musicPlayerController.beginGeneratingPlaybackNotifications()
musicPlayerController.setQueue(with: "ra.978194965")
musicPlayerController.play()
API response:
{
“id”: “ra.978194965”,
“type”: “stations”,
“href”: “/v1/catalog/us/stations/ra.978194965”,
“attributes”: {
“artwork”: {
“width”: 4320,
“url”: “https://is2-ssl.mzstatic.com/image/thumb/Features114/v4/e5/10/76/e5107683-9e51-ebc5-3901-d8fbd65f2c2a/source/{w}x{h}sr.jpeg”,
“height”: 1080,
“textColor3”: “332628”,
“textColor2”: “120509”,
“textColor4”: “33272a”,
“textColor1”: “000000”,
“bgColor”: “f4f4f4”,
“hasP3”: false
},
“url”: “https://music.apple.com/us/station/apple-music-1/ra.978194965”,
“mediaKind”: “audio”,
“supportedDrms”: [
“fairplay”,
“playready”,
“widevine”
],
“requiresSubscription”: false,
“name”: “Apple Music 1”,
“kind”: “streaming”,
“radioUrl”: “itsradio://music.apple.com/us/station/ra.978194965”,
“playParams”: {
“id”: “ra.978194965”,
“kind”: “radioStation”,
“format”: “stream”,
“stationHash”: “CgkIBRoFlaS40gMQBA”,
“mediaType”: 0
},
“editorialNotes”: {
“name”: “Apple Music 1”,
“short”: “The new music that matters.”,
“tagline”: “The new music that matters.”
},
“isLive”: true
}
},```
Thank you!
Best regards,
MichaelNg
Post not yet marked as solved
Im using more easily to create a pick up and play a video, and that part works, and to show the video I use this line:
PhotoPickerResultView(result: photoPickerService.results[0])
and this part work fine to, arrives from this:
struct PhotoPickerResultView: View {
var result: PHPickerResult
enum MediaType {
case loading, error, video
}
@State private var loaded = false
@State private var url: URL?
@State private var mediaType: MediaType = .loading
@State private var latestErrorDescription = ""
var body: some View {
Group {
switch mediaType {
case .loading:
ProgressView()
case .error:
VStack {
Image(systemName: "exclamationmark.triangle.fill")
Text(latestErrorDescription).font(.caption)
}
.foregroundColor(.gray)
case .video:
if url != nil {
VideoPlayer(player: AVPlayer(url: url!))
...
.....
My question is, How can I use or implement the custom buttons of the AVPlayer? like:
@State private var player1 = AVPlayer(url: URL(string: "https...mp4")!)
VideoPlayer(player: player1)
Button {
player1.play()
} label: {
Text(" PLAY ")
}
Button {
player1.pause()
} label: {
Text(" PAUSE ")
}
from this line:
PhotoPickerResultView(result: photoPickerService.results[0])
???
or what do I have to change to use more easily that custom buttons of the AVPlayer??
Thanks
Post not yet marked as solved
Hello,
I'm the developer of an Apple Music app called Soor, I've been recently working on adding Catalyst support to the app.
However, I've noticed some severe bugs while setting the queue for playing non-library items on macOS 12.2.
Both MPMusicPlayerPlayParametersQueueDescriptor and MPMusicPlayerStoreQueueDescriptor fail to play items using valid playbackStore identifiers.
The console logs the following errors:
[SDKPlayback] systemMusicPlayer _establishConnectionIfNeeded timeout [ping did not pong]
`[SDKPlayback] Failed to prepareToPlay error: Error Domain=NSOSStatusErrorDomain Code=9205 "(null)"`
I have filed radars for this along with sample projects showcasing the issue. FB9890270 and FB9890331.
Here's a gist of the sample code for which the player either completely fails to set the queue or now playing item stays nil.
/// These are valid playback store ids retrieved from Apple Music API.
/// You may replace them with any valid playback store IDs of your choice.
let playbackStoreIDs = ["1588418743", "1604815955", "1596475453", "1562346959", "1596475469", "1596475460", "1580955750", "1591442362", "1607324602", "1531596345"]
var playParams = [MPMusicPlayerPlayParameters]()
for playbackStoreID in playbackStoreIDs {
let param = MPMusicPlayerPlayParameters(dictionary: ["id": playbackStoreID, "kind": "song"])!
playParams.append(param)
}
let queueDesc = MPMusicPlayerPlayParametersQueueDescriptor(playParametersQueue: playParams)
queueDesc.startItemPlayParameters = playParams[3]
player.setQueue(with: queueDesc)
player.play()
Has anyone managed to playback music correctly using only playback store ids on Catalyst?
Post not yet marked as solved
I use encrypted contents in my app . Before playing I request for decryption key which helps in playing content online . I want to play content offline too . So I download the content. But how can I store decryption keys ?
Hello,
I have an app that uses the MediaPlayer framework (applicationQueuePlayer specifically) and since the introduction of iOS 14.6 and Lossless audio, some users have been reporting that sometimes they cannot play a song past 15 seconds without playback either pausing or the app making a screeching noise. As far as I knew, this was supposed to be fixed in 14.7, but users running 14.7.1 are reporting it to me whether Lossless is on or off. The problem seems to be intermittent -- there for one user one day and gone the next, though sometimes it sticks around. I am pretty much never able to reproduce it for myself.
My code to load and play the player is pretty simple and has not changed since well before this issue started cropping up. I run setQueue and set the queue with either a descriptor of storeIDs for Apple Music Items or an MPMediaItemCollection for library items, then call Play.
Is there anything I can do about this issue? Are you guys still working on this server side? I need to continue supporting users on older OSes so I am not yet able to use the new MusicKit for Swift player. I don't know if it's fixed there or not.
Thanks! (Tagging in @JoeKun)
Post not yet marked as solved
So I am trying to add two buttons to the view of my SwiftUI app that will allow a user to fast forward and rewind the audio clip.
I already have it working on the Lock Screen and notification bar, but in the app I can't figure out how to link two buttons to these actions.
I am wondering if anyone is able to assist.
In the MusicCore.swift I have the following
func setupRemoteTransportControls() {
// Get the shared MPRemoteCommandCenter
let commandCenter = MPRemoteCommandCenter.shared()
let changePlaybackPositionCommand = commandCenter.changePlaybackPositionCommand
changePlaybackPositionCommand.isEnabled = true
changePlaybackPositionCommand.addTarget { event in
let seconds = (event as? MPChangePlaybackPositionCommandEvent)?.positionTime ?? 0
let time = CMTime(seconds: seconds, preferredTimescale: 1)
self.player?.seek(to: time)
return .success
}
let skipBackwardCommand = commandCenter.skipBackwardCommand
if(MusicPlayer.mediatype == "podcast")
{
skipBackwardCommand.isEnabled = true
skipBackwardCommand.preferredIntervals = [NSNumber(value: 10)]
skipBackwardCommand.addTarget(handler: skipBackward)
}
else{
skipBackwardCommand.isEnabled = false
}
let skipForwardCommand = commandCenter.skipForwardCommand
if(MusicPlayer.mediatype == "podcast")
{
skipForwardCommand.isEnabled = true
skipForwardCommand.preferredIntervals = [NSNumber(value: 30)]
}
else{
skipForwardCommand.isEnabled = false
}
skipForwardCommand.addTarget(handler: skipForward)
// Add handler for Play Command
commandCenter.playCommand.addTarget { [unowned self] event in
if self.player?.rate == 0.0 {
self.player?.play()
return .success
}
return .commandFailed
}
// Add handler for Pause Command
commandCenter.pauseCommand.addTarget { [unowned self] event in
if self.player?.rate == 1.0 {
self.player?.pause()
MPNowPlayingInfoCenter.default().nowPlayingInfo?[MPNowPlayingInfoPropertyElapsedPlaybackTime] = Int(Double((self.player?.currentTime().seconds)!))
return .success
}
return .commandFailed
}
func skipBackward(_ event: MPRemoteCommandEvent) -> MPRemoteCommandHandlerStatus {
//self.player?.seek(to: CMTimeMakeWithSeconds(CMTimeGetSeconds((self.player?.currentTime())!).advanced(by: -30), preferredTimescale: 1))
// print(CMTimeGetSeconds((self.player?.currentTime())!)) //Output: 42
//print(event.interval)
//self.player!.seek(to: CMTimeMakeWithSeconds(CMTimeGetSeconds((self.player?.currentTime())!).advanced(by: -30), preferredTimescale: 1))
let currentTime = self.player?.currentTime()
self.player?.seek(to: CMTime(seconds: currentTime!.seconds - 10, preferredTimescale: 1), completionHandler: { isCompleted in
if isCompleted {
MPNowPlayingInfoCenter.default().nowPlayingInfo?[MPNowPlayingInfoPropertyElapsedPlaybackTime] = Int(Double((self.player?.currentTime().seconds)!))
}
})
return .success
}
func skipForward(_ event: MPRemoteCommandEvent) -> MPRemoteCommandHandlerStatus {
//self.player?.seek(to: CMTimeMakeWithSeconds(CMTimeGetSeconds((self.player?.currentTime())!).advanced(by: 30), preferredTimescale: 1))
let currentTime = self.player?.currentTime()
self.player?.seek(to: CMTime(seconds: currentTime!.seconds + 30, preferredTimescale: 1), completionHandler: { isCompleted in
if isCompleted {
MPNowPlayingInfoCenter.default().nowPlayingInfo?[MPNowPlayingInfoPropertyElapsedPlaybackTime] = Int(Double((self.player?.currentTime().seconds)!))
}
})
return .success
}
}
But how can I call that from inside a View?
Post not yet marked as solved
Hi,
We have a Video Player App, which seem to crash randomly when player is closed.
Crash Info:
crash_info_entry_0 : BUG IN CLIENT OF LIBDISPATCH: dispatch_sync called on queue already owned by current thread
Stack trace:
0 libdispatch.dylib 0x12dd0 __DISPATCH_WAIT_FOR_QUEUE__ + 484
1 libdispatch.dylib 0x12900 _dispatch_sync_f_slow + 144
2 MediaToolbox 0x1f6374 FigCaptionRendererSessionSetPlayer + 68
3 MediaToolbox 0x2a7b30 setPlayerDo + 184
4 libdispatch.dylib 0x3950 _dispatch_client_callout + 20
5 libdispatch.dylib 0x12a70 _dispatch_lane_barrier_sync_invoke_and_complete + 56
6 MediaToolbox 0x2a7a6c -[FigSubtitleCALayer setPlayer:] + 64
7 AVFCore 0x6e2c0 -[AVPlayer _removeLayer:videoLayer:closedCaptionLayer:subtitleLayer:interstitialLayer:] + 572
8 AVFCore 0x30dc8 -[AVPlayerLayer dealloc] + 324
9 Foundation 0x2cf78 NSKVODeallocate + 216
10 QuartzCore 0x68c54 CA::Layer::free_transaction(CA::Transaction*) + 404
11 QuartzCore 0x4f284 CA::Transaction::commit() + 952
12 MediaToolbox 0x3375bc setBounds + 376
13 MediaToolbox 0x200160 UpdateLayoutContext + 892
14 MediaToolbox 0x1feda8 onCaptionInputDo + 212
15 libdispatch.dylib 0x3950 _dispatch_client_callout + 20
16 libdispatch.dylib 0xb0ac _dispatch_lane_serial_drain + 664
17 libdispatch.dylib 0xbc10 _dispatch_lane_invoke + 392
18 libdispatch.dylib 0x16318 _dispatch_workloop_worker_thread + 656
19 libsystem_pthread.dylib 0x11b0 _pthread_wqthread + 288
20 libsystem_pthread.dylib 0xf50 start_wqthread + 8
Post not yet marked as solved
iTunesMetadataTrackSubTitle value showing within quotes in tvOS player inside info tab, can we avoid this double quotes
Post not yet marked as solved
Hello,
https://stackoverflow.com/questions/51797343/setting-http-header-fields-for-avurlassets
Is it okay to use the method using "AVURLAssetHTTPHeaderFieldsKey" described on this page?
Post not yet marked as solved
Is there any way of discovering the name of the device when the user started to cast to it? The only thing I found was this
let route = AVAudioSession.sharedInstance().currentRoute
for output in route.outputs where output.portType == .airPlay {
infoDict["deviceName"] = output.portName
infoDict["portType"] = output.portType.rawValue
}
but the output.portName returns the portType instead of the portName.
Post not yet marked as solved
Attempting to pull a podcast, via it's ID, leads to a CORS error, as it has the response header:
access-control-allow-origin: http://localhost:3000
Meaning, locally, I am able to pull the required podcast episode list, but as soon as I am in staging or production, the whole thing falls over.
Has anyone come across this issue, and does anyone have any work arounds?
Post not yet marked as solved
EarPods toggle button event has been received on iPhone 7.
But it hasn't happened since yesterday. The source code has never been changed.
The source code is roughly like this.
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
MPRemoteCommandCenter *commandCenter = [MPRemoteCommandCenter sharedCommandCenter];
[commandCenter.togglePlayPauseCommand addTarget:self action:@selector(onTogglePlayPause:)];
commandCenter.togglePlayPauseCommand.enabled = 1;
- (MPRemoteCommandHandlerStatus)onTogglePlayPause:(MPRemoteCommandHandlerStatus*)event {
NSLog(@"toggle event! - never called....");
return MPRemoteCommandHandlerStatusSuccess;
}
env
iphone 7 (ios 15.2)
xcode 13.1
How do I get this event?