I am using https://developer.apple.com/documentation/applemusicapi/add-tracks-to-a-library-playlist
to add tracks to playlists. This endpoint works fine for all playlists except for collaborative playlists.
For collaborative playlist I get the following 500 error as a response:
    "errors": [
        {
            "id": "<some id>",
            "title": "Upstream Service Error",
            "detail": "Unable to update tracks",
            "status": "500",
            "code": "50001"
        }
    ]
}
Steps to reproduce:
Create a playlist in your library.
Use the api to add a song.
Confirm that it works.
Make that same playlist collaborative.
Update the playlist ID in your api request (as making a playlist collaborative changes its id)
Confirm that you get the 500 error.
                    
                  
                Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
  
    
    Selecting any option will automatically load the page
  
  
  
  
    
  
  
          Post
Replies
Boosts
Views
Activity
                    
                      I've been wondering if there is a way to modify or even disable tones for indicating channel states. The behaviour regarding tones seems like a black box with little documentation.
During migration to Apple's PT Framework we've noticed that there are few scenarios where a tone is played which doesn't match certain certifications. For example; moving from a channel to another produces a tone which would fail a test case. I understand the reasoning fully, as it marks that the channel is ready to transmit or receive, but this doesn't mirror the behaviour of TETRA which would be wanted in this case.
I'm also wondering if there would be any way to directly communicate feedback regarding PT Framework?
                    
                  
                
                    
                      Hi, when using ApplicationMusicPlayer from MusicKit my app automatically gets the media controls on the lock screen: Play/ Pause, Skip Buttons, Playback Position etc.
I would like to customize these. Tried a bunch of things, e.g. using MPRemoteCommandCenter. So far I haven't had any success.
Does anyone know how I can customize the media controls of ApplicationMusicPlayer.
Thank you.
                    
                  
                
                    
                      I'm working on a project to support spatial audio editing, using this sample project as a reference: https://developer.apple.com/documentation/Cinematic/editing-spatial-audio-with-an-audio-mix
This sample works well on an unedited capture, but does not work for a capture that has already been edited.
The failure is occurring at "let audioInfo = try await CNAssetSpatialAudioInfo(asset: myAsset)", which is throwing "no eligible audio tracks in asset".
I also find that for already edited captures, if i use CNAssetSpatialAudioInfo.assetContainsSpatialAudio, it returns false.
What i mean by "already edited" is that if I take a spatial capture with my iPhone 16, and then edit that capture in the Photos app using the Cinematic effect, and then save the edited output  (e.g. edited_capture.mov), I can't import that edited_capture.mov into my project as a spatial audio asset.
Is this intentional behavior or a bug?
If it's intentional, can you describe why?
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Media Technologies
  	
                
                
                SubTopic:
                  
                    
	
		Audio
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      Hi Apple Team,
We have integrated FairPlay Streaming Server SDK v3 into our MDRM platform since 2017, the system works stable and stayed untouched. As you know, both Widevine and Playready have requirements to upgrade the Server SDK regularly. We want to know if Apple imposes similar requirements for upgrading the FPS SDK, or if we may continue using the old one without any updates.
Thanks for your support!
                    
                  
                
                    
                      (This only started happening as of Xcode 26.)
I know macOS and watchOS don't support this property, but all other platforms do (did?) up until I upgraded Xcode. Now when I compile I get this:
Value of type 'AVPlayerItem' has no member 'externalMetadata'
                    
                  
                
                    
                      Hello everyone,
I'm looking for a definitive clarification on how to completely disable all video stabilization, including the hardware OIS, using AVFoundation. The goal is to achieve a completely raw, unstabilized video feed, which is crucial when using external equipment like gimbals to avoid conflicting stabilization motions.
My research points to using the AVCaptureConnection property preferredVideoStabilizationMode and setting it to AVCaptureVideoStabilizationMode.off.
The documentation for the .off case states:
A mode that doesn’t stabilize video capture.
This description is slightly ambiguous. It's unclear whether this only affects software-level stabilization (EIS, EIS+OIS, etc) or if it guarantees the complete deactivation of the physical OIS module. For professional video applications, this is a critical distinction.
So, I'd like to ask the community:
Has anyone been able to definitively confirm that setting preferredVideoStabilizationMode to .off also disables the hardware OIS? Are there any known tests or documentation that prove this behavior?
Is there an alternative or more direct method to ensure the OIS module is physically inactive during video capture?
What is the community's best practice for ensuring absolutely no stabilization is applied to the video pipeline?
Any insights or shared experiences on this topic would be greatly appreciated.
Thank you!
                    
                  
                
                    
                      I'm creating Live Photos programmatically in my app using the Photos and AVFoundation frameworks. While the Live Photos work perfectly in the Photos app (long press shows motion), users cannot set them as motion wallpapers. The system shows "Motion not available" message.
Here's my approach for creating Live Photos:
// 1. Create video with required metadata
let writer = try AVAssetWriter(outputURL: videoURL, fileType: .mov)
let contentIdentifier = AVMutableMetadataItem()
contentIdentifier.identifier = .quickTimeMetadataContentIdentifier
contentIdentifier.value = assetIdentifier as NSString
writer.metadata = [contentIdentifier]
// Video settings: 882x1920, H.264, 30fps, 2 seconds
// Added still-image-time metadata at middle frame
// 2. Create HEIC image with asset identifier
var makerAppleDict: [String: Any] = [:]
makerAppleDict["17"] = assetIdentifier  // Required key for Live Photo
metadata[kCGImagePropertyMakerAppleDictionary as String] = makerAppleDict
// 3. Generate Live Photo
PHLivePhoto.request(
    withResourceFileURLs: [photoURL, videoURL],
    placeholderImage: nil,
    targetSize: .zero,
    contentMode: .aspectFit
) { livePhoto, info in
    // Success - Live Photo created
}
// 4. Save to Photos library
PHAssetCreationRequest.forAsset().addResource(with: .photo, fileURL: photoURL, options: nil)
PHAssetCreationRequest.forAsset().addResource(with: .pairedVideo, fileURL: videoURL, options: nil)
What I've Tried
Matching exact video specifications from Camera app (882x1920, H.264, 30fps)
Adding all documented metadata (content identifier, still-image-time)
Testing various video durations (1.5s, 2s, 3s)
Different image formats (HEIC, JPEG)
Comparing with exiftool against working Live Photos
Expected Behavior
Live Photos created programmatically should be eligible for motion wallpapers, just like those from the Camera app.
Actual Behavior
System shows "Motion not available" and only allows setting as static wallpaper.
Any insights or workarounds would be greatly appreciated. This is affecting our users who want to use their created content as wallpapers.
Questions
Are there additional undocumented requirements for Live Photos to be wallpaper-eligible?
Is this a deliberate restriction for third-party apps, or a bug?
Has anyone successfully created Live Photos that work as motion wallpapers?
Environment
iOS 17.0 - 18.1
Xcode 16.0
Tested on iPhone 16 Pro
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Media Technologies
  	
                
                
                SubTopic:
                  
                    
	
		Photos & Camera
		
  	
                  
                
              
              
                Tags:
              
              
  
  
    
      
      
      
        
          
            LivePhotosKit JS
          
        
        
      
      
    
      
      
      
        
          
            PhotoKit
          
        
        
      
      
    
      
      
      
        
          
            Core Image
          
        
        
      
      
    
      
      
      
        
          
            AVFoundation
          
        
        
      
      
    
  
  
              
                
                
              
            
          
                    
                      In iOS 26 (Developer Beta), the AVCaptureMetadataOutputObjectsDelegate no longer receives callbacks when metadataOutput.metadataObjectTypes = [.face] is set. On earlier iOS versions the issue does not occur. Interestingly, face detection works if I set the sessionPreset to .medium, but not with .high — except on the iPhone 16 Pro Max, where it works regardless.
                    
                  
                
                    
                      I am developing an iOS app that needs to play spoken audio on demand from a server, while ducking the audio of background music from another app (e.g., SoundtrackYourBrand or Apple Music). This must work even when the app is in the background, and the server dictates when and what audio is played. Ideally, the message should be played within a minute of the server requesting it.
Current Attempt & Observations
I initially tried using Firebase Cloud Messaging (FCM) silent notifications to send a URL to an audio file, which the app would then play using AVPlayer.
This works consistently when the app is active, but in the background, it only works about 60% of the time.
In cases where it fails, iOS ducks the background music (e.g., from SoundtrackYourBrand) but never plays the spoken audio.
Interestingly, when I play the audio without enabling audio ducking, it seems to work 100% of the time from my limited testing, even in the background.
The app has background modes enabled for Audio, Background Fetch, and Remote Notifications.
Best Approach to Achieve This?
I’d like guidance on the best Apple-compliant approach to reliably play audio on command from the server, even when the app is in the background. Some possible paths:
Ensuring the app remains active in the background – Are there recommended ways to prevent the app from getting suspended, such as background tasks, a special background mode, or a persistent connection to the server?
Alternative triggering mechanisms – Would something like VoIP, Push-to-Talk, or another background service be better suited for this use case?
Built-in iOS speech synthesis (AVSpeechSynthesizer) – If playing external audio is unreliable, would generating speech dynamically from text be a more robust approach?
Streaming audio instead of sending a URL – Could continuous streaming from the server keep the app active and allow playback at the right moment?
I want to ensure the solution is reliable and works 100% of the time when needed. Any recommendations on the best approach for this would be greatly appreciated.
Thank you for your time and guidance.
                    
                  
                
                    
                      Hi,
I'm developing a musicKit integration in my iOS App, and I want to select songs from recently played (done it), the problem is that the queue is not auto-generated and the user have to select other song if they want to go forward.
There is any method to ask for similar songs, or recommended songs, from a song that the user has already selected?
It will be really great :)
Also if you know it... There is any publisher for the music duration or I need to do a timer?? Thanks.
David.
                    
                  
                
                    
                      Please include the line below in follow-up emails for this request.
Case-ID: 11089799
When using AVSpeechUtterance and setting it to play in Mandarin, if Siri is set to Cantonese on iOS 18, it will be played in Cantonese. There is no such issue on iOS 17 and 16.
1.let utterance = AVSpeechUtterance(string: textView.text)
   let voice = AVSpeechSynthesisVoice(language: "zh-CN")
   utterance.voice = voice
2.In the phone settings, Siri is set to Cantonese
                    
                  
                
                    
                      We are encountering a critical, intermittently occurring crash issue when accessing photo data using PHAssetResourceManager.writeDataForAssetResource on iOS 18. The problem does not arise on iOS 17 or earlier versions.
We have been unable to identify a consistent reproduction path. Based on user feedback, the issue seems to involve Live Photo and Raw image files.
Our investigation has revealed that the crash occurs in the +[PISchema identifier] method of the PhotoImaging Framework. When called manually, this method causes a crash on iOS 18 but works without issues on iOS 17.
Reproduction Steps:
1.Fetch PHAsset.
2.Get PHAssetResource by [PHAssetResource assetResourcesForAsset:].
3.Call [PHAssetResourceManager writeDataForAssetResource:toFile:options:completionHandler:].
Crash Log:
Incident Identifier: CFD60092-FDB1-43B4-BA42-3F507F7B8B96
CrashReporter Key:   260b4780989083a54e0cb451930fe9a3bed64862
Hardware Model:      iPhone13,4
AppStoreTools:       16C5031b
AppVariant:          1:iPhone13,4:18
Code Type:           ARM-64 (Native)
Role:                Foreground
Parent Process:      launchd [1]
Date/Time:           2025-02-15 19:07:57.7054 +0800
Launch Time:         2025-02-15 19:07:55.4106 +0800
OS Version:          iPhone OS 18.3.1 (22D72)
Release Type:        User
Baseband Version:    5.20.03
Report Version:      104
Exception Type:  EXC_CRASH (SIGABRT)
Exception Codes: 0x0000000000000000, 0x0000000000000000
Termination Reason: SIGNAL 6 Abort trap: 6
Terminating Process: mCloud_iPhone [11109]
Triggered by Thread:  11
Application Specific Information:
abort() called
Thread 11 name:   Dispatch queue: com.apple.NSXPCConnection.m-user.com.apple.photos.service
Thread 11 Crashed:
0   libsystem_kernel.dylib        	       0x1e850b2d4 __pthread_kill + 8
1   libsystem_pthread.dylib       	       0x221b4959c pthread_kill + 268
2   libsystem_c.dylib             	       0x19ec24b08 abort + 128
3   NeutrinoCore                  	       0x1bdcdbdec -[NUAssertionPolicyAbort notifyAssertion:] + 68
4   NeutrinoCore                  	       0x1bdcdbbf4 -[NUAssertionPolicyComposite notifyAssertion:] + 160
5   NeutrinoCore                  	       0x1bdcdc098 -[NUAssertionPolicyUnique notifyAssertion:] + 176
6   NeutrinoCore                  	       0x1bdcdb524 -[NUAssertionHandler handleFailureInFunction:file:lineNumber:currentlyExecutingJobName:description:arguments:] + 156
7   NeutrinoCore                  	       0x1bdcdc4bc _NUAssertFailHandler + 176
8   NeutrinoCore                  	       0x1bdc8ea98 -[NUIdentifier initWithNamespace:name:version:] + 2352
9   NeutrinoCore                  	       0x1bdc8eba8 -[NUIdentifier initWithName:version:] + 84
10  NeutrinoCore                  	       0x1bdc8ec10 -[NUIdentifier initWithName:] + 68
11  PhotoImaging                  	       0x1bda54ce4 +[PISchema identifier] + 36
12  PhotoImaging                  	       0x1bda550fc +[PISchema registeredPhotosSchemaIdentifier] + 32
13  PhotoImaging                  	       0x1bd9d7128 +[PIPhotoEditHelper newComposition] + 28
14  PhotoImaging                  	       0x1bd940798 +[PICompositionSerializer deserializeCompositionFromAdjustments:metadata:formatIdentifier:formatVersion:sidecarData:error:] + 160
15  PhotoImaging                  	       0x1bd9412ec +[PICompositionSerializer deserializeCompositionFromData:formatIdentifier:formatVersion:sidecarData:error:] + 224
16  PhotoLibraryServices          	       0x1afabf75c -[PLPhotoEditPersistenceManager loadCompositionFrom:formatIdentifier:formatVersion:sidecarData:error:] + 1856
17  PhotoLibraryServices          	       0x1afabffe4 +[PLPhotoEditPersistenceManager validateAdjustmentData:formatIdentifier:formatVersion:error:] + 108
18  Photos                        	       0x1af4ac360 __167+[PHContentEditingInputRequestContext contentEditingInputRequestContextForAsset:requestID:managerID:networkAccessAllowed:downloadIntent:progressHandler:resultHandler:]_block_invoke + 260
19  Photos                        	       0x1af4ac67c -[PHAdjustmentData(ContentEditingInput) _contentEditing_readableByClientWithVerificationBlock:] + 136
20  Photos                        	       0x1af4ac4b0 -[PHAdjustmentData(ContentEditingInput) _contentEditing_requiredBaseVersionReadableByClient:verificationBlock:] + 88
21  Photos                        	       0x1af4abb8c -[PHContentEditingInputRequestContext _adjustmentBaseVersionFromResult:request:canHandleAdjustmentData:] + 404
22  Photos                        	       0x1af4a911c -[PHContentEditingInputRequestContext produceChildRequestsForRequest:reportingIsLocallyAvailable:isDegraded:result:] + 624
23  Photos                        	       0x1af2c1d10 -[PHMediaRequestContext _produceChildRequestsForRequest:withResult:] + 88
24  Photos                        	       0x1af2c11e8 -[PHMediaRequestContext mediaRequest:didFinishWithResult:] + 88
25  Photos                        	       0x1af505184 -[PHAdjustmentDataRequest _finishFromAsynchronousCallback] + 124
26  Photos                        	       0x1af5050a0 __39-[PHAdjustmentDataRequest startRequest]_block_invoke + 584
27  PhotoLibraryServicesCore      	       0x1b001be8c __106-[PLAssetsdResourceClient adjustmentDataForAsset:networkAccessAllowed:trackCPLDownload:completionHandler:]_block_invoke.86 + 864
28  CoreFoundation                	       0x196dd8e34 __invoking___ + 148
29  CoreFoundation                	       0x196dd7e7c -[NSInvocation invoke] + 428
30  Foundation                    	       0x195a64ae0 __NSXPCCONNECTION_IS_CALLING_OUT_TO_EXPORTED_OBJECT__ + 16
31  Foundation                    	       0x195a63514 -[NSXPCConnection _decodeAndInvokeReplyBlockWithEvent:sequence:replyInfo:] + 532
32  Foundation                    	       0x195a6653c __88-[NSXPCConnection _sendInvocation:orArguments:count:methodSignature:selector:withProxy:]_block_invoke_5 + 188
33  libxpc.dylib                  	       0x221babb80 _xpc_connection_reply_callout + 116
34  libxpc.dylib                  	       0x221b9e2d0 _xpc_connection_call_reply_async + 80
35  libdispatch.dylib             	       0x19eb6b028 _dispatch_client_callout3 + 20
36  libdispatch.dylib             	       0x19eb88b64 _dispatch_mach_msg_async_reply_invoke + 340
37  libdispatch.dylib             	       0x19eb7242c _dispatch_lane_serial_drain + 352
38  libdispatch.dylib             	       0x19eb73158 _dispatch_lane_invoke + 432
39  libdispatch.dylib             	       0x19eb7e38c _dispatch_root_queue_drain_deferred_wlh + 288
40  libdispatch.dylib             	       0x19eb7dbd8 _dispatch_workloop_worker_thread + 540
41  libsystem_pthread.dylib       	       0x221b44680 _pthread_wqthread + 288
42  libsystem_pthread.dylib       	       0x221b42474 start_wqthread + 8
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Media Technologies
  	
                
                
                SubTopic:
                  
                    
	
		Photos & Camera
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      Hi fellow iOS developers! 👋
I've written a Swift code that converts a video (from a URL) into a Live Photo after downloading it. The conversion process seems fine, but when I try to set the generated Live Photo as a wallpaper on iOS 17+, it shows the message 'Motion not Available.'
Has anyone else experienced this issue or know why this might be happening? Could it be related to changes in iOS 17 Live Photo handling or the generated file structure? Any help or suggestions would be greatly appreciated! 🙏
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Media Technologies
  	
                
                
                SubTopic:
                  
                    
	
		Photos & Camera
		
  	
                  
                
              
              
                Tags:
              
              
  
  
    
      
      
      
        
          
            Photos and Imaging
          
        
        
      
      
    
      
      
      
        
          
            LivePhotosKit JS
          
        
        
      
      
    
      
      
      
        
          
            PhotoKit
          
        
        
      
      
    
  
  
              
                
                
              
            
          
                    
                      I am writing an iOS app to present a slide show of assets in a Photo album, in a random order, including videos and live photos.  I have got it all working quite nicely but for a Live Photo, I need to know what effect is selected  (Live, Loop, Bounce, Long Exposure, Live Off) to display the image correctly.  I can't find any mention of getting this information in the documentation.  Anyone know how to do this?  Thanks in advance.
Adrian.
(Xcode 16.1 iOS 18.0)
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Media Technologies
  	
                
                
                SubTopic:
                  
                    
	
		Photos & Camera
		
  	
                  
                
              
              
              
  
  
    
    
  
  
              
                
                
              
            
          
                    
                      Hi,
I’m an iOS developer building an app with an use case that needs advanced playback on Apple Music subscription streams, specifically:
•	Real-time tempo change (BPM) during playback — i.e., time-stretch with key-lock, not just crossfade.
•	Beat-matched transitions between tracks.
From what I can tell, this capability seems to exist only for approved partners and isn’t available through public MusicKit.
Question: What’s the official request path to be evaluated for that restricted partner entitlement (application form, questionnaire, NDA, or internal team/BD contact)? If the entitlement identifier is internal, how can I get my account routed to the right Apple Music team?
For reference, publicly announced partners include Algoriddim djay, Serato DJ Pro, rekordbox (AlphaTheta), and Engine DJ—all of which appear to implement mixing features that imply advanced playback (tempo/beat-matching) on Apple Music content. I’d prefer not to share product details publicly for the moment and can provide specifics privately if needed.
Thanks in advance!
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		Media Technologies
  	
                
                
                SubTopic:
                  
                    
	
		Audio
		
  	
                  
                
              
              
                Tags:
              
              
  
  
    
      
      
      
        
          
            Apple Music API
          
        
        
      
      
    
      
      
      
        
          
            FairPlay Streaming
          
        
        
      
      
    
      
      
      
        
          
            MusicKit
          
        
        
      
      
    
      
      
      
        
          
            AVFoundation
          
        
        
      
      
    
  
  
              
                
                
              
            
          
                    
                      I'm working on a photo app and I want to allow the user to display, edit and delete photos. I can fetch all photos using PHAsset.fetchAssets(with: options). This works as intended.
However, I can't seem to find a way to prevent the user from seeing photos from a Shared Library. The PHAssetSourceType only contains typeCloudShared to only show items from a specific album; not library.
How can I filter by iCloud Shared Library?
                    
                  
                
                    
                      In my app I use AVAssetReaderTrackOutput to extract PCM audio from a user-provided video or audio file and display it as a waveform.
Recently a user reported that the waveform is not in sync with his video, and after receiving the video I noticed that the waveform is in fact double as long as the video duration, i.e. it shows the audio in slow-motion, so to speak.
Until now I was using
CMFormatDescription.audioStreamBasicDescription.mSampleRate
which for this particular user video returns 22'050. But in this case it seems that this value is wrong... because the audio file has two audio channels with different sample rates, as returned by
CMFormatDescription.audioFormatList.map({ $0.mASBD.mSampleRate })
The first channel has a sample rate of 44'100, the second one 22'050. If I use the first sample rate, the waveform is perfectly in sync with the video.
The problem is given by the fact that the ratio between the audio data length and the sample rate multiplied by the audio duration is 8, double the ratio for the first audio file (4). In the code below this ratio is given by
Double(length) / (sampleRate * asset.duration.seconds)
When commenting out the line with the sampleRate variable definition in the code below and uncommenting the following line, the ratios for both audio files are 4, which is the expected result. I would expect audioStreamBasicDescription to return the correct sample rate, i.e. the one used by AVAssetReaderTrackOutput, which (I think) somehow merges the stereo tracks. The documentation is sparse, and in particular it’s not documented whether the lower or higher sample rate is used; in this case, it seems like the higher one is used, but audioStreamBasicDescription for some reason returns the lower one.
Does anybody know why this is the case or how I should extract the sample rate of the produced PCM audio data? Should I always take the higher one?
I created FB19620455.
let openPanel = NSOpenPanel()
openPanel.allowedContentTypes = [.audiovisualContent]
openPanel.runModal()
let url = openPanel.urls[0]
let asset = AVURLAsset(url: url)
let assetTrack = asset.tracks(withMediaType: .audio)[0]
let assetReader = try! AVAssetReader(asset: asset)
let readerOutput = AVAssetReaderTrackOutput(track: assetTrack, outputSettings: [AVFormatIDKey: Int(kAudioFormatLinearPCM), AVLinearPCMBitDepthKey: 16, AVLinearPCMIsBigEndianKey: false, AVLinearPCMIsFloatKey: false, AVLinearPCMIsNonInterleaved: false])
readerOutput.alwaysCopiesSampleData = false
assetReader.add(readerOutput)
let formatDescriptions = assetTrack.formatDescriptions as! [CMFormatDescription]
let sampleRate = formatDescriptions[0].audioStreamBasicDescription!.mSampleRate
//let sampleRate = formatDescriptions[0].audioFormatList.map({ $0.mASBD.mSampleRate }).max()!
print(formatDescriptions[0].audioStreamBasicDescription!.mSampleRate)
print(formatDescriptions[0].audioFormatList.map({ $0.mASBD.mSampleRate }))
if !assetReader.startReading() {
    preconditionFailure()
}
var length = 0
while assetReader.status == .reading {
    guard let sampleBuffer = readerOutput.copyNextSampleBuffer(), let blockBuffer = sampleBuffer.dataBuffer else {
        break
    }
    length += blockBuffer.dataLength
}
print(Double(length) / (sampleRate * asset.duration.seconds))
                    
                  
                
                    
                      Hi,
just generated a HDR10 MVHEVC file, mediainfo is below:
Color range                              : Limited
Color primaries                          : BT.2020
Transfer characteristics                 : PQ
Matrix coefficients                      : BT.2020 non-constant
Codec configuration box                  : hvcC+lhvC
then generate the segment files with below command:
mediafilesegmenter --iso-fragmented -t 4  -f av_1 av_new_1.mov
then upload the segment files and prog_index.m3u8 to web server.
just find that can not play the HLS stream on Safari...
the url is http://ip/vod/prog_index.m3u8
just checked that if i remove the tag Transfer characteristics : PQ when generating the MVHEVC file.
above same mediafilesegmenter command and upload the files to web server.
the new version of HLS stream is can play on Safari...
Is there any way to play HLS PQ video on Safari. thanks.
                    
                  
                
                    
                      We are experiencing thousands of crashes in our application when attempting to present the camera through a Web View. The app crashes during this process, and the crash logs point to
WebCore::AVVideoCaptureSource::create
WebCore::RealtimeMediaSourceCenter::getUserMediaDevices.
This issue has only been observed in iOS 18.2 beta versions (beta 1 - 22C5109p, beta 2 - 22C5125e, beta 3 - 22C5131e).
In iOS versions below 18.2, the functionality works and we haven't identified any correlation with specific device models. The problem seems to stem from a WebCore framework introduced in these beta releases 18.2.
We kindly request a review and fix for this issue in upcoming beta releases to restore functionality. Let us know if there are any workarounds or adjustments we can implement in the interim.
Thank you for your attention to this matter.