I've been generating new Audio Unit Extension apps with Xcode 16 (and newer), and although they generally work initially, it is easy (although I'm not sure how to do it reliably) to cause the app to no longer be able to instantiate the audiounit. Generally the call to AVAudioUnit.findComponent fails and SimplePlayEngine hits the fatalError("Failed to find component with type...")
In the most recent project, merely adding files to the extension (without making any use of them) caused it to go off the rails.
If I "Archive" the app+plugin, there is no audio unit extension in the bundle.
If I switch to the audiounit extension and build it it's fine. If I look at the build folder in Library/Developer/Xcode/project_folder the extension_name.appex is there.
Any ideas? If I can coax an unmodified audio unit extension project to exhibit this behavior I'll attach it here. Right now what I have has code I don't want to share.
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm building a UIKit app that reads user's Apple Music library and displays it. In MusicKit there is the Artwork structure which I need to use to display artwork images in the app. Since I'm not using SwiftUI I cannot use the ArtworkImage view that is recommended way of displaying those images but the Artwork structure has a method that returns url for the image which can be used to read the image.
The way I have it setup is really simple:
extension MusicKit.Song {
func imageURL(for cgSize: CGSize) -> URL? {
return artwork?.url(
width: Int(cgSize.width),
height: Int(cgSize.height)
)
}
func localImage(for cgSize: CGSize) -> UIImage? {
guard let url = imageURL(for: cgSize),
url.scheme == "musicKit",
let data = try? Data(contentsOf: url) else {
return nil
}
return .init(data: data)
}
}
Now, everytime I access .artwork property (so a lot of times) the main thread gets blocked and the console output gets bombared with messages like these:
2023-07-26 11:49:47.317195+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: <MPMediaLibraryArtwork: 0x289591590> with error; Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction." UserInfo={NSDebugDescription=The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction.}
2023-07-26 11:49:47.317262+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: file:///var/mobile/Media/iTunes_Control/iTunes/Artwork/Originals/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7
2023-07-26 11:49:47.323099+0200 Plum[998:297013] [Plum] IIOImageWriteSession:121: cannot create: '/var/mobile/Media/iTunes_Control/iTunes/Artwork/Caches/320x320/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7.sb-f9c7943d-6ciLNp'error = 1 (Operation not permitted)
My guess is that the most performance-heavy task here is performing the color analysis for each artwork but IMO the property backgroundColor should not be a stored property if that's the case. I am not planning to use it anywhere and if so it should be a computed async property so it doesn't block the caller.
I know I can move the call to a background thread and that fixes the issue of blocking main thread but still the loading times for each artwork are terribly slow and that impacts the UX.
SwiftUI's ArtworkImage loads the artworks much quicker and without the errors so there must be a better way to do it.
I'm creating Live Photos programmatically in my app using the Photos and AVFoundation frameworks. While the Live Photos work perfectly in the Photos app (long press shows motion), users cannot set them as motion wallpapers. The system shows "Motion not available" message.
Here's my approach for creating Live Photos:
// 1. Create video with required metadata
let writer = try AVAssetWriter(outputURL: videoURL, fileType: .mov)
let contentIdentifier = AVMutableMetadataItem()
contentIdentifier.identifier = .quickTimeMetadataContentIdentifier
contentIdentifier.value = assetIdentifier as NSString
writer.metadata = [contentIdentifier]
// Video settings: 882x1920, H.264, 30fps, 2 seconds
// Added still-image-time metadata at middle frame
// 2. Create HEIC image with asset identifier
var makerAppleDict: [String: Any] = [:]
makerAppleDict["17"] = assetIdentifier // Required key for Live Photo
metadata[kCGImagePropertyMakerAppleDictionary as String] = makerAppleDict
// 3. Generate Live Photo
PHLivePhoto.request(
withResourceFileURLs: [photoURL, videoURL],
placeholderImage: nil,
targetSize: .zero,
contentMode: .aspectFit
) { livePhoto, info in
// Success - Live Photo created
}
// 4. Save to Photos library
PHAssetCreationRequest.forAsset().addResource(with: .photo, fileURL: photoURL, options: nil)
PHAssetCreationRequest.forAsset().addResource(with: .pairedVideo, fileURL: videoURL, options: nil)
What I've Tried
Matching exact video specifications from Camera app (882x1920, H.264, 30fps)
Adding all documented metadata (content identifier, still-image-time)
Testing various video durations (1.5s, 2s, 3s)
Different image formats (HEIC, JPEG)
Comparing with exiftool against working Live Photos
Expected Behavior
Live Photos created programmatically should be eligible for motion wallpapers, just like those from the Camera app.
Actual Behavior
System shows "Motion not available" and only allows setting as static wallpaper.
Any insights or workarounds would be greatly appreciated. This is affecting our users who want to use their created content as wallpapers.
Questions
Are there additional undocumented requirements for Live Photos to be wallpaper-eligible?
Is this a deliberate restriction for third-party apps, or a bug?
Has anyone successfully created Live Photos that work as motion wallpapers?
Environment
iOS 17.0 - 18.1
Xcode 16.0
Tested on iPhone 16 Pro
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
LivePhotosKit JS
PhotoKit
Core Image
AVFoundation
Since iOS 12 it has become difficult to detect the end of playback using the system music player.
In earlier iOS versions, the now playing item would be set nil and you would receive a notification that the player stopped.
In iOS 12 and later, nowPlayingItem still contains the current song and the only notification you get is MPMusicPlayerControllerPlaybackStateDidChangeNotification with the playbackState set to MPMusicPlaybackStatePaused.
Pressing pause in my car (or any remote access) generates the same conditions making it difficult to correctly detect the difference.
It would be nice if they added a notification that playback was done (similar to the other players).
Any suggestions?
Hello,
My company has an in-store app with FPS SDK 4.x (1024) keys. We've handed those keys over to a trusted third-party and we do not have them. We've been in-store for several years.
The person that created the keys in our organization mistakenly stored them encrypted to our third-party's PGP keys, so we cannot decrypt them, and the third party also has no mechanism to provide us with the keys even though it is in their runtime environment. They only have secure mechanisms for us to upload keys onto their servers.
We are trying to migrate to a different third-party DRM provider, and would like to obtain new keys. Unfortunately, the developer portal won't let me create new keys, saying that we have exceeded the number of keys allowed, which I assume is one.
Additionally, the new DRM provider can only support SDK 4.x keys, and it appears that we can only request SDK 5.x keys on the Apple Developer portal, as the SDK 4.0 option is grayed out. Regardless, it seems that we are not able to request any keys.
We've submitted a request to the support e-mail address and received an automated e-mail that the response should take a few days, but may take longer on occasion. It's now been a month. The e-mail says that the reply address is not monitored. Is there any way we can accelerate this?
Thank you,
Carlos
It's 2025, and I see that trends in video storage and streaming have changed significantly. Nowadays, CDN combined with domain-based video protection is the most popular solution.
Does anyone have more insights into this technology or real-world experience with it?
Topic:
Media Technologies
SubTopic:
Video
I am using https://developer.apple.com/documentation/applemusicapi/add-tracks-to-a-library-playlist
to add tracks to playlists. This endpoint works fine for all playlists except for collaborative playlists.
For collaborative playlist I get the following 500 error as a response:
"errors": [
{
"id": "<some id>",
"title": "Upstream Service Error",
"detail": "Unable to update tracks",
"status": "500",
"code": "50001"
}
]
}
Steps to reproduce:
Create a playlist in your library.
Use the api to add a song.
Confirm that it works.
Make that same playlist collaborative.
Update the playlist ID in your api request (as making a playlist collaborative changes its id)
Confirm that you get the 500 error.
I am developing an app that plays HLS audio.
When using AVPlayerItem with AVURLAsset, can AVAssetResourceLoaderDelegate correctly handle HLS segments?
My goal is to use AVAssetResourceLoaderDelegate to add authentication HTTP headers when accessing HLS .m3u8 and .ts files.
I can successfully download the files, but playback fails with errors.
Specifically, I am observing the following cases:
A. AVAssetResourceLoaderDelegate is canceled, and CoreMediaErrorDomain -12881 occurs
In NSURLConnectionDataDelegate’s didReceiveResponse method, set contentInformationRequest
In didReceiveData, call dataRequest respondWithData
resourceLoader didCancelLoadingRequest is called
CoreMediaErrorDomain -12881 occurs
B. CoreMediaErrorDomain -12881 occurs
In NSURLConnectionDataDelegate’s didReceiveResponse method, set contentInformationRequest
In connection didReceiveData, buffer all received data until the end
In connectionDidFinishLoading, pass the buffered data to respondWithData
Call loadingRequest finishLoading
CoreMediaErrorDomain -12881 occurs
In both cases, dataRequest.requestsAllDataToEndOfResource is YES.
For this use case, I am not using AVURLAssetHTTPHeaderFieldsKey because I need to apply the most up-to-date authentication data at the moment each file is accessed.
I would appreciate any advice or suggestions you might have. Thank you in advance!
In my app I use AVAssetReaderTrackOutput to extract PCM audio from a user-provided video or audio file and display it as a waveform.
Recently a user reported that the waveform is not in sync with his video, and after receiving the video I noticed that the waveform is in fact double as long as the video duration, i.e. it shows the audio in slow-motion, so to speak.
Until now I was using
CMFormatDescription.audioStreamBasicDescription.mSampleRate
which for this particular user video returns 22'050. But in this case it seems that this value is wrong... because the audio file has two audio channels with different sample rates, as returned by
CMFormatDescription.audioFormatList.map({ $0.mASBD.mSampleRate })
The first channel has a sample rate of 44'100, the second one 22'050. If I use the first sample rate, the waveform is perfectly in sync with the video.
The problem is given by the fact that the ratio between the audio data length and the sample rate multiplied by the audio duration is 8, double the ratio for the first audio file (4). In the code below this ratio is given by
Double(length) / (sampleRate * asset.duration.seconds)
When commenting out the line with the sampleRate variable definition in the code below and uncommenting the following line, the ratios for both audio files are 4, which is the expected result. I would expect audioStreamBasicDescription to return the correct sample rate, i.e. the one used by AVAssetReaderTrackOutput, which (I think) somehow merges the stereo tracks. The documentation is sparse, and in particular it’s not documented whether the lower or higher sample rate is used; in this case, it seems like the higher one is used, but audioStreamBasicDescription for some reason returns the lower one.
Does anybody know why this is the case or how I should extract the sample rate of the produced PCM audio data? Should I always take the higher one?
I created FB19620455.
let openPanel = NSOpenPanel()
openPanel.allowedContentTypes = [.audiovisualContent]
openPanel.runModal()
let url = openPanel.urls[0]
let asset = AVURLAsset(url: url)
let assetTrack = asset.tracks(withMediaType: .audio)[0]
let assetReader = try! AVAssetReader(asset: asset)
let readerOutput = AVAssetReaderTrackOutput(track: assetTrack, outputSettings: [AVFormatIDKey: Int(kAudioFormatLinearPCM), AVLinearPCMBitDepthKey: 16, AVLinearPCMIsBigEndianKey: false, AVLinearPCMIsFloatKey: false, AVLinearPCMIsNonInterleaved: false])
readerOutput.alwaysCopiesSampleData = false
assetReader.add(readerOutput)
let formatDescriptions = assetTrack.formatDescriptions as! [CMFormatDescription]
let sampleRate = formatDescriptions[0].audioStreamBasicDescription!.mSampleRate
//let sampleRate = formatDescriptions[0].audioFormatList.map({ $0.mASBD.mSampleRate }).max()!
print(formatDescriptions[0].audioStreamBasicDescription!.mSampleRate)
print(formatDescriptions[0].audioFormatList.map({ $0.mASBD.mSampleRate }))
if !assetReader.startReading() {
preconditionFailure()
}
var length = 0
while assetReader.status == .reading {
guard let sampleBuffer = readerOutput.copyNextSampleBuffer(), let blockBuffer = sampleBuffer.dataBuffer else {
break
}
length += blockBuffer.dataLength
}
print(Double(length) / (sampleRate * asset.duration.seconds))
I am working on an iOS application using SwiftUI where I want to convert a JPG and a MOV file to a live photo. I am utilizing the LivePhoto Class from Github for this. The JPG and MOV files are displayed correctly in my WallpaperDetailView, but I am facing issues when trying to download the live photo to the gallery and generate the Live Photo.
Here is the relevant code and the errors I am encountering:
Console prints:
Play button should be visible Image URL fetched and set: Optional("https://firebasestorage.googleapis.com/...") Video is ready to play Video downloaded to: file:///var/mobile/Containers/Data/Application/.../tmp/CFNetworkDownload_7rW5ny.tmp Failed to generate Live Photo
I have verified that the app has the necessary permissions to access the Photo Library.
The JPEG and MOV files are successfully downloaded and can be displayed in the app.
The issue seems to occur when generating the Live Photo from the downloaded files.
struct WallpaperDetailView: View {
var wallpaper: Wallpaper
@State private var isLoading = false
@State private var isImageSaved = false
@State private var imageURL: URL?
@State private var livePhotoVideoURL: URL?
@State private var player: AVPlayer?
@State private var playerViewController: AVPlayerViewController?
@State private var isVideoReady = false
@State private var showBuffering = false
var body: some View {
ZStack {
if let imageURL = imageURL {
GeometryReader { geometry in
KFImage(imageURL)
.resizable()
...
}
}
if let playerViewController = playerViewController {
VideoPlayerViewController(playerViewController: playerViewController)
.frame(maxWidth: .infinity, maxHeight: .infinity)
.clipped()
.edgesIgnoringSafeArea(.all)
}
}
.onAppear {
PHPhotoLibrary.requestAuthorization { status in
if status == .authorized {
loadImage()
} else {
print("User denied access to photo library")
}
}
}
private func loadImage() {
isLoading = true
if let imageURLString = wallpaper.imageURL, let imageURL = URL(string: imageURLString) {
self.imageURL = imageURL
if imageURL.scheme == "file" {
self.isLoading = false
print("Local image URL set: \(imageURL)")
} else {
fetchDownloadURL(from: imageURLString) { url in
self.imageURL = url
self.isLoading = false
print("Image URL fetched and set: \(String(describing: url))")
}
}
}
if let livePhotoVideoURLString = wallpaper.livePhotoVideoURL, let livePhotoVideoURL = URL(string: livePhotoVideoURLString) {
self.livePhotoVideoURL = livePhotoVideoURL
preloadAndPlayVideo(from: livePhotoVideoURL)
} else {
self.isLoading = false
print("No valid image or video URL")
}
}
private func preloadAndPlayVideo(from url: URL) {
self.player = AVPlayer(url: url)
let playerViewController = AVPlayerViewController()
playerViewController.player = self.player
self.playerViewController = playerViewController
let playerItem = AVPlayerItem(url: url)
playerItem.preferredForwardBufferDuration = 1.0
self.player?.replaceCurrentItem(with: playerItem)
...
print("Live Photo Video URL set: \(url)")
}
private func saveWallpaperToPhotos() {
if let imageURL = imageURL, let livePhotoVideoURL = livePhotoVideoURL {
saveLivePhotoToPhotos(imageURL: imageURL, videoURL: livePhotoVideoURL)
} else if let imageURL = imageURL {
saveImageToPhotos(url: imageURL)
}
}
private func saveImageToPhotos(url: URL) {
...
}
private func saveLivePhotoToPhotos(imageURL: URL, videoURL: URL) {
isLoading = true
downloadVideo(from: videoURL) { localVideoURL in
guard let localVideoURL = localVideoURL else {
print("Failed to download video for Live Photo")
DispatchQueue.main.async {
self.isLoading = false
}
return
}
print("Video downloaded to: \(localVideoURL)")
self.generateAndSaveLivePhoto(imageURL: imageURL, videoURL: localVideoURL)
}
}
private func generateAndSaveLivePhoto(imageURL: URL, videoURL: URL) {
LivePhoto.generate(from: imageURL, videoURL: videoURL, progress: { percent in
print("Progress: \(percent)")
}, completion: { livePhoto, resources in
guard let resources = resources else {
print("Failed to generate Live Photo")
DispatchQueue.main.async {
self.isLoading = false
}
return
}
print("Live Photo generated with resources: \(resources)")
self.saveLivePhotoToLibrary(resources: resources)
})
}
private func saveLivePhotoToLibrary(resources: LivePhoto.LivePhotoResources) {
LivePhoto.saveToLibrary(resources) { success in
DispatchQueue.main.async {
if success {
self.isImageSaved = true
print("Live Photo saved successfully")
} else {
print("Failed to save Live Photo")
}
self.isLoading = false
}
}
}
private func fetchDownloadURL(from gsURL: String, completion: @escaping (URL?) -> Void) {
let storageRef = Storage.storage().reference(forURL: gsURL)
storageRef.downloadURL { url, error in
if let error = error {
print("Failed to fetch image URL: \(error)")
completion(nil)
} else {
completion(url)
}
}
}
private func downloadVideo(from url: URL, completion: @escaping (URL?) -> Void) {
let task = URLSession.shared.downloadTask(with: url) { localURL, response, error in
guard let localURL = localURL, error == nil else {
print("Failed to download video: \(String(describing: error))")
completion(nil)
return
}
completion(localURL)
}
task.resume()
}
}```
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
Files and Storage
Swift
SwiftUI
Photos and Imaging
Hello everyone,
I'm looking for a definitive clarification on how to completely disable all video stabilization, including the hardware OIS, using AVFoundation. The goal is to achieve a completely raw, unstabilized video feed, which is crucial when using external equipment like gimbals to avoid conflicting stabilization motions.
My research points to using the AVCaptureConnection property preferredVideoStabilizationMode and setting it to AVCaptureVideoStabilizationMode.off.
The documentation for the .off case states:
A mode that doesn’t stabilize video capture.
This description is slightly ambiguous. It's unclear whether this only affects software-level stabilization (EIS, EIS+OIS, etc) or if it guarantees the complete deactivation of the physical OIS module. For professional video applications, this is a critical distinction.
So, I'd like to ask the community:
Has anyone been able to definitively confirm that setting preferredVideoStabilizationMode to .off also disables the hardware OIS? Are there any known tests or documentation that prove this behavior?
Is there an alternative or more direct method to ensure the OIS module is physically inactive during video capture?
What is the community's best practice for ensuring absolutely no stabilization is applied to the video pipeline?
Any insights or shared experiences on this topic would be greatly appreciated.
Thank you!
Hello,
I'm developing an app that displays a photo library using UICollectionView and PHCachingImageManager. I'd like to achieve a user experience similar to the native iOS Photos app, where low-quality images are shown quickly while scrolling, and higher-quality images are loaded for visible cells once scrolling stops.
I'm currently using the following approach:
While Scrolling: I'm using the UICollectionViewDataSourcePrefetching protocol. In the prefetchItemsAt method, I call startCachingImages with low-quality options to cache images in advance.
After Scrolling Stops: In the scrollViewDidEndDecelerating method, I intend to load high-quality images for the currently visible cells.
I have a few questions regarding this approach:
What is the best practice for managing both low-quality and high-quality images efficiently with PHCachingImageManager? Is it correct to call startCachingImages with fastFormat options and then call it again with highQualityFormat in scrollViewDidEndDecelerating?
How can I minimize the delay when a low-quality image is replaced by a high-quality one? Are there any additional strategies to help pre-load high-quality images more effectively?
Topic:
Media Technologies
SubTopic:
Photos & Camera
I noticed that AVSampleBufferDisplayLayerContentLayer is not released when the AVSampleBufferDisplayLayer is removed and released.
It is possible to reproduce the issue with the simple code:
import AVFoundation
import UIKit
class ViewController: UIViewController {
var displayBufferLayer: AVSampleBufferDisplayLayer?
override func viewDidLoad() {
super.viewDidLoad()
let displayBufferLayer = AVSampleBufferDisplayLayer()
displayBufferLayer.videoGravity = .resizeAspectFill
displayBufferLayer.frame = view.bounds
view.layer.insertSublayer(displayBufferLayer, at: 0)
self.displayBufferLayer = displayBufferLayer
DispatchQueue.main.asyncAfter(deadline: .now() + 1) {
self.displayBufferLayer?.flush()
self.displayBufferLayer?.removeFromSuperlayer()
self.displayBufferLayer = nil
}
}
}
In my real project I have mutliple AVSampleBufferDisplayLayer created and removed in different view controllers, this is problematic because the amount of leaked AVSampleBufferDisplayLayerContentLayer keeps increasing.
I wonder that maybe I should use a pool of AVSampleBufferDisplayLayer and reuse them, however I'm slightly afraid that this can also lead to strange bugs.
Edit: It doesn't cause leaks on iOS 18 device but leaks on iPad Pro, iOS 17.5.1
Hello everyone,
I have a SwiftUI app using WKWebView to load a website that includes a form with a file input (). The issue is:
📌 When a user taps “Browse” and selects “Take Photo” (camera option), the app crashes before the camera opens.
Setup Details:
• App Uses SwiftUI with WKWebView
• The crash occurs only when selecting “Take Photo”, but selecting an image from the library works fine.
📌 Full Code (WKWebView in SwiftUI)
import SwiftUI
import WebKit
struct WebViewRepresentable: UIViewRepresentable {
var urlString: String
func makeUIView(context: Context) -> WKWebView {
let webView = WKWebView()
webView.configuration.allowsInlineMediaPlayback = true
webView.configuration.mediaTypesRequiringUserActionForPlayback = []
loadURL(in: webView)
return webView
}
func updateUIView(_ uiView: WKWebView, context: Context) {
loadURL(in: uiView)
}
private func loadURL(in webView: WKWebView) {
if let url = URL(string: urlString) {
webView.load(URLRequest(url: url))
}
}
}
struct ContentView: View {
@State private var currentURL: String = "https://fv-wohlensee.ch"
var body: some View {
VStack(spacing: 0) {
// Oberer Bereich in Grün
Color(red: 0, green: 0.4, blue: 0)
.frame(height: 50)
// WebView with white background
WebViewRepresentable(urlString: currentURL)
.background(Color.white)
Divider()
// Navigation buttons
HStack(spacing: 10) {
Button {
currentURL = "https://fv-wohlensee.ch/vereinshaus-eymatt/"
} label: {
VStack {
Image(systemName: "house")
.font(.system(size: 18))
Text("Klubhaus")
.font(.system(size: 12))
.minimumScaleFactor(0.7)
.lineLimit(1)
}
.padding(8)
}
.foregroundColor(.white)
.frame(maxWidth: .infinity)
Button {
currentURL = "https://fv-wohlensee.ch/vereinsboot/"
} label: {
VStack {
Image(systemName: "ferry.fill")
.font(.system(size: 18))
Text("Boot")
.font(.system(size: 12))
.minimumScaleFactor(0.7)
.lineLimit(1)
}
.padding(8)
}
.foregroundColor(.white)
.frame(maxWidth: .infinity)
Button {
currentURL = "https://fv-wohlensee.ch/aktivitaeten/"
} label: {
VStack {
Image(systemName: "calendar")
.font(.system(size: 18))
Text("Aktivitäten")
.font(.system(size: 12))
.minimumScaleFactor(0.7)
.lineLimit(1)
}
.padding(8)
}
.foregroundColor(.white)
.frame(maxWidth: .infinity)
Button {
currentURL = "https://fv-wohlensee.ch/mitglied-werden/"
} label: {
VStack {
Image(systemName: "person.badge.plus")
.font(.system(size: 18))
Text("Mitglied")
.font(.system(size: 12))
.minimumScaleFactor(0.7)
.lineLimit(1)
}
.padding(8)
}
.foregroundColor(.white)
.frame(maxWidth: .infinity)
}
.padding(.horizontal, 15)
.padding(.vertical, 10)
.background(Color(red: 0, green: 0.4, blue: 0))
}
.frame(maxWidth: .infinity, maxHeight: .infinity)
.background(Color(red: 0, green: 0.4, blue: 0))
.ignoresSafeArea()
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
ContentView()
}
}
What I’ve Tried:
1️⃣ Checked Info.plist: Added permissions for camera and photo library:
<key>NSCameraUsageDescription</key>
<string>This app requires access to the camera to upload photos.</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>This app requires access to your photo library.</string>
2️⃣ Enabled Media Capture in WKWebView:
webView.configuration.allowsInlineMediaPlayback = true
webView.configuration.mediaTypesRequiringUserActionForPlayback = []
3️⃣ Tested in Safari: The same form works fine when opened in Safari.
Questions:
❓ Does WKWebView need additional permissions to open the camera?
❓ Do I need to implement a delegate to handle file uploads in SwiftUI?
❓ Has anyone faced this issue and found a fix?
Any guidance would be greatly appreciated! 🚀
Thanks in advance! 😊
Topic:
Media Technologies
SubTopic:
Photos & Camera
I'm playing library items (MPMediaItem) and apple music tracks (Track) in MPMusicPlayerApplicationController.applicationQueuePlayer, but I can't use the actual Queue functionality because I can't figure out how to get both media types into the same queue. If there's a way to get both types in a single queue, that would solve my problem, but I've given up on that one.
Because I can't use a queue, I have to be able to detect when a song ends so that I can put the next song in the queue and play it. The only way I can figure out to detect when a song ends is by watching the playBackState, and I've actually got that pretty much working, but it's really ugly, because you get playBackState of paused when a song ends, and when a bluetooth speaker disconnects, etc.
The only answer I've been able to find on the internet is to watch the MPMusicPlayerControllerNowPlayingItemDidChange, and when that fires, and the nowPlayingItem is NIL, a song ends.. but that's not the case. When a song ends, the nowPlayingItem remains the same. There's got to be an answer to this problem, right?
I'm experiencing audio issues while developing for visionOS when playing PCM data through AVAudioPlayerNode.
Issue Description:
Occasionally, the speaker produces loud popping sounds or distorted noise
This occurs during PCM audio playback using AVAudioPlayerNode
The issue is intermittent and doesn't happen every time
Technical Details:
Platform: visionOS
Device: vision pro / simulator
Audio Framework: AVFoundation
Audio Node: AVAudioPlayerNode
Audio Format: PCM
I would appreciate any insights on:
Common causes of audio distortion with AVAudioPlayerNode
Recommended best practices for handling PCM playback in visionOS
Potential configuration issues that might cause this behavior
Has anyone encountered similar issues or found solutions? Any guidance would be greatly helpful.
Thank you in advance!
AVAudioFormat has no Swift concurrency annotations but the documentation states "Instances of this class are immutable."
This made me always assume it was safe to pass AVAudioFormat instances around. Is this the case? If so can it be marked as Sendable? Am I missing something?
When using AVSampleBufferDisplayLayer to play uncompressed H.264 and H.265 video with B-frames more than 7, frame drops occur. The more B-frames there are, the more noticeable the frame drops become, for example 15 bframes.
Use FFmpeg to transcode a video file with visible timestamps and frame numbers (x264 or x265 ):
ffmpeg -i test.mp4 -vf "drawtext=fontsize=45:text=%{pts} %{n}:y=400" -c:v libx264 -x264-params "bframes=15:b-adapt=0" -crf 30 -y x264_bf15.mp4
ffmpeg -i test.mp4 -vf "drawtext=fontsize=45:text=%{pts} %{n}:y=400" -c:v libx265 -x265-params "bframes=15:b-adapt=0" -crf 30 -y x265_bf15.mp4
Use the demo player from this repository to reproduce the issue: https://github.com/msfrms/CustomPlayer
frame drops can be observed. And following log can be found in devices console.
mediaserverd <<<< IQ-CA >>>> piqca_gmstats_dump: FIQCA(0x1266f4000) recent frames: enqueued: 184, displayed: 138, dropped: 42, flushed: 0, evicted: 3, >16ms late: 2
PS. I was using iphone11 iOS14.6, to replay this issue.
May I ask why frame drops occur in this case?
Is there any configuration or API usage change that could help fix the frame drop issue?
Many thanks!
It's been well over a year since Apple added favoriting of artists back to Apple Music (the little star icon on an artist page), but yet I still haven't seen a way to get this data from an authenticated user from Music API. I was expecting to hear something about this during the WWDC, but there have been no announcements that I've caught.
Has anyone else heard anything? People assume when they provide access to their Apple Music account that we can actually get to the data in their Apple Music account, and we end up looking a little dumb not being able to get this core data.
using iOS 26.2; Airpods 4
Long press stem to launch Siri
Speak "Record Voice Memo" -> Recording starts
Recording in progress...
Long press stem to launch Siri -> Nothing happens.
To stop recording need use phone.
is this intended behaviour?
i would like to be able to stop recording with Siri
I am able to launch Siri from phone while recording, but point is to keep phone in pocket and start/stop recordings only via Airpods.