Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics

Post

Replies

Boosts

Views

Activity

newCommandQueue got nil
I am using Metal for rendering, and when calling the newCommandQueue interface of Metal, there is a certain probability that I will get a nil return value. However, when I call the MTLCreateSystemDefaultDevice interface, I can get a non-empty return value, which means my device supports Metal. I would like to ask what causes the newCommandQueue to return nil? Is there any way to avoid this situation? Thank you.
1
0
100
3d
AVAssetExportSession failed to export audio
I am trying to use AVAssetExportSession to export audio form video but every time I try it, it fails and I don't know why ?! this is the code import AVFoundation protocol AudioExtractionProtocol { func extractAudio(from fileUrl: URL, to outputUrl: URL) } final class AudioExtraction { private var avAsset: AVAsset? private var avAssetExportSession: AVAssetExportSession? init() {} } //MARK: - AudioExtraction conforms to AudioExtractionProtocol extension AudioExtraction: AudioExtractionProtocol { func extractAudio(from fileUrl: URL, to outputUrl: URL) { createAVAsset(for: fileUrl) createAVAssetExportSession(for: outputUrl) exportAudio() } } //MARK: - Private Methods extension AudioExtraction { private func createAVAsset(for fileUrl: URL) { avAsset = AVAsset(url: fileUrl) } private func createAVAssetExportSession(for outputUrl: URL) { guard let avAsset else { return } avAssetExportSession = AVAssetExportSession(asset: avAsset, presetName: AVAssetExportPresetAppleM4A) avAssetExportSession?.outputURL = outputUrl } private func exportAudio() { guard let avAssetExportSession else { return } print("I am here \n") avAssetExportSession.exportAsynchronously { if avAssetExportSession.status == .failed { print("\(avAssetExportSession.status)\n") } } } } func test_AudioExtraction_extractAudioAndWriteItToFile() { let videoUrl = URL(string: "https://storage.googleapis.com/gtv-videos-bucket/sample/ForBiggerMeltdowns.mp4")! let audioExtraction: AudioExtractionProtocol = AudioExtraction() audioExtraction.extractAudio(from: videoUrl, to: FileMangerTest.audioFile) FileMangerTest.tearDown() } class FileMangerTest { private static let fileManger = FileManager.default private static var directoryUrl: URL { fileManger.urls(for: .cachesDirectory, in: .userDomainMask).first! } static var audioFile: URL { directoryUrl.appendingPathComponent("audio", conformingTo: .mpeg4Audio) } static func tearDown() { try? fileManger.removeItem(at: audioFile) } static func contant(at url: URL) -> Data? { return fileManger.contents(atPath: url.absoluteString) } }
0
0
89
3d
How to migrate PHPhotoLibraryChangeObserver to Swift 6?
I have the following code: extension AssetGridViewController: PHPhotoLibraryChangeObserver { func photoLibraryDidChange(_ changeInstance: PHChange) { Task { @MainActor in guard let changes = changeInstance.changeDetails(for: fetchResult) else { return } fetchResult = changes.fetchResultAfterChanges } } } With Swift 6, this generates a compilation error: Main actor-isolated instance method 'photoLibraryDidChange' cannot be used to satisfy nonisolated protocol requirement. The error includes to fix-it suggestions: Adding nonisolated to the function (nonisolated func photoLibraryDidChange(_ changeInstance: PHChange)) Adding @preconcurrency to the protocol conformance (extension AssetGridViewController: @preconcurrency PHPhotoLibraryChangeObserver {) Both options generate a runtime error: EXC_BREAKPOINT (code=1, subcode=0x105b7c400). For context, AssetGridViewController is a regular UIViewController. Any ideas on how to fix this?
2
0
168
2w
API to get AVSpeechSynthesisVoice that are available to download (via Settings.app) but not yet downloaded to a device
Here is the use case, I have a language learning app that uses AVSpeechSynthesizer ❤️. When a user listens to a phrase with the AVSpeechSynthesizer using a AVSpeechSynthesisVoice with a AVSpeechSynthesisVoiceQuality of default it sounds much much worse than voices with enhanced or premium, really affecting usability. There appears to be no API for the app to know if there are enhanced or premium voices available to download (via Settings.app) but not yet downloaded to a device. The only API I could find is AVSpeechSynthesisVoice.speechVoices() which returns all available voices on the device, but not a full list of voices available via download. So the app cannot know if it should inform the user "hey this voice your listening to is a much lower quality than enhanced or premium, go to settings and download the enhanced or premium version". Any ideas? Do I need to send in an enhancement request via Feedback Assistant? Thank you for helping my users ears and helping them turn speech synthesis voice quality up to 11 when it's available to them with just a couple of extra taps! (I suppose the best workaround is to display a warning every time the user is using a default quality voice, I wonder what % of voices have enhanced or premium versions...)
1
0
185
1w
Issue setting a queue with library and non-library items at the same time (plus a couple more MusicKit issues)
As the summer continues, I have been diving deeper and deeper into MusicKit, largely with great results. A few issues have arisen that I've outlined here, feedbacks already filed and numbers included here. All of this happens on the lasted developer beta and latest Xcode beta. Thanks! FB10967343 - Setting the queue with library and non-library items at the same time doesn't work correctly In my app, I am working on a feature that lets a user shuffle songs from a collection of albums that may or may not be in their library. However, I’ve discovered an issue where the queue does not seem to work correctly when mixing these types. I’ve attempted to load ApplicationMusicPlayer by creating a Queue and to load applicationQueuePlayer using a MPMusicPlayerPlayParametersQueueDescriptor, but the same issue occurs each time. The queue is able to play songs from the same source, but if it’s been playing a library song and tries to move to a non-library song, the queue stops.  The first thing I do is pick random songs from each album, using a MusicLibraryRequest or a MusicCatalogResourceRequest as appropriate, then taking a randomElement() from the ensuing MusicItemCollection for the album.  I append each track to an array, which I then cast to MusicItemCollection so I’ve now got a MusicItemCollection consisting of the tracks I want. If I’m in MusicKit land, I simply set the queue as follows:  player.queue = ApplicationMusicPlayer.Queue(for: tracks) It takes a bit more doing in MediaPlayer, but in theory this should also work, right?    do {         let paramObjects = tracks.compactMap {             $0.playParameters         }         let params = try paramObjects.map({try JSONEncoder().encode($0)}) let dicts = try params.compactMap {               try JSONSerialization.jsonObject(with: $0, options: []) as? [String:Any]           }           let finalParams = dicts.compactMap {                 MPMusicPlayerPlayParameters(dictionary: $0)             } let descriptor = MPMusicPlayerPlayParametersQueueDescriptor(playParametersQueue: finalParams) mediaPlayer.setQueue(with: descriptor) } catch { print(error) } In either case, the following issue occurs: say that I end up with a queue made up of one library song, then one non-library song. The player will play just the first song, then it acts as if the queue has ended. Say that it has two non-library songs, then one library song. Just the two non-library songs play. Indeed, printing queue.entries shows just the number of items that were from the same source type. FB10967076 - Publishing changes from background thread error when inserting queue items When using the .insert method on ApplicationMusicPlayer.Queue on the last iOS 16 and Xcode betas, it returns a “Publishing changes from background thread” error even though the function I’m doing in is marked as a @MainActor and the stacktace indicates it was on the main thread. FB10967277 - song.with([.albums], preferredSource: .library) generates thousands of lines of EntityQueries in the console I’ve noticed that when using the preferredSource: .library when requesting additional properties on a library item creates ~6,000 of “EntityQuery” entries in the console, all in the span of a second. This doesn’t seem to be leading to any major performance issues, but it sure seems like something isn't right. let request = MusicLibraryRequest<Song>.init() do { let response = try await request.response() guard let song = response.items.first else { return } let songWithAlbums = try await song.with([.albums], preferredSource: .library) } catch { print(error) } generates the following output (except... 6,000 of them) 2022-07-31 13:02:07.729003-0400 MusicKitFutzing[9405:2192606] [EntityQuery] Finished fetching results in 0s 2022-07-31 13:02:07.729047-0400 MusicKitFutzing[9405:2192605] [EntityQuery] Finished executing query in 0.00100017s 2022-07-31 13:02:07.729202-0400 MusicKitFutzing[9405:2192611] [EntityQuery] Finished executing query in 0s 2022-07-31 13:02:07.729240-0400 MusicKitFutzing[9405:2192605] [EntityQuery] Finished fetching results in 0s
1
1
1.1k
Jul ’22
Rich Notification
Is There Anyone Who implemented Rich Notification Recently I tried To Implement it but its not working, I Have went through blogs and tut which are Available On on internet but They are not working, Please let me know if there is someone who Can Help Me With That.
1
0
127
6d
Issue with iPhone QR Code Contact Preview Displaying Email Instead of Organization
Post Content: Hi everyone, I’m encountering an issue with how iPhone displays contact information from a vCard QR code in the contact preview. When I scan the QR code with my iPhone camera, the contact preview shows the email address between the name and the contact image, instead of displaying the organization name. Here’s the structure of the vCard I’m using: BEGIN:VCARD VERSION:3.0 FN:Ahmad Rana N:Rana;Ahmad;;; ORG:Company 3 TEL;TYPE=voice,msg:+1234567890 EMAIL:a(at the rate)gmail.com URL:https://example.com IMPP:facebook:fb END:VCARD What I Expect: When I scan it with camera and in the contact preview before creating the camera I want organization name between name and image of the preview but I get email instead of ogrganization name. If only organisation is passed then it displays correctly but when I pass email it displayed email in between. Steps I’ve Taken: Verified the vCard structure to ensure it follows the standard format. Reordered the fields in the vCard to prioritize the organization name and job title. Tested with a simplified vCard containing only the name, organization, and email. Despite these efforts, the email address continues to be displayed in the contact preview between the name and the contact image, while the organization name is not shown as expected. Question: How can I ensure that the organization name is displayed correctly in the contact preview on iPhone when scanning a QR code? Are there specific rules or best practices for field prioritization in vCards that I might be missing? I would appreciate any insights or suggestions on how to resolve this issue. Thank you!
1
0
174
1w
Is there a way to directly go from VideoToolbox to Metal for 10-bit/BT.2020 YCbCr HEVC?
tl;dr how can I get raw YUV in a Metal fragment shader from a VideoToolbox 10-bit/BT.2020 HEVC stream without any extra/secret format conversions? With VideoToolbox and 10-bit HEVC, I've found that it defaults to CVPixelBuffers w/ formats kCVPixelFormatType_Lossless_420YpCbCr10PackedBiPlanarFullRange or kCVPixelFormatType_Lossy_420YpCbCr10PackedBiPlanarFullRange. To mitigate this, I have the following snippet of code to my application: // We need our pixels unpacked for 10-bit so that the Metal textures actually work var pixelFormat:OSType? = nil let bpc = getBpcForVideoFormat(videoFormat!) let isFullRange = getIsFullRangeForVideoFormat(videoFormat!) // TODO: figure out how to check for 422/444, CVImageBufferChromaLocationBottomField? if bpc == 10 { pixelFormat = isFullRange ? kCVPixelFormatType_420YpCbCr10BiPlanarFullRange : kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange } let videoDecoderSpecification:[NSString: AnyObject] = [kVTVideoDecoderSpecification_EnableHardwareAcceleratedVideoDecoder:kCFBooleanTrue] var destinationImageBufferAttributes:[NSString: AnyObject] = [kCVPixelBufferMetalCompatibilityKey: true as NSNumber, kCVPixelBufferPoolMinimumBufferCountKey: 3 as NSNumber] if pixelFormat != nil { destinationImageBufferAttributes[kCVPixelBufferPixelFormatTypeKey] = pixelFormat! as NSNumber } var decompressionSession:VTDecompressionSession? = nil err = VTDecompressionSessionCreate(allocator: nil, formatDescription: videoFormat!, decoderSpecification: videoDecoderSpecification as CFDictionary, imageBufferAttributes: destinationImageBufferAttributes as CFDictionary, outputCallback: nil, decompressionSessionOut: &decompressionSession) In short, I need kCVPixelFormatType_420YpCbCr10BiPlanar so that I have a straightforward MTLPixelFormat.r16Unorm/MTLPixelFormat.rg16Unorm texture binding for Y/CbCr. Metal, seemingly, has no direct pixel format for 420YpCbCr10PackedBiPlanar. I'd also rather not use any color conversion in VideoToolbox, in order to save on processing (and to ensure that the color transforms/transfer characteristics match between streamer/client, since I also have a custom transfer characteristic to mitigate blocking in dark scenes). However, I noticed that in visionOS 2, the CVPixelBuffer I receive is no longer a compressed render target (likely a bug), which caused GPU texture read bandwidth to skyrocket from 2GiB/s to 30GiB/s. More importantly, this implies that VideoToolbox may in fact be doing an extra color conversion step, wasting memory bandwidth. Does Metal actually have no way to handle 420YpCbCr10PackedBiPlanar? Are there any examples for reading 10-bit HDR HEVC buffers directly with Metal?
1
0
147
6d
Location not visible in video recorded in third party app
I recently bought an insta360 flow gimbal. when recording video with the instaflow app, I cannot see the location in apple photos app and all other apple apps. However I can see the location in windows photos app once I downloaded the videos into my windows PC. The location is also visible in android app once I share it through google account. With an exif app, I can see the location meta data in exif table as well, but again not shown as location. exiftool in my pc can also see the meta data including location as in attached screenshot. Compared to video shot with built-in camera app, I cannot find any difference in terms of location meta data. What could be wrong? I contacted insta360 app support, they do not seem to understand what's going on, just asking for very simple questions again and again like do you enable GPS location access, are you shooting video? I also contacted apple support, they are just saying it's thirdparty issue and refusing to help further. If it's really thirdparty issue how come the location data is actually embeded as meta data, and windows pc and android device can see the location? BTW, I air drop this video to all my apple devices like iPhone 15 ultra and ipad air, and very old iPhone, all of them cannot see the location.
1
0
174
6d
Generating Live Photo from JPG and MOV fails
I am working on an iOS application using SwiftUI where I want to convert a JPG and a MOV file to a live photo. I am utilizing the LivePhoto Class from Github for this. The JPG and MOV files are displayed correctly in my WallpaperDetailView, but I am facing issues when trying to download the live photo to the gallery and generate the Live Photo. Here is the relevant code and the errors I am encountering: Console prints: Play button should be visible Image URL fetched and set: Optional("https://firebasestorage.googleapis.com/...") Video is ready to play Video downloaded to: file:///var/mobile/Containers/Data/Application/.../tmp/CFNetworkDownload_7rW5ny.tmp Failed to generate Live Photo I have verified that the app has the necessary permissions to access the Photo Library. The JPEG and MOV files are successfully downloaded and can be displayed in the app. The issue seems to occur when generating the Live Photo from the downloaded files. struct WallpaperDetailView: View { var wallpaper: Wallpaper @State private var isLoading = false @State private var isImageSaved = false @State private var imageURL: URL? @State private var livePhotoVideoURL: URL? @State private var player: AVPlayer? @State private var playerViewController: AVPlayerViewController? @State private var isVideoReady = false @State private var showBuffering = false var body: some View { ZStack { if let imageURL = imageURL { GeometryReader { geometry in KFImage(imageURL) .resizable() ... } } if let playerViewController = playerViewController { VideoPlayerViewController(playerViewController: playerViewController) .frame(maxWidth: .infinity, maxHeight: .infinity) .clipped() .edgesIgnoringSafeArea(.all) } } .onAppear { PHPhotoLibrary.requestAuthorization { status in if status == .authorized { loadImage() } else { print("User denied access to photo library") } } } private func loadImage() { isLoading = true if let imageURLString = wallpaper.imageURL, let imageURL = URL(string: imageURLString) { self.imageURL = imageURL if imageURL.scheme == "file" { self.isLoading = false print("Local image URL set: \(imageURL)") } else { fetchDownloadURL(from: imageURLString) { url in self.imageURL = url self.isLoading = false print("Image URL fetched and set: \(String(describing: url))") } } } if let livePhotoVideoURLString = wallpaper.livePhotoVideoURL, let livePhotoVideoURL = URL(string: livePhotoVideoURLString) { self.livePhotoVideoURL = livePhotoVideoURL preloadAndPlayVideo(from: livePhotoVideoURL) } else { self.isLoading = false print("No valid image or video URL") } } private func preloadAndPlayVideo(from url: URL) { self.player = AVPlayer(url: url) let playerViewController = AVPlayerViewController() playerViewController.player = self.player self.playerViewController = playerViewController let playerItem = AVPlayerItem(url: url) playerItem.preferredForwardBufferDuration = 1.0 self.player?.replaceCurrentItem(with: playerItem) ... print("Live Photo Video URL set: \(url)") } private func saveWallpaperToPhotos() { if let imageURL = imageURL, let livePhotoVideoURL = livePhotoVideoURL { saveLivePhotoToPhotos(imageURL: imageURL, videoURL: livePhotoVideoURL) } else if let imageURL = imageURL { saveImageToPhotos(url: imageURL) } } private func saveImageToPhotos(url: URL) { ... } private func saveLivePhotoToPhotos(imageURL: URL, videoURL: URL) { isLoading = true downloadVideo(from: videoURL) { localVideoURL in guard let localVideoURL = localVideoURL else { print("Failed to download video for Live Photo") DispatchQueue.main.async { self.isLoading = false } return } print("Video downloaded to: \(localVideoURL)") self.generateAndSaveLivePhoto(imageURL: imageURL, videoURL: localVideoURL) } } private func generateAndSaveLivePhoto(imageURL: URL, videoURL: URL) { LivePhoto.generate(from: imageURL, videoURL: videoURL, progress: { percent in print("Progress: \(percent)") }, completion: { livePhoto, resources in guard let resources = resources else { print("Failed to generate Live Photo") DispatchQueue.main.async { self.isLoading = false } return } print("Live Photo generated with resources: \(resources)") self.saveLivePhotoToLibrary(resources: resources) }) } private func saveLivePhotoToLibrary(resources: LivePhoto.LivePhotoResources) { LivePhoto.saveToLibrary(resources) { success in DispatchQueue.main.async { if success { self.isImageSaved = true print("Live Photo saved successfully") } else { print("Failed to save Live Photo") } self.isLoading = false } } } private func fetchDownloadURL(from gsURL: String, completion: @escaping (URL?) -> Void) { let storageRef = Storage.storage().reference(forURL: gsURL) storageRef.downloadURL { url, error in if let error = error { print("Failed to fetch image URL: \(error)") completion(nil) } else { completion(url) } } } private func downloadVideo(from url: URL, completion: @escaping (URL?) -> Void) { let task = URLSession.shared.downloadTask(with: url) { localURL, response, error in guard let localURL = localURL, error == nil else { print("Failed to download video: \(String(describing: error))") completion(nil) return } completion(localURL) } task.resume() } }```
0
0
96
1w
Iphone 14 & 15 pro model camera focusing error / UNITY APP
Hi, I would like to ask your advice with our IOS app, which has a problem with the Iphone 14 pro & Iphone 15 pro model camera focusing on close objects. The game is made with Unity and the idea is that kids can scan random QR - and barcodes to catch monsters. With Iphone the no pro models focus and read the barcodes well, but new models with three back camera systems our app does not focus on close, that makes the barcode reading hard. Somehow it seems that the Iphone pro model is not changing the camera lens to close focusing one in a third party app like it does when you use the camera itself. Do you have any advice on how to solve the focusing problem? I appreciate your help!
0
0
146
1w
Voice recorder app recording in dual mono instead of stereo
Hi y'all, After getting mono recording working, I want to differentiate my app from the standard voice memos to allow for stereo recording. I followed this tutorial (https://developer.apple.com/documentation/avfaudio/capturing_stereo_audio_from_built-in_microphones) to get my voice recorder to record stereo audio. However, when I look at the waveform in Audacity, both channels are the same. If I look at the file info after sharing it, it says the file is in stereo. I don't exactly know what's going on here. What I suspect is happening is that the recorder is only using one microphone. Here is the relevant part of my recorder: // MARK: - Initialization override init() { super.init() do { try configureAudioSession() try enableBuiltInMicrophone() try setupAudioRecorder() } catch { // If any errors occur during initialization, // terminate the app with a fatalError. fatalError("Error: \(error)") } } // MARK: - Audio Session and Recorder Configuration private func enableBuiltInMicrophone() throws { let audioSession = AVAudioSession.sharedInstance() let availableInputs = audioSession.availableInputs guard let builtInMicInput = availableInputs?.first(where: { $0.portType == .builtInMic }) else { throw Errors.NoBuiltInMic } do { try audioSession.setPreferredInput(builtInMicInput) } catch { throw Errors.UnableToSetBuiltInMicrophone } } private func configureAudioSession() throws { let audioSession = AVAudioSession.sharedInstance() do { try audioSession.setCategory(.record, mode: .default, options: [.allowBluetooth]) try audioSession.setActive(true) } catch { throw Errors.FailedToInitSessionError } } private func setupAudioRecorder() throws { let date = Date() let dateFormatter = DateFormatter() dateFormatter.locale = Locale(identifier: "en_US_POSIX") dateFormatter.dateFormat = "yyyy-MM-dd, HH:mm:ss" let timestamp = dateFormatter.string(from: date) self.recording = Recording(name: timestamp) guard let fileURL = recording?.returnURL() else { fatalError("Failed to create file URL") } self.currentURL = fileURL print("Recording URL: \(fileURL)") do { let audioSettings: [String: Any] = [ AVFormatIDKey: Int(kAudioFormatMPEG4AAC), AVLinearPCMIsNonInterleaved: false, AVSampleRateKey: 44_100.0, AVNumberOfChannelsKey: isStereoSupported ? 2 : 1, AVLinearPCMBitDepthKey: 16, AVEncoderAudioQualityKey: AVAudioQuality.max.rawValue ] audioRecorder = try AVAudioRecorder(url: fileURL, settings: audioSettings) } catch { throw Errors.UnableToCreateAudioRecorder } audioRecorder.delegate = self audioRecorder.prepareToRecord() } //MARK: update orientation public func updateOrientation(withDataSourceOrientation orientation: AVAudioSession.Orientation = .front, interfaceOrientation: UIInterfaceOrientation) async throws { let session = AVAudioSession.sharedInstance() guard let preferredInput = session.preferredInput, let dataSources = preferredInput.dataSources, let newDataSource = dataSources.first(where: { $0.orientation == orientation }), let supportedPolarPatterns = newDataSource.supportedPolarPatterns else { return } isStereoSupported = supportedPolarPatterns.contains(.stereo) if isStereoSupported { try newDataSource.setPreferredPolarPattern(.stereo) } try preferredInput.setPreferredDataSource(newDataSource) try session.setPreferredInputOrientation(interfaceOrientation.inputOrientation) } Here is the relevant part of my SwiftUI view: RecordView() .onAppear {             Task {                 if await AVAudioApplication.requestRecordPermission() {                     // The user grants access. Present recording interface.                     print("Permission granted")                 } else {                     // The user denies access. Present a message that indicates                     // that they can change their permission settings in the                     // Privacy & Security section of the Settings app.                     model.showAlert.toggle()                 }                 try await recorder.updateOrientation(interfaceOrientation: deviceOrientation)             }         }         .onReceive(NotificationCenter.default.publisher(for: UIDevice.orientationDidChangeNotification)) { _ in                     if let windowScene = UIApplication.shared.connectedScenes.first as? UIWindowScene,                        let orientation = windowScene.windows.first?.windowScene?.interfaceOrientation {                         deviceOrientation = orientation                         Task {                             do {                                 try await recorder.updateOrientation(interfaceOrientation: deviceOrientation)                             } catch {                                 throw Errors.UnableToUpdateOrientation                             }                         }                     }                 } Here is the full repo: https://github.com/aabagdi/MemoMan/tree/MemoManStereo Thanks for any leads!
1
0
171
1w
PiP not launching from a WKWebview Sandboxed app
Hi, I am developing an app that has a WKWebView and it can open sites like Youtube. The app is sandboxed as it is meant to be uploaded to the mac App Store. It has a feature PiP where we start the native PiP by calling a browser Javascript where we tell the WKWEBView to fire the PiP. It works well when we are running the code from XCODE in Debug scheme. When we run the code from release mode by archiving it or directly from the build folder, the WKWebView is not able to fire the PiP Agent and thus the Native PiP window is not visible, while the site shows that PiP is opened and we can here the sound being played. But PiP window is not visible. I cannot see PiPAgent in activity monitor. Why does it not work from within the release build outside xcode. But when I try to run the build directly from the Finder in builds folder, this PiP feature does not work. Request technical help for this. Thanks!
0
0
99
1w
Input location of AVAudioSession are different between iPhone
Position of AVAudioSession is different when I use the speaker. try session.setCategory(.playAndRecord, mode: .voiceChat, options: []) try session.overrideOutputAudioPort(.speaker) try session.setActive(true) let route = session.currentRoute route.inputs.forEach{ input in print(input.selectedDataSource?.location) } In iPhone 11(iOS 17.5.1), AVAudioSessionLocation: Lower In iPhone 7 Plus(iOS 15.8.2), AVAudioSessionLocation: Upper What causes this difference in behavior?
0
0
145
1w
AVAudioSessionErrorCodeCannotInterruptOthers
We are to judge the AVAudioSessionInterruptionOptionShouldResume, to restore the audio playback. We have been online for a long time and have been able to resume audio playback normally. But recently we've had a lot of user feedback as to why the audio won't resume playing. Based on this feedback, we checked and found that there were some apps that did not play audio but occupied audio all the time. For example, when a user was using the wechat app, after sending a voice message, we received a notification to resume audio playback, and wechat did not play audio either. But we resume play times wrong AVAudioSessionErrorCodeCannotInterruptOthers. After that, we gave feedback to the wechat app and fixed the problem. But we still have some users feedback this problem, we do not know which app is maliciously occupying audio, so we do not know which aspect to troubleshoot the problem. We pay close attention to user feedback and hope it can help us solve user experience problems.
0
0
112
1w