AVFoundation

RSS for tag

Work with audiovisual assets, control device cameras, process audio, and configure system audio interactions using AVFoundation.

AVFoundation Documentation

Posts under AVFoundation tag

431 Posts
Sort by:
Post not yet marked as solved
3 Replies
8.5k Views
I am trying to get AVPlayer to play a m3u8 playlist that is a local file.I have narrowed this down to a simple test case using one of Apple's sample playlists:https://tungsten.aaplimg.com/VOD/bipbop_adv_fmp4_example/master.m3u8If i play this playlist from the remote url, AVPlayer plays this fine. However, if i download this playlist to a local file, and then hand AVPlayer the local file URL, AVPlayer will not play it. It just shows the crossed out play symbol.Interestingly enough, this can be duplicated with Safari as well. Safari will play the remote playlist but not the local file. Also of note is that this behavior of AVPlayer is identical on iOS 10.2 as well as macOS 10.12.1. Also, playing an mp4 media file directly (not wrapped in a playlist) does not seem to have this issue, as it plays both from a remote URL as well as from a local file.Inspecting the AVPlayerItem.error does not lead to anything useful either:An unknown error occurred (-12865) The operation could not be completed Is anyone aware of any limitation that AVPlayer would not play a local playlist?Thank you.
Posted
by
Post not yet marked as solved
1 Replies
376 Views
Currently, I'm recording audio on the Apple Watch using the presentAudioRecorderController and sending it to the corresponding iPhone app.The audio format by default seems to be recorded on the Apple Watch using two channels and there doesn't seem to be a way to change that.For various reasons, I need the audio format to be only one channel. I was hoping there would be something on iOS to convert the audio file from stereo (two channels) to mono (one channel)Right now, I'm investigating if this is possible using AVAudioEngine and other various related classes. I'm not sure if it's possible and it seems very complicated at the moment.
Posted
by
Post not yet marked as solved
1 Replies
1.1k Views
I use AVPlayerViewController to play short videos in my app.If there is an app playing audio in background before user plays a video in my app, I want the background audio playing of the other app to resume after my video player is dismissed. I currently use AVAudioSession.setActive(false, with: .notifyOthersOnDeactivation) to do that.Even though Apple's Music app and Podcasts app do resume playing after I call AVAudioSession.setActive(false, with: .notifyOthersOnDeactivation)—won't resume without the call so it means this call does have effect—none of the 3rd party music or podcast apps that I tested(Spotify, SoundCloud, Amazon Music, Overcast) do.I doubt it's because none of these popular 3rd party apps supports background audio resuming. There must be something missing in my code:class ViewController: UIViewController { override func viewDidLoad() { super.viewDidLoad() } override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) // I know it's not the best place to call setActive nor it covers all the cases. It's just a convenient place to put the code to test its effect after dismissing the video player. do { try AVAudioSession.sharedInstance().setActive(false, with: .notifyOthersOnDeactivation) try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient) try AVAudioSession.sharedInstance().setActive(true) } catch { print(error) } } @IBAction func play(_ sender: Any) { do { try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback) try AVAudioSession.sharedInstance().setActive(true) } catch { print(error) } let playerController = AVPlayerViewController() playerController.player = AVPlayer(url: URL(string: "http://gslb.miaopai.com/stream/UkjiD45ddxZFQ79I2bLaGg__.mp4")!) playerController.player?.play() present(playerController, animated: true) } }Even though someone on Stack Overflow thinks this code is fine and its these 3rd party apps' fault, I still believe there are something that can be done to make them resume audio playback because Twitter app can. But I don't know what Twitter does to achieve that, anyone knows?PS: here is the complete project so anyone interested can try it.
Posted
by
an0
Post not yet marked as solved
1 Replies
902 Views
I'd like to use AVAudioConverter to convert audio captured from the microphone to μLaw.Unfortunately, when I try to create an output buffer to convert into, I get an exception. I've tried both AVAudioPCMBuffer and AVAudioCompressedBuffer, and neither works for me.Is this supposed to work?Thanks!let format = AVAudioFormat(settings: [AVFormatIDKey: NSNumber(value: kAudioFormatULaw), AVSampleRateKey: 8000, AVNumberOfChannelsKey: 1]) let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: 1000) // required condition is false: isPCMFormat let buffer = AVAudioCompressedBuffer(format: format, packetCapacity: 1000) // required condition is false: !(fmt.IsPCM() || fmt.mFormatID == kAudioFormatALaw || fmt.mFormatID == kAudioFormatULaw)
Posted
by
Post not yet marked as solved
1 Replies
658 Views
HiI am writing an app for people with variable hearing issues - I need to be able to change the the left-right volume of music being played (Apple Music, Spotify, etc) dynamically as the environment changes.Being able to set the Accessibility options (Settings > General > Accessibility - Left-Right Audio Balance) from code would be perfect.Does anyone know how I can programmatically set and adjust this left-right volume balance from my Swift app?I've Googled this to death and can't find anything.Andy
Post not yet marked as solved
3 Replies
1.3k Views
I am developing a video playback application , which can play ads along with the content.I am successfully able to start the playback with ad on device (ipad/iphone) but it never playsback in airplay mode. I gets the the below error notification: "AVPlayerItemFailedToPlayToEndTimeErrorKey = "Error Domain=AVFoundationErrorDomain Code=-11800 \"The operation could not be completed\" UserInfo={NSUnderlyingError=0x17005eff0 {Error Domain=NSOSStatusErrorDomain Code=-12926 \"(null)\"}, NSLocalizedFailureReason=An unknown error occurred (-12926), NSLocalizedDescription=The operation could not be completed" Any idea what could be wrong ?
Posted
by
Post not yet marked as solved
1 Replies
348 Views
Hi,i am trying to play different local videos on a button press inside a collection view cell.My problem:I start the first video by pressing the button and presenting the AVPlayerplayerViewController.player?.playI can now rotate my device and the video is playing in landscape mode.After I press done it goes back to the collectionview and I can choose the second video.Now the second video plays, but when i go into landscape mode the first Video shows upand there is no done or back button.If I stay in portrait mode everything is fine..Any ideas?Thanks a lot.
Posted
by
Post not yet marked as solved
4 Replies
1.7k Views
I'm using AVAudioSession in my VOIP app (using also CallKit).I understand that the session can get interrupted by a number of things, for example by a second incoming call.Apple states towards the bottom of this pagehttps://developer.apple.com/library/content/documentation/Audio/Conceptual/AudioSessionProgrammingGuide/HandlingAudioInterruptions/HandlingAudioInterruptions.html#//apple_ref/doc/uid/TP40007875-CH4-SW5"Note: There is no guarantee that a begin interruption will have a corresponding end interruption. Your app needs to be aware of a switch to a foreground running state or the user pressing a Play button. In either case, determine whether your app should reactivate its audio session."On the mentioned page there is an example, when the user ignores the incoming call and in that case AVAudioSessionInterruptionTypeEnded is being sent.But what should I do in every other case, when I don't get an AVAudioSessionInterruptionTypeEnded? (E.g. When the user answers the 2nd call and puts me on hold and later he ends the 2nd call?)Thanks!
Posted
by
Post not yet marked as solved
2 Replies
2.4k Views
Hi,I'm researching possible ways to stream external video and audio into an iOS device, up until now I came up with the the following "solutions" (more like hacks) -Developing MFi CameraI couldn't find any existing external camera that works using MFi so I guess it's reasonable that Apple are not allowing MFi cameras, will appreciate official say on the manner.Wifi CameraThere are several problems with this - Connectivity-wise - The iOS device needs to be on the same network as the wifi camera, if the camera provides its own network internet connectivity won't work (Wifi + Celluar can't work together right?)LatencyMultipeerConnectivity - This is not really a solution since it allows connecting to other iOS devices, right?Lightning to USB 3 Adapter - this allows connecting a PTP/mass storage device only to iPads afaikWill appreciate any other ideas on how to connect to an external camera stream!
Posted
by
Post marked as solved
3 Replies
2.6k Views
I'm trying to record video, with audio input coming from a connected Bluetooth headset. However I haven't been able to get an AVCaptureDevice that represents the headset, so I can't add it to my AVCaptureSession.I tried to find it using a discovery session, but it never finds the headset-- only the built in microphone. That may not be surprising since I'm asking for .builtInMicrophone, but the enumeration has no members for external audio inputs. I tried leaving the device type array empty but that finds no devices.let discoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInMicrophone], mediaType: AVMediaType.audio, position: .unspecified) print("Found \(discoverySession.devices.count) devices") for device in discoverySession.devices { print("Device: \(device)") }After doing some searching I tried setting up the AVAudioSession to specifically allow Bluetooth. However this has had no effect on the above.private let session = AVCaptureSession() // later... session.usesApplicationAudioSession = true session.automaticallyConfiguresApplicationAudioSession = false do { try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, with: [.allowBluetooth]) try AVAudioSession.sharedInstance().setActive(true) } catch { print("Error messing with audio session: \(error)") }For completeness I also tried the deprecated AVCaptureDevice.devices() method, but it doesn't find the Bluetooth headset either.I know that the headset is available because AVAudioSession can see it. However I haven't been able to get from the availableInputs array to something I can use in an AVCaptureSession. The following code finds the headset, but what would I do with the result? It's not an AVCaptureDevice nor can I construct one from the entries in the array. Setting the "preferred" input doesn't have any effect that I know how to use.if let availableInputs = AVAudioSession.sharedInstance().availableInputs { print("Found \(availableInputs.count) inputs") for input in availableInputs { print("Input: \(input)") if input.portType == AVAudioSessionPortBluetoothHFP { print("Setting preferred input") do { try AVAudioSession.sharedInstance().setPreferredInput(input) } catch { print("Error setting preferred input: \(error)") } } } }Given that the Bluetooth headset is connected and available, how do I set it as the audio input for my capture session?
Posted
by
Post not yet marked as solved
5 Replies
1.6k Views
I'm trying to obtain the intrinsic matrix for each video frame of AVCaptureSession (the same intrisic matrix as ARKit provides), however the isCameraIntrinsicMatrixDeliverySupported property of AVCaptureConnection is false in my use-case.The documentation of the property says "This property's value is true only if both the connection's input device format and output class support delivery of camera intrinsics."How do I know which device formats support delivery of intrinsic matrix? What do I need to do to be able to enable the intrinsic matrix delivery?Simple code to illustrate my problem:import UIKit import AVFoundation class ViewController: UIViewController { var sess: AVCaptureSession! var sessOut: AVCaptureVideoDataOutput! var prevLayer: AVCaptureVideoPreviewLayer! override func viewDidLoad() { super.viewDidLoad() sess = AVCaptureSession() let device = AVCaptureDevice.default(.builtInWideAngleCamera, for: AVMediaType.video, position: .back) let input = try! AVCaptureDeviceInput(device: device!) sess.addInput(input) sessOut = AVCaptureVideoDataOutput() sess.addOutput(sessOut) sessOut.connections.first?.videoOrientation = .landscapeRight sessOut.connections.first?.preferredVideoStabilizationMode = .cinematic print(sessOut.connections.first?.isCameraIntrinsicMatrixDeliverySupported) // <-- false - why? prevLayer = AVCaptureVideoPreviewLayer(session: sess) prevLayer.frame = self.view.frame prevLayer.videoGravity = .resizeAspectFill prevLayer.connection?.videoOrientation = .landscapeRight self.view.layer.addSublayer(prevLayer) sess.startRunning() } }
Posted
by
Post not yet marked as solved
1 Replies
804 Views
Device: iPad Pro 10.5-inchIOS: 10.3.2 10.3.3I use the following function to take a snapshot in the app.UIView (UISnapshotting) - (BOOL) drawViewHierarchyInRect: (CGRect) rect afterScreenUpdates: (BOOL) afterUpdates;CGRect rect = self.view.bounds; UIGraphicsBeginImageContextWithOptions(rect.size, NO, [[UIScreen mainScreen] scale]); [self.view drawViewHierarchyInRect:rect afterScreenUpdates:YES]; UIImage *snapshot = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext();However, every time I use this feature, the Camera preview rendering by AVFoundation used in another part of the application became slow. self.session = [[AVCaptureSession alloc] init]; AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; self.input = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error]; [self.session addInput:self.input]; self.output = [[AVCapturePhotoOutput alloc] init]; [self.session addOutput:self.output]; self.prevLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session]; self.prevLayer.frame = self.previewView.layer.bounds; self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; switch ([[UIApplication sharedApplication] statusBarOrientation]) { case UIInterfaceOrientationLandscapeLeft:{ AVCaptureConnection *con = self.prevLayer.connection; con.videoOrientation = AVCaptureVideoOrientationLandscapeLeft; } break; case UIInterfaceOrientationLandscapeRight:{ AVCaptureConnection *con = self.prevLayer.connection; con.videoOrientation = AVCaptureVideoOrientationLandscapeRight; } break; case UIInterfaceOrientationPortrait:{ AVCaptureConnection *con = self.prevLayer.connection; con.videoOrientation = AVCaptureVideoOrientationPortrait; } break; case UIInterfaceOrientationPortraitUpsideDown:{ AVCaptureConnection *con = self.prevLayer.connection; con.videoOrientation = AVCaptureVideoOrientationPortraitUpsideDown; } break; default: break; } [self.previewView.layer addSublayer:self.prevLayer]; [self.session startRunning];Currently we are seeing this problem only in the above device and OS environment.Anyone can solve this? I need information that will lead to resolution.
Posted
by
Post not yet marked as solved
1 Replies
394 Views
Hi,According to the documentation for the AVVideoCompositing protocol:"When creating instances of custom video compositors, AV Foundation initializes them by calling init and then makes them available as the value of the customVideoCompositor property of the object to which it was assigned. You then can do any additional setup or configuration to the custom compositor."AVMutableVideoComposition has a customVideoCompositorClass property, but does not have a customVideoCompositor property.Am I misunderstanding this? I need to access the instance to set some properties on it, but I cannot.Thanks,Frank
Posted
by
Post not yet marked as solved
2 Replies
2.1k Views
Hi,I have asubmitted an application to teh App Review Board. They have rejected the app with the following reason:"Your app declares support for audio in the UIBackgroundModes key in your Info.plist but did not include features that require persistent audio."My app uses audio recording in the background and asks the user for microphone permissions when they first use the application. I have also appealed the decision but they say that the way we are using audio in the background is not acceptable and still rejected the app. I am happy to make any changes that may be needed but I want clarification on what the problem is because according to the guidlines for background excecution, what I am doing is acceptable. Does anyone know how I can talk to an Apple engineer for her/him to explain to me what needs to be done? I am stuck and we have spent over a year developing this app. Best,Feras A.
Posted
by
Post not yet marked as solved
4 Replies
684 Views
I am working on a console app (Actually a library, but designed to be used from console), where I want to be able to enumerate the AVCaptureDevices at runtime. Attempting to do this however, I am noticing that the AVCaptureDevice devices property is never updating. I have some minimal code to reproduce this. Run it with a USB camera plugged in, and as you unplug and replug the camera in the output never changes, and it says there are devices that don't actually exist. Note I am not running this in a UI application, so there is no standard loop. Its just meant to be running from a standard "c" main.#import <Foundation/Foundation.h> #import <AVFoundation/AVFoundation.h> int main(int argc, const char * argv[]) { @autoreleasepool { while (true) { NSArray* captureDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; for (id device in captureDevices) { NSString* uniqueIdentifier = [(AVCaptureDevice*)device uniqueID]; NSLog(uniqueIdentifier); } usleep(1000000); } } return 0; }Thanks for any help. I would be ok firing up a message loop in a background thread if I needed to, but I can't guarentee that the user application has a message loop (sorry if terminology is different, I come from windows).
Posted
by
Post not yet marked as solved
1 Replies
795 Views
Hi all,in short: Does iOS 11 emulator support HEVC rendering with AVSampleBufferDisplayLayer?I have a working app which renders live video streams using AVSampleBufferDisplayLayer. For H264 I use CMVideoFormatDescriptionCreateFromH264ParameterSets, everything has been running ok for years. I made an upgrade to HEVC (which is now possible in iOS 11), so I am using CMVideoFormatDescriptionCreateFromHEVCParameterSets instead (and pass VPS, SPS and PPS). Everything seems to work and I get *no* error when creating the description and when I check the status property of the display layer, it says "rendering" even after several minutes of feeding HEVC video samples, so no fail here either. There is one small detail though, no video is rendered 🙂. Since I am testing in iOS 11 emulator, I suspect that this might be the problem. Does anybody know, if iOS emulator can decode / render HEVC with AVSampleBufferDisplayLayer? If yes, what might be the problem, when the entire pipeline works and no error is reported anywhere? Just FYI, H264 can be rendered ok even in the emulator.Thank you very much.Alex
Posted
by
Post not yet marked as solved
9 Replies
5.3k Views
When i am playing video got this crash.Crashlytics log reportsCrashed: com.apple.main-thread0 libobjc.A.dylib 0x1845a67e8 object_isClass + 161 Foundation 0x185d103e8 KVO_IS_RETAINING_ALL_OBSERVERS_OF_THIS_OBJECT_IF_IT_CRASHES_AN_OBSERVER_WAS_OVERRELEASED_OR_SMASHED + 682 Foundation 0x185d0e8ec NSKeyValueWillChangeWithPerThreadPendingNotifications + 3003 AVFoundation 0x18abb60f8 -[AVPlayerItem willChangeValueForKey:] + 964 AVFoundation 0x18abc7ff0 -[AVPlayerItem _updatePropertyCacheAndTriggerKVOForPlayer:usingKeys:] + 2005 AVFoundation 0x18abb959c __60-[AVPlayerItem _informObserversAboutAvailabilityOfDuration:]_block_invoke_2 + 1846 libdispatch.dylib 0x184cdd088 _dispatch_call_block_and_release + 247 libdispatch.dylib 0x184cdd048 _dispatch_client_callout + 168 libdispatch.dylib 0x184d1ddfc _dispatch_main_queue_callback_4CF$VARIANT$armv81 + 9689 CoreFoundation 0x185301eb0 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 1210 CoreFoundation 0x1852ffa8c __CFRunLoopRun + 201211 CoreFoundation 0x18521ffb8 CFRunLoopRunSpecific + 43612 GraphicsServices 0x1870b7f84 GSEventRunModal + 10013 UIKit 0x18e7f42f4 UIApplicationMain + 20814 VZFiOSMobile 0x10475b340 main (main.m:17)15 libdyld.dylib 0x184d4256c start + 4Why does this happens and what I can do to avoid this?
Posted
by
Post not yet marked as solved
5 Replies
1.3k Views
Hi all,I just filled the bug report 39600781, regarding 'aumi' (kAudioUnitType_MIDIProcessor) audio units and AVAudioEngine. In short: in iOS, when attaching 'aumi' audio units to an AVAudioEngine, the AVAudioEngine will never call their renderBlock / internalRenderBlock, rendering them (pun 😝) useless.I write this post because I know that 'aumi' audio units in iOS are undocumented, thus some quirks are to be expected 🙂. But since some developers are already implementing them (I am beta testing their apps), I think it is important to raise awareness that 'aumi' audio units do not work with hosts that use AVAudioEngine. Note that hosts using AUGraph work, because they need to explicitly call the renderBlock method of their audio units. Yet since AUGraph is going to be deprecated... well, if "aumis" are to be officially introduced in the next iOS version, it is important to raise awareness about this particular issue. That's all. Thanks for reading 😉.
Posted
by
Post not yet marked as solved
3 Replies
10k Views
Hi,I have an app that includes a very simple one-button interface for recording a video. Our customer wants to be able to switch between the front and rear cameras while the video is being recorded, and without any interruption in the video stream. I notice that even the iOS built-in camera app doesn't do this, but I've heard that some third-party apps do.Is this possible in a practical way? If so, how would I go about it?Thanks,Frank
Posted
by
Post marked as solved
4 Replies
2.2k Views
HiI'm using Swift 4 and the AvCam example code from Apple. In my own app I've an issue where the captured image orientation is wrong. My requirement is to save the photos to my applications folder as opposed to the Photos library. I've spent a day trying to correct my own code and got nowhere.As a test I modified the AvCam code to write the photo data to a static variable in a struct and then display that image on the camera view in AvCam by tapping a button as follows:// In the camera view controller: @IBOutlet weak var pv: UIImageView! // for previewing test @IBAction func btn(_ sender: UIButton) { // for setting preview test image let dataProvider = CGDataProvider(data: photoTest.photo! as CFData) let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent) let photo = UIImage(cgImage: cgImageRef) pv.contentMode = .scaleAspectFill pv.image = photo }The orientaion issue is replicated in AvCam with the above code. In my own app the image is previewed before saving to the apps folder and the images orientation is always wrong in the preview as well as when reloading the image from disk. I.e. if the iphone is in portrait orientation, the preview image is rotated 90 degrees anti-clockwise, in landscape the same happens so the preview image appears to be the correct way up.I'm assuming I've completely missed the point somewhere along the line as it seems to me that regardless of device orientation, the top of a photo should always be the top. For example, in the Photos app photos are shown the correct way up regardless of orientation.Any help would be much appreciated, this is driving me nuts ;o)Cheers
Posted
by