Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

HLS streaming from a Uniview IP camera
Hi I'm trying to stream a H264 video feed that is coming from a uniview IP camera in a browser however the stream is just not displaying. Either I get a single frame or just a black screen. I get the same issues on safari on the mac or any browser on an iphone. However the video stream works just fine using hls.js in Windows or on Android. We are grabbing the the RTSP stream from the camera and using ngix to serve the .m3u8 url. However even if we save the stream to a file and try an play it on the iphone it has the same issue (unless we use a separate media player like VLC). I know if we use ffmpeg to reencode as H264 rather than copy it the it will play. My guess there is an incompatibility between how uniview encode the video and what apple can accept. I've asked uniview and they are not sure what the problem is either. Is there a way to get more debug information on why a particular HLS stream is failing in safari on mac or iPhone.
1
0
575
Sep ’24
iOS18 AVPlayerViewController 出现卡住界面
(AVPlayerViewController *)avPlayerVC { if(!_avPlayerVC){ _avPlayerVC =[[AVPlayerViewController alloc] init]; _avPlayerVC.videoGravity = AVLayerVideoGravityResizeAspectFill; _avPlayerVC.showsPlaybackControls = NO; [self addSubview:_avPlayerVC.view]; [_avPlayerVC.view mas_makeConstraints:^(MASConstraintMaker *make) { make.edges.mas_equalTo(0); }]; [self sendSubviewToBack:_avPlayerVC.view]; } return _avPlayerVC; } 我在一个cell里添加这个,界面无法动弹。只有在iOS18会这样
2
1
715
Sep ’24
How to fetch all albums after performing a library request
Hi, In my app I am using MusicLibraryRequest<Artist> to fetch all of the artists in someone's Library collection. With this response I then fetch each artists albums: artist.with([.album]). The response from this only gives albums in the users Library collection. I would like to augment it with all of the albums for an artist from the full catalogue. I'm using MusicKit and targeting iOS18 and visionOS 2. Could someone please point me towards the best way to approach this?
1
0
588
Sep ’24
How to generate thumbnails for protected content using AVAssetImageGenerator?
I have a FairPlay-encrypted HLS stream and played the video in an AVPlayer.And I want to generate scrubbing thumbnails using the AVAssetImageGenerator. Also, I am able to generate thumbnails for clear streams but get errors for protected content. *How to generate thumbnails for protected content. func getImageThumbnail(forTime: CMTime) { let generator = AVAssetImageGenerator(asset: asset) generator.appliesPreferredTrackTransform = true generator.cancelAllCGImageGeneration() generator.generateCGImagesAsynchronously(forTimes: [NSValue(time: forTime)]) { [weak self] requestedTime, image, actualTime, result, error in if let error = error { print("Error generate: \(error.localizedDescription)") return } if let image = image { DispatchQueue.main.async { let image = UIImage(cgImage: image).jpegData(compressionQuality: 1.0) self?.playerImg.image = UIImage(data: image!) } } } }
1
0
759
Sep ’24
Sound quality
Am a musician/DJ. Jumped from 14 Pro Max iOS 17 to 16 Pro Max iOS 18.1 b4. For each audio source(music app/yt music etc.) same track/eq/volume compared side by side. With the new device, overall it's a bit muffled and damping music when highs and lows are mixed. Most noticeable when listening to high vocals and acoustic instruments. Drum and bass sound much like on an old Nokia. On 14 Pro it's nothing like that. Thank you
1
0
476
Sep ’24
Progress tracking does not work after the download has been paused and resumed
Hello, I recently started integrating HLS downloads into my application by using AVAssetDownloadTask and AVAssetDownloadConfiguration. I took an example from the documentation as a basis, with only one small difference: the minimum target for my application is iOS 16, so I replaced urlSession(_:assetDownloadTask:willDownloadTo:) with urlSession(_:assetDownloadTask:didFinishDownloadingTo:). And I encountered the following issue: after pausing a download and resuming it later, the progress no longer functions as expected. Could you, please, help me with this? What are the right approaches to implementing pause and progress tracking? Some details: I used devices with iOS 16.0.2 and 17.6.1 for testing. There was no code in the example that pauses the download and resumes it. So, I used the following methods to do this: suspend and resume Also, I have tried to track downloading progress using two different approaches: Using task.progress.observe(\.fractionCompleted) { ... }, which was presented in the example. In this scenario, after a pause, an observation callback will only be called once, when the download has completed, despite the fact that data is being successfully downloaded over the network. Using urlSession(_:assetDownloadTask:didLoad:totalTimeRangesLoaded:timeRangeExpectedToLoad:) and calculating progress as totalTimeRangesLoaded.reduce(0.0) { $0 + CMTimeGetSeconds($1.timeRangeValue.duration) / CMTimeGetSeconds(timeRangeExpectedToLoad.duration) }. In this scenario, I have noticed that the result of the calculation does not always increase, but sometimes there are outliers. Example of logs: 68%, 69%, 70%, 72%, 63%, 65%, 66%, 69%, 70%, 71%, 72%. Such fluctuations are most easily reproduced when I try to resume the download after pause. However, sometimes they occur spontaneously. It's important to mention, that this method marked as deprecated, perhaps for this reason. In both cases download is successful, the problem is with progress reporting only. Full version of code can be found here.
0
0
319
Sep ’24
AVPlayer.replaceCurrentItem(with:) "Incorrect actor executor assumption" runtime crash when building for iOS 18
Hi there, I have some code that's been working fine for the last few versions of iOS and macOS and all the others, and now causes a runtime crash in iOS 18/macOS 15 etc. I have an actor called Player which is basically a big wrapper around an AVPlayer. It all gets compiled down to a Framework, and my clients use it by dropping it in to their video player app code. It handles everything needed for them to be able to talk to our media infrastructure and handles telemetry. It has its own property called avplayer which is an AVPlayer. Gets created at the init(). It has a function called load(_ avPlayerItem: AVPlayerItem) which the clients use to load a new video into player. The offending code (which used to work!) looks like this: Task { @MainActor in avplayer.replaceCurrentItem(with: avPlayerItem) } No warnings in Xcode. When you run it, it crashes on iOS 18 and macOS 15 with this error in the debugger: Incorrect actor executor assumption I thought, "Okay well maybe replaceCurrentItem has changed and doesn't need to be on the main actor anymore, so even if you say this outside of a Main Actor-scoped task: avplayer.replaceCurrentItem(with: avPlayerItem) ...it still crashes the exact same way. Does anyone have any ideas? I'm under some heavy pressure here to get this working and I don't even know where to start with this. Big thanks in advance.
4
0
850
Sep ’24
Launching an app with Camera Control
I've just received my iPhone 16 Pro to develop some of the Camera Control features. I am trying to set up my app to be launched from a button press, and from my research in the documents this is only possible if I develop a LockedCameraCaptureExtension. Is this correct? My app is written in React Native, so to build an extension would require me to re-create the entire UI in Swift which just isn't possible with my resources. Ideally I could build a simple extension that requires Authentication to open the app but I'n not sure that will work: The app extension terminates shortly after launch if it doesn’t have an active camera view that uses AVCaptureEventInteraction to handle events from the hardware buttons, or if access to the camera hasn’t been requested. This is a bit frustrating for something so simple as to just opening an app. Thanks, Alex
1
0
1.1k
Sep ’24
While AirPlaying, MPVolumeView snaps to different volume when I play my app media
I have an app that plays sound files stored locally. I'm using a single SwiftUI view with a MPVolumeView so the user can control system volume from the player in my app. When I'm playing the sound file on the iPhone, my volume slider operates as expected. When I AirPlay to my AppleTV, the volume slider still works to control the volume, but when I hit play in my app, the volume snaps to a different value, but actual sound volume doesn't change. Control still works. Flipping to control center, I see a volume mismatch between system volume and the MPVolumeView. Here's the code that I use to put the slider in my app. struct VolumeSlider: UIViewRepresentable { func makeUIView(context: Context) -> MPVolumeView { let vv = MPVolumeView(frame: .zero) vv.showsVolumeSlider = true vv.setVolumeThumbImage(UIImage() ,for: UIControl.State.normal) return vv } func updateUIView(_ uiView: MPVolumeView, context: Context) { // No need to update the view in this case } } I'm using AVFoundation and AVAudioPlayer to playback the sound file. I'm using MediaPlayer to tell MPNowPlayingInfoCenter the track info and AlbumArt. Audio control via control center works perfectly. Does the same if I target iOS 16 or 17. Is this a bug with the MPVolumeView or the way I added it to the app?
3
0
737
Sep ’24
Anyone has any issues with voice control and high quality visuals in the Vision Pro
We are working on an app for the vision pro which has high polygons count and lots of high resolution textures. Everything looks smooth and in general very well, The issue is the moment we turn on voice control even if it is not being used the visuals at the center start to stutter left to right. Has anyone seen this ?, it must be a bug, any workaround ? Thanks, Guillermo
1
0
500
Sep ’24
ExtAudioFileRead throwing AVAudioSessionErrorCodeResourceNotAvailable error on iOS and iPadOS 18
Calls to ExtAudioFileRead are throwing OSStatus 561145203 (AVAudioSessionErrorCodeResourceNotAvailable) on iOS and iPadOS 18 -- earlier versions of iOS have not exhibited this behavior. This is a longstanding code path that has seen a spike of these error codes since iOS 18's release. The following is also printed to the Xcode 16 console:
2
1
710
Sep ’24
Capture Extension Icon
Hello, I seem to be having an issue assigning my Capture Extension an icon. It works fine using a system icon, for example: Image(systemName: "star") But it fails when I use my custom icon, such as: Image(uiImage: UIImage(named: "widget-icon")!) The "widget-icon" is located in both my Assets collection and the widget folder for good measure, and yet, my Widget always has a "?" icon. I am able to use "widget-icon" just fine for other Lock Screen widgets, but it is not working for the Camera Extension Widget. Any thoughts? Thank you for your help!
2
0
486
Sep ’24
Using non-modular libraries in an audio-unit
I've got a bunch of Audio Units I've been developing over the last few months - in Xcode 14 based on the Audio Unit template that ships in Xcode. The DSP heavy-lifting is all done in Rust libraries, which I build for Intel and Apple Silicon, merge using lipo and build XCFrameworks from. There are several of these - one with the DSP code, and several others used from the UI - a mix of SwiftUI and Cocoa. Getting the integration of Rust, Objective C, C++ and Swift working and automated took a few weeks (my first foray into C++ since the 1990s), but has been solid, reliable and working well - everything loads in Logic Pro and Garage Band and works. But now I'm attempting the (woefully underdocumented) process of making 13 audio unit projects able to be loaded in-process - which means moving all of the actual code into a Framework. And therein lies the problem: None of the xcframeworks are modular, which leads to the dreaded "Include of non-modular header inside framework module". Imported directly into the app extension project with "Allow Non-modular Includes in Framework Modules" works fine - but in a framework this seems to be verboten. So, the obvious solution would be to add a module map to each xcframework, and then, poof, they're modular. BUT... due to a peculiar limitation of Xcode's build system I've spent days searching for a workaround for, it is only possible to have ONE xcframework containing a module.modulemap file in any project. More than that and xcodebuild will try to clobber them with each other and the build will fail. And there appears to be NO WAY to name the file anything other than module.modulemap inside an xcframework and have it be detected. So I cannot modularize my frameworks, because there is more than one of them. How the heck does one work around this? A custom module map file somewhere (that the build should find and understand applies to four xcframeworks - how?)? Something else? I've seen one dreadful workaround - https://medium.com/@florentmorin/integrating-swift-framework-with-non-modular-libraries-d18098049e18 - given that I'm generating a lot of the C and Objective C code for the audio in Rust, I suppose I could write a tool that parses the header files and generates Objective C code that imports each framework and declares one method for every single Rust call. But it seems to me there has to be a correct way to do this without jumping through such hoops. Thoughts?
2
0
662
Sep ’24
DockKit tracking becomes erratic with increased zoom factor in iOS app
I'm developing an iOS app using DockKit to control a motorized stand. I've noticed that as the zoom factor of the AVCaptureDevice increases, the stand's movement becomes increasingly erratic up and down, almost like a pendulum motion. I'm not sure why this is happening or how to fix it. Here's a simplified version of my tracking logic: func trackObject(_ boundingBox: CGRect, _ dockAccessory: DockAccessory) async throws { guard let device = AVCaptureDevice.default(for: .video), let input = try? AVCaptureDeviceInput(device: device) else { fatalError("Camera not available") } let currentZoomFactor = device.videoZoomFactor let dimensions = device.activeFormat.formatDescription.dimensions let referenceDimensions = CGSize(width: CGFloat(dimensions.width), height: CGFloat(dimensions.height)) let intrinsics = calculateIntrinsics(for: device, currentZoom: Double(currentZoomFactor)) let deviceOrientation = UIDevice.current.orientation let cameraOrientation: DockAccessory.CameraOrientation = { switch deviceOrientation { case .landscapeLeft: return .landscapeLeft case .landscapeRight: return .landscapeRight case .portrait: return .portrait case .portraitUpsideDown: return .portraitUpsideDown default: return .unknown } }() let cameraInfo = DockAccessory.CameraInformation( captureDevice: input.device.deviceType, cameraPosition: input.device.position, orientation: cameraOrientation, cameraIntrinsics: useIntrinsics ? intrinsics : nil, referenceDimensions: referenceDimensions ) let observation = DockAccessory.Observation( identifier: 0, type: .object, rect: boundingBox ) let observations = [observation] try await dockAccessory.track(observations, cameraInformation: cameraInfo) } func calculateIntrinsics(for device: AVCaptureDevice, currentZoom: Double) -> matrix_float3x3 { let dimensions = CMVideoFormatDescriptionGetDimensions(device.activeFormat.formatDescription) let width = Float(dimensions.width) let height = Float(dimensions.height) let diagonalPixels = sqrt(width * width + height * height) let estimatedFocalLength = diagonalPixels * 0.8 let fx = Float(estimatedFocalLength) * Float(currentZoom) let fy = fx let cx = width / 2.0 let cy = height / 2.0 return matrix_float3x3( SIMD3<Float>(fx, 0, cx), SIMD3<Float>(0, fy, cy), SIMD3<Float>(0, 0, 1) ) } I'm calling this function regularly (10-30 times per second) with updated bounding box information. The erratic movement seems to worsen as the zoom factor increases. Questions: Why might increasing the zoom factor cause this erratic movement? I'm currently calculating camera intrinsics based on the current zoom factor. Is this approach correct, or should I be doing something differently? Are there any other factors I should consider when using DockKit with a variable zoom? Could the frequency of calls to trackRider (10-30 times per second) be contributing to the erratic movement? If so, what would be an optimal frequency? Any insights or suggestions would be greatly appreciated. Thanks!
8
0
752
Sep ’24
AVCaptureSystemZoomSlider has a factor that I can't get anywhere.
As you can see, the value shown in the AVCaptureSystemZoomSlider is not the same as the raw camera zoom factor. I tried to calculate this value, and it seems it's 0.8. (5-1)*0.8=4.2-1 in this image. It seems this factor only applies to the default wide-angle camera. And I can't get this value from anywhere. (It's not displayVideoZoomFactorMultiplier btw, I checked that.) What is it?
1
1
606
Sep ’24
Implementing Enhanced Dialogue on tvOS 18
How does a third party developer go about supporting the new Enhanced Dialogue option for video apps in tvOS 18? If an app is using the standard AVPlayerViewController, I had assumed it would be a simple-ish matter of building against the tvOS 18 SDK but apparently not, the options don't appear, not even greyed out.
1
1
562
Sep ’24
How To Play Audio Through Headphones on WatchOS 11?
I have an app that plays audio and the behaviour of it has changed in watchOS 11. I can no longer figure out how to play the audio through the headphones. To play audio I.. let session = AVAudioSession.sharedInstance() try session.setCategory(.playback, mode: .default, policy: .longFormAudio, options: [] let activated = try await session.activate() if activated { // play audio } In previous versions, 'try await session.activate()` would bring up a route picker where the user could select their headphones. Now on watchOS 11 it just plays the audio out of the speaker. Maybe that's what some people want but if they do want it to play out of the headphones I can't see how I can give that option now? There's no AVRoutePickerView available on watchOS for selecting it. I've tried setting the category to .multiRoute instead of .playback and that does bring up the picker but selecting the speaker results in an error code and selecting the headphones results in it saying it cannot find my headphones (which shouldn't be the case since Apple Music on watchOS finds them). Tried overriding the output with try session.overrideOutputAudioPort(.speaker) but the compiler complains that speaker isn't available on watchOS, which is strange as if I understand correctly it's possible to play through the speaker now at least on some Apple Watches. So is this a bug or is there some way I've not found of playing audio through the headphones?
1
1
551
Sep ’24