In an m3u8 manifest, audio EXT-X-MEDIA tags usually contain CHANNELS tag containing the audio channels count like so:
#EXT-X-MEDIA:TYPE=AUDIO,URI="audio_clear_eng_stereo.m3u8",GROUP-ID="default-audio-group",LANGUAGE="en",NAME="stream_5",AUTOSELECT=YES,CHANNELS="2"
Is it possible to get this info from AVPlayer, AVMediaSelectionOption or some related API?
Posts under iOS tag
200 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hello. I am re-writing our way of storing data into Core Data in our app, so it can be done concurrently.
The solution I opted for is to have a singleton actor that takes an API model, and maps it to a Core Data object and saves it.
For example, to store an API order model, I have something like this:
func store(
order apiOrder: APIOrder,
currentContext: NSManagedObjectContext?
) -> NSManagedObjectID? {
let context = currentContext ?? self.persistentContainer.newBackgroundContext()
// …
}
In the arguments, there is a context you can pass, in case you need to create additional models and relate them to each other. I am not sure this is how you're supposed to do it, but it seemed to work.
From what I've understood of Core Data and using multiple contexts, the appropriate way use them is with context.perform or context.performAndWait.
However, since my storage helper is an actor, @globalActor actor Storage2 { … }, my storage's methods are actor-isolated.
This gives me warnings / errors in Swift 6 when I try to pass the context for to another of my actor's methods.
let context = …
return context.performAndWait {
// …
if let apiBooking = apiOrder.booking {
self.store(booking: apiBooking, context: context)
/* causes warning:
Sending 'context' risks causing data races; this is an error in the Swift 6 language mode
'self'-isolated 'context' is captured by a actor-isolated closure. actor-isolated uses in closure may race against later nonisolated uses
Access can happen concurrently
*/
}
// …
}
From what I understand this is because my methods are actor-isolated, but the closure of performAndWait does not execute in a thread safe environment.
With all this, what are my options? I've extracted the store(departure:context:) into its own method to avoid duplicated code, but since I can't call it from within performAndWait I am not sure what to do.
Can I ditch the performAndWait? Removing that makes the warning "go away", but I don't feel confident enough with Core Data to know the answer.
I would love to get any feedback on this, hoping to learn!
We use MLModel in our app, which uses two file formats: mlmodel and mlpackage. We find that when the model is released, models using mlmodel format have a certain probability of crashing. And these crashes account for the majority (over 85%) in the iOS 16.x system. Here is the crash stack:
Exception Type: SIGTRAP
Exception Codes: TRAP_BRKPT at 0x1b48e855c
Crashed Thread: 5
Thread 5 Crashed:
0 libdispatch.dylib 0x00000001b48e855c _dispatch_semaphore_dispose.cold.1 + 40
1 libdispatch.dylib 0x00000001b48b2b28 _dispatch_semaphore_signal_slow
2 libdispatch.dylib 0x00000001b48b0e58 _dispatch_dispose + 208
3 AppleNeuralEngine 0x00000001ef07b51c -[_ANEProgramForEvaluation .cxx_destruct] + 32
4 libobjc.A.dylib 0x00000001a67ed4a4 object_cxxDestructFromClass(objc_object*, objc_class*) + 116
5 libobjc.A.dylib 0x00000001a67f221c objc_destructInstance + 80
6 libobjc.A.dylib 0x00000001a67fb9d0 _objc_rootDealloc + 80
7 AppleNeuralEngine 0x00000001ef079e04 -[_ANEProgramForEvaluation dealloc] + 72
8 AppleNeuralEngine 0x00000001ef07ca70 -[_ANEModel .cxx_destruct] + 44
9 libobjc.A.dylib 0x00000001a67ed4a4 object_cxxDestructFromClass(objc_object*, objc_class*) + 116
10 libobjc.A.dylib 0x00000001a67f221c objc_destructInstance + 80
11 libobjc.A.dylib 0x00000001a67fb9d0 _objc_rootDealloc + 80
12 AppleNeuralEngine 0x00000001ef07bd7c -[_ANEModel dealloc] + 136
13 CoreFoundation 0x00000001ad4563cc cow_cleanup + 168
14 CoreFoundation 0x00000001ad49044c -[__NSDictionaryM dealloc] + 148
15 Espresso 0x00000001bb19c7a4 Espresso::ANERuntimeEngine::compiler::reset() + 1340
16 Espresso 0x00000001bb19cac8 Espresso::ANERuntimeEngine::compiler::~compiler() + 108
17 Espresso 0x00000001bacd69e4 std::__1::__shared_weak_count::__release_shared() + 84
18 Espresso 0x00000001ba944d00 std::__1::__hash_table<std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>, std::__1::__unordered_map_hasher<Espresso::platform, std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>, std::__1::hash<Espresso::platform>, std::__1::equal_to<Espresso::platform>, true>, std::__1::__unordered_map_equal<Espresso::platform, std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>, std::__1::equal_to<Espresso::platform>, std::__1::hash<Espresso::platform>, true>, std::__1::allocator<std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>>>::__deallocate_node(std::__1::__hash_node_base<std::__1::__hash_node<std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>, void*>*>*) + 40
19 Espresso 0x00000001ba8ea640 std::__1::__hash_table<std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>, std::__1::__unordered_map_hasher<Espresso::platform, std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>, std::__1::hash<Espresso::platform>, std::__1::equal_to<Espresso::platform>, true>, std::__1::__unordered_map_equal<Espresso::platform, std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>, std::__1::equal_to<Espresso::platform>, std::__1::hash<Espresso::platform>, true>, std::__1::allocator<std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>>>::~__hash_table() + 28
20 Espresso 0x00000001ba8e5750 Espresso::net::~net() + 396
21 Espresso 0x00000001bacd69e4 std::__1::__shared_weak_count::__release_shared() + 84
22 Espresso 0x00000001bad750e4 std::__1::__vector_base<std::__1::shared_ptr<Espresso::net>, std::__1::allocator<std::__1::shared_ptr<Espresso::net>>>::clear() + 52
23 Espresso 0x00000001ba902448 std::__1::__vector_base<std::__1::shared_ptr<Espresso::net>, std::__1::allocator<std::__1::shared_ptr<Espresso::net>>>::~__vector_base() + 36
24 Espresso 0x00000001ba8ed99c std::__1::unique_ptr<EspressoLight::espresso_plan::priv_t, std::__1::default_delete<EspressoLight::espresso_plan::priv_t>>::reset(EspressoLight::espresso_plan::priv_t*) + 188
25 Espresso 0x00000001ba95b7fc EspressoLight::espresso_plan::~espresso_plan() + 72
26 Espresso 0x00000001ba902078 EspressoLight::espresso_plan::~espresso_plan() + 16
27 Espresso 0x00000001ba8e690c espresso_plan_destroy + 372
28 CoreML 0x00000001c48c45cc -[MLNeuralNetworkEngine _deallocContextAndPlan] + 40
29 CoreML 0x00000001c48c43bc -[MLNeuralNetworkEngine dealloc] + 40
30 libobjc.A.dylib 0x00000001a67ed4a4 object_cxxDestructFromClass(objc_object*, objc_class*) + 116
31 libobjc.A.dylib 0x00000001a67f221c objc_destructInstance + 80
32 libobjc.A.dylib 0x00000001a67fb9d0 _objc_rootDealloc + 80
~~~~ Our code that release the MLModel object ~~~~
Moreover, we use a synchronization mechanism to ensure that the release of the MLModel and the data processing of the model (by calling [model predictionFromFeatures]) do not occur simultaneously. What could be the possible causes of the problem, and how can we prevent it from happening? Any advice would be appreciated.
When I try to install packages through spm it doesn't find the packages.swift file, and with pods I have problems with Foundation. Trying to install Firebase shows that many pods have issues with double quotes. I don't know if it's my PC using everything updated or what, but I'm having a lot of problems and I haven't been able to reconcile my first application. I don't want to give up, and I would like some advice, since you have managed it, I would appreciate some help. I have followed tutorials, cases of people with similar problems, but it always throws some error.
My pc is mac m1
Hi,
Could someone possibly tell me if it is accurate that Xcode Version 16.1 (16B40) is NOT compatible with iOS 18.1.1 and if so, when it will be?
Many TIA.
Hi, so a little context, by environment variables I mean like $HOME in linux. I know that there are some standard and some app specific environment variables in the above mentioned platforms.
Is is possible for the user to set environment variables? If so, what are the ways in which users can set environment variables in the platforms mentioned across the system or for the app as they would in macOS/linux/windows?
And is it possible for developers of the app do the same? If so how?
What I have found so far is that it cannot be done by users, and there is one place in xcode where I, as a developer can set environment variables for my app from the scheme. This isn't available when I ship just the installation binary to the end user, but rather is available only when the app is run using xcode. I just need validation that what I understand is correct and that there aren't other ways to do this by the user/developer without jailbreaking.
On the iOS 18 system, we have found that some pages will definitely experience this kind of crash phenomenon when displayed. It's puzzling that I can modify different codes and this kind of crash won't occur on the 18 system. This has had a great impact on my code development, and I don't know if I can still use this API in the future. Can you help me solve this dilemma. thank
when my iPhone15 pro max upgrade to iOS18.1.1,it can not connect to hotPot of my lot device(os android5.1) any more and my iPhone12(iOS 18.1.1) has no issues.
Both the 15 pro max and the iPhone12 works well with another device (OS android 10.0).
had tried:
1.Forget Network (and re-add your desired Wifi network),
2.Reset Network Settings (under Settings/General/Transfer or Reset iPhone)
3.Turn Airplane Mode On then Off after a few seconds
4.Restart the iPhone.
5.Rest all setting
6.Disable VPN
7.close the the settings from rotating my WiFi address
Did anyone have similar issues?
post detail almost same as the title of the post, can’t enable the option. That is apple is already to provide of the always display option for 16 normal series?
Hi everyone,
I am working on a 3D reconstruction project.
Recently I have been able to retrieve the intrinsics from the two cameras on the back of my iPhone.
One consideration is that I want this app to run regardless if there is no LiDAR, but at least two cameras on the back. IF there is a LiDAR that is something I have considered to work later on the course of the project.
I am using a AVCaptureSession with the two cameras AVCaptureDevice:
builtInWideAngleCamera
builtInUltraWideCamera
The intrinsic matrices seem to be correct. However, the when I retrieve the extrinsics, e.g., builtInWideAngleCamera w.r.t. builtInUltraWideCamera the matrix I get looks like this:
Extrinsic Matrix (Ultra-Wide to Wide):
[0.9999968, 0.0008149305, -0.0023960583, 0.0]
[-0.0008256607, 0.9999896, -0.0044807075, 0.0]
[0.002392382, 0.0044826716, 0.99998707, 0.0].
[-14.277955, -8.135408e-10, -0.3359985, 0.0]
The extrinsic matrix of the form: [R | t], seems to be correct for the rotational part, but the translational vector is ALL ZEROS. Which suggests that the cameras are physically overlapped as well the last element not being 1 (homogeneous coordinates).
Has anyone encountered this 'issue' before?
Is there a flaw in my reasoning or something I might be missing?
Any comments are very much appreciated.
Hi everyone,
I’m having trouble getting the correct horizontal slide transitions when navigating between multiple screens in my SwiftUI app. I have three screens (enum cases with an int assigned as index depending on the navigation order): .checkClient (index 0), .login(document: String) (index 1), and .register (index 2).
My goal is:
When moving forward (e.g., from .checkClient to .login, or from .login to any other with a greater index), the new screen should enter from the right (trailing) and the old one should exit to the left (leading).
When going backward (from .register back to .checkClient, for example), the new screen should enter from the left (leading) and the old one should exit to the right (trailing).
I’ve been using a state property isAdvancing to determine the direction of transitions (I use TCA, so my logic is in a Reducer body, my properties in a State and my views are normal SwiftUI Views):
case .updateCurrentScreen(let newScreen):
state.isAdvancing = newScreen.index > state.currentScreen.index
state.currentScreen = newScreen
return .none
I tried applying .transition directly inside each case:
.transition(
.asymmetric(
insertion: .move(edge: store.isAdvancing ? .trailing : .leading),
removal: .move(edge: store.isAdvancing ? .leading : .trailing)
)
)
This works correctly the first time I navigate forward. However, when I go to the .register screen and then hit back, the directions become inconsistent. Sometimes the removal happens in the wrong direction, and after returning to .checkClient, forward navigations stop working as intended.
Then, I tried placing the transition at a higher level, wrapping the switch in a ZStack and using a single .transition(...) outside:
ZStack {
switch store.currentScreen {
case .checkClient:
StartView(...)
case .login:
LoginView(...)
case .register:
RegisterView(...)
}
}
.transition(
.asymmetric(
insertion: .move(edge: store.isAdvancing ? .trailing : .leading),
removal: .move(edge: store.isAdvancing ? .leading : .trailing)
)
)
.animation(.easeInOut, value: store.currentScreen)
But doing this results in some transitions reverting to a fade instead of a horizontal slide.
I’ve also tried ensuring that isAdvancing updates before changing the currentScreen. Unfortunately, I still encounter inconsistent transitions when navigating back and forth between these screens.
Here is my complete view logic (even though is not finished nor polished yet):
var body: some View {
WithPerceptionTracking {
ZStack {
AdaptiveSheetView(
backgroundImage: Asset.AuthorizationDomain.background,
hasBackButton: store.showsBackButton,
isFullScreen: store.isFullScreen,
backAction: {
store.send(.goBack)
}
) {
if store.showsIATILogo {
Image(asset: Asset.iatiLogo)
.padding(spacing: .medium)
}
ZStack {
switch store.currentScreen {
case .checkClient:
StartView(store: store.scope(state: \.startState, action: \.startAction))
case .login:
if let newStore = store.scope(state: \.loginState, action: \.loginAction) {
LoginView(store: newStore)
}
case .register:
if let newStore = store.scope(state: \.registerState, action: \.registerAction) {
RegisterView(store: newStore)
}
}
}
.transition(
.asymmetric(
insertion: .move(edge: store.isAdvancing ? .trailing : .leading),
removal: .move(edge: store.isAdvancing ? .leading : .trailing)
)
)
}
.animation(.easeInOut, value: store.currentScreen)
if store.startState.checkUser == .loading { LoadingSpinner() }
PreloadView(store: store.scope(state: \.preloadState, action: \.preloadAction))
.opacity(store.preloadViewShown ? 1.0 : 0.0)
.animation(.linear(duration: 0.5), value: store.preloadViewShown)
.onChange(of: store.preloadViewShown) { shown in
if !shown { store.send(._checkPreviousSessions) }
}
}
}
}
Has anyone experienced similar issues or found a reliable pattern for achieving these “push/pop” style transitions in SwiftUI? Any guidance would be greatly appreciated!
My minimum target is iOS 16, so I can not make use of TabView with paginated style for this AFAIK.
Thanks in advance for any time and attention you dedicate to me 🙏🏼
I have private certificate authority. Root > Intermediate > Leaf.
When I install the Root Certificate, it shows in Settings > General > About > Certificate Trust Settings in iOS 18.1.1
However, when I install the Intermediate Certificate (including the CA Bundle), the Intermediate CA Certificate is not shown in the Certificate Trust Settings.
All my leaf certificates are issued by the Intermediate CA. Is this a bug? If not, how can this be solved? TIA!
Hi,
We have configured WKAppBoundDomains and are using limitsNavigationsToAppBoundDomains to enable Service Workers, which works perfectly. However, we are now unable to load a WKWebView with any domain that is not included in the app-bound domains.
For these other WKWebView instances, we have explicitly set config.limitsNavigationsToAppBoundDomains = false, but it doesn’t seem to have any effect. We don’t require access to the restricted APIs enabled by app-bound domains for these instances—we simply want them to load and perform basic website functionality.
Is there a way to enable Service Workers selectively in some WKWebView instances while allowing others to remain unaffected by the App-Bound Domains restriction?
Thank you for your help!
Best regards,
Rose Ding
Hi,
I'd like to develop an app which runs speech recognition even after going into background. I know I can accomplish this using audio background mode and the process the audio but I am not sure if this workaround would get accepted into App Store because of the processing limitations while in the background.
How can I accomplish this while still being compliant with Apples privacy policy and other restrictions?
Thanks,
Marek
I've started building an app that uses NFC to communicate towards a device.
The initial communication is being triggered as intended, the last step of authentication is sent the connection seems to drop with the message:
-[_NFCardSession validateReceivedAPDU:]:236 Invalid ISO7816 APDU detected, (null)
The same response has been validated on an Android device and the process works fine there, but .readerDeselected is triggered instead of .received with cardAPDU.
Any way to see which part of the validation that fails?
Best regards
The documentation for a text filter extension states that receiverISOCountryCode is a field the extension receives
https://developer.apple.com/documentation/sms_and_call_reporting/ilmessagefilterqueryrequest/3979257-receiverisocountrycode
"The ISO Country Code of the receiving phone number"
However, if the extension defers to its text server, then the payload sent to the server doesn't contain the iso country code:
POST /server-endpoint HTTP/1.1
Accept: */*
Content-Type: application/json; charset=utf-8
Content-Length: 148
{
"_version": 1,
"query": {
"sender": "14085550001",
"message": {
"text": "This is a message"
}
},
"app": {
"version": "1.1"
}
}
from: https://developer.apple.com/documentation/sms_and_call_reporting/ilmessagefilterextensioncontext/2880240-deferqueryrequesttonetwork
Why does the payload sent to the text server not contain the country code?
If an app has a text filtering extension and associated server that the iPhone OS communicates with, then how can that communication be authenticated?
In other words, how can the server verify that the request is valid and coming from the iPhone and not from some spoofer?
If somebody reverse engineers the associated domain urls our of the app's info.plist or entitlement files and calls the server url directly, then how can the server detect this has occurred and the request is not coming from the iPhone OS of a handset on which the app is installed?
I am developing a library for RichTextEditor for SwiftUI, and I am facing issues with implementing NSParagraphStyle related features like nested bullet lists and text alignment. I have searched a lot and personally feel that the documentation is not enough on this topic, so here I want to discuss how we can achieve the nested list with UI/NSTextView and natively available NSTextList in NSParagraphStyle.textLists. The problem is I am not able to understand how I can use this text list and how to manage adding list and removing list with my editor
I have seen code that work adding attributes to each string and then merge them, but I don't want that, I want to add/update/remove attributes from selected text and if text is not selected then want to manage typing attributes to keep applied attributes to current position
In one of My Apps I want have a "tip" option for users, so they can support the developer if they like the App. As a small tank you, "tipping" will unlock a small bonus feature (nothing essential, but nevertheless nice to have). According to the Guidelines this is perfectly fine and explicitly allowed.
So a created a few IAP options with different prices as non-consumable one-time purchases and the same number of consumable one-time-purchases with the same prices (so no subscriptions at all).
The idea is: users should be able to tip multiple times if they want to (actually many users of the old App version have asked for this)
Non-consumable IAP items can only be purchased once, but these can be restored (for example when switching to a new device).
Consumable IAP items can not be restored, but these can be purchased multiple times.
The idea is to combine both, so it is possible to tip multiple times (via consumable IAPs) and to allow the restore of previous purchases (to unlock the bonus feature). To do this, the App would at first only offer the non-consumable IAP items in its "tipping" screen. Only after these were purchased, the App would then only offer the consumable IAP items. Users tipping only once would always purchase the non-consumable item that can be restored. Users who have tipped multiple times would also have purchased the non-consumable IAP once and in addition the consumable item, so they all can restore and unlock the bonus feature. And no one is confused about any internal differences (consumable vs non-consumable), because the users will always see only one tipping option.
I've created these IAP items and send them to Apple for review together with this explanation. All the non-consumable items got approved, all the consumable items got rejected with the following bizarre and unhelpful note from the review board:
3.1.1 - new IAP type
New type: Consumable
Previous type: Non-Consumable
Recommend: Download
The binary of the app is new (more than 14 days).
What exactly does this mean? I've read section 3.1.1 of the guidelines and nothing indicates anything which would allow any rejection here. And what does "the binary of the app is new" mean? Sure the App is new (currently the new version is tested via TestFlight), but what does this have to do with the IAP?
With normal "Appstore rejections" there was always a link provided which allowed to contact the review team. But there's absolutely nothing available here. Just this bizarre note which doesn't give any clue what's wrong.
Does anyone have an idea what's wrong, or what I need to do to get this approved? How can I contact the review team that is responsible for this IAP review?
BTW: I've tried to get this approved again after changing the description/names of the localization a bit (because these are the only things which are marked in red), but the same strange note came back.
And the localization title and description can not have a long text, so there're not that many possible variations available which describe the IAP correctly.
I am building a video conferencing app using LiveKit in Flutter and want to implement Picture-in-Picture (PiP) mode on iOS. My goal is to display a view showing the speaker's initials or avatar during PiP mode. I successfully implemented this functionality on Android but am struggling to achieve it on iOS.
I am using a MethodChannel to communicate with the native iOS code. Here's the Flutter-side code:
import 'package:flutter/foundation.dart';
import 'package:flutter/services.dart';
class PipController {
static const _channel = MethodChannel('pip_channel');
static Future<void> startPiP() async {
try {
await _channel.invokeMethod('enterPiP');
} catch (e) {
if (kDebugMode) {
print("Error starting PiP: $e");
}
}
}
static Future<void> stopPiP() async {
try {
await _channel.invokeMethod('exitPiP');
} catch (e) {
if (kDebugMode) {
print("Error stopping PiP: $e");
}
}
}
}
On the iOS side, I am using AVPictureInPictureController. Since it requires an AVPlayerLayer, I had to include a dummy video URL to initialize the AVPlayer. However, this results in the dummy video’s audio playing in the background, but no view is displayed in PiP mode.
Here’s my iOS code:
import Flutter
import UIKit
import AVKit
@main
@objc class AppDelegate: FlutterAppDelegate {
var pipController: AVPictureInPictureController?
var playerLayer: AVPlayerLayer?
override func application(
_ application: UIApplication,
didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?
) -> Bool {
let controller: FlutterViewController = window?.rootViewController as! FlutterViewController
let pipChannel = FlutterMethodChannel(name: "pip_channel", binaryMessenger: controller.binaryMessenger)
pipChannel.setMethodCallHandler { [weak self] (call: FlutterMethodCall, result: @escaping FlutterResult) in
if call.method == "enterPiP" {
self?.startPictureInPicture(result: result)
} else if call.method == "exitPiP" {
self?.stopPictureInPicture(result: result)
} else {
result(FlutterMethodNotImplemented)
}
}
GeneratedPluginRegistrant.register(with: self)
return super.application(application, didFinishLaunchingWithOptions: launchOptions)
}
private func startPictureInPicture(result: @escaping FlutterResult) {
guard AVPictureInPictureController.isPictureInPictureSupported() else {
result(FlutterError(code: "UNSUPPORTED", message: "PiP is not supported on this device.", details: nil))
return
}
// Set up the AVPlayer
let player = AVPlayer(url: URL(string: "http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4")!)
let playerLayer = AVPlayerLayer(player: player)
self.playerLayer = playerLayer
// Create a dummy view
let dummyView = UIView(frame: CGRect(x: 0, y: 0, width: 1, height: 1))
dummyView.isHidden = true
window?.rootViewController?.view.addSubview(dummyView)
dummyView.layer.addSublayer(playerLayer)
playerLayer.frame = dummyView.bounds
// Initialize PiP Controller
pipController = AVPictureInPictureController(playerLayer: playerLayer)
pipController?.delegate = self
// Start playback and PiP
player.play()
pipController?.startPictureInPicture()
print("Picture-in-Picture started")
result(nil)
}
private func stopPictureInPicture(result: @escaping FlutterResult) {
guard let pipController = pipController, pipController.isPictureInPictureActive else {
result(FlutterError(code: "NOT_ACTIVE", message: "PiP is not currently active.", details: nil))
return
}
pipController.stopPictureInPicture()
playerLayer = nil
self.pipController = nil
result(nil)
}
}
extension AppDelegate: AVPictureInPictureControllerDelegate {
func pictureInPictureControllerDidStartPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
print("PiP started")
}
func pictureInPictureControllerDidStopPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
print("PiP stopped")
}
}
Questions:
How can I implement PiP mode on iOS without using a video URL (or AVPlayerLayer)?
Is there a way to display a custom UIView (like a speaker’s initials or an avatar) in PiP mode instead of requiring a video?
Why does PiP not display any view, even though the dummy video URL is playing in the background?
I am new to iOS development and would greatly appreciate any guidance or alternative approaches to achieve this functionality. Thank you!