Hey Devs,
Do we have any possibility to make the apps hidden or move them to hidden folder(in iOS 18) programmatically?
Posts under iOS tag
200 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi there,
We build and operate apps for several PBS stations, and we're considering adding in-app donations with Apple Pay:
https://developer.apple.com/apple-pay/nonprofits/
I'm curious if any examples of this functionality in a live app. We'd love to take a look, and get a better idea of use-cases before we tackle that project. (Seems like it would be a very nice upgrade.)
Cheers,
Kevin
I am building a video conferencing app using LiveKit in Flutter and want to implement Picture-in-Picture (PiP) mode on iOS. My goal is to display a view showing the speaker's initials or avatar during PiP mode. I successfully implemented this functionality on Android but am struggling to achieve it on iOS.
I am using a MethodChannel to communicate with the native iOS code. Here's the Flutter-side code:
import 'package:flutter/foundation.dart';
import 'package:flutter/services.dart';
class PipController {
static const _channel = MethodChannel('pip_channel');
static Future<void> startPiP() async {
try {
await _channel.invokeMethod('enterPiP');
} catch (e) {
if (kDebugMode) {
print("Error starting PiP: $e");
}
}
}
static Future<void> stopPiP() async {
try {
await _channel.invokeMethod('exitPiP');
} catch (e) {
if (kDebugMode) {
print("Error stopping PiP: $e");
}
}
}
}
On the iOS side, I am using AVPictureInPictureController. Since it requires an AVPlayerLayer, I had to include a dummy video URL to initialize the AVPlayer. However, this results in the dummy video’s audio playing in the background, but no view is displayed in PiP mode.
Here’s my iOS code:
import Flutter
import UIKit
import AVKit
@main
@objc class AppDelegate: FlutterAppDelegate {
var pipController: AVPictureInPictureController?
var playerLayer: AVPlayerLayer?
override func application(
_ application: UIApplication,
didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?
) -> Bool {
let controller: FlutterViewController = window?.rootViewController as! FlutterViewController
let pipChannel = FlutterMethodChannel(name: "pip_channel", binaryMessenger: controller.binaryMessenger)
pipChannel.setMethodCallHandler { [weak self] (call: FlutterMethodCall, result: @escaping FlutterResult) in
if call.method == "enterPiP" {
self?.startPictureInPicture(result: result)
} else if call.method == "exitPiP" {
self?.stopPictureInPicture(result: result)
} else {
result(FlutterMethodNotImplemented)
}
}
GeneratedPluginRegistrant.register(with: self)
return super.application(application, didFinishLaunchingWithOptions: launchOptions)
}
private func startPictureInPicture(result: @escaping FlutterResult) {
guard AVPictureInPictureController.isPictureInPictureSupported() else {
result(FlutterError(code: "UNSUPPORTED", message: "PiP is not supported on this device.", details: nil))
return
}
// Set up the AVPlayer
let player = AVPlayer(url: URL(string: "http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4")!)
let playerLayer = AVPlayerLayer(player: player)
self.playerLayer = playerLayer
// Create a dummy view
let dummyView = UIView(frame: CGRect(x: 0, y: 0, width: 1, height: 1))
dummyView.isHidden = true
window?.rootViewController?.view.addSubview(dummyView)
dummyView.layer.addSublayer(playerLayer)
playerLayer.frame = dummyView.bounds
// Initialize PiP Controller
pipController = AVPictureInPictureController(playerLayer: playerLayer)
pipController?.delegate = self
// Start playback and PiP
player.play()
pipController?.startPictureInPicture()
print("Picture-in-Picture started")
result(nil)
}
private func stopPictureInPicture(result: @escaping FlutterResult) {
guard let pipController = pipController, pipController.isPictureInPictureActive else {
result(FlutterError(code: "NOT_ACTIVE", message: "PiP is not currently active.", details: nil))
return
}
pipController.stopPictureInPicture()
playerLayer = nil
self.pipController = nil
result(nil)
}
}
extension AppDelegate: AVPictureInPictureControllerDelegate {
func pictureInPictureControllerDidStartPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
print("PiP started")
}
func pictureInPictureControllerDidStopPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
print("PiP stopped")
}
}
Questions:
How can I implement PiP mode on iOS without using a video URL (or AVPlayerLayer)?
Is there a way to display a custom UIView (like a speaker’s initials or an avatar) in PiP mode instead of requiring a video?
Why does PiP not display any view, even though the dummy video URL is playing in the background?
I am new to iOS development and would greatly appreciate any guidance or alternative approaches to achieve this functionality. Thank you!
Dear Apple Engineer,
We have problem in the banking application after update iOS to 18, 18.1 or 18.1.1. No notifications appear on the locked screen, even in the notification center. On lower version push notifications apparently correctly.
What have we checked so far is:
certificates
profiles
app with push notifications extension and without it
usage of setBadgeCount(_:withCompletionHandler:) instead of applicationIconBadgeNumber
Our sample payload with encrypted data:
{
"aps":{
"alert":"Message from Bank",
"badge":0,
"sound":"default",
"mutable-content":1,
"category":""
},
"Type":"",
"MessageId":"",
"Id":"8ebf0c13-83cf-4029-ac13-91d026c3770a",
"Media-url":"",
"alternativeTitle":"New message",
"priority":5,
"EncryptedData":"eyJ0eXAiOiJibTplbmMtdjEiLCJhbGciOiJibTppb3MtZWNkaCIsImVuYyI6ImJtOkExMjhHQ00tSVYxNiIsImVuY19raWQiOiI5OUIyN0E4NC1CQzRFLTRGMzQtQjBGNC0yMTcyMEYxQTFEN0EifQ...BDdxycY-ZWPC7BgI_07efVSgjKyGyGVKlcNtZSslWJePrwJkJyIxFBr07XtayB0I2jv6Vc8AdUpdvMJ-daVzkPYMZ7pQA_X0Pg8RPRS2GnPkhyhK3XNkLRMsjG6CkSafYaqSeLMEpdF2Q-QkajvO3ojnRl1C-Bp9FpNbeaCwJXwqjEMKKhggRsKH8zdk7XcYhZX5_hARbBkIFLrCX1Xzyypp_PfZ23v9Pbd8aHmAf7FQdYN6xbfyoL5XEaDrCjGi-up2n1nlcTeEfkXHBunitUzQulmrjo86GJS0ldhF0mEMZ3_t6ObbjeKijYExMeYHxeCe89Yg10TvZI6kP4xizpJijG9cz75X3VI3I4SgeR8BuZRcb5eTQKWWzGW7u6LD1QtV3PWFCtv942CSz62kPPo-dD0248Fqm5HwxZejQSrZKjYQQ87dkzB0q7p2Q_M0z2Y-bRfNRXJl8VaF5X6-2KwLq47zwrQYUIcEHdag3J05X0SzBiImAdbh2zQz074QqEEpoU1F6C89LHKFxAw",
"IsSigned":false
}
What do you need to analyze the problem? Identifiers, sample application?
Best regards,
Michał iOS Developer.
On the iOS 18 system, we have found that some pages will definitely experience this kind of crash phenomenon when displayed. It's puzzling that I can modify different codes and this kind of crash won't occur on the 18 system. This has had a great impact on my code development, and I don't know if I can still use this API in the future. Can you help me solve this dilemma. thank
Just updated to newest beta of 18.2. The update corrects several major email issues but now I am only able to pull sone of my emails when I open mail.
after opening my mail on the 18.2 iPhone 15 pro max I am ablevto pull only 1 message from Xfinity. When I check on my pc using outlook 2016 I am able to pull 18 additional messages. Going back to my iPhone after a half hour the additional mail messages still do not show up even after doing a hard reset of the phone.
apologies, but these mail issues need to be resolved. It is bothersome that ios 18 was even released to the public at all with these type of issues.
Dear Experts,
I created a SwiftUI View (in a UIKit based project) whose objective is to display a short animation in the front of the user screen. I developed it 8 month ago in XCode 15 for iOS 17, no problems.
But since iOS 18, I can observe huge lags on my animation only in Release app. The problem is not present in Debug app.
I don't understand why this problem occurs, the animation is quite simple, it's just an offset deplacement.
I tried many thing like:
Show the animation with a UINavigationController
Show the animation with a UIWindow
Move the view with .position
Remove the GeometryReader
All other animation
withAnimation and .animation
Task and DispatchQueue
Etc...
I found that the laggy animation occurs when I set the Optimization Level for the Swift Compiler - Code Generation to Optimize for Speed [-O]. That's very strange because we had this option on Release for iOS 17 and we had no lags...
I can share to you a Sample Repository with the configuration we have. https://github.com/Thibma/sample-animation-swiftui
Today the only option I used is to develop this feature in UIKit but it's too bad to skip the SwiftUI opportunity. :/
If you have any ideas to resolve this, I take !
Thank you !
Hi, so a little context, by environment variables I mean like $HOME in linux. I know that there are some standard and some app specific environment variables in the above mentioned platforms.
Is is possible for the user to set environment variables? If so, what are the ways in which users can set environment variables in the platforms mentioned across the system or for the app as they would in macOS/linux/windows?
And is it possible for developers of the app do the same? If so how?
What I have found so far is that it cannot be done by users, and there is one place in xcode where I, as a developer can set environment variables for my app from the scheme. This isn't available when I ship just the installation binary to the end user, but rather is available only when the app is run using xcode. I just need validation that what I understand is correct and that there aren't other ways to do this by the user/developer without jailbreaking.
I'm experiencing the same situation as this post from several months ago, which never received an answer to their follow up question.
https://forums.developer.apple.com/forums/thread/759255
The documentation for adding a message filter extension says "you must set up shared credentials as described in Shared Web Credentials"
(https://developer.apple.com/documentation/sms_and_call_reporting/sms_and_mms_message_filtering/creating_a_message_filter_app_extension)
However credentials are not forwarded to the server and calling SecAddSharedWebCredential from within the extension isn't possible.
So I don't understand why the documentation states Shared Web Credentials must be set up.
After setting them up, then what is expected to happen with them, or what are you supposed to do with them next. The documentation just says to set them up, it doesn't say how/if they are used or how to use them in the specific context of a message filter extension
We use MLModel in our app, which uses two file formats: mlmodel and mlpackage. We find that when the model is released, models using mlmodel format have a certain probability of crashing. And these crashes account for the majority (over 85%) in the iOS 16.x system. Here is the crash stack:
Exception Type: SIGTRAP
Exception Codes: TRAP_BRKPT at 0x1b48e855c
Crashed Thread: 5
Thread 5 Crashed:
0 libdispatch.dylib 0x00000001b48e855c _dispatch_semaphore_dispose.cold.1 + 40
1 libdispatch.dylib 0x00000001b48b2b28 _dispatch_semaphore_signal_slow
2 libdispatch.dylib 0x00000001b48b0e58 _dispatch_dispose + 208
3 AppleNeuralEngine 0x00000001ef07b51c -[_ANEProgramForEvaluation .cxx_destruct] + 32
4 libobjc.A.dylib 0x00000001a67ed4a4 object_cxxDestructFromClass(objc_object*, objc_class*) + 116
5 libobjc.A.dylib 0x00000001a67f221c objc_destructInstance + 80
6 libobjc.A.dylib 0x00000001a67fb9d0 _objc_rootDealloc + 80
7 AppleNeuralEngine 0x00000001ef079e04 -[_ANEProgramForEvaluation dealloc] + 72
8 AppleNeuralEngine 0x00000001ef07ca70 -[_ANEModel .cxx_destruct] + 44
9 libobjc.A.dylib 0x00000001a67ed4a4 object_cxxDestructFromClass(objc_object*, objc_class*) + 116
10 libobjc.A.dylib 0x00000001a67f221c objc_destructInstance + 80
11 libobjc.A.dylib 0x00000001a67fb9d0 _objc_rootDealloc + 80
12 AppleNeuralEngine 0x00000001ef07bd7c -[_ANEModel dealloc] + 136
13 CoreFoundation 0x00000001ad4563cc cow_cleanup + 168
14 CoreFoundation 0x00000001ad49044c -[__NSDictionaryM dealloc] + 148
15 Espresso 0x00000001bb19c7a4 Espresso::ANERuntimeEngine::compiler::reset() + 1340
16 Espresso 0x00000001bb19cac8 Espresso::ANERuntimeEngine::compiler::~compiler() + 108
17 Espresso 0x00000001bacd69e4 std::__1::__shared_weak_count::__release_shared() + 84
18 Espresso 0x00000001ba944d00 std::__1::__hash_table<std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>, std::__1::__unordered_map_hasher<Espresso::platform, std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>, std::__1::hash<Espresso::platform>, std::__1::equal_to<Espresso::platform>, true>, std::__1::__unordered_map_equal<Espresso::platform, std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>, std::__1::equal_to<Espresso::platform>, std::__1::hash<Espresso::platform>, true>, std::__1::allocator<std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>>>::__deallocate_node(std::__1::__hash_node_base<std::__1::__hash_node<std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>, void*>*>*) + 40
19 Espresso 0x00000001ba8ea640 std::__1::__hash_table<std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>, std::__1::__unordered_map_hasher<Espresso::platform, std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>, std::__1::hash<Espresso::platform>, std::__1::equal_to<Espresso::platform>, true>, std::__1::__unordered_map_equal<Espresso::platform, std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>, std::__1::equal_to<Espresso::platform>, std::__1::hash<Espresso::platform>, true>, std::__1::allocator<std::__1::__hash_value_type<Espresso::platform, std::__1::shared_ptr<Espresso::net_compiler>>>>::~__hash_table() + 28
20 Espresso 0x00000001ba8e5750 Espresso::net::~net() + 396
21 Espresso 0x00000001bacd69e4 std::__1::__shared_weak_count::__release_shared() + 84
22 Espresso 0x00000001bad750e4 std::__1::__vector_base<std::__1::shared_ptr<Espresso::net>, std::__1::allocator<std::__1::shared_ptr<Espresso::net>>>::clear() + 52
23 Espresso 0x00000001ba902448 std::__1::__vector_base<std::__1::shared_ptr<Espresso::net>, std::__1::allocator<std::__1::shared_ptr<Espresso::net>>>::~__vector_base() + 36
24 Espresso 0x00000001ba8ed99c std::__1::unique_ptr<EspressoLight::espresso_plan::priv_t, std::__1::default_delete<EspressoLight::espresso_plan::priv_t>>::reset(EspressoLight::espresso_plan::priv_t*) + 188
25 Espresso 0x00000001ba95b7fc EspressoLight::espresso_plan::~espresso_plan() + 72
26 Espresso 0x00000001ba902078 EspressoLight::espresso_plan::~espresso_plan() + 16
27 Espresso 0x00000001ba8e690c espresso_plan_destroy + 372
28 CoreML 0x00000001c48c45cc -[MLNeuralNetworkEngine _deallocContextAndPlan] + 40
29 CoreML 0x00000001c48c43bc -[MLNeuralNetworkEngine dealloc] + 40
30 libobjc.A.dylib 0x00000001a67ed4a4 object_cxxDestructFromClass(objc_object*, objc_class*) + 116
31 libobjc.A.dylib 0x00000001a67f221c objc_destructInstance + 80
32 libobjc.A.dylib 0x00000001a67fb9d0 _objc_rootDealloc + 80
~~~~ Our code that release the MLModel object ~~~~
Moreover, we use a synchronization mechanism to ensure that the release of the MLModel and the data processing of the model (by calling [model predictionFromFeatures]) do not occur simultaneously. What could be the possible causes of the problem, and how can we prevent it from happening? Any advice would be appreciated.
Hello. I am re-writing our way of storing data into Core Data in our app, so it can be done concurrently.
The solution I opted for is to have a singleton actor that takes an API model, and maps it to a Core Data object and saves it.
For example, to store an API order model, I have something like this:
func store(
order apiOrder: APIOrder,
currentContext: NSManagedObjectContext?
) -> NSManagedObjectID? {
let context = currentContext ?? self.persistentContainer.newBackgroundContext()
// …
}
In the arguments, there is a context you can pass, in case you need to create additional models and relate them to each other. I am not sure this is how you're supposed to do it, but it seemed to work.
From what I've understood of Core Data and using multiple contexts, the appropriate way use them is with context.perform or context.performAndWait.
However, since my storage helper is an actor, @globalActor actor Storage2 { … }, my storage's methods are actor-isolated.
This gives me warnings / errors in Swift 6 when I try to pass the context for to another of my actor's methods.
let context = …
return context.performAndWait {
// …
if let apiBooking = apiOrder.booking {
self.store(booking: apiBooking, context: context)
/* causes warning:
Sending 'context' risks causing data races; this is an error in the Swift 6 language mode
'self'-isolated 'context' is captured by a actor-isolated closure. actor-isolated uses in closure may race against later nonisolated uses
Access can happen concurrently
*/
}
// …
}
From what I understand this is because my methods are actor-isolated, but the closure of performAndWait does not execute in a thread safe environment.
With all this, what are my options? I've extracted the store(departure:context:) into its own method to avoid duplicated code, but since I can't call it from within performAndWait I am not sure what to do.
Can I ditch the performAndWait? Removing that makes the warning "go away", but I don't feel confident enough with Core Data to know the answer.
I would love to get any feedback on this, hoping to learn!
Hi,
Could someone possibly tell me if it is accurate that Xcode Version 16.1 (16B40) is NOT compatible with iOS 18.1.1 and if so, when it will be?
Many TIA.
When I try to install packages through spm it doesn't find the packages.swift file, and with pods I have problems with Foundation. Trying to install Firebase shows that many pods have issues with double quotes. I don't know if it's my PC using everything updated or what, but I'm having a lot of problems and I haven't been able to reconcile my first application. I don't want to give up, and I would like some advice, since you have managed it, I would appreciate some help. I have followed tutorials, cases of people with similar problems, but it always throws some error.
My pc is mac m1
when my iPhone15 pro max upgrade to iOS18.1.1,it can not connect to hotPot of my lot device(os android5.1) any more and my iPhone12(iOS 18.1.1) has no issues.
Both the 15 pro max and the iPhone12 works well with another device (OS android 10.0).
had tried:
1.Forget Network (and re-add your desired Wifi network),
2.Reset Network Settings (under Settings/General/Transfer or Reset iPhone)
3.Turn Airplane Mode On then Off after a few seconds
4.Restart the iPhone.
5.Rest all setting
6.Disable VPN
7.close the the settings from rotating my WiFi address
Did anyone have similar issues?
post detail almost same as the title of the post, can’t enable the option. That is apple is already to provide of the always display option for 16 normal series?
Hi everyone,
I am working on a 3D reconstruction project.
Recently I have been able to retrieve the intrinsics from the two cameras on the back of my iPhone.
One consideration is that I want this app to run regardless if there is no LiDAR, but at least two cameras on the back. IF there is a LiDAR that is something I have considered to work later on the course of the project.
I am using a AVCaptureSession with the two cameras AVCaptureDevice:
builtInWideAngleCamera
builtInUltraWideCamera
The intrinsic matrices seem to be correct. However, the when I retrieve the extrinsics, e.g., builtInWideAngleCamera w.r.t. builtInUltraWideCamera the matrix I get looks like this:
Extrinsic Matrix (Ultra-Wide to Wide):
[0.9999968, 0.0008149305, -0.0023960583, 0.0]
[-0.0008256607, 0.9999896, -0.0044807075, 0.0]
[0.002392382, 0.0044826716, 0.99998707, 0.0].
[-14.277955, -8.135408e-10, -0.3359985, 0.0]
The extrinsic matrix of the form: [R | t], seems to be correct for the rotational part, but the translational vector is ALL ZEROS. Which suggests that the cameras are physically overlapped as well the last element not being 1 (homogeneous coordinates).
Has anyone encountered this 'issue' before?
Is there a flaw in my reasoning or something I might be missing?
Any comments are very much appreciated.
Hello!
I am having trouble setting start times for songs when using the ApplicationMusicPlayer.
When I initialize a new MusicPlayer.Queue.Entry using the following constructor, I am seeing strange results:
init(
_ playableMusicItem: PlayableMusicItem,
startTime: TimeInterval? = nil,
endTime: TimeInterval? = nil
)
It appears that any value I provide for startTime is also applied to the endTime. For example:
MusicPlayer.Queue.Entry(playable, startTime: TimeInterval(30), endTime: TimeInterval(183))
provides the following console output:
MusicPlayer.Queue.Entry(id: "3D6A3DA3-595E-4657-8DBA-DDD245BBB7EF", transientItem: PlayableMusicItem, startTime: 30.0, endTime: 30.0)
I have also tried setting the endTime to nil with the same result. Does anyone have any experience setting start times for songs using the MusicKit ApplicationMusicPlayer?
Any feedback is greatly appreciated!
Hi everyone,
I’m having trouble getting the correct horizontal slide transitions when navigating between multiple screens in my SwiftUI app. I have three screens (enum cases with an int assigned as index depending on the navigation order): .checkClient (index 0), .login(document: String) (index 1), and .register (index 2).
My goal is:
When moving forward (e.g., from .checkClient to .login, or from .login to any other with a greater index), the new screen should enter from the right (trailing) and the old one should exit to the left (leading).
When going backward (from .register back to .checkClient, for example), the new screen should enter from the left (leading) and the old one should exit to the right (trailing).
I’ve been using a state property isAdvancing to determine the direction of transitions (I use TCA, so my logic is in a Reducer body, my properties in a State and my views are normal SwiftUI Views):
case .updateCurrentScreen(let newScreen):
state.isAdvancing = newScreen.index > state.currentScreen.index
state.currentScreen = newScreen
return .none
I tried applying .transition directly inside each case:
.transition(
.asymmetric(
insertion: .move(edge: store.isAdvancing ? .trailing : .leading),
removal: .move(edge: store.isAdvancing ? .leading : .trailing)
)
)
This works correctly the first time I navigate forward. However, when I go to the .register screen and then hit back, the directions become inconsistent. Sometimes the removal happens in the wrong direction, and after returning to .checkClient, forward navigations stop working as intended.
Then, I tried placing the transition at a higher level, wrapping the switch in a ZStack and using a single .transition(...) outside:
ZStack {
switch store.currentScreen {
case .checkClient:
StartView(...)
case .login:
LoginView(...)
case .register:
RegisterView(...)
}
}
.transition(
.asymmetric(
insertion: .move(edge: store.isAdvancing ? .trailing : .leading),
removal: .move(edge: store.isAdvancing ? .leading : .trailing)
)
)
.animation(.easeInOut, value: store.currentScreen)
But doing this results in some transitions reverting to a fade instead of a horizontal slide.
I’ve also tried ensuring that isAdvancing updates before changing the currentScreen. Unfortunately, I still encounter inconsistent transitions when navigating back and forth between these screens.
Here is my complete view logic (even though is not finished nor polished yet):
var body: some View {
WithPerceptionTracking {
ZStack {
AdaptiveSheetView(
backgroundImage: Asset.AuthorizationDomain.background,
hasBackButton: store.showsBackButton,
isFullScreen: store.isFullScreen,
backAction: {
store.send(.goBack)
}
) {
if store.showsIATILogo {
Image(asset: Asset.iatiLogo)
.padding(spacing: .medium)
}
ZStack {
switch store.currentScreen {
case .checkClient:
StartView(store: store.scope(state: \.startState, action: \.startAction))
case .login:
if let newStore = store.scope(state: \.loginState, action: \.loginAction) {
LoginView(store: newStore)
}
case .register:
if let newStore = store.scope(state: \.registerState, action: \.registerAction) {
RegisterView(store: newStore)
}
}
}
.transition(
.asymmetric(
insertion: .move(edge: store.isAdvancing ? .trailing : .leading),
removal: .move(edge: store.isAdvancing ? .leading : .trailing)
)
)
}
.animation(.easeInOut, value: store.currentScreen)
if store.startState.checkUser == .loading { LoadingSpinner() }
PreloadView(store: store.scope(state: \.preloadState, action: \.preloadAction))
.opacity(store.preloadViewShown ? 1.0 : 0.0)
.animation(.linear(duration: 0.5), value: store.preloadViewShown)
.onChange(of: store.preloadViewShown) { shown in
if !shown { store.send(._checkPreviousSessions) }
}
}
}
}
Has anyone experienced similar issues or found a reliable pattern for achieving these “push/pop” style transitions in SwiftUI? Any guidance would be greatly appreciated!
My minimum target is iOS 16, so I can not make use of TabView with paginated style for this AFAIK.
Thanks in advance for any time and attention you dedicate to me 🙏🏼
Hello, if an associated domain is specified for an app (for example, the url of a server services an app extension text spam filtering) then what is there in place to stop somebody with malicious intentions from obtaining that url from the .plist/.entitlements file of the app and doing something with that url, such as denial of service attack or whatever?
弹出保存密码到iCloud钥匙串弹窗后立即退到后台再回到前台,搜索框无法弹出键盘