When I build my app for iPad OS, either 26, or 18.5, as well as iOS on 16.5 from Xcode 26 with UIDesignRequiresCompatibility enabled my app is crashing as it loads the main UIViewController, a subclassed UITabBarController which is being loaded programatically from a Storyboard from another SplashScreen ViewController.
On i(Pad)OS 18.5 I get this error:
Thread 1: "Could not instantiate class named _TtGC5UIKit17UICoreHostingViewVCS_21ToolbarVisualProvider8RootView_ because no class named _TtGC5UIKit17UICoreHostingViewVCS_21ToolbarVisualProvider8RootView_ was found; the class needs to be defined in source code or linked in from a library (ensure the class is part of the correct target)"
On iPadOS 26 I get this error:
UIKitCore/UICoreHostingView.swift:54: Fatal error: init(coder:) has not been implemented
There is no issue building from Xcode 16.4, regardless of targeted i(Pad)OS.
UIKit
RSS for tagConstruct and manage graphical, event-driven user interfaces for iOS or tvOS apps using UIKit.
Posts under UIKit tag
200 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Support for menus in Storyboards is yanked without ever being deprecated (to my knowledge)? Really? WTF? This is a major step backwards, Apple.
So nice to have to spend a month rewriting my app after WWDC each year. Re-creating a complex menu hierarchy in code is exactly what I wanted to do. Ugh.
On WWDC25 session "Meet Liquid Glass", two Liquid Glass variants are mentioned: "regular" and "clear". "Regular" seems to be the default setting for UIGlassEffect, but I was not able to find an option for clear.
Is there a native element that uses clear? Is it coming to later betas for iOS 26?
The swift syntax compilation reported an error.
as follows
How should I be compatible
Hello Apple Developer Community,
I'm developing an application for iPadOS 26 on an 11th generation iPad, using Objective-C. With the recent update to iPadOS 26, I've noticed a significant change in how app windows are presented. Specifically, the new minimize and close buttons, similar to those found on macOS, now appear in the top-left corner of app windows.
The issue I'm encountering is that these newly introduced system buttons overlap with custom buttons I've programmatically added to the left side of my app's navigation bar. This overlap affects nearly all screens in my application, making some of my essential UI elements inaccessible or difficult to interact with.
I'm looking for guidance on whether there's an official way to opt out of displaying these minimize and close buttons, or perhaps a method to adjust their position or visibility to prevent them from interfering with existing UI elements. My aim is to maintain the functionality and user experience of my application without having to redesign a substantial portion of its interface.
Any insights or suggestions from the community would be greatly appreciated. Thank you in advance for your help!
I noticed on the Find My app in the new iOS 26 beta that the TabView and the sheet seem to be part of the same view. When you collapse the sheet, the TabView is still visible, and you can swipe up to view the sheet again. Is there a way to recreate this effect? Preferably in SwiftUI, but UIKit works too.
With iPadOS26, if I create a UITabBar, and use that to switch between views, the selected state never updates. I created this simple UIViewController to demonstrate the issue:
class SimpleTabBarController: UIViewController, UITabBarDelegate {
let tabBar = UITabBar()
let redItem = UITabBarItem(title: "Red", image: nil, tag: 0)
let blueItem = UITabBarItem(title: "Blue", image: nil, tag: 1)
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = .white
tabBar.items = [redItem, blueItem]
tabBar.selectedItem = redItem
tabBar.delegate = self
tabBar.translatesAutoresizingMaskIntoConstraints = false
view.addSubview(tabBar)
NSLayoutConstraint.activate([
tabBar.leadingAnchor.constraint(equalTo: view.leadingAnchor),
tabBar.trailingAnchor.constraint(equalTo: view.trailingAnchor),
tabBar.bottomAnchor.constraint(equalTo: view.safeAreaLayoutGuide.bottomAnchor)
])
updateBackground(for: redItem)
}
func tabBar(_ tabBar: UITabBar, didSelect item: UITabBarItem) {
updateBackground(for: item)
}
private func updateBackground(for item: UITabBarItem) {
switch item.tag {
case 0: view.backgroundColor = .systemRed
case 1: view.backgroundColor = .systemBlue
default: view.backgroundColor = .white
}
}
}
The tabBar didSelect item method is called, and the background color gets updated as expected, but the selected state of the UITabBar stays the same.
I files a feedback for a related issue: FB17841678
Just testing an existing app with Xcode 26.
I notice that content of UIBarButtonItem (either text or image) disappears when tapped (and reappear on release).
Those are custom, bordered buttons.
Attribute inspector:
Buttons in Xcode:
When selected in Xcode, we see a rectangle inside the rounder rect of iOS 26
In simulator:
When tapped in simulator:
I have edited code from
backButton.setTitleTextAttributes([
.font : boldFont,
.foregroundColor : UIColor.systemBlue,
], for: .normal)
to
backButton.setTitleTextAttributes([
.font : boldFont,
.foregroundColor : UIColor.systemBlue,
], for: [.normal, .focused, .selected, .highlighted])
to no avail.
What am I missing ?
Edit: Well this is embarassing. It looks like I didn't research this thoroughly enough, animations block UIVIew tap events. I found a solution by using DispatchQueue
I ran into an unexpected issue when presenting a UIView-based toast inside a separate UIWindow in a SwiftUI app. Specifically, when animations are applied to the toast view (UIToastView), the tap gesture no longer works.
To help identify the root cause, I created a minimal reproducible example (MRE) with under 500 lines of code, demonstrating the behavior:
Demo GIF: Screen Recording
Code Repo: ToastDemo
What I Tried:
Using a separate UIWindow to present the toast overlay.
Adding a tap gesture directly to the UIView.
Referencing related solutions:
A Blog Post explaining UIWindow usage in SwiftUI - https://www.fivestars.blog/articles/swiftui-windows (Sorry, Apple Dev Forum will not allow a link to this)
A Stack Overflow thread on handling touch events in multiple windows.
Problem Summary:
When animations are involved (fade in, slide up), taps on the toast are not recognized.
Without animations, taps work as expected.
UIWindow setup seems correct, so I’m wondering if animation effects are interfering with event propagation.
I could potentially work around this by restructuring the touch handling, but I'd love insight from the community on why this happens, or if there’s a cleaner fix.
Edit: Well this is embarassing. It looks like I didn't research this thoroughly enough, animations block UIVIew tap events. I found a solution by using DispatchQueue
I'm currently working on implementing a character limit for Korean text input using UITextField, but I've encountered two key issues.
1. How can I determine if Korean input is complete?
I understand that markedTextRange represents provisional (composing) text during multistage text input systems (such as Korean, Japanese, Chinese).
While testing with Korean input, I expected markedTextRange to reflect the composing state.
However, it seems that markedTextRange remains nil throughout the composition process.
2. Problems limiting character count for Korean input
I’ve tried two methods to enforce a character limit. Both lead to incorrect behavior due to how Korean characters are composed.
Method 1 – Before replacement:
func textField(_ textField: UITextField, shouldChangeCharactersIn range: NSRange, replacementString string: String) -> Bool {
guard let text = textField.text else { return true }
return text.count <= 5
}
This checks the text length before applying the replacementString.
The issue is that when the user enters a character that is meant to combine with the previous one to form a composed character, the input should result in a single, combined character.
However, because the character limit check is based on the state before the replacement is applied, the second character does not get composed as expected.
Method 2 – After change:
textField.addTarget(self, action: #selector(editingChanged), for: .editingChanged)
@objc private func editingChanged(_ sender: UITextField) {
guard var text = sender.text else { return }
if text.count > limitCount {
text.removeLast()
sender.text = text
}
}
This removes the last character if the count exceeds the limit after the change.
But when a user keeps typing past the limit, the last character is overwritten by new input.
I suspect this happens because the .editingChanged event occurs before the multistage input is finalized,
and the final composed character is applied after that event.
My understanding of the input flow:
Standard input:
shouldChangeCharactersIn is called
replacementString is applied
.editingChanged is triggered
With multistage input (Korean, etc.):
shouldChangeCharactersIn is called
replacementString is applied
.editingChanged is triggered
Final composed character is inserted (after all the above)
Conclusion
Because both approaches lead to incorrect character count behavior with Korean input,
I believe I need a new strategy.
Is there an officially recommended way to handle multistage input properly with UITextField in this context?
Any advice or clarification would be greatly appreciated.
MacOS 15.5(24F74)
Xcode 16.4 (16F6)
Hi,
I have an iPhone App with an UIWindowScene and two UIWindow's(mainWindow and alertWindow). In the mainWindow I have the whole app and it is allowed to rotate. The alertWindow is a window to show alert's to the user on the top of the screen and I do not want that the content inside rotate.
I thought I may do:
override var supportedInterfaceOrientations: UIInterfaceOrientationMask {
return .portrait
}
And
override var shouldAutorotate: Bool {
return false
}
In the rootviewcontroller of alertWindow but after doing those changes the rootviewcontroller of mainWindow does not rotate until I do any navigation.
I have thought to have two UIWindowScene's (one per UIWindow) but as far I know iPhone app only supports one UIWindowScene.
So, how can I avoid rotation in the viewcontroller of alertWindow without losing the rotation on rootviewcontroller of mainWindow?
My viewcontroller is a UIHostingController, so I tried also to avoid from my SwiftUI view but I did not find any solution neither.
Thank you in advance
I have a problem with the URL schemes under iOS 18. Data is being sent from one app to another app. The amount of data varies. It can sometimes be more than 5 MB.
With iOS 18, errors often occur when sending large amounts of data. The error message is: "Failed to open URL asynchronously".
If I send the data once again in this case, it works.
To reproduce the error quickly, I wrote two small apps.
AppA sends data to AppB. AppB calls AppA and AppA sends data to AppB again. The whole thing runs in an endless loop.
Code snippet:
// AppA
// The file to which fileUrl points contains a 4 MB string.
// The string consists of only one letter “AAAAAA....”
let dataStr = try String(contentsOf: fileUrl, encoding: .utf8)
if let url = URL(string: "appb://receive?data=\(dataStr)") {
UIApplication.shared.open(url, options: [:]) { (result) in
if !result {
os_log("can't open url", type: .error)
}
}
}
// AppB
DispatchQueue.main.asyncAfter(deadline: .now() + 2.0) {
if let returnUrl = URL(string: "appa://return") {
UIApplication.shared.open(returnUrl)
}
}
If the test is started, the error occurs approximately 15-20 times per hour.
The first error occurs very quickly if the device is restarted prior to this.
As soon as the error occurs, we end up in
os_log(“can't open url”, type: .error)
I know the possibility of exchanging the data via AppGroups, but cannot use it in our case.
Tested with following devices:
// The error occurs:
iPhone 11 with iOS 18.4.1
iPhone SE with iOS 18.5
// The error does not occur
iPhone 8 with iOS 16.7.10
iPhone 16 simulator on a M1 MacBook (macOS 15.4.1)
Unfortunately, there is no other error message in the "Console" app. Except "Failed to open URL asynchronously".
There were no problems at this point between iOS 12 and iOS 17.
My question is now, are there new limitations to the URL schemes under iOS 18 or is it a bug?
Hello!
I wanted to see if someone with more UIKit experience than me can help me out on guiding me in the right direction for conditionally adding and deleting a row in a UITableView.
What I Want to Accomplish
I have a tip slider with percentages (0% - 20%) with a custom option on the end. I'm wanting to, when the custom option is tapped, bring up a row immediately below there and have a UITextField. When another option, let's say 10%, is tapped, I want the text field row to go away.
Can someone explain to me how this would work? And if so, provide an example?
Thank you!
I have a UIPageViewController embedded in a UIScrollView and each page has a drawing view with a UIPanGestureRecognizer to free-draw. With this setup, the 1st time I attempt to draw, the pan gesture is ignored. It works the 2nd time I perform the gesture.
In my case I need to wrap the UIPageViewController in a UIScrollView to have a pull to refresh mechanism (set thescrollView.refreshControl).
I’ve tried every combination of UIGestureRecognizerDelegate methods (shouldRecognizeSimultaneously…, require(toFail:), etc.) with no luck.
This is my view hierarchy:
ScrollView
|- UIPageViewController
|- Page 1
| |- DrawingView with UIPanGestureRecognizer
|- Page 2
|- DrawingView with UIPanGestureRecognizer
Is this a known limitation when a UIPageViewController is nested inside another scroll view?
Reproduction steps (tested on iOS 18.4 / Xcode 16.3, iPhone 16 Pro)
Launch the app; the first page shows a white canvas in the bottom part.
Try to draw immediately → nothing happens.
Lift your finger and draw again → works.
Here is a link for the sample project with the reproducible code: https://github.com/marcod-storyteller/page-controller-sample
P.S: If the UIPageViewController has a .pageCurl transition style instead, the problem disappears.
Hello,
I'm currently working on my first SceneKit game and have encountered an issue related to moving an SCNNode using a UIPanGestureRecognizer.
When I deploy the game to my iPhone via Xcode in debug mode, all interactions are smooth. However, when I stop the debugging session and run the game directly from the device (outside of Xcode), the SCNNode movement behaves inconsistently — it works sometimes smoothly and sometimes not and the interaction becomes choppy. The SCNNode movement is controlled using a UIPanGestureRecognizer.
Do you have any ideas what might be causing the issue?
When I create a tab group for the sidebar on iPad, the title and disclosure triangle act like a single control. Every time I tap the section title, the disclosure triangle for that section activates and hides or exposes that section's children and actions.
I want the section title to behave like Photos, where tapping a section title just displays its view controller, and the disclosure triangle is a separate control that must be tapped to hide and show children and actions.
I did not see any delegate methods that would let me control this behavior. Is this supported?
I have used the following code for years to add a right bar button item to the navigation bar, but for some unknown reason, this no longer works. It stopped working when I updated my app to have Scene support. I don't understand what is preventing this code from working.
@interface ViewController ()
@end
@implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view.
NSLog(@"viewDidLoad");
// Add a Share Button
UIBarButtonItem *shareButton;
shareButton = [[UIBarButtonItem alloc] initWithBarButtonSystemItem:UIBarButtonSystemItemAction target:self action:@selector(editProject:)];
self.navigationItem.rightBarButtonItem = shareButton;
self.navigationItem.rightBarButtonItem.tintColor = [UIColor blueColor];
}
-(void) editProject:(id)sender {
}
@end
To test this, I created a brand new test app that does nothing except for attempting to add this button. The autogenerated code gives you the following project and I simply modified the ViewController class as shown above:
What do I need to do differently to make the right bar button item to display? I know that I can add buttons using the storyboard that can be controlled via IBOutlets, but just want to know if its still possible to do this programmatically.
When minimize an app on extended display, then plug out the extended display, the app crashes.
These simple steps, make every iOS app running on Mac crash. Pls fix it.
I first applied a snapshot on the main thread like this:
var snapshot = NSDiffableDataSourceSnapshot<Section, MessageViewModel>()
snapshot.appendSections([.main])
snapshot.appendItems([], toSection: .main)
dataSource.applySnapshotUsingReloadData(snapshot)
After loading data, I applied the snapshot again using:
Task { @MainActor in
await dataSource.applySnapshotUsingReloadData(snapshot)
}
On an iPhone 13 mini, I received the following warning:
Warning: applying updates in a non-thread confined manner is dangerous and can lead to deadlocks. Please always submit updates either always on the main queue or always off the main queue
However, this warning did not appear when I ran the same code on an iPhone 16 Pro simulator.
Can anyone explain it to me? Thank you
I found the following statement on the site https://developer.apple.com/documentation/technotes/tn3187-migrating-to-the-uikit-scene-based-life-cycle:
"Soon, all UIKit based apps will be required to adopt the scene-based life-cycle, after which your app won’t launch if you don’t. While supporting multiple scenes is encouraged, only adoption of scene life-cycle is required."
Could you please clarify when exactly apps will no longer be able to launch if they do not adopt the scene-based life-cycle? I would like to confirm the deadline as the impact of this change is significant.