I am making a swift app supporting multi language ,showing proper language ui according user's phone setting language, I want to launch different screen (showing different image, boot-en.jpg, boot-ja.jpg) according language,i created two LaunchScreen files ,LaunchScreen-en.storyboard and LaunchScreen-ja.storyboard and localize them ,and add a different UIImage to them,
then create two InfoPlist.strings file with congfiging
."UILaunchStoryboardName" = "LaunchScreen_en"; //
"UILaunchStoryboardName" = "LaunchScreen_ja";//
and then **config info.plist ** with
UILaunchStoryboardName
LaunchScreen
above all steps ,build and run,hope to see launch screen showing boot-ja.jpg when phone's language is Japanese, showing boot-en.jpg when phone's language is English, but it shows black screen, how to fix this problem, thank you.
General
RSS for tagExplore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Post
Replies
Boosts
Views
Activity
Hi,
WWDC24 videos have a lot of references to an "Image Playground" API, and the "What's New in AppKit" session even shows it in action, with a "ImagePlaygroundViewController". However, there doesn't seem to be any access to the new API, even with Xcode 16.2 beta. Am I missing something, or is that 'coming later'?
Hello. I am having trouble trying to create a control in control center view that opens a specific view in my app using AppIntents (iOS18 )and WidgetKit, can someone help me please? I feel like the original videos and documentations are incomplete.
When a custom tool items set is formed in PKToolPicker with each inking item having non-nil identifier created using PKToolPickerInkingItem(type: ,color:,width:,identifier: ), changing color or width of an inking item (pen, pencil etc) causes an instant crash.
I believe it is a bug in PencilKit.
It seems that when you change the color or width of an inking item having a identifier in squeeze tool palette, it tries to find a tool item without identifier (the default tool picker has items without identifier) in the tool items set. I guess it cannot find, thus the find function returns either -1 or highest integer number (2^63 -1 ) and it uses this number as index without boundary checking. That's why we observe [__NSArrayM replaceObjectAtIndex:withObject:]: index 9223372036854775807 beyond bounds [0 .. 9]
I filed a report on Feedback Assistant with id: FB15519801 too.
The corresponding part in crash report is as follows:
0 CoreFoundation 0x183e0908c __exceptionPreprocess + 164 (NSException.m:249)
1 libobjc.A.dylib 0x18110b2e4 objc_exception_throw + 88 (objc-exception.mm:356)
2 CoreFoundation 0x183de4048 -[__NSArrayM replaceObjectAtIndex:withObject:] + 1020 (NSArrayM.m:180)
3 PencilKit 0x1c44f73c8 -[PKToolPicker _setSelectedTool:saveState:updateUI:updateLastSelectedTool:] + 800
(PKToolPicker.m:587)
4 PencilKit 0x1c45a5684 -[PKPencilSqueezeControllerPaletteViewDelegateProxy paletteView:didSelectTool:atIndex:] + 200 (PKPencilSqueezeControllerPaletteViewDelegateProxy.m:227)
5 PencilKit 0x1c460906c -[PKSqueezePaletteView _didSelectTool:atIndex:] + 196 (PKSqueezePaletteView.m:441)
6 PencilKit 0x1c462203c -[PKSqueezePaletteViewExpandedInkingToolLayout _didTapStrokeWeightButton:] + 336
(PKSqueezePaletteViewExpandedInkingToolLayout.m:224)
7 UIKitCore 0x18691edd8 -[UIApplication sendAction:to:from:forEvent:] + 100 (UIApplication.m:5797)
8 UIKitCore 0x18691ecb0 -[UIControl sendAction:to:forEvent:] + 112 (UIControl.m:942)
9 UIKitCore 0x18691eb00 -[UIControl _sendActionsForEvents:withEvent:] + 324 (UIControl.m:1013)
10 UIKitCore 0x187080568 -[UIButton _sendActionsForEvents:withEvent:] + 124 (UIButton.m:4192)
11 UIKitCore 0x187081d7c -[UIControl touchesEnded:withEvent:] + 400 (UIControl.m:692)
12 UIKitCore 0x1868675b0 -[UIWindow _sendTouchesForEvent:] + 852 (UIWindow.m:3313)
and the exception reason is
*** -[__NSArrayM replaceObjectAtIndex:withObject:]: index 9223372036854775807 beyond bounds [0 .. 9]
Hi, I have a problem that I can't solve, and I hope you can help me:
When I switch tabs or scroll down, the background color of the tab bar changes automatically.
This happened to me with Xcode 16.0.
Can yo help me please?
I have two questions --
1) How can I prevent a modal from being dismissed when the app enters the background?
2) I have a modal I'm presenting that gets dismissed seemingly at random if it's displayed within the first several seconds of app launch but stays displayed indefinitely otherwise. No other code is calling dismiss, and none of the UIAdaptivePresentationControllerDelegate dismissal methods get called. What other actions / etc would cause a modal presentation to be dismissed like that?
Hello! I hope you are all doing well. The reason for this post is to ask about the Share Sheet behavior, as I am experiencing a double Share Sheet behavior where in iOS 14.6 I give the order to download 2 files which open 2 share pop-ups (2 Share Sheets), but in iOS 17.6 it only opens once to download each file. Do you know if this changed at some point between iOS versions and why?
I leave an example image of the behavior in iOS 14.6:
How can I test biometric on UI Tests in Swift / iOS 18? This code not working.
+ (void)successfulAuthentication {
notify_post("com.apple.BiometricKit_Sim.fingerTouch.match");
notify_post("com.apple.BiometricKit_Sim.pearl.match");
}
+ (void)unsuccessfulAuthentication {
notify_post("com.apple.BiometricKit_Sim.fingerTouch.nomatch");
notify_post("com.apple.BiometricKit_Sim.pearl.nomatch");
}
I'm using React Native to create a mobile application.When I click on a button in my app, I need to programmatically take a screenshot of the current page of my application together with the iPhone status bar that shows the time, cellular provider, and battery level. However, my app page is being captured without having the statusbar.
My 'screenshot taken' function is written in Objective-C.
Is this happening because of any privacy-related concerns?
Would you kindly assist me with this?
Attaching the screenshot code,
#import <UIKit/UIKit.h>
#import <React/RCTBridgeModule.h>
#import <React/RCTLog.h>
@interface ScreenshotModule : NSObject
@end
@implementation ScreenshotModule
RCT_EXPORT_MODULE();
RCT_REMAP_METHOD(takeStatusBarScreenshot, resolver:(RCTPromiseResolveBlock)resolve rejecter:(RCTPromiseRejectBlock)reject)
{
dispatch_async(dispatch_get_main_queue(), ^{
@try {
// Get the status bar window
UIWindow *statusBarWindow = [UIApplication sharedApplication].windows.firstObject;
UIScene *scene = [UIApplication sharedApplication].connectedScenes.allObjects.firstObject;
if ([scene isKindOfClass:[UIWindowScene class]]) {
UIWindowScene *windowScene = (UIWindowScene *)scene;
BOOL statusBarHidden = windowScene.statusBarManager.isStatusBarHidden;
if (statusBarHidden) {
NSLog(@"Status bar is hidden, app is in full-screen mode.");
} else {
NSLog(@"Status bar is visible.");
}
} else {
NSLog(@"The scene is not a UIWindowScene.");
}
// Check if the statusBarWindow is valid
if (!statusBarWindow) {
reject(@"screenshot_failed", @"Status bar window not found", nil);
return;
}
// Get the window scene and status bar frame
UIWindowScene *windowScene = statusBarWindow.windowScene;
CGRect statusBarFrame = windowScene.statusBarManager.statusBarFrame;
// Log the status bar frame for debugging
RCTLogInfo(@"Status Bar Frame: %@", NSStringFromCGRect(statusBarFrame));
// Check if the status bar frame is valid
if (CGRectIsEmpty(statusBarFrame)) {
reject(@"screenshot_failed", @"Status bar frame is empty", nil);
return;
}
// Start capturing the status bar
UIGraphicsBeginImageContextWithOptions(statusBarFrame.size, NO, [UIScreen mainScreen].scale);
CGContextRef context = UIGraphicsGetCurrentContext();
// Render the status bar layer
[statusBarWindow.layer renderInContext:context];
// Create an image from the current context
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
if (!image) {
reject(@"screenshot_failed", @"Failed to capture screenshot", nil);
return;
}
// Convert the image to PNG format and then to a base64 string
NSData *imageData = UIImagePNGRepresentation(image);
if (imageData == nil) {
reject(@"screenshot_failed", @"Image data is nil", nil);
return;
}
NSString *base64String = [imageData base64EncodedStringWithOptions:0];
// Log base64 string length for debugging
RCTLogInfo(@"Base64 Image Length: %lu", (unsigned long)[base64String length]);
// Optionally, save the image to a file (for debugging purposes)
NSString *path = [NSTemporaryDirectory() stringByAppendingPathComponent:@"statusbar_screenshot.png"];
[UIImagePNGRepresentation(image) writeToFile:path atomically:YES];
RCTLogInfo(@"Status bar screenshot saved to: %@", path);
// Resolve with the base64 image
resolve(base64String);
}
@catch (NSException *exception) {
reject(@"screenshot_error", @"Error while capturing status bar screenshot", nil);
}
});
}
@end
I'm following the video tutorial below, using the exact examples, but was not able to semantically match the results:
https://developer.apple.com/videos/play/wwdc2024/10131
https://developer.apple.com/documentation/corespotlight/building-a-search-interface-for-your-app
In iOS 18 and macOS 15 and later, Spotlight also supports semantic searches of your content, in addition to lexical matching of a search term.
I'm on macOS 15.1, so I'd expect it should work now? Or is this depend on Apple Intelligence for some reason?
Specifically I've indexed the following:
Keyword: "windsurfing carmel"
Literal match:
the best windsurfing carmel county
windsurfing lessons
Semantic match:
sailboarding lessons
the best windsurfing carmel county
windsurfing lessons
Expected: find semantic match.
Actual: only literal match were returned.
Because CSUserQuery.prepare is only supported by macOS 15, my switch from CSSearchQuery makes no sense without the semantic search benefits.
Did I miss something? I also added the corespotlight delegate extension as directed but was not able to hit the breakpoint as per the video. I wish there is the sample code for this, but couldn't find it.
I am trying to convert a string field to an integer field in our database schema. However, the custom migration that I write doesn't seem to run.
My Model
//Run 1
//typealias Book = BookSchemaV1.Book
//Run 2
typealias Book = BookSchemaV2.Book
// MARK: - Migration Plan
enum BookModelMigrationPlan: SchemaMigrationPlan {
static var schemas: [any VersionedSchema.Type] = [
BookSchemaV1.self,
BookSchemaV2.self
]
static var stages: [MigrationStage] = [migrateV1toV2]
static var oldBooks: [BookSchemaV1.Book] = []
static let migrateV1toV2 = MigrationStage.custom(
fromVersion: BookSchemaV1.self,
toVersion: BookSchemaV2.self,
willMigrate: nil,
didMigrate: { context in
oldBooks = try context.fetch(FetchDescriptor<BookSchemaV1.Book>())
oldBooks.forEach { oldBook in
do {
let newBook = BookSchemaV2.Book(
bookID: String(oldBook.bookID),
title: oldBook.title
)
context.insert(newBook)
context.delete(oldBook)
try context.save()
} catch {
print("New model not saved")
}
}
}
)
}
// MARK: - Schema Versions
enum BookSchemaV1: VersionedSchema {
static var models: [any PersistentModel.Type] = [Book.self]
static var versionIdentifier = Schema.Version(1, 0, 0)
@Model
final class Book {
@Attribute(.unique) var bookID: Int
var title: String
init(
bookID: Int,
title: String
) {
self.bookID = bookID
self.title = title
}
}
}
enum BookSchemaV2: VersionedSchema {
static var models: [any PersistentModel.Type] = [Book.self]
static var versionIdentifier = Schema.Version(2, 0, 0)
@Model
class Book {
@Attribute(.unique) var bookID: String
var title: String
init(
bookID: String,
title: String
) {
self.bookID = bookID
self.title = title
}
}
}
@MainActor
class AppDataContainer {
static let shared = AppDataContainer()
let container: ModelContainer
private init() {
do {
let schema = Schema([Book.self])
let config = ModelConfiguration(schema: schema)
container = try ModelContainer(for: schema, migrationPlan: BookModelMigrationPlan.self, configurations: [config])
} catch {
fatalError("Could not create ModelContainer: \(error)")
}
}
}
A lot of apps just produce a black screen image (sometimes with a logo) when you screenshot within them. It appears the UITextField trick most had used no longer works in iOS 18. How can you achieve this?
In visionOS, i have been trying to implement this view as a background for information view, but i cannot find any information about it anywhere. Does anyone know what this is called or any workaround to achieve this look?
It appears that starting with macOS Sequoia, Quick Look Preview extension no longer loads MapKit maps correctly anymore. Map tiles do not appear, leaving users with a beige background.
Users report that polylines do render correctly, but annotations appears black.
This was previously working fine in prior macOS versions including Sonoma.
STEPS TO REPRODUCE
Create a macOS app project, with an associated document.
Ensure project has a Quick Look preview extension, with necessary basic setups.
Ensure that the extension mentioned in (2) must have a MKMapView. Any other cosmetic changes, etc, does not need to be implemented to observe the base issue. Do note that it has been reported that in addition to the map tiles not loading, annotations don't render correctly as well.
I have an StatusBar APP in Swift, under Sonoma the App works fine,
after upgrade to Sequoia, i cant get the input from Keyboard inside the NSTextField, because the NSTextField shows the Focusring.
The given input from keyboard, get every time to another window, like xcode, Safari or Desktop.
depending on what was last active after I selected a menu item in the menu of the statusbar APP with a view or an NSAlert with an NSTextField.
I have spent several hours with debug sessions, a KeyDown or KeyUP event from the keyboard does not arrive.
As I said in Sonoma this was not a problem and the APP works as expected.
To me this looks like a bug in Sequoia.
But maybe someone here has an idea on this topic
Somebody help me please.
I try to set specific time for notification, it works nice, but if you need a little beat more functional this is where difficulties appear. I'd like to give opportunities for repeat, example every hour. I know that UNCalendarNotificationTriger has a repeat value, but when you set repeat on true it remember date component, exp - .minute, and then just repeating notification every time when that minute comes!
I'm looking for solution for set notification at special time(exp: 5:00 pm), and then repeating this notification every hour(6, 7, 8, 9 pm)
Maybe it's so easy but looks like I feel stuck 😕
I want to simulate the pressing of Fn (Globe) + Control + arrow keys combo, but I’m encountering an issue where the Fn modifier seems to be ignored, and the system behaves as if only Control + arrow keys are pressed.
In macOS, the combination of Fn + Right Arrow (key code 124) is treated as End (key code 119), and even that didn’t have expected behavior. Instead of moving the window to the right edge of the screen on Sequoia, it switches to the next space, which is the default behavior for Control + Right Arrow.
Demo:
(I included Fn + Control + C to show that centering the window works for example.)
import SwiftUI
@main
struct LittleKeypressDemo: App {
var body: some Scene {
Window("Keypress Demo", id: "keypress-demo") {
ContentView()
}
.windowStyle(.hiddenTitleBar)
.windowResizability(.contentSize)
.windowBackgroundDragBehavior(.enabled)
}
}
struct ContentView: View {
var body: some View {
VStack(spacing: 20) {
KeyPressButton(icon: "arrowtriangle.right.fill", keyCode: 124)
KeyPressButton(icon: "arrow.down.to.line", keyCode: 119)
KeyPressButton(label: "C", keyCode: 8)
}
.padding()
}
}
struct KeyPressButton: View {
let icon: String?
let label: String?
let keyCode: CGKeyCode
init(icon: String? = nil, label: String? = nil, keyCode: CGKeyCode) {
self.icon = icon
self.label = label
self.keyCode = keyCode
}
var body: some View {
Button(action: { simulateKeyPress(keyCode) }) {
HStack {
Image(systemName: "globe")
Image(systemName: "control")
if let icon = icon {
Image(systemName: icon)
} else if let label = label {
Text(label)
}
}
}
.buttonStyle(.bordered)
.controlSize(.large)
}
}
func simulateKeyPress(_ keyCode: CGKeyCode) {
let fnKey = VirtualKey(keyCode: 63, flags: .maskSecondaryFn)
let controlKey = VirtualKey(keyCode: 59, flags: [.maskControl, .maskSecondaryFn])
let targetKey = VirtualKey(keyCode: keyCode, flags: [.maskControl, .maskSecondaryFn])
[fnKey, controlKey, targetKey].forEach { $0.pressAndRelease() }
}
struct VirtualKey {
let keyCode: CGKeyCode
let flags: CGEventFlags
func pressAndRelease() {
postKeyEvent(keyDown: true)
postKeyEvent(keyDown: false)
}
private func postKeyEvent(keyDown: Bool) {
guard let event = CGEvent(keyboardEventSource: nil, virtualKey: keyCode, keyDown: keyDown) else { return }
event.flags = flags
event.post(tap: .cghidEventTap)
}
}
Expected behavior:
Simulating the key combo Fn + Control + Right Arrow on macOS Sequoia should move the current window to the right half of the screen, instead of switching to the next desktop space.
Questions:
Is CGEventFlags.maskSecondaryFn enough to simulate the Fn key in combination with Control and the arrow keys? Are there alternative approaches or workarounds to correctly simulate this behavior? What’s the obvious thing I missing?
(Btw., window management is completely irrelevant here. I’m specifically asking about simulating these key presses.)
Any insights or suggestions would be greatly appreciated.
Thank you.
On testing my app with tvOS 18, I have noticed the Siri Remote back button no longer provides system-provided behavior when interacting with tab bar controller pages. Instead of moving focus back to the tab bar when pressed, the back button will close the app, as if the Home button was pressed. This occurs both on device and in the Simulator.
Create tvOS project with a tab bar controller.
Create pages/tabs which contain focusable items (ie. buttons)
Scroll down to any focusable item (ie. a button or UICollectionView cell)
Hit the Siri Remote back button. See expect behavior below:
Expected behavior: System-provided behavior should move focus back to the tab bar at the top of the screen.
Actual results: App is closed and user is taken back to the Home Screen.
Has anyone else noticed this behavior?
Even on iOS 18 (16-17 also repo) we are seeing a crash on the FamilyActivityPicker when users tap on "Other" or just at random times. We see the follow debug message but no other way of identifying the issue in code.
[u 3C8AF272-DC4E-55C4-B8C6-34826D2BEB5B:m (null)] [com.apple.FamilyControls.ActivityPickerExtension(1150.1)] Connection to plugin invalidated while in use.
Even with the most basic implementation of FamilyActivityPicker (example below) we can repro this crash consistently.
Big applications (think Opal) see the same issue but it seems like they see it less, and are able to intercept the disconnect in order to show an error to the user. My two questions are
How can we intercept this crash/disconnect in order to alert our user and restart the experience?
Is this EVER gonna get fixed properly?
Usage Example:
var body: some View {
NavigationView {
ZStack {
familyPickerErrorView
.opacity(isHidden ? 0 : 1)
.onAppear {
DispatchQueue.main.asyncAfter(deadline: .now() + 2) {
withAnimation {
isHidden = false
}
}
}
VStack {
Color.clear
.frame(height: 1)
.background(Color(UIColor.systemBackground))
FamilyActivityPicker(
headerText: "Select Apps To Be Blocked (Maximum of 50)",
footerText: "Want to block Safari? Check our FAQs",
selection: $familySelection)
.ignoresSafeArea(.all)
}
}
}
.toolbar {
ToolbarItem(placement: .navigationBarLeading) {
Button(action: {
isPresented = false
}) {
Text("Cancel")
.foregroundColor(.black)
}
}
ToolbarItem(placement: .navigationBarTrailing) {
Button(action: {
isPresented = false
}) {
Text("Done")
}
}
}
.navigationBarTitleDisplayMode(.inline)
.alert(isPresented: $showAlert) {
Alert(title: Text("Family Activity Picker Issue"), message: Text(alertMessage), dismissButton: .default(Text("OK")))
}
.onAppear {
isPresented = true
}
}
I'd like to share an app's screen in two modes. First in a standard mirroring mode and second in an "additional content" mode (very likely with a session role windowExternalDisplayNonInteractive).
I found that the Keynote app on iOS does a very nice example of what I want to achieve when sharing an iPhone using AirPlay to an AppleTV.
sharing a screen results in mirroring the screen on the TV
tapping the play button in Keynote switches to "additional content" where iPhone and TV show different content
leaving the additional content mode returns to "mirroring" where TV and iPhone show the same content
Is there an example for implementing such a feature?
I am able to successfully use the external display (windowExternalDisplayNonInteractive) and show additional content there.
How can I programmatically "detach" the additional content from the external display and activate mirroring mode?
Searching the Developer Forums for windowExternalDisplayNonInteractive reveals some discussions, which include valuable information, however, returning to mirror mode does not seem to be covered.