.onReceive construction isn't working with button long press gesture but tap is working, it increases count by 1. Construction should increase count every 0.5 seconds if button pressed
struct Test: View {
@State private var count = 0
@State private var isLongPressed = false
var body: some View {
VStack {
Text("Count: \(count)")
Button("Increase") {
count += 1
print("Button tapped")
}
.onLongPressGesture {
isLongPressed = true
print("Long press started")
}
.onLongPressGesture(minimumDuration: 0) {
isLongPressed = false
print("Long press ended")
}
}
.onReceive(Timer.publish(every: 0.5, on: .main, in: .common).autoconnect()) { _ in
if isLongPressed {
count += 1
print("Count increased: \(count)")
}
}
}
}
Explore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Before ios18, when two fingers are switched to single fingers, the printing scale value will not change. When switching to two fingers again, the pinch and zoom printing scale will change. The same operation is performed after ios18, and two fingers are switched to single fingers. Switch back to two fingers, and the scale printing will not change.
code here:
- (void)pinchGesture:(UIPinchGestureRecognizer *)recognizer {
NSSet <UIEvent*> *events = [recognizer valueForKey:@"_activeEvents"];
UIEvent *event = [events anyObject];
if (recognizer.state == UIGestureRecognizerStateBegan) {
NSLog(@"---- begin finger count: %d scale: %f",event.allTouches.count,recognizer.scale);
recognizer.scale = 1.0;
} else if (recognizer.state == UIGestureRecognizerStateChanged) {
NSLog(@"---- change finger count: %d scale: %f",event.allTouches.count,recognizer.scale);
// recognizer.scale = 1.0;
}
log image for iOS 17.7
log image for ios18.0.2
Using a button that is placed in the bottom ornament to set focus on a text field will not display the keyboard properly while a button embedded in the view will behave as expected.
To demonstrate the issue, simply run the attached project on Vision Pro with visionOS 1.1 and tap the Toggle 2 button in the bottom ornament. You’ll see that the field does have focus but the keyboard is now visible.
Run the same test with Toggle 1 and the field will get focus and the keyboard will show as expected.
import SwiftUI
import RealityKit
import RealityKitContent
struct ContentView: View {
@State private var text = ""
@State private var showKeyboard = false
@FocusState private var focusedField: FocusField?
private enum FocusField: Hashable {
case username
case password
}
var body: some View {
VStack {
TextField("Test", text: $text)
.focused($focusedField, equals: .username)
Text("Entered Text: \(text)")
.padding()
Button("Toggle 1") { // This button will work and show the keyboard
if focusedField != nil {
focusedField = nil
} else {
focusedField = .username
}
}
Spacer()
}
.padding()
.toolbar {
ToolbarItem(placement: .bottomOrnament) {
Button("Toggle 2") { // This button will set focus properly but not show the keyboard
if focusedField != nil {
focusedField = nil
} else {
focusedField = .username
}
}
}
}
}
}
Is there a way to work around this?
FB13641609
I'm testing using Group Activities and having no trouble iOS<->iOS or starting an activity on macOS and joining via iOS. However, when I start an activity and then try to join it from another macOS client, the starting side joins the session just fine, but the receiving side acts like I don't have the required app, even when it is already running.
I see the active SharePlay icon in the menu bar, and the Current Activity is shown, but instead of an "Open" button there is a "MyApp Required" string and a "View" button that goes to the App Store. (Where the app is not available yet, as expected, since I'm still working on it.) There is no GroupSession started on that Mac yet, obviously.
I'm looking for any hints to help debug what is going on. How does Group Activities find the app for the activity on macOS and how can I figure out why it isn't finding mine?
Thanks!
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = .white
do {
let shapeLayer = CAShapeLayer()
shapeLayer.frame = CGRect(x: 50, y: 100, width: 200, height: 108)
let path = UIBezierPath(roundedRect: shapeLayer.bounds, cornerRadius: 36)
shapeLayer.path = path.cgPath
shapeLayer.fillColor = UIColor.orange.cgColor
view.layer.addSublayer(shapeLayer)
}
do {
let layer = CALayer()
layer.backgroundColor = UIColor.blue.cgColor
layer.cornerRadius = 36
layer.frame = CGRect(x: 50, y: 300, width: 200, height: 108)
view.layer.addSublayer(layer)
}
}
}
The corner radius is set to 36 through CAShapeLayer, but the actual effect is larger than 36, close to half of the height.
Setting it through CALayer is fine
Can anyone explain it to me? Thank you
I'm not quite sure where the problem is, but I will describe what I am doing to recreate the issue, and am happy to provide whatever information I can to be more useful.
I am changing the ActivationPolicy for my app in order to make it unobtrusive when in the background (e.g. hiding it from the dock and using only a menu bar status item). When the user activates the app with a hotkey, it changes from NSApplicationActivationPolicyAccessory back to NSApplicationActivationPolicyRegular. This allows normal usage (dock icon, menu bar, etc.)
This works fine, except in a rare situation which I finally just tracked down. If there is a window open in the app and I use the hotkey to convert back to an accessory, and then disconnect and reconnect the display on which the app was previously displayed, when I convert the app back to "regular mode", the menu bar has disappeared (and I am left with an empty space at the top of the screen). I can also trigger this bug by having the display in question briefly mirror the other display (effectively "orphaning" the hidden app), and then restoring the original side-by-side configuration before activating the app again.
The app otherwise works, but the menu bar is missing. Switching back and forth with other apps does not fix the problem. Quitting and restarting the app resolves the issue. As does disabling the accessory only mode and forcing the app to always remain in "regular mode" with a dock icon (there is a preference for this in my app). Once fixed, I can then re-enable the "accessory mode" and all is well until the bug is triggered again.
The bug would normally occur quite sporadically, presumably requiring a particular combination of changing Spaces or displays, or having the computer go to sleep while this app was in accessory mode. Thus far, the above is the only way I have found that can replicate this issue on demand.
If I close all windows before hiding the app, then it works fine when I revert to "regular mode". It only happens if there is a window open at the time.
Using applicationDidChangeScreenParameters: on my AppDelegate indicates that there is a change in screen, and logging window.screen.frame for each open window in [NSApp orderedWindows] shows that the size changes from e.g. 1920x1080 to 0x0 and back while the display is disconnected or mirrored.
There is also an error in the console in Xcode when this happens -- invalid display identifier <some UUID>.
I have tried various options for window collectionBehavior, as well as various settings for Spaces (which I normally use). None of these changes has fixed the behavior thus far.
I use [NSApp hide:self]; from my AppDelegate to hide the app, and [[NSRunningApplication currentApplication] activateWithOptions:NSApplicationActivateAllWindows];[NSApp unhide:self]; to bring it back to the front.
I welcome any ideas for things to chase down, or requests for more specific information that would be useful.
Thank you!
Fletcher
We are currently trying to adopt the newly introduced find bar in our app.
The app:
The app is a text editor with a main text view. However it includes nested views (for text like footnotes) that are presented as modal sheets. So you tap on the footnote within the main text, a form sheet is presented with the contents of the footnote ready to be edited. We have an existing search implementation, but are eager to move to the system-provided UI. Connecting the find bar through a custom UIFindSession with our existing implementation is working without any issues.
The Problem:
Searching for text does not only work in the main text view, but also nested text (like footnotes). Let's say I have a text containing the word "iPhone" both in the main text and the footnote. In our existing implementation, stepping from the search match to the next one would open the modal and highlight the match in the nested text. The keyboard would stay open.
With the new UIFindInteraction this is not working however. As soon as a modal form sheet is presented, the find interaction closes. By looking at the stack trace I can see a private class called UIInputWindowController that cleans up input accessory views after the modal gets presented. I believe it is causing the find panel to give up its first responder state. I noticed that opening popovers appears to be working fine.
Is there a way to alter the presentation of the nested text so that the view is either not modal or able to preserve the current find session? Or is this unsupported behavior and we should try and look for a different way?
The thing that really confuses me is that this appears to work without issue in Notes.app. There the find bar is implemented as well. There are multiple views that can be presented while the find bar is open. Move Note is one of them. The view appears as a modal sheet. It keeps the find bar open and active, though its tint color matches the deactivated one of the main Notes view. The find bar is still functional with the text field being active and the overlay updating in the background. This behavior appears to be a bug in the Notes app, but is exactly what we want for our use case.
I attached some images: Two are from the Notes app, two from a test project demonstrating the problem. Opening a modal view closes the find bar there.
I writing swift code to change the app icon using setAlternateIconName and flutter MethodChannel to invoke swift.
UIApplication.shared.setAlternateIconName(iconName) { error in
if let error = error {
print("Error setting alternate icon: \(error.localizedDescription)")
result(FlutterError(code: "ICON_CHANGE_ERROR", message: error.localizedDescription, details: nil)) // Send error back to Flutter
} else {
print("App icon changed successfully!")
result(nil) // Success!
}
}
But I got an error message the requested operation couldn't be completed because the feature is not supported when using it on iOS 17+.
So, Is setAlternateIconName still available?
PS. In XCode, the code hinting shows that setAlternateIconName is still not deprecated.
Hi Team,
We are AAER for Apple and have come across multiple request from customers of BYOD iOS devices that they want to block screenshot and screen sharing for specific work apps like outlook on BYOD iOS devices. Is there any specific App Config Plist for Outlook or .ipa file that will allow us to achieve this.
I am trying to support dragging out a 'file' object from my app into Finder, on macOS. I have my object conform to Transferable and the files are saved on disk locally, so I just want to pass it the URL. This works fine when dragging out to other apps, like Notes or Mail, but not in Finder. I setup a ProxyRepresentation as well, as suggested by another thread, but it doesn't seem to help. Is there any other setup I need to do in the Xcode project file for it to work, or is there something else that I'm missing?
@available(iOSApplicationExtension 17.0, macOSApplicationExtension 14.0, *)
extension FileAttachments: Transferable {
public static var transferRepresentation: some TransferRepresentation {
FileRepresentation(exportedContentType: UTType.content) { content in
SentTransferredFile(content.fullFileURL(), allowAccessingOriginalFile: false)
}
.exportingCondition { file in
if let fileUTI = UTType(filenameExtension: file.fullFileURL().pathExtension), let fileURL = file.fullFileURL() {
print("FileAttachments: FileRepresentation exportingCondition fileUTI: \(fileUTI) for file: \(fileURL)")
return fileUTI.conforms(to: UTType.content)
}
return false
}
.suggestedFileName{$0.fileRenamedName}
ProxyRepresentation { file in
if let fileURL = file.fullFileURL() {
print("FileAttachments: ProxyRepresentation returning file")
return fileURL
}
return file.fullFileURL()!
}
}
}
I need to convert user input to HID key codes, and while for English it's pretty easily done, for other languages not so much. On Mac there is UCKeyTranslate function, but on iOS I couldn't find anything like this.
Is there a good way to achieve this?
Below is my sample code. On the Home page, when I click "show sheet," the sheet page expands, and the StateObject inside the sheet is initialized once. However, when I click "show Fullscreen" and then click "show sheet" inside the fullscreen page, the sheet gets initialized twice.
However, if I remove navigationDestination, this issue does not occur.
This problem causes the network request in the sheet page to be triggered multiple times.
Can someone tell me the reason?
enum TestRouter: String, Hashable {
case test
var targetView: some View {
Text("test")
}
var title: String {
return "test title"
}
}
@MainActor
struct NavigationInnerView<Content>: View where Content: View {
var contentView: () -> Content
@MainActor public init(@ViewBuilder contentView: @escaping () -> Content) {
self.contentView = contentView
}
var body: some View {
NavigationStack() {
contentView()
.navigationDestination(for: TestRouter.self) { route in
route.targetView
}
}
.navigationViewStyle(StackNavigationViewStyle())
}
}
struct ContentView: View {
@State var showFullScreen: Bool = false
@State var showSheet: Bool = false
var contentView: some View {
VStack {
VStack {
Text("Home")
Button {
showFullScreen = true
} label: {
Text("show fullscreen")
}
Button {
showSheet = true
} label: {
Text("show sheet ")
}
}
}
}
var body: some View {
NavigationInnerView {
contentView
.fullScreenCover(isPresented: $showFullScreen) {
NavigationInnerView {
FullScreenContentView()
}
}
.sheet(isPresented: $showSheet) {
NavigationInnerView {
SheetContentView()
}
}
}
}
}
class FullScreenViewModel: ObservableObject {
@Published var content: Bool = false
init() {
print("Full Screen ViewModel init")
}
}
struct FullScreenContentView: View {
@Environment(\.dismiss) var dismiss
@State var showSheet: Bool = false
@StateObject var viewModel: FullScreenViewModel = .init()
init() {
print("Full screen view init")
}
var body: some View {
VStack {
Text("FullScreen")
Button {
dismiss()
}label: {
Text("dismiss")
}
Button {
showSheet = true
} label: {
Text("show sheet")
}
}
.sheet(isPresented: $showSheet) {
NavigationInnerView {
SheetContentView()
}
}
}
}
class SheetViewModel: ObservableObject {
@Published var content: Bool = false
init() {
print("SheetViewModel init")
}
}
struct SheetContentView: View {
@Environment(\.dismiss) var dismiss
@StateObject var viewModel = SheetViewModel()
init() {
print("sheet view init")
}
var body: some View {
Text("Sheet")
Button {
dismiss()
} label: {
Text("dismiss")
}
}
}
#Preview {
ContentView()
}
We have a IOS app and we are using the same app for Mac Catalyst.
In IOS we are able to detect when user take the screenshot using UIApplicationUserDidTakeScreenshotNotification. But in MACCatalyst it is not working.
As per docs UIApplicationUserDidTakeScreenshotNotification and UIScreenCapturedDidChangeNotification are both supported for MacCatalyst 13.1+. But I am not getting screen shot notifications using both
Hello!
I have a destination navigation which is TabVIew where each tab item is ScrollView. And when scrolling content of any of tab items is underneath navigation bar its background is always hidden. But at the same time tab bar background is toggled depending on scrolling content position.
I expected it would work with TabView the same as with any other view.
Is it supposed to work like that?
The aim is to save the data of a program in 2 different formats of choice, say type1 (default) and type2.
No problem when + (BOOL)autosavesInPlace is NO, you can save as… and get a choice.
No problem when + (BOOL)autosavesInPlace is YES and you created a new document, you can choose when saving.
But you do not get a choice when you created the new file by duplicating a existing file. It takes the type of the latter.
(Using dataOfType:error:, but did not find a solution either by using writeToURL:ofType:error:, duplicateDocument:, etc.)
Topic:
UI Frameworks
SubTopic:
AppKit
I'm having this problem, with this code:
let docPicker = UIDocumentPickerViewController(forOpeningContentTypes: [
.pdf
])
docPicker.delegate = self
docPicker.modalPresentationStyle = .currentContext
view.window?.rootViewController?.present(docPicker, animated: true, completion: nil)
but then when I open the simulator and click the button that calls to the method that has this code...
Cannot pick the pdf document.
Testing in browserstack with real devices is working, but it's a very slow process, why I cannot use simulators to make work this?
Could an Apple employee that works on SwiftUI please explain the update() func in the DynamicProperty protocol? The docs have ambiguous information, e.g.
"Updates the underlying value of the stored value."
and
"SwiftUI calls this function before rendering a view’s body to ensure the view has the most recent value."
From: https://developer.apple.com/documentation/swiftui/dynamicproperty/update()
How can it both set the underlying value and get the most recent value? What does underlying value mean? What does stored value mean?
E.g. Is the code below correct?
struct MyProperty: DynamicProperty {
var x = 0
mutating func update() {
// get x from external storage
x = storage.loadX()
}
}
Or should it be:
struct MyProperty: DynamicProperty {
let x: Int
init(x: Int) {
self.x = x
}
func update() {
// set x on external storage
storage.save(x: x)
}
}
This has always been a mystery to me because of the ambigious docs so thought it was time to post a question.
Topic:
UI Frameworks
SubTopic:
SwiftUI
Hello.
I have a scenario where a hover effect is being shown for a button that is disabled. Usually this doesn't happen but when you wrap the button in a Menu it doesn't work properly.
Here is some example code:
struct ContentView: View {
var body: some View {
NavigationStack {
Color.green
.toolbar {
ToolbarItem(placement: .topBarTrailing) {
Menu("Menu") {
Button("Disabled Button") {}
.disabled(true)
.hoverEffectDisabled() // This doesn't work.
Button("Enabled Button") {}
}
}
}
}
}
}
And here is what it looks like:
This looks like a SwiftUI bug. Any help is appreciated, thank you!
I am wondering how I change the measurement units on screen in my CPMapTemplate. In my screenshot below the distance is in miles, but how can I change that to kilometers?
Does this need to come from my route data?
I am not seeing this anywhere in the CarPlay programming guide or in the documentation.
Starting from Sequoia release 15.2 apps crash with following call stack, when adding static text controls. First call to [NSTextField setStringValue] causes following crash
0 libobjc.A.dylib 0x19f2f5820 objc_msgSend + 32
1 AppKit 0x1a3355460 -[NSCell _objectValue:forString:errorDescription:] + 144
2 AppKit 0x1a3355348 -[NSCell setStringValue:] + 48
3 AppKit 0x1a33af9fc -[NSControl setStringValue:] + 104
4 AppKit 0x1a3d1f190 -[NSTextField setStringValue:] + 52
It happens on specific MacBook Pro models(16 in MacBook Pro).
Crash analysis found that isa pointer of the object was corrupted for the object NSCell. Enabled Zombies in debugging, not found any issue. Also tried address sanitizer. Since the issue started with a recent release of macOS, any changes in the Appkit in the recent releases trigger the crash? Any help/suggestions would be appreciated.
Topic:
UI Frameworks
SubTopic:
AppKit