I have an URGENT assignment with detail as below:
I have an application and it has an Extension UI for it.
When I select an item of popup list on standard UI of application.
How can I catch event of this selection in standard UI and do update for Extension UI?
Do we have any relationship between Standard UI and Extension UI in MacOS?
Please help to share your experience about this technical?
General
RSS for tagExplore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Post
Replies
Boosts
Views
Activity
So in our app, we let the user to type in credit card numbers. We wanna enable the user to autofill credit card numbers saved in the safari.
We use the webview to open a link that leads to a PCI-compliant third-party components.
We use WKWebView. If we use SFSafariViewController, then the webview will open the link using real safari, and the user will be able to autofill from safari. But we use WKWebView.
My question is:
Is it possible to enable the user to autofill credit card numbers saved in safari using WKWebView?
Thanks!
Has anyone been able to create a Control Center widget that opens a snippet view? There are stock Control Center widgets that do this, but I haven't been able to get it to work.
Here's what I tried:
struct SnippetButton: ControlWidget {
var body: some ControlWidgetConfiguration {
StaticControlConfiguration(
kind: "***.***.snippetWidget"
) {
ControlWidgetButton(action: SnippetIntent()) {
Label("Show Snippet", systemImage: "map.fill")
}
}
.displayName(LocalizedStringResource("Show Snippet"))
.description("Show a snippet.")
}
}
struct SnippetIntent: ControlConfigurationIntent {
static var title: LocalizedStringResource = "Show a snippet"
static var description = IntentDescription("Show a snippet with some text.")
@MainActor
func perform() async throws -> some IntentResult & ProvidesDialog & ShowsSnippetView {
return .result(dialog: IntentDialog("Hello!"), view: SnippetView())
}
}
struct SnippetView: View {
var body: some View {
Text("Hello!")
}
}
Apple's documentation pretty much only says this about ObservableObject: "A type of object with a publisher that emits before the object has changed. By default an ObservableObject synthesizes an objectWillChange publisher that emits the changed value before any of its @Published properties changes.".
And this sample seems to behave the same way, with or without conformance to the protocol by Contact:
import UIKit
import Combine
class ViewController: UIViewController {
let john = Contact(name: "John Appleseed", age: 24)
private var cancellables: Set<AnyCancellable> = []
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
john.$age.sink { age in
print("View controller's john's age is now \(age)")
}
.store(in: &cancellables)
print(john.haveBirthday())
}
}
class Contact {
@Published var name: String
@Published var age: Int
init(name: String, age: Int) {
self.name = name
self.age = age
}
func haveBirthday() -> Int {
age += 1
return age
}
}
Can I therefore omit conformance to ObservableObject every time I don't need the objectWillChange publisher?
I went to update to Apple Intelligence Beta 15.1 on Sequoia yesterday when it dropped, and I’m still waitlisted. The weird thing is, I had it on my Mac within a few minutes after downloading the beta, but then restarted the Mac and it now says “joined waitlist” and still does today.
My app needs to recognize screen gestures the hard way. I have working code for flick recognition (a combination of distance, direction, and velocity), but it's not as reliable as whatever Apple uses. Does anyone know exactly what defines a flick in iOS?
I have been using UIDocumentInteractionController and presentOptionsMenuFromRect for sharing PDFs for years.
Now I get these errors.
Only support loading options for CKShare and SWY types.
[ERROR] failed to get service endpoint creating for for item at URL
Collaboration: error loading metadata for documentURL:file:
The files I create I believe are fine. I have tried sharing them from a temporary folder and also from a subfolder on the iPad but still get these errors. Any help appreciated.
I have apps that send requests for route between 2 locations and search, filter then display facilities near the route. The apps first send a request for the route on a background thread, then based on route, search for facilities near certain locations on or near the route. There maybe multiple searches on the same route, each on a different location. Suitable results then are displayed on the map. Apps also do live updates.
However, since I have switched to using NSURLSession to search for the route on a background thread, not all suitable results/pin are displayed. Certain pins only show up upon the next didUpdateToLocation call.
So my question is, what is the best practice to sync the results on the UI? Why do only some of the results show up on the UI, and others don't.
In my document-based app for macOS I am using storyboard. I create a custom menu and menu items in storyboard. The menu is there, but the menu item remains inactive (grey).
In order to activate the menu, one has to use NSWindowDelegate and walk the menu tree, to enable the menu items.
import Foundation
import Cocoa
class DocumentController: NSViewController, NSTextFieldDelegate {
@IBOutlet var myOutlet: NSView!
// ... code here ...
}
extension DocumentController: NSWindowDelegate {
func windowDidBecomeMain(_ notification: Notification) {
NSLog("windowDidBecomeMain - enable menu")
if let customMenu = NSApp.mainMenu?.items[2].submenu {
customMenu.item(at: 1)?.isEnabled = true
}
}
func windowDidResignMain(_ notification: Notification) {
NSLog("windowDidResignMain - disable menu")
}
}
I added the outlet (myOutlet) in storyboard, but windowDidBecomeMain() does not get called?
Thanks for any help.
I'm trying to build a Live Activity extension. I can successfully start my live activity via push notification. The problem is, when I start live activity from my app, I can get pushTokenUpdates since I control everything and run the for loop that gets pushTokenUpdates. But the code that gets pushTokenUpdates isn't called when I start live activity with push notification since system starts it automatically (maybe it is called, but I don't know when and where).
Where am I supposed to get pushTokenUpdates when I start Live Activity using push notification to send them to my server?
The relevant code is below:
for await data in activity.pushTokenUpdates {
let token = data.map { String(format: "%02x", $0) }.joined()
Logger().log("Activity token: \(token)")
// send token to the server
}
I'm trying to develop a Live Activity Extension. The problem is, I can't get pushToStartToken. I'm able to get it when I start a Live Activity, but I can't when I don't start a Live Activity.
This function successfully generates the token:
private func startNewLiveActivity() async {
guard #available(iOS 16.2, *) else { return }
let attributes = MyWidgetAttributes(
homeTeam: "Badger",
awayTeam: "Lion",
date: "12/09/2023"
)
let initialContentState = ActivityContent(
state: MyWidgetAttributes.ContentState(
homeTeamScore: 0,
awayTeamScore: 0,
lastEvent: "Match Start"
),
staleDate: nil
)
guard let activity = try? Activity.request(
attributes: attributes,
content: initialContentState,
pushType: .token
) else { return }
if #available(iOS 17.2, *) {
Task {
for await data in Activity< MyWidgetAttributes>.pushToStartTokenUpdates {
let token = data.map { String(format: "%02x", $0) }.joined()
// THE DESIRED pushToStartToken TOKEN IS GENERATED HERE
}
}
}
for await data in activity.pushTokenUpdates {
let token = data.map { String(format: "%02x", $0) }.joined()
// I send token to server here.
}
}
But when I try to get pushToStartToken separately, without creating a live activity, it doesn't return any value:
private func getPushToStartToken() async {
guard #available(iOS 17.2, *) else { return }
Task {
for await data in Activity<MyWidgetAttributes>.pushToStartTokenUpdates {
let token = data.map { String(format: "%02x", $0) }.joined()
// THIS DOESN'T GENERATE ANY TOKENS SINCE THE ACTIVITY IS NOT CREATED
}
}
}
It's the same as the title, and when I checked the log, there were hundreds to thousands of lines with the following content.
The larger the page index of the Pdf from which you select letters, the more logs will be recorded.
.notdef: no mapping.
.notdef: no mapping.
.notdef: no mapping.
.notdef: no mapping.
.notdef: no mapping.
.notdef: no mapping.
.
.
.
can't create CMap Adobe-KR1-UCS2'. can't create CMap Adobe-KR1-UCS2'.
can't create CMap Adobe-KR1-UCS2'. can't create CMap Adobe-KR1-UCS2'.
can't create CMap Adobe-KR1-UCS2'. can't create CMap Adobe-Identity-UCS2'.
can't create CMap Adobe-KR1-UCS2'. can't create CMap Adobe-KR1-UCS2'.
can't create CMap Adobe-Identity-UCS2'. can't create CMap Adobe-Identity-UCS2'.
can't create CMap `Adobe-Identity-UCS2'.
I don't know why this is happening. Is there a solution?
How can we performantly scroll to a target location using TextKit 2?
Hi everyone,
I'm building a custom text editor using TextKit 2 and would like to scroll to a target location efficiently. For instance, I would like to move to the end of a document seamlessly, similar to how users can do in standard text editors by using CMD + Down.
Background:
NSTextView and TextEdit on macOS can navigate to the end of large documents in milliseconds. However, after reading the documentation and experimenting with various ideas using TextKit 2's APIs, it's not clear how third-party developers are supposed to achieve this.
My Code:
Here's the code I use to move the selection to the end of the document and scroll the viewport to reveal the selection.
override func moveToEndOfDocument(_ sender: Any?) {
textLayoutManager.ensureLayout(for: textLayoutManager.documentRange)
let targetLocation = textLayoutManager.documentRange.endLocation
let beforeTargetLocation = textLayoutManager.location(targetLocation, offsetBy: -1)!
textLayoutManager.textViewportLayoutController.layoutViewport()
guard let textLayoutFragment = textLayoutManager.textLayoutFragment(for: beforeTargetLocation) else {
return
}
guard let textLineFragment = textLayoutFragment.textLineFragment(for: targetLocation, isUpstreamAffinity: true) else {
return
}
let lineFrame = textLayoutFragment.layoutFragmentFrame
let lineFragmentFrame = textLineFragment.typographicBounds.offsetBy(dx: 0, dy: lineFrame.minY)
scrollToVisible(lineFragmentFrame)
}
While this code works as intended, it is very inefficient because ensureLayout(_:) is incredibly expensive and can take seconds for large documents.
Issues Encountered:
In my attempts, I have come across the following two issues.
Estimated Frames: The frames of NSTextLayoutFragment and NSTextLineFragment are approximate and not precise enough for scrolling unless the text layout fragment has been fully laid out.
Laying out all text is expensive: The frames become accurate once NSTextLayoutManager's ensureLayout(for:) method has been called with a range covering the entire document. However, ensureLayout(for:) is resource-intensive and can take seconds for large documents. NSTextView, on the other hand, accomplishes the same scrolling to the end of a document in milliseconds.
I've tried using NSTextViewportLayoutController's relocateViewport(to:) without success. It's unclear to me whether this function is intended for a use case like mine. If it is, I would appreciate some guidance on its proper usage.
Configuration:
I'm testing on macOS Sonoma 14.5 (23F79), Swift (AppKit), Xcode 15.4 (15F31d).
I'm working on a multi-platform project written in AppKit and UIKit, so I'm looking for either a single solution that works in both AppKit and UIKit or two solutions, one for each UI framework.
Question:
How can third-party developers scroll to a target location, specifically the end of a document, performantly using TextKit 2?
Steps to Reproduce:
The issue can be reproduced using the example project (download from link below) by following these steps:
Open the example project.
Run the example app on a Mac. The example app shows an uneditable text view in a scroll view. The text view displays a long text.
Press the "Move to End of Document" toolbar item.
Notice that the text view has scrolled to the bottom, but this took several seconds (~3 seconds on my MacBook Pro 16-inch, 2021). The duration will be shown in Xcode's log.
You can open the ExampleTextView.swift file and find the implementation of moveToEndOfDocument(_:). Comment out line 84 where the ensureLayout(_:) is called, rerun the app, and then select "Move to End of Document" again. This time, you will notice that the text view moves fast but does not end up at the bottom of the document.
You may also open the large-file.json in the project, the same file that the example app displays, in TextEdit, and press CMD+Down to move to the end of the document. Notice that TextEdit does this in mere milliseconds.
Example Project:
The example project is located on GitHub:
https://github.com/simonbs/apple-developer-forums/tree/main/how-can-we-performantly-scroll-to-a-target-location-using-textkit-2
Any advice or guidance on how to achieve this with TextKit 2 would be greatly appreciated.
Thanks in advance!
Best regards,
Simon
Hello,
I've noticed a few rare crashes with the following stacktrace reported on AppStore connect:
Hardware Model: iPhone16,2
AppStoreTools: 15F31e
AppVariant: 1:iPhone16,2:17.4
OS Version: iPhone OS 17.5.1 (21F90)
Exception Type: EXC_CRASH (SIGABRT)
Exception Codes: 0x0000000000000000, 0x0000000000000000
Termination Reason: SIGNAL 6 Abort trap: 6
Terminating Process: <App> [15575]
Triggered by Thread: 0
Last Exception Backtrace:
0 CoreFoundation 0x1a3d38f20 __exceptionPreprocess + 164 (NSException.m:249)
1 libobjc.A.dylib 0x19bbbe018 objc_exception_throw + 60 (objc-exception.mm:356)
2 Foundation 0x1a323f868 -[NSAssertionHandler handleFailureInMethod:object:file:lineNumber:description:] + 188 (NSException.m:252)
3 CoreAutoLayout 0x1c4eabcc8 -[NSLayoutConstraint _setSymbolicConstant:constant:symbolicConstantMultiplier:] + 552 (NSLayoutConstraint.m:669)
4 CoreAutoLayout 0x1c4eab674 -[NSLayoutConstraint setConstant:] + 96 (NSLayoutConstraint.m:750)
5 <App> 0x10486d578 closure #1 in BaseChatTableViewCell.requestPreview(for:with:) + 540 (BaseChatTableViewCell+File.swift:162)
6 <App> 0x10486d73c thunk for @escaping @callee_guaranteed (@in_guaranteed URLRequest, @guaranteed NSHTTPURLResponse?, @guaranteed UIImage) -> () + 164 (<compiler-generated>:0)
7 <App> 0x104c4f814 __85-[UIImageView(AFNetworking) setImageWithURLRequest:placeholderImage:success:failure:]_block_invoke + 176 (UIImageView+AFNetworking.m:118)
8 <App> 0x104c3cc74 __78-[AFImageDownloader downloadImageForURLRequest:withReceiptID:success:failure:]_block_invoke.88 + 52 (AFImageDownloader.m:276)
9 libdispatch.dylib 0x1abbdc13c _dispatch_call_block_and_release + 32 (init.c:1530)
10 libdispatch.dylib 0x1abbdddd4 _dispatch_client_callout + 20 (object.m:576)
11 libdispatch.dylib 0x1abbec5a4 _dispatch_main_queue_drain + 988 (queue.c:7898)
12 libdispatch.dylib 0x1abbec1b8 _dispatch_main_queue_callback_4CF + 44 (queue.c:8058)
13 CoreFoundation 0x1a3d0b710 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 16 (CFRunLoop.c:1780)
14 CoreFoundation 0x1a3d08914 __CFRunLoopRun + 1996 (CFRunLoop.c:3149)
15 CoreFoundation 0x1a3d07cd8 CFRunLoopRunSpecific + 608 (CFRunLoop.c:3420)
16 GraphicsServices 0x1e8bb81a8 GSEventRunModal + 164 (GSEvent.c:2196)
17 UIKitCore 0x1a634090c -[UIApplication _run] + 888 (UIApplication.m:3713)
18 UIKitCore 0x1a63f49d0 UIApplicationMain + 340 (UIApplication.m:5303)
19 <App> 0x1047c6c20 main + 80 (main.m:11)
20 dyld 0x1c73b9e4c start + 2240 (dyldMain.cpp:1298)
It crashes on the last line of
let previewSize = BaseChatTableViewCell.getPreviewSize(from: imageSize, isMediaFile)
self.filePreviewImageViewHeightConstraint?.constant = previewSize.height
self.filePreviewImageViewWidthConstraint?.constant = previewSize.width
where previewSize is a CGSize. I am unable to reproduce the crash, nor am I able to understand why it crashes there. Anyone got an idea what could cause a crash on setting a constant?
My app’s WidgetKit widgets are all crashing on iOS 18 beta 5. They were working just fine on earlier betas. This is happening across both Home and Lock Screen widgets. It's an EXC_BAD_ACCESS crash that seems to be happening deep within WidgetKit.
I've seen other developers posting about this on social media, so it's not just me. Wanted to get this flagged ASAP as it's very late in the beta cycle now...
Filed as FB14684253.
Third-party WidgetKit complications on watchOS 11 beta 5 are not appearing in the list of available complications. They have also disappeared from watch faces where they were installed. The exact same complications were working fine on earlier betas. This is happening on device, but not in simulator.
This issue may be related to FB14684253, which was fixed with the release of Xcode beta 5. However, Xcode beta 5 does not fix the issue on Apple Watch.
As a sanity check, I also tried with the Backyard Birds sample project, and the complications for that app aren't appearing on device either.
Filed as FB14689021.
I am writing to inquire if there is any way to programmatically check whether a user has enabled the “Large App Icon” mode in iOS 18. Our development team is working on optimizing our app’s user interface, and it would be beneficial to adapt the design based on this setting.
Any guidance on how to access this information, or if it’s even possible within the current iOS APIs, would be greatly appreciated.
Thank you for your time and assistance.
In a music streaming app, when using Activity.request to activate the Dynamic Island, the system’s Now Playing interface appears correctly. However, the app's live activities, lock screen, and other related features fail to display properly.
During debugging, the following code is used:
activity = try Activity.request(attributes: attributes, contentState: contentState, pushType: .token)
if !NMABTestManager.default().is(inTest: "FH-NewLiveActivityPush") {
// Listen to push token updates
if activity != nil {
tokenUpdatesTask?.cancel()
tokenUpdatesTask = Task.detached {
for await tokenData in self.activity!.pushTokenUpdates {
let mytoken = tokenData.map { String(format: "%02x", $0) }.joined().uppercased()
// pushToken is Data, needs to be converted to String using the above method before being passed to the server
self.pushToken = mytoken
}
}
}
}
} catch (let error) {
print("Error Starting Live Activity: \(error.localizedDescription)")
}
In this scenario, the push token is returned correctly, and no errors are triggered.
This issue did not occur in iOS 17 but appears sporadically in iOS 18. Once it occurs, it cannot be resolved through restarting or other means.
feedbackid:FB14763873, i upload my sysdisagnose
My app uses Core Data and has enabled App Groups for data sharing between the App and Widget.
In my app, there's a Core Data entity called Task. As per the documentation's suggestion, I've separately implemented a TaskData struct that conforms to AppEntity. I've also implemented TaskDataQuery: EntityQuery, which includes a method called suggestedEntities. This method fetches all Tasks from the main context and uses Task.toTaskData.
Following the documentation, I've implemented the corresponding WidgetConfigurationIntent, which holds:
@Parameter(title: "Task")
var task: TaskData
as well as the corresponding AppIntentTimelineProvider to implement the provider.
I haven't encountered any retrieval issues on the simulator; everything works perfectly.
However, the problem arises when I deploy to a physical device. Users report that their widgets can't retrieve any data. Specifically, when users long-press the widget to set up a task, it remains in a Loading state, unable to fetch any Core Data.
I've looked through some resources, and it seems this might be a common issue with iOS 17?
How can I resolve this issue? Has anyone encountered this or can offer any suggestions? This has been troubling me for several days now, and it's causing my product to continually lose users. I'm really upset about it.
Any advice is welcome.
I am developing SDK and swizzling viewDidAppear.
I have a customer who implements a custom TabBar Navigation where VC's are added to the hierarchy on the first load, and then, he changes the opacity to the currently displayed tab so the next time the user sees the tab - viewDidAppear isn't called, so my code isn't called.
I'm attaching a sample project which reproduces that.
Is there any way to trigger ViewDidAppear intentionally?
If yes, what can be the side effect of doing that?
Do I have any other alternative in this case?
@main
struct DemoCustomTabViewApp: App {
@UIApplicationDelegateAdaptor(AppDelegate.self) var appDelegate
var body: some Scene {
WindowGroup {
TabBarRouterView()
}
}
}
import UIKit
// MARK: - TabBarItem (unchanged)
enum TabBarItem: Identifiable, CaseIterable {
case home, search, profile
var id: Self { self }
var title: String {
switch self {
case .home: return "Home"
case .search: return "Search"
case .profile: return "Profile"
}
}
var icon: String {
switch self {
case .home: return "house"
case .search: return "magnifyingglass"
case .profile: return "person"
}
}
}
// MARK: - NavigationControllerView
struct NavigationControllerView: UIViewControllerRepresentable {
var rootViewController: UIViewController
func makeUIViewController(context: Context) -> UINavigationController {
let navigationController = UINavigationController(rootViewController: rootViewController)
return navigationController
}
func updateUIViewController(_ uiViewController: UINavigationController, context: Context) {}
}
// MARK: - TabBarRouterViewModel
class TabBarRouterViewModel: ObservableObject {
@Published var currentTab: TabBarItem = .home
@Published var cachedViews: [TabBarItem: AnyView] = [:]
let tabs: [TabBarItem] = TabBarItem.allCases
func switchTab(to tab: TabBarItem) {
currentTab = tab
}
func createView(for tab: TabBarItem) -> AnyView {
if let cachedView = cachedViews[tab] {
return cachedView
}
let rootViewController: UIViewController
switch tab {
case .home:
rootViewController = UIHostingController(rootView: Text("Home View"))
case .search:
rootViewController = UIHostingController(rootView: Text("Search View"))
case .profile:
rootViewController = UIHostingController(rootView: Text("Profile View"))
}
let navigationView = NavigationControllerView(rootViewController: rootViewController)
let anyView = AnyView(navigationView)
cachedViews[tab] = anyView
return anyView
}
}
// MARK: - CustomTabBarView (unchanged)
struct CustomTabBarView: View {
let tabs: [TabBarItem]
@Binding var selectedTab: TabBarItem
let onTap: (TabBarItem) -> Void
var body: some View {
HStack {
ForEach(tabs) { tab in
Spacer()
VStack {
Image(systemName: tab.icon)
.font(.system(size: 24))
Text(tab.title)
.font(.caption)
}
.foregroundColor(selectedTab == tab ? .blue : .gray)
.onTapGesture {
onTap(tab)
}
Spacer()
}
}
.frame(height: 60)
.background(Color.white)
.shadow(radius: 2)
}
}
// MARK: - TabBarRouterView
struct TabBarRouterView: View {
@StateObject private var viewModel = TabBarRouterViewModel()
var body: some View {
VStack(spacing: .zero) {
contentView
CustomTabBarView(
tabs: viewModel.tabs,
selectedTab: $viewModel.currentTab,
onTap: viewModel.switchTab
)
}
.edgesIgnoringSafeArea(.bottom)
}
private var contentView: some View {
ZStack {
ForEach(viewModel.tabs) { tab in
viewModel.createView(for: tab)
.opacity(viewModel.currentTab == tab ? 1.0 : 0.0)
}
}
}
}