Hi everyone,
We have a user experiencing a display issue. Here's a screenshot they shared with us. Unlike in the simulator, where we see three icons, their display shows two buttons abbreviated with ellipses. The device is iPhone 12 mini with iOS 17.6.1.
The user isn't using any accessibility settings or large text size. Does anyone know what setting might be causing this? Any advice would be appreciated!
Thanks!
General
RSS for tagExplore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Post
Replies
Boosts
Views
Activity
I'd like an effect similar to the iOS 18 Siri, with a border/stroke extending around the view including the corner radius of the screen.
This is challenging as "logically" the view is rectangular and those radii don't really exist.
I've seen some posts around the web about _displayCornerRadius which appears to be what would work but it is private and as far as I can tell never mentioned anywhere else.
Does anyone know of a way to achieve a border around a view that correctly wraps around the corners of the rounded screen?
Currently using SwiftUI but open to any solutions.
My app exports a custom file type identifier for a file that it exports to share between other users of the same app. However in the list of apps in UIActivityViewController view, my app is listed far off screen. How can I make my app which owns the file type appear first in the list, instead of seeing many irrelevant apps that can't actually open my file?
Plist snippets:
<key>CFBundleDocumentTypes</key>
<array>
<dict>
<key>CFBundleTypeName</key>
<string>My Custom App File</string>
<key>LSHandlerRank</key>
<string>Owner</string>
<key>LSItemContentTypes</key>
<array>
<string>abc.myapp.myextension</string>
</array>
</dict>
</array>
<key>UTExportedTypeDeclarations</key>
<array>
<dict>
<key>UTTypeConformsTo</key>
<array>
<string>public.content</string>
<string>public.data</string>
</array>
<key>UTTypeDescription</key>
<string>My Custom App File</string>
<key>UTTypeIconFiles</key>
<array/>
<key>UTTypeIdentifier</key>
<string>abc.myapp.myextension</string>
<key>UTTypeTagSpecification</key>
<dict>
<key>public.filename-extension</key>
<array>
<string>myextension</string>
</array>
<key>public.mime-type</key>
<array>
<string>application/octet-stream</string>
</array>
</dict>
</dict>
</array>
I've noticed the tab bar in tvOS 18 (beta) is positioned lower on the TV screen than in previous versions. Bug? I see no documentation on this important UI change...
If this is not a bug, is there any way to adjust the y coordinate of the tab bar location in tvOS 18? I would really like to restore this to the previous location for my app and avoid having to do OS-conditional constraints for all my views/pages.
My app correctly applies the ShieldConfiguration template that I chose (icon, text, CTA, etc), but some users reported seeing the following default shield:
With the default shield text that reads, "You cannot use AppName because it is restricted."
Any idea why?
I created a Radar for this FB14766095, but thought I would add it here for extra visibility, or if anyone else had any thoughts on the issue.
Basic Information
Please provide a descriptive title for your feedback:
iOS 18 hit testing functionality differs from iOS 17
What type of feedback are you reporting?
Incorrect/Unexpected Behavior
Description:
Please describe the issue and what steps we can take to reproduce it:
We have an issue in iOS 18 Beta 6 where hit testing functionality differs from the expected functionality in iOS 17.5.1 and previous versions of iOS.
iOS 17: When a sheet is presented, the hit-testing logic considers subviews of the root view, meaning the rootView itself is rarely the hit view.
iOS 18: When a sheet is presented, the hit-testing logic changes, sometimes considering the rootView itself as the hit view.
Code:
import SwiftUI
struct ContentView: View {
@State var isPresentingView: Bool = false
var body: some View {
VStack {
Text("View One")
Button {
isPresentingView.toggle()
} label: {
Text("Present View Two")
}
}
.padding()
.sheet(isPresented: $isPresentingView) {
ContentViewTwo()
}
}
}
#Preview {
ContentView()
}
struct ContentViewTwo: View {
@State var isPresentingView: Bool = false
var body: some View {
VStack {
Text("View Two")
}
.padding()
}
}
extension UIWindow {
public override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
/// Get view from superclass.
guard let hitView = super.hitTest(point, with: event) else { return nil }
print("RPTEST rootViewController = ", rootViewController.hashValue)
print("RPTEST rootViewController?.view = ", rootViewController?.view.hashValue)
print("RPTEST hitView = ", hitView.hashValue)
if let rootView = rootViewController?.view {
print("RPTEST rootViewController's view memory address: \(Unmanaged.passUnretained(rootView).toOpaque())")
print("RPTEST hitView memory address: \(Unmanaged.passUnretained(hitView).toOpaque())")
print("RPTEST Are they equal? \(rootView == hitView)")
}
/// If the returned view is the `UIHostingController`'s view, ignore.
print("MTEST: hitTest rootViewController?.view == hitView", rootViewController?.view == hitView)
print("MTEST: -")
return hitView
}
}
Looking at the print statements from the provided sample project:
iOS 17 presenting a sheet from a button tap on the ContentView():
RPTEST rootViewController's view memory address: 0x0000000120009200
RPTEST hitView memory address: 0x000000011fd25000
RPTEST Are they equal? false
MTEST: hitTest rootViewController?.view == hitView false
RPTEST rootViewController's view memory address: 0x0000000120009200
RPTEST hitView memory address: 0x000000011fd25000
RPTEST Are they equal? false
MTEST: hitTest rootViewController?.view == hitView false
iOS 17 dismiss from presented view:
RPTEST rootViewController's view memory address: 0x0000000120009200
RPTEST hitView memory address: 0x000000011fe04080
RPTEST Are they equal? false
MTEST: hitTest rootViewController?.view == hitView false
RPTEST rootViewController's view memory address: 0x0000000120009200
RPTEST hitView memory address: 0x000000011fe04080
RPTEST Are they equal? false
MTEST: hitTest rootViewController?.view == hitView false
iOS 18 presenting a sheet from a button tap on the ContentView():
RPTEST rootViewController's view memory address: 0x000000010333e3c0
RPTEST hitView memory address: 0x0000000103342080
RPTEST Are they equal? false
MTEST: hitTest rootViewController?.view == hitView false
RPTEST rootViewController's view memory address: 0x000000010333e3c0
RPTEST hitView memory address: 0x000000010333e3c0
RPTEST Are they equal? true
MTEST: hitTest rootViewController?.view == hitView true
You can see here ☝️ that in iOS 18 the views have the same memory address on the second call and are evaluated to be the same. This differs from iOS 17.
iOS 18 dismiss
RPTEST rootViewController's view memory address: 0x000000010333e3c0
RPTEST hitView memory address: 0x0000000103e80000
RPTEST Are they equal? false
MTEST: hitTest rootViewController?.view == hitView false
RPTEST rootViewController's view memory address: 0x000000010333e3c0
RPTEST hitView memory address: 0x0000000103e80000
RPTEST Are they equal? false
MTEST: hitTest rootViewController?.view == hitView false
The question I want to ask:
Is this an intended change, meaning the current functionality in iOS 18 is expected?
Or is this a bug and it's something that needs to be fixed?
As a user, I would expect that the hit testing functionality would remain the same from iOS 17 to iOS 18.
Thank you for your time.
On testing my app with tvOS 18, I have noticed the Siri Remote back button no longer provides system-provided behavior when interacting with tab bar controller pages. Instead of moving focus back to the tab bar when pressed, the back button will close the app, as if the Home button was pressed. This occurs both on device and in the Simulator.
Create tvOS project with a tab bar controller.
Create pages/tabs which contain focusable items (ie. buttons)
Scroll down to any focusable item (ie. a button or UICollectionView cell)
Hit the Siri Remote back button. See expect behavior below:
Expected behavior: System-provided behavior should move focus back to the tab bar at the top of the screen.
Actual results: App is closed and user is taken back to the Home Screen.
Has anyone else noticed this behavior?
It seems that when an entity has and ordered to-many relationship to the same entity, inserting an object into the ordered set causes other objects of the set to turn into faults during the next save of the managed object context.
I verified it with several applications.
For the sake of example, the entity will be called Folder and the ordered to-many relationship subfolders (an NSOrdereset), with a cascade delete rule. The reciprocal to-one relationship is called parent.
Assuming you have a Folder object with two subfolders, removing the last subfolder from the set (setting its parent to nil) and reinserting it at index 0 with insertObject:<>inSubfoldersAtIndex:0 will turn the other subfolder into a fault at the next save.
Now assuming that other folder has a name attribute (NSString) that is bound to a textfield in your UI, the name of that subfolder will disappear when the context saves, since it becomes nil while the subfolder is turned into a fault.
Is this expected behavior?
Note: I'm using Objective C, Xcode 15 and macOS sonoma, but I've seen this issue occur on previous macOS versions.
I am developing SDK and swizzling viewDidAppear.
I have a customer who implements a custom TabBar Navigation where VC's are added to the hierarchy on the first load, and then, he changes the opacity to the currently displayed tab so the next time the user sees the tab - viewDidAppear isn't called, so my code isn't called.
I'm attaching a sample project which reproduces that.
Is there any way to trigger ViewDidAppear intentionally?
If yes, what can be the side effect of doing that?
Do I have any other alternative in this case?
@main
struct DemoCustomTabViewApp: App {
@UIApplicationDelegateAdaptor(AppDelegate.self) var appDelegate
var body: some Scene {
WindowGroup {
TabBarRouterView()
}
}
}
import UIKit
// MARK: - TabBarItem (unchanged)
enum TabBarItem: Identifiable, CaseIterable {
case home, search, profile
var id: Self { self }
var title: String {
switch self {
case .home: return "Home"
case .search: return "Search"
case .profile: return "Profile"
}
}
var icon: String {
switch self {
case .home: return "house"
case .search: return "magnifyingglass"
case .profile: return "person"
}
}
}
// MARK: - NavigationControllerView
struct NavigationControllerView: UIViewControllerRepresentable {
var rootViewController: UIViewController
func makeUIViewController(context: Context) -> UINavigationController {
let navigationController = UINavigationController(rootViewController: rootViewController)
return navigationController
}
func updateUIViewController(_ uiViewController: UINavigationController, context: Context) {}
}
// MARK: - TabBarRouterViewModel
class TabBarRouterViewModel: ObservableObject {
@Published var currentTab: TabBarItem = .home
@Published var cachedViews: [TabBarItem: AnyView] = [:]
let tabs: [TabBarItem] = TabBarItem.allCases
func switchTab(to tab: TabBarItem) {
currentTab = tab
}
func createView(for tab: TabBarItem) -> AnyView {
if let cachedView = cachedViews[tab] {
return cachedView
}
let rootViewController: UIViewController
switch tab {
case .home:
rootViewController = UIHostingController(rootView: Text("Home View"))
case .search:
rootViewController = UIHostingController(rootView: Text("Search View"))
case .profile:
rootViewController = UIHostingController(rootView: Text("Profile View"))
}
let navigationView = NavigationControllerView(rootViewController: rootViewController)
let anyView = AnyView(navigationView)
cachedViews[tab] = anyView
return anyView
}
}
// MARK: - CustomTabBarView (unchanged)
struct CustomTabBarView: View {
let tabs: [TabBarItem]
@Binding var selectedTab: TabBarItem
let onTap: (TabBarItem) -> Void
var body: some View {
HStack {
ForEach(tabs) { tab in
Spacer()
VStack {
Image(systemName: tab.icon)
.font(.system(size: 24))
Text(tab.title)
.font(.caption)
}
.foregroundColor(selectedTab == tab ? .blue : .gray)
.onTapGesture {
onTap(tab)
}
Spacer()
}
}
.frame(height: 60)
.background(Color.white)
.shadow(radius: 2)
}
}
// MARK: - TabBarRouterView
struct TabBarRouterView: View {
@StateObject private var viewModel = TabBarRouterViewModel()
var body: some View {
VStack(spacing: .zero) {
contentView
CustomTabBarView(
tabs: viewModel.tabs,
selectedTab: $viewModel.currentTab,
onTap: viewModel.switchTab
)
}
.edgesIgnoringSafeArea(.bottom)
}
private var contentView: some View {
ZStack {
ForEach(viewModel.tabs) { tab in
viewModel.createView(for: tab)
.opacity(viewModel.currentTab == tab ? 1.0 : 0.0)
}
}
}
}
My app uses Core Data and has enabled App Groups for data sharing between the App and Widget.
In my app, there's a Core Data entity called Task. As per the documentation's suggestion, I've separately implemented a TaskData struct that conforms to AppEntity. I've also implemented TaskDataQuery: EntityQuery, which includes a method called suggestedEntities. This method fetches all Tasks from the main context and uses Task.toTaskData.
Following the documentation, I've implemented the corresponding WidgetConfigurationIntent, which holds:
@Parameter(title: "Task")
var task: TaskData
as well as the corresponding AppIntentTimelineProvider to implement the provider.
I haven't encountered any retrieval issues on the simulator; everything works perfectly.
However, the problem arises when I deploy to a physical device. Users report that their widgets can't retrieve any data. Specifically, when users long-press the widget to set up a task, it remains in a Loading state, unable to fetch any Core Data.
I've looked through some resources, and it seems this might be a common issue with iOS 17?
How can I resolve this issue? Has anyone encountered this or can offer any suggestions? This has been troubling me for several days now, and it's causing my product to continually lose users. I'm really upset about it.
Any advice is welcome.
In a music streaming app, when using Activity.request to activate the Dynamic Island, the system’s Now Playing interface appears correctly. However, the app's live activities, lock screen, and other related features fail to display properly.
During debugging, the following code is used:
activity = try Activity.request(attributes: attributes, contentState: contentState, pushType: .token)
if !NMABTestManager.default().is(inTest: "FH-NewLiveActivityPush") {
// Listen to push token updates
if activity != nil {
tokenUpdatesTask?.cancel()
tokenUpdatesTask = Task.detached {
for await tokenData in self.activity!.pushTokenUpdates {
let mytoken = tokenData.map { String(format: "%02x", $0) }.joined().uppercased()
// pushToken is Data, needs to be converted to String using the above method before being passed to the server
self.pushToken = mytoken
}
}
}
}
} catch (let error) {
print("Error Starting Live Activity: \(error.localizedDescription)")
}
In this scenario, the push token is returned correctly, and no errors are triggered.
This issue did not occur in iOS 17 but appears sporadically in iOS 18. Once it occurs, it cannot be resolved through restarting or other means.
feedbackid:FB14763873, i upload my sysdisagnose
I am writing to inquire if there is any way to programmatically check whether a user has enabled the “Large App Icon” mode in iOS 18. Our development team is working on optimizing our app’s user interface, and it would be beneficial to adapt the design based on this setting.
Any guidance on how to access this information, or if it’s even possible within the current iOS APIs, would be greatly appreciated.
Thank you for your time and assistance.
Third-party WidgetKit complications on watchOS 11 beta 5 are not appearing in the list of available complications. They have also disappeared from watch faces where they were installed. The exact same complications were working fine on earlier betas. This is happening on device, but not in simulator.
This issue may be related to FB14684253, which was fixed with the release of Xcode beta 5. However, Xcode beta 5 does not fix the issue on Apple Watch.
As a sanity check, I also tried with the Backyard Birds sample project, and the complications for that app aren't appearing on device either.
Filed as FB14689021.
My app’s WidgetKit widgets are all crashing on iOS 18 beta 5. They were working just fine on earlier betas. This is happening across both Home and Lock Screen widgets. It's an EXC_BAD_ACCESS crash that seems to be happening deep within WidgetKit.
I've seen other developers posting about this on social media, so it's not just me. Wanted to get this flagged ASAP as it's very late in the beta cycle now...
Filed as FB14684253.
Hello,
I've noticed a few rare crashes with the following stacktrace reported on AppStore connect:
Hardware Model: iPhone16,2
AppStoreTools: 15F31e
AppVariant: 1:iPhone16,2:17.4
OS Version: iPhone OS 17.5.1 (21F90)
Exception Type: EXC_CRASH (SIGABRT)
Exception Codes: 0x0000000000000000, 0x0000000000000000
Termination Reason: SIGNAL 6 Abort trap: 6
Terminating Process: <App> [15575]
Triggered by Thread: 0
Last Exception Backtrace:
0 CoreFoundation 0x1a3d38f20 __exceptionPreprocess + 164 (NSException.m:249)
1 libobjc.A.dylib 0x19bbbe018 objc_exception_throw + 60 (objc-exception.mm:356)
2 Foundation 0x1a323f868 -[NSAssertionHandler handleFailureInMethod:object:file:lineNumber:description:] + 188 (NSException.m:252)
3 CoreAutoLayout 0x1c4eabcc8 -[NSLayoutConstraint _setSymbolicConstant:constant:symbolicConstantMultiplier:] + 552 (NSLayoutConstraint.m:669)
4 CoreAutoLayout 0x1c4eab674 -[NSLayoutConstraint setConstant:] + 96 (NSLayoutConstraint.m:750)
5 <App> 0x10486d578 closure #1 in BaseChatTableViewCell.requestPreview(for:with:) + 540 (BaseChatTableViewCell+File.swift:162)
6 <App> 0x10486d73c thunk for @escaping @callee_guaranteed (@in_guaranteed URLRequest, @guaranteed NSHTTPURLResponse?, @guaranteed UIImage) -> () + 164 (<compiler-generated>:0)
7 <App> 0x104c4f814 __85-[UIImageView(AFNetworking) setImageWithURLRequest:placeholderImage:success:failure:]_block_invoke + 176 (UIImageView+AFNetworking.m:118)
8 <App> 0x104c3cc74 __78-[AFImageDownloader downloadImageForURLRequest:withReceiptID:success:failure:]_block_invoke.88 + 52 (AFImageDownloader.m:276)
9 libdispatch.dylib 0x1abbdc13c _dispatch_call_block_and_release + 32 (init.c:1530)
10 libdispatch.dylib 0x1abbdddd4 _dispatch_client_callout + 20 (object.m:576)
11 libdispatch.dylib 0x1abbec5a4 _dispatch_main_queue_drain + 988 (queue.c:7898)
12 libdispatch.dylib 0x1abbec1b8 _dispatch_main_queue_callback_4CF + 44 (queue.c:8058)
13 CoreFoundation 0x1a3d0b710 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 16 (CFRunLoop.c:1780)
14 CoreFoundation 0x1a3d08914 __CFRunLoopRun + 1996 (CFRunLoop.c:3149)
15 CoreFoundation 0x1a3d07cd8 CFRunLoopRunSpecific + 608 (CFRunLoop.c:3420)
16 GraphicsServices 0x1e8bb81a8 GSEventRunModal + 164 (GSEvent.c:2196)
17 UIKitCore 0x1a634090c -[UIApplication _run] + 888 (UIApplication.m:3713)
18 UIKitCore 0x1a63f49d0 UIApplicationMain + 340 (UIApplication.m:5303)
19 <App> 0x1047c6c20 main + 80 (main.m:11)
20 dyld 0x1c73b9e4c start + 2240 (dyldMain.cpp:1298)
It crashes on the last line of
let previewSize = BaseChatTableViewCell.getPreviewSize(from: imageSize, isMediaFile)
self.filePreviewImageViewHeightConstraint?.constant = previewSize.height
self.filePreviewImageViewWidthConstraint?.constant = previewSize.width
where previewSize is a CGSize. I am unable to reproduce the crash, nor am I able to understand why it crashes there. Anyone got an idea what could cause a crash on setting a constant?
How can we performantly scroll to a target location using TextKit 2?
Hi everyone,
I'm building a custom text editor using TextKit 2 and would like to scroll to a target location efficiently. For instance, I would like to move to the end of a document seamlessly, similar to how users can do in standard text editors by using CMD + Down.
Background:
NSTextView and TextEdit on macOS can navigate to the end of large documents in milliseconds. However, after reading the documentation and experimenting with various ideas using TextKit 2's APIs, it's not clear how third-party developers are supposed to achieve this.
My Code:
Here's the code I use to move the selection to the end of the document and scroll the viewport to reveal the selection.
override func moveToEndOfDocument(_ sender: Any?) {
textLayoutManager.ensureLayout(for: textLayoutManager.documentRange)
let targetLocation = textLayoutManager.documentRange.endLocation
let beforeTargetLocation = textLayoutManager.location(targetLocation, offsetBy: -1)!
textLayoutManager.textViewportLayoutController.layoutViewport()
guard let textLayoutFragment = textLayoutManager.textLayoutFragment(for: beforeTargetLocation) else {
return
}
guard let textLineFragment = textLayoutFragment.textLineFragment(for: targetLocation, isUpstreamAffinity: true) else {
return
}
let lineFrame = textLayoutFragment.layoutFragmentFrame
let lineFragmentFrame = textLineFragment.typographicBounds.offsetBy(dx: 0, dy: lineFrame.minY)
scrollToVisible(lineFragmentFrame)
}
While this code works as intended, it is very inefficient because ensureLayout(_:) is incredibly expensive and can take seconds for large documents.
Issues Encountered:
In my attempts, I have come across the following two issues.
Estimated Frames: The frames of NSTextLayoutFragment and NSTextLineFragment are approximate and not precise enough for scrolling unless the text layout fragment has been fully laid out.
Laying out all text is expensive: The frames become accurate once NSTextLayoutManager's ensureLayout(for:) method has been called with a range covering the entire document. However, ensureLayout(for:) is resource-intensive and can take seconds for large documents. NSTextView, on the other hand, accomplishes the same scrolling to the end of a document in milliseconds.
I've tried using NSTextViewportLayoutController's relocateViewport(to:) without success. It's unclear to me whether this function is intended for a use case like mine. If it is, I would appreciate some guidance on its proper usage.
Configuration:
I'm testing on macOS Sonoma 14.5 (23F79), Swift (AppKit), Xcode 15.4 (15F31d).
I'm working on a multi-platform project written in AppKit and UIKit, so I'm looking for either a single solution that works in both AppKit and UIKit or two solutions, one for each UI framework.
Question:
How can third-party developers scroll to a target location, specifically the end of a document, performantly using TextKit 2?
Steps to Reproduce:
The issue can be reproduced using the example project (download from link below) by following these steps:
Open the example project.
Run the example app on a Mac. The example app shows an uneditable text view in a scroll view. The text view displays a long text.
Press the "Move to End of Document" toolbar item.
Notice that the text view has scrolled to the bottom, but this took several seconds (~3 seconds on my MacBook Pro 16-inch, 2021). The duration will be shown in Xcode's log.
You can open the ExampleTextView.swift file and find the implementation of moveToEndOfDocument(_:). Comment out line 84 where the ensureLayout(_:) is called, rerun the app, and then select "Move to End of Document" again. This time, you will notice that the text view moves fast but does not end up at the bottom of the document.
You may also open the large-file.json in the project, the same file that the example app displays, in TextEdit, and press CMD+Down to move to the end of the document. Notice that TextEdit does this in mere milliseconds.
Example Project:
The example project is located on GitHub:
https://github.com/simonbs/apple-developer-forums/tree/main/how-can-we-performantly-scroll-to-a-target-location-using-textkit-2
Any advice or guidance on how to achieve this with TextKit 2 would be greatly appreciated.
Thanks in advance!
Best regards,
Simon
It's the same as the title, and when I checked the log, there were hundreds to thousands of lines with the following content.
The larger the page index of the Pdf from which you select letters, the more logs will be recorded.
.notdef: no mapping.
.notdef: no mapping.
.notdef: no mapping.
.notdef: no mapping.
.notdef: no mapping.
.notdef: no mapping.
.
.
.
can't create CMap Adobe-KR1-UCS2'. can't create CMap Adobe-KR1-UCS2'.
can't create CMap Adobe-KR1-UCS2'. can't create CMap Adobe-KR1-UCS2'.
can't create CMap Adobe-KR1-UCS2'. can't create CMap Adobe-Identity-UCS2'.
can't create CMap Adobe-KR1-UCS2'. can't create CMap Adobe-KR1-UCS2'.
can't create CMap Adobe-Identity-UCS2'. can't create CMap Adobe-Identity-UCS2'.
can't create CMap `Adobe-Identity-UCS2'.
I don't know why this is happening. Is there a solution?
I'm trying to develop a Live Activity Extension. The problem is, I can't get pushToStartToken. I'm able to get it when I start a Live Activity, but I can't when I don't start a Live Activity.
This function successfully generates the token:
private func startNewLiveActivity() async {
guard #available(iOS 16.2, *) else { return }
let attributes = MyWidgetAttributes(
homeTeam: "Badger",
awayTeam: "Lion",
date: "12/09/2023"
)
let initialContentState = ActivityContent(
state: MyWidgetAttributes.ContentState(
homeTeamScore: 0,
awayTeamScore: 0,
lastEvent: "Match Start"
),
staleDate: nil
)
guard let activity = try? Activity.request(
attributes: attributes,
content: initialContentState,
pushType: .token
) else { return }
if #available(iOS 17.2, *) {
Task {
for await data in Activity< MyWidgetAttributes>.pushToStartTokenUpdates {
let token = data.map { String(format: "%02x", $0) }.joined()
// THE DESIRED pushToStartToken TOKEN IS GENERATED HERE
}
}
}
for await data in activity.pushTokenUpdates {
let token = data.map { String(format: "%02x", $0) }.joined()
// I send token to server here.
}
}
But when I try to get pushToStartToken separately, without creating a live activity, it doesn't return any value:
private func getPushToStartToken() async {
guard #available(iOS 17.2, *) else { return }
Task {
for await data in Activity<MyWidgetAttributes>.pushToStartTokenUpdates {
let token = data.map { String(format: "%02x", $0) }.joined()
// THIS DOESN'T GENERATE ANY TOKENS SINCE THE ACTIVITY IS NOT CREATED
}
}
}
I'm trying to build a Live Activity extension. I can successfully start my live activity via push notification. The problem is, when I start live activity from my app, I can get pushTokenUpdates since I control everything and run the for loop that gets pushTokenUpdates. But the code that gets pushTokenUpdates isn't called when I start live activity with push notification since system starts it automatically (maybe it is called, but I don't know when and where).
Where am I supposed to get pushTokenUpdates when I start Live Activity using push notification to send them to my server?
The relevant code is below:
for await data in activity.pushTokenUpdates {
let token = data.map { String(format: "%02x", $0) }.joined()
Logger().log("Activity token: \(token)")
// send token to the server
}
In my document-based app for macOS I am using storyboard. I create a custom menu and menu items in storyboard. The menu is there, but the menu item remains inactive (grey).
In order to activate the menu, one has to use NSWindowDelegate and walk the menu tree, to enable the menu items.
import Foundation
import Cocoa
class DocumentController: NSViewController, NSTextFieldDelegate {
@IBOutlet var myOutlet: NSView!
// ... code here ...
}
extension DocumentController: NSWindowDelegate {
func windowDidBecomeMain(_ notification: Notification) {
NSLog("windowDidBecomeMain - enable menu")
if let customMenu = NSApp.mainMenu?.items[2].submenu {
customMenu.item(at: 1)?.isEnabled = true
}
}
func windowDidResignMain(_ notification: Notification) {
NSLog("windowDidResignMain - disable menu")
}
}
I added the outlet (myOutlet) in storyboard, but windowDidBecomeMain() does not get called?
Thanks for any help.