Hello guys,
Is there any API developers can use to get advantage of the new windows resize and placement feature for macOS Sequoia?
And of course I'm not talking about resizing the windows from your own app, but the windows from other apps.
Thank you.
General
RSS for tagExplore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Post
Replies
Boosts
Views
Activity
How to unhide an app after utilizing the new require face ID and disappear function?
For the past couple of years, Apple has been using custom sliders in Music, Podcasts, and on the Lock Screen for scrubbing music and controlling volume.
But those sliders are not available to 3rd party devs. We can recreate them theoretically for scrubbing audio. But not for volume, where in UIKit we are supposed to use MPVolumeView.
Anyone from Apple? Are there any plans to make the slider styles used in Apple's audio apps available to the rest of us?
FB12261162
I recently upgraded to Xcode version 15.4, but after the upgrade, my app stopped working. I resolved all the dependency issues and managed to run it locally. However, when I archive and distribute the app, it shows a blank white screen upon launching it on the phone.
Any troubleshooting suggestions would be greatly appreciated.
Hi,
I am looking at the ContactAccessButton and contactAccessPicker functionality that's new in iOS18. I had a couple of questions about this:
is this available through UIKit as well as SwiftUI, or is it a SwiftUI-only feature?
is it going to be available on the Mac as well as iOS?
Thanks.
Is there currently an option to make generated asset symbols public?
If not, would it be possible to set the generated asset symbol so they are public. It's quite common to have an apps design system implemented in a separate framework. Currently the generate assets symbols is useless for this as they can't be access in the framework consumer.
It would be great to add it to this new dropdown in Xcode 16 or along side it. (113704993 in the release notes) So the options would be Internal, Public and Off. This should affect the symbols, the extensions and the framework support.
(There's a post on the swift forums about this as well here: https://forums.swift.org/t/generate-images-and-colors-inside-a-swift-package/65674)
I am a VoIP app developer currently working on the Picture-in-Picture (PiP) mode. To ensure the video always appears correctly oriented, I need to obtain rotation information when the device rotates. When the app is in the foreground, I can use [UIDevice currentDevice].orientation to retrieve this value and adjust the video accordingly. However, in PiP mode, my app moves to the background, and this API does not function as expected. Similarly, interfaceOrientation of UIWindowScene and [UIApplication sharedApplication].statusBarOrientation do not work in the background. Is there a way to obtain the orientation information of the system status bar when my app enters the background in PiP mode? Thank you in advance.
I have an app that currently supports as low as iOS 16. I'd like to add some app intents to it that allow customization using the WidgetConfigurationIntent API that's only available on iOS 17 and later. Is there a way to build an intent (or other kind of app extension) that requires a higher OS version than my main app's deployment target, and only surface it for those OS versions?
Hi,
I am trying out the new UITabBar stuff in iOS18, and I don't think it'll be a good fit for my app. Is there a way to revert to the old iPad UITabBar look and placement that we've been using before? I don't want to make such a big change like this just yet, but I would need to make other changes to the app as well and I don't want to use the new UITabBar just yet. Is there a way to achieve this?
When using PencilKit and the default PKCanvasView, we get a pop-up menu with options like 'Select All | Insert Space' when we long tap on the canvas.
What is the proper way to remove this menu completely?
Thank you for your help.
I'm using a LPMetadataProvider to get metadata for URLs.
If I do this
if itemProvider.hasItemConformingToTypeIdentifier(UTType.image.identifier) {
let item = try? await itemProvider.loadItem(forTypeIdentifier: UTType.image.identifier)
// continue with code to convert data to UIImage...
}
That seems to fail quite often, even on larger sites like Amazon where users will expect to see an icon.
I've noticed it's because sometimes the type is dyn.agq80w5pbq7ww88brrfv085u
I'm assuming this is because something about the website response or the image data does not let the system determine if it is actually an image.
If I just do this
let type: String = "dyn.agq80w5pbq7ww88brrfv085u"
if itemProvider.hasItemConformingToTypeIdentifier(type) {
let item = try? await itemProvider.loadItem(forTypeIdentifier: type)
// continue with code to convert data to UIImage...
}
Then I get the icon, so it is there and it is an image. The problem is, does this dynamic type cover everything? Should I even be doing that?
Does anyone know precisely what causes this and are there recommendations on better ways to handle it?
I know LPLinkView appears to do something to load these 'non-image' icons so I am assuming there are way.
My assumption is that it would be safe to look at the results of itemProvider.registeredTypeIdentifiers() and if it has something use that as the type rather than coming up with a hardcoded list of types to check for.
The guide on how to make a QC composition a standalone app assumes I use an earlier version of Xcode (and retired). And Xcode, of course, has changed a lot since the past decade and the guides basically can't be used anymore.
Hello,
On iOS it is possible to use the keyboard spacebar to move the caret while editing text. The user has to longpress on the spacebar, and then the keyboard would turn into a trackpad that moves around a floating caret.
Is there a way to get the position of the caret while the user uses this space bar navigation? And by the position I mean the position in an app or in a frame?
Are there any other events that this action can send out?
Thanks a lot!
Notifying the Live Activity When Countdown is Complete:
Once the countdown timer reaches zero, how do I notify the live activity? What is the best approach to ensure the live activity updates accurately and promptly?
Handling App Suspension in the Background:
When my app is in the background, it gets suspended. Should I set up a job queue on the backend to send an update push notification when the timer ends? Is there a more efficient way to handle this?
Ensuring Timer Accuracy:
Since this feature deals with time, I am concerned about the accuracy of the timer. How can I ensure the countdown timer remains accurate even when the app is not in the foreground?
Any answers, insights, or guidance on these issues would be greatly appreciated. Thank you in advance for your help!
This is a follow up to my previous post, this time using the visionOS 1.2 simulator and Xcode 15.4.
I am trying to set the background colour of my app shortcut platter for the visionOS version using NSAppIconComplementingColorNames but it doesn’t take effect when I check the shortcuts app. I see the built in Music Recognition app has a background colour but I am unable to set one for my app. Please note that this feature is working correctly on iOS. But on visionOS it has no effect and returns a plain background.
I noticed the documentation doesn't mention it works on visionOS. But if that's the case how was the music recognition app able to set a colour?
I'm using CPListItem to display different items in CarPlay, things like Addresses that the user can navigate to for example. Sometimes these addresses are too long and go over the chevron indicator. I was trying to find a way to truncate the title, or use lineBreakMode similar to labels but couldn't find a way to do so.
Any help is appreciated here.
See the screenshot below for more details on the problem.
Simplely, when we set UITextView.inputView and then call becomeFirstResponder, but the custom inputView could not show expectedly just like before. We test this code in iOS17 and below, while only iOS17 does not work.
And xcode console print these logs:
Failed to retrieve snapshot.
-[RTIInputSystemClient remoteTextInputSessionWithID:performInputOperation:] perform input operation requires a valid sessionID
-[RTIInputSystemClient remoteTextInputSessionWithID:performInputOperation:] perform input operation requires a valid sessionID
-[RTIInputSystemClient remoteTextInputSessionWithID:performInputOperation:] perform input operation requires a valid sessionID
-[RTIInputSystemClient remoteTextInputSessionWithID:performInputOperation:] perform input operation requires a valid sessionID
Unsupported action selector setShiftStatesNeededInDestination:autoShifted:shiftLocked:
Unsupported action selector setShiftStatesNeededInDestination:autoShifted:shiftLocked:
Unsupported action selector setShiftStatesNeededInDestination:autoShifted:shiftLocked:
Unsupported action selector setShiftStatesNeededInDestination:autoShifted:shiftLocked:
Hi All,
I'm very new to iOS development and Swift UI is my first coding language. I'm trying to link the users search results in Spotlight with the detail view that is stored in Core Data. I can search for users data in spotlight but when I tap on it, it's only appearing in the main view of the app. Is there anyways that I can use .onContinueUserActivity at the launch of the app or is there any different code that I have to use? I've searched for many articles but I couldn't get a solution. It would be good if anyone can share some links or guide here. Thank you.
.onContinueUserActivity(DetailView.productUserActivityType) { userActivity in
if let product = try? userActivity.typedPayload(Product.self) {
selectedProduct = product.id.uuidString
}
}
I get this code from Apple's State restoration app but I can't use this with Core Data.