Hi everyone,
I have a content view that declares a StateObject
@StateObject var appModel: AppModel = AppModel.instance
I present the content view from a uikit view:
let swiftUIView = ContentView()
let hostingController = UIHostingController(rootView: swiftUIView)
hostingController.modalPresentationStyle = .fullScreen
DispatchQueue.main.async{
self.present(hostingController, animated: true, completion: nil)
}
I dismiss the contentview this way:
struct DismissingView: View {
@Environment(\.dismiss) var dismiss
var body: some View {
Button(action: {
dismiss()
}
, label: {
VStack(spacing: 10) {
Image(systemName: "xmark")
.resizable()
.aspectRatio(contentMode: .fit)
.frame(width: 22)
}
.foregroundColor(.white)
})
}
}
when I dismiss the content view I want to deallocate everything, but the stateobject remains. I have tried changing it to an observedObject with no success.
Any idea how to deallocate everything? Thank you
Explore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Post
Replies
Boosts
Views
Activity
Im having an issue with my View. When ever i want to put it inside of a ScrollView there is extra padding that appears. I tested to find the cause of it but to no luck. Here is the code for it :
NavigationView{
NavigationLink(destination: HiringAdPage(favorite: true)) {
HiringAds()
}
.buttonStyle(PlainButtonStyle())
}
the ScrollPart of it :
ScrollView{
VStack(spacing: 0) {
ForEach(0..<5) {index in
HiringAdsView()
}
}
}
Someone please help been trying for a while now..
Hi,
In the visionOS documentation
Positioning and sizing windows - Specify initial window position
In visionOS, the system places new windows directly in front of people, where they happen to be gazing at the moment the window opens.
Positioning and sizing windows - Specify window resizability
In visionOS, the system enforces a standard minimum and maximum size for all windows, regardless of the content they contain.
The first thing I don't understand is why it talk about macOS in visionOS documentation.
The second thing, what is this page for if it's just to tell us that on visionOS we have no control over the position and size of 2D windows. Whereas it is precisely the opposite that would be interesting. I don't understand this limitation. It limits so much the use of 2D windows under visionOS.
I really hope that this limitation will disappear in future betas.
Follow up to this thread, does adjustsFontSizeToFitWidth also follow the same thinking? Is it assumed that if we use a font style, this doesn't need to be set in a UIButton if we're using UIButtonConfiguration?
I have an app that launches into an immersive space with a mixed immersion style.
It appears like the Reality View has bounds that resemble a window. I would expect the bounds to not exist because it's an ImmersiveSpace.
Why do they exist? And how can I remove these?
This is the entire code:
@main
struct RealityKitDebugViewApp: App {
var body: some Scene {
ImmersiveSpace {
ContentView()
}
}
}
struct ContentView: View {
@State var logMessages = [String]()
var body: some View {
RealityView { content, attachements in
let root = Entity()
content.add(root)
guard let debugView = attachements.entity(for: "entity_debug_view") else { return }
debugView.position = [0, 0, 0]
root.addChild(debugView)
} update: { content, attachements in
} attachments: {
Color.blue
.tag("entity_debug_view")
}
.onAppear(perform: {
self.logMessages.append("Hello World")
})
}
}
SwiftUI Buttons in a List no longer highlight when tapped. Seems to have stopped highlighting after iOS 16.0 I've only tested on an iPhone/simulators so not sure if iPad has the same issue.
The issue has been carried over to iOS 17 Beta 4. My app does not feel Apple like as there is no visual feedback for the user when a button in the list is pressed.
Does anyone know why this is occurring? Is this a bug on Apple's end?
Hi,
I'm looking through SwiftUI Map for SwiftUI documentation (including IOS17 Beta) for way to adjust Map() scale, or zoom level, while simultaneously showing user's location and heading, for which I'm doing this
@State var position = MapCameraPosition = .userLocation(followsHeading: true, fallback: .automatic)
Map(position: $position)
It does not appear to be possible so am looking for confirmation. Thanks everyone.
Since Xcode 15 beta 5, making a class with the @Observable macro no longer requires all properties to have an initialization value, as seen in the video. Just put an init that collects the properties and everything works correctly.
@Observable
final class Score: Identifiable {
let id: Int
var title: String
var composer: String
var year: Int
var length: Int
var cover: String
var tracks: [String]
init(id: Int, title: String, composer: String, year: Int, length: Int, cover: String, tracks: [String]) {
self.id = id
self.title = title
self.composer = composer
self.year = year
self.length = length
self.cover = cover
self.tracks = tracks
}
}
But there is a problem: the @Observable macro makes each property to integrate the @ObservationTracked macro that seems not to conform the types to Equatable, and in addition, to Hashable.
Obviously, being a feature of each property, it is not useful to conform the class in a forced way with the static func == or with the hash(into:Hasher) function that conforms both protocols.
That any class we want to be @Observable does not conform to Hashable, prevents any instance with the new pattern to be usable within a NavigationStack using the data driven navigation bindings and the navigationDestination(for:) modifier.
I understand that no one has found a solution to this. If you have found it it would be great if you could share it but mainly I am making this post to invoke the mighty developers at Apple to fix this bug. Thank you very much.
P.S. - I also posted a Feedback (FB12535713), but no one replies. At least that I see.
Hi guys,
I am migrating my widgets to iOS 17 and because I already manage my layout margins, I just want to disable to new built-in widget content margins.
I did it by using ".contentMarginsDisabled()" on the WidgetConfiguration and it works fine at run time.
WIDGET CODE
struct MyWidget: Widget {
let kind: String = "MyWidget"
var body: some WidgetConfiguration {
return IntentConfiguration(kind: kind, intent: MyWidgetIntent.self, provider: WidgetProvider<MyWidgetIntent>()) { entry in
WidgetView<MyWidgetIntent>(entry: entry)
}
.configurationDisplayName("My Widget")
.supportedFamilies([WidgetFamily.systemMedium])
.contentMarginsDisabled() // <-- HERE
}
}
RESULT
Nevertheless, when previewing the WidgetView in a WidgetPreviewContext I didn't find anyway to disable the content margins (because manipulating directly the view and not a WidgetConfiguration).
PREVIEW CODE
struct MyWidget_Previews: PreviewProvider {
static var previews: some View {
WidgetView<MyWidgetIntent>(entry: WidgetEntry<MyWidgetIntent>())
// .padding(-16) Need to add this negative padding to disable margin
.previewContext(
WidgetPreviewContext(family: .systemMedium))
}
}
Do you know how to disable the content margins for widget preview?
As Natascha notes in her helpful article
https://tanaschita.com/20230807-migrating-to-observation/
pre-iOS17 was like this,
Stateview Subview
-----------------------------------------
Value Type @State @Binding
Ref Type @StateObject @ObservedObject
With iOS 17, it's like this:
Stateview Subview
-----------------------------------------
Value Type @State @Binding
Ref Type @State @Bindable
I like how they simplified @State and @StateObject into just @State for both cases. I'm curious, though, why didn't they simplify @Binding and @ObservedObject into just @Binding? Why did they need to maintain the separate property wrapper @Bindable? I'm sure there's a good reason, just wondering if anybody knew why.
Interestingly, you can use both @Binding and @Bindable, and they both seem to work. I know that you're supposed to use @Bindable here, but curious why @Binding works also.
import SwiftUI
@Observable
class TestClass {
var myNum: Int = 0
}
struct ContentView: View {
@State var testClass1 = TestClass()
@State var testClass2 = TestClass()
var body: some View {
VStack {
Text("testClass1: \(testClass1.myNum)")
Text("testClass2: \(testClass2.myNum)")
// Note the passing of testClass2 without $. Xcode complains otherwise.
ButtonView(testClass1: $testClass1, testClass2: testClass2)
}
.padding()
}
}
struct ButtonView: View {
@Binding var testClass1:TestClass
@Bindable var testClass2:TestClass
var body: some View {
Button(action: {
testClass1.myNum += 1
testClass2.myNum += 2
} , label: {
Text("Increment me")
})
}
}
I have a regular SwiftUI View embedded inside of a NavigationStack. In this view, I make use of the .searchable() view modifier to make that view searchable. I have a button on the toolbar placed on the .confirmationAction section, which is a problem when a User types into the search bar and the button gets replaced by the SearchBar's cancel button.
Thus, I conditionally place the button, depending on whether a User is searching, either on the navigationBar or on the keyboard. The latter does not work however, as the button does not show and when trying to debug the View Hierarchy, Xcode throws an error saying the View Hierarchy could not be displayed. If I set the button to be on the .bottomBar instead, it shows up perfectly and the View Hierarchy also displays with no further issue.
Has someone come across this issue and if so, how did you get it fixed?
Thank you in advance.
I have an app which uses SwiftUI and Mac Catalyst. When running on a Mac I want to provide a preferences menu entry with the usual keyboard shortcut Command + ,. An implementation via the Settings bundle is out of question since my preferences are too complex for this.
Here is a reduced example of my implementation:
import SwiftUI
@main
struct PreferencesMenuTestApp: App {
@UIApplicationDelegateAdaptor private var appDelegate: AppDelegate
var body: some Scene {
WindowGroup {
ContentView()
}
}
}
class AppDelegate: UIResponder, UIApplicationDelegate {
override func buildMenu(with builder: UIMenuBuilder) {
let preferencesCommand = UIKeyCommand(title: "Preferences…",
action: #selector(showPreferences),
input: ",",
modifierFlags: .command)
// let preferencesCommand = UIAction(title: "Preferences…") { action in
// debugPrint("show preferences")
// }
let menu = UIMenu(title: "Preferences…",
options: .displayInline,
children: [preferencesCommand])
builder.insertSibling(menu, afterMenu: .about)
}
@objc
func showPreferences() {
debugPrint("show preferences")
}
}
The problem is that the menu entry is disabled. Obviously the provided selector is not recognised. When I mark the AppDelegate with @main, then the menu entry is enabled. Of course then the app's window is empty.
When I switch to the UIAction implementation (the out commented code) it works fine. But since one cannot provide a keyboard shortcut for UIActions this is not a good solution.
What am I missing? How would one implement a preferences menu entry that actually works?
Summary
When trying to display SwiftUI previews, building the previews may fail with the following error:
Linking failed: linker command failed with exit code 1 (use -v to see invocation)
ld: warning: search path '/Applications/Xcode.app/Contents/SharedFrameworks-iphonesimulator' not found
ld: warning: Could not find or use auto-linked framework 'CoreAudioTypes': framework 'CoreAudioTypes' not found
Note that may app does not use CoreAudioTypes.
Observation
This issue seems to occur when two conditions are met:
The SwiftUI view must be located in a Swift Package
Somewhere in either the View or the #Preview a type from another package has to be used.
Say I have to packages one named Model-package and one named UI-Package. The UI-Package depends on the Model-Package. If I have a SwiftUI view in the UI-Package that uses a type of the Model-Package either in the View itself or in the #Preview, then the described error occurs. If I have a View in the UI-package that does not use a type of the Model-Package anywhere in its View or #Preview then the SwiftUI Preview builds and renders successful.
I created a bug report: FB13033812
Our app has an architecture based on ViewModels.
Currently, we are working on migrating from the ObservableObject protocol to the Observable macro (iOS 17+).
The official docs about this are available here: https://developer.apple.com/documentation/swiftui/migrating-from-the-observable-object-protocol-to-the-observable-macro
Our ViewModels that were previously annotated with @StateObject now use just @State, as recommended in the official docs.
Some of our screens (a screen is a SwiftUI view with a corresponding ViewModel) are presented modally. We expect that after dismissing a SwiftUI view that was presented modally, its corresponding ViewModel, which is owned by this view (via the @State modifier), will be deinitialized. However, it seems there is a memory leak, as the ViewModel is not deinitialized after a modal view is dismissed.
Here's a simple code where ModalView is presented modally (through the .sheet modifier), and ModalViewModel, which is a @State of ModalView, is never deinitialized.
import SwiftUI
import Observation
@Observable
final class ModalViewModel {
init() {
print("Simple ViewModel Inited")
}
deinit {
print("Simple ViewModel Deinited") // never called
}
}
struct ModalView: View {
@State var viewModel: ModalViewModel = ModalViewModel()
let closeButtonClosure: () -> Void
var body: some View {
ZStack {
Color.yellow
.ignoresSafeArea()
Button("Close") {
closeButtonClosure()
}
}
}
}
struct ContentView: View {
@State var presentSheet: Bool = false
var body: some View {
Button("Present sheet modally") {
self.presentSheet = true
}
.sheet(isPresented: $presentSheet) {
ModalView {
self.presentSheet = false
}
}
}
}
#Preview {
ContentView()
}
Is this a bug in the iOS 17 beta version or intended behavior? Is it possible to build a relationship between the View and ViewModel in a way where the ViewModel will be deinitialized after the View is dismissed?
Thank you in advance for the help.
Sonoma beta release notes mention that NSMenu was rewritten from scratch using AppKit, however, it seems like a lot of behavior was removed along the way which breaks applications. I've filed several requests using Feedback Assistant, but none of them were fixed in the 3 following betas.
FB12867496: NSMenu no longer receives keyboard events from GetEventDispatcherTarget (there is a workaround)
FB12867573: NSMenuItem custom view window is nil
FB12887219 : NSMenu performSelector highlightItem doesn't highlight menu item
FB12938907: NSMenu not properly updated when adding/removing NSMenuItem
I wonder if anyone else has experienced similar problems and can share workarounds for them:
At the moment, using Bindable for an object stored in Environment works in a cumbersome way:
struct ContentView: View {
@Environment(Model.self) var model
var body: some View {
@Bindable var model = model
VStack {
Text(model.someField.uppercased())
TextField("", text: $model.someField)
someSubView
}
.padding()
}
@ViewBuilder
var someSubView: some View {
@Bindable var model = model
TextField("", text: $model.someField)
}
}
A new @Bindable needs to be instantiated for each computed property in the view, which creates boilerplate I would like to avoid.
I made a new property wrapper which functions the same as the EnvironmentObject wrapper, but for Observable:
@propertyWrapper
struct EnvironmentObservable<Value: AnyObject & Observable>: DynamicProperty {
@Environment var wrappedValue: Value
public init(_ objectType: Value.Type) {
_wrappedValue = .init(objectType)
}
public init() {
_wrappedValue = .init(Value.self)
}
private var store: Bindable<Value>!
var projectedValue: Bindable<Value> {
store
}
mutating func update() {
store = Bindable(wrappedValue)
}
}
Example:
struct ContentView: View {
@EnvironmentObservable var model: Model
var body: some View {
VStack {
Text(model.someField.uppercased())
SubView(value: $model.someField)
someSubView
}
.padding()
}
var someSubView: some View {
TextField("", text: $model.someField)
}
}
I was wondering if there would be any downsides to using this method? In my testings it seems to behave the same, but I'm not sure if using this could have a performance impact.
In UIKit, we can add an insets to a MKMapView with setVisibleMapRect to have additional space around a specified MKMapRect. It's useful for UIs like Apple Maps or FlightyApp (see first screenshot attached). This means we can have a modal sheet above the map but still can see all the content added to the map.
I'm trying to do the same for a SwiftUI Map (on iOS 17) but I can't find a way to do it: see screenshot 2 below. Is it possible to obtain the same result or should I file a feedback for an improvement?
Try the following code on macOS, and you'll see the marker is added in the wrong place, as the conversion from screen coordinates to map coordinates doesn't work correctly.
The screenCoord value is correct, but reader.convert(screenCoord, from: .local) offsets the resulting coordinate by the height of the content above the map, despite the .local parameter.
struct TestMapView: View {
@State var placeAPin = false
@State var pinLocation :CLLocationCoordinate2D? = nil
@State private var cameraProsition: MapCameraPosition = .camera(
MapCamera(
centerCoordinate: .denver,
distance: 3729,
heading: 92,
pitch: 70
)
)
var body: some View {
VStack {
Text("This is a bug demo.")
Text("If there are other views above the map, the MapProxy doesn't convert the coordinates correctly.")
MapReader { reader in
Map(
position: $cameraProsition,
interactionModes: .all
)
{
if let pl = pinLocation {
Marker("(\(pl.latitude), \(pl.longitude))", coordinate: pl)
}
}
.onTapGesture(perform: { screenCoord in
pinLocation = reader.convert(screenCoord, from: .local)
placeAPin = false
if let pinLocation {
print("tap: screen \(screenCoord), location \(pinLocation)")
}
})
.mapControls{
MapCompass()
MapScaleView()
MapPitchToggle()
}
.mapStyle(.standard(elevation: .automatic))
}
}
}
}
extension CLLocationCoordinate2D {
static var denver = CLLocationCoordinate2D(latitude: 39.742043, longitude: -104.991531)
}
(FB13135770)
I have a multiplatform app where I support iOS, macOS and tvOS. There is one target which supports it all. In my assets catalog I have the AppIcon entry which holds the app icon for iOS and macOS. This works as expected. However the tvOS app icon is ignored. I added an "tvOS App Icon & Top Shelf Image" asset to my asset catalog and filled it with my icons for tvOS.
Then I added it in the target’s general settings App Icon entry under App Icons and Launch Screen like shown in the screenshot.
What am I missing? What needs to be done to make this work?
Hello!
I've tested/implemented TipKit in SwiftUI and UIKit but it seems that the close, i.e. X, button doesn't work in UIKit but does in SwiftUI. Not sure if this is a bug or I have to do something different about it in UIKit.
Testing with Xcode 15 Beta 8
Thanks!