Post not yet marked as solved
I'd like an Image subview of a lock screen widget to render as itself, and not with the multiply-like effect it gets today.
I've tried .widgetAccentable(true) and .widgetAccentable(false), but none have the appearance I'm looking for.
Is there maybe a new modifier that lets me "force" the rendering mode? Hoping there is and it's just not jumping out at me.
Thanks for your help.
Post not yet marked as solved
Anything new this year to support reordering outline group items or items across sections in a multi-section list?
I really want to code my sidebar in swiftUI but user driven ordering is a must for me.
Hello, I'm trying to make a grid of container-relative shapes, where outside gutters match the gutters in between the items.
This stickiest part of this problem is the fact that calling .inset on a ContainerRelativeShape doubles the gutter in between the items.
I've tried LazyVGrid, and an HStack of VStacks, and they all have this double gutter in between.
I think I could move forward with some gnarly frame math, but I was curious if I'm missing some SwiftUI layout feature that could make this easier and more maintainable.
Post not yet marked as solved
When using the showAnchorGeometry I see lots of green surface anchors in my scene and it has been really helpful for debugging the placement of objects when I tap the screen of my device.
But I also get some blue shapes too, and I'm not quite sure what those mean... Is there a document that explains what showAnchorGeometry is actually... showing?
Googling "showAnchorGeometry blue" wasn't helpful! (I promise I tried)
Hello,
I’m noticing that during a collaborative session anchors created on the host device are appearing in a different location on client devices and it’s making it challenging to test other collaboration logic.
For example, when the client places a textured plane mesh at an anchor placed by the host, this placement can sometimes be considered as behind a surface and it gets clipped by RealityKits mesh occlusion.
I’d prefer if I could see it floating in space when testing so I can see that something is happening.
Im drawing a blank on if there are any debugging options to help me out. Nothing in render or debug options jumped out at me.
Thoughts?
Post not yet marked as solved
Hello,
Im noticing a behavior when I try to send a package file as a resource to peer.
A file like this is basically a folder with an extension, and, and despite receiving a progress object from the send call, I’m not seeing it rise past 0.0%. Attaching it to a ProgressView also does not show any progress.
The completion handler of the send is never called with an error and the receiver will only get the “finished receiving” callback if the host cancels or disconnects.
I didn’t see anything in the sendResource documentation about not supporting bundle files, but it would not surprise me.
Any thoughts?
Post not yet marked as solved
Background
So, I've got an anchor that I add to my Session after performing a raycast from a user's tap.
This anchor is named "PictureAnchor".
This anchor is not getting saved in my scene's world map, and I'm not sure why.
Information Gathering
I keep an eye on my session by outputting some information in
func session(_ session: ARSession, didUpdate frame: ARFrame)
As the ARFrame's are processed I look at the scene's anchors via sceneView.scene.anchors.filter({$0.name == "PictureAnchor"
and I see that my anchor is present in the sceneAnchors.
However, when I do
frame.anchors.filter to check the anchors of the ARFrame itself, my PictureAnchor is never present.
Furthermore, if I "save" the worldMap, an Anchor named PictureAnchor is not present.
Note: I could be totally wrong on how to read the data inside a saved world map, but I'm taking the anchors array at face value.
Other Information
I've noticed that the AR Persistence sample project actually checks for the anchor to be present in the ARFrame's anchors before permitting a save, but this condition is never happening for me.
I also noticed that my scene can have over 100 anchors, and the frame can have over 40, but only around 8 or 16 anchors are saved to the world map.
Main Question Restated
So, my main question is, why is my user-added "PictureAnchor" not present in the ARWorldMap, when I save my scene's map?
I see that it's present in the scene's anchors, but not present in the ARFrame's anchors.
A model entity is visible in the scene after being attached to this anchor as well.
Hello, I’ve noticed that when I set the image of a picture frame asset in Reality Composer it will change its size and aspect ratio to match the image. That’s pretty nice!
I would like to let a user dynamically modify that picture while running the app. Is this possible? Or are the models properties you set in the composer locked in when you export?
I want to create a feature where a user can stick images down my app onto their walls. I want to persist their placements between launches and use pinching a panning gestures to manipulate the images.
I see lots of articles going back a few years that show how to do this in ARKit, but going through WWDC videos I’m seeing a trend toward RealityKit, and am starting to think that’s the “right” thing to learn.
Is RealityKit to most up to date secret sauce? Is there a sample project like this one but using RealityKit?
https://developer.apple.com/documentation/arkit/environmental_analysis/placing_objects_and_handling_3d_interaction
Post not yet marked as solved
For my first build, my package.resolved was not committed to the respository. I've fixed that and if I check my main branch on GitHub I can see the package.resolved file in the xcshareddata directory.
Even so, Xcode cloud is telling me that the file is missing and is failing to start my builds.
Could there be a caching issue going on?
My .gitignore file is empty.
Post not yet marked as solved
Asking with the WWDC-10220 tag because Fruta is the sample code for this presentation.
When launching Fruta on a landscape iPad Pro 11", the "State" of the application is completely empty until the user taps the back button. After tapping back everything appears to pop into place.
Is this expected behavior in SwiftUI when using split screens?
NavigationLinks are finicky and I'm expecting the programmatic setting of the primary column "selection" to be resolved on launch, not when the user taps back.
Post not yet marked as solved
So, I've got a SwiftUI app with a triple column splitView.
When the app starts on an 11 inch iPad, the "primary" column is offscreen.
The primaryColumn has a List full of navigationLinks.
Like so:
List {
ForEach(items, id: \.itemID) { item in
NavigationLink(tag: item, selection: $selectedItem) {
ItemDetailsView(item: item)
...
Now, the selection of the first Column in the split view cascades through the rest of the app, so populating it is pretty important.
I've tried having the selectedItem be set from an EnvironmentObject. I've also tried having it set in onAppear.
Everything I try only causes a selection the "pop into place" whenever I expose the primary column of the sidebar.
Am I going about this the wrong way?
Is it because the sidebar is hidden by default?
Post not yet marked as solved
Hey All,Been digging around the internet looking for this one, and while stackoverflow has some relevant solutions, none are working for me.My View Hierarchy is the followingView--->UISplitViewController.view ( set as a child viewController )--------> rootViewController.view (set as the mainViewController of the splitView)--------> detailViewController.view (set as the detailViewController of the splitview)Via the iPhone 6 simulator(split view is always collapsed) I present a modal viewcontroller with the following code: UINavigationController *navigationController = [[UINavigationController alloc] initWithRootViewController:viewController];
[navigationController.navigationBar setBarStyle:UIBarStyleBlack];
[navigationController setModalPresentationStyle:UIModalPresentationPopover];
navigationController.popoverPresentationController.sourceView = view;
navigationController.popoverPresentationController.barButtonItem = barButtonItem;
navigationController.popoverPresentationController.delegate = self;
[self presentViewController:nav animated:YES completion:nil];I dissmiss the presented controller from that viewController by calling:[self dismissViewControllerAnimated:true completion:nil];If I set animated to "false" I dont have any problems, but it looks bad and doesnt make sense.I see some posts regarding this and custom presenatation methods, but Im not using anything custom here.Any Help is appreciated!EDIT:On iPhone the ModalPresentationStyle should default to UIModalPresentationOverFullScreen, so I tried setting the presentationStyle directly to that, and it worked!If I set the presentationStyle to "FullScreen" I get the same behavior, a black screen after dismissing.
I've tried a few things to get dropdown menus working on my Catalyst toolbar.
let barButton = UIBarButtonItem(systemItem: .add, primaryAction: UIAction(title: "Add", handler: { [weak self] _ in
... }), menu: menu)
let addItem = NSToolbarItem(itemIdentifier: .addItem, barButtonItem: barButton)
Even tried adding an itemMenuFormRepresentation, but no dice.
addItem.itemMenuFormRepresentation = AppDelegate.newMenu
Has anyone got this working on Big Sur?
Post not yet marked as solved
I'm giving my users a way to configure the dark mode style of an application. I achieve this by setting the overrideUserInterfaceStyle property of the windows on every WindowScene of the application.
This works on iPad and iPhone, but on Catalyst the toolbar area of the window does not change style. It's always the same as the system.
Is there a different technique required to change the interface style of a Catalyst application?