Hi, I have a SwiftUI View, that is attached to a 3D object in Reality View. This is supposed to be a HUD for the user to select a few things. I wanted a sub menu for one of the top level buttons. But looks like none of the reasonable choices like Menu, Sheet or Popover work. Is there a known limitation of RealityKit Views where full SwiftUI cannot be used? Or am I doing something wrong? For example, Button { SLogger.info(Toggled) withAnimation { showHudPositionMenu.toggle() } } label: { HStack { Image(systemName: rectangle.3.group) Text(My Button) } } .popover(isPresented: $showHudPositionMenu, attachmentAnchor: attachmentAnchor) { HudPositionMenuItems(showHudPositionMenu: $showHudPositionMenu, currentHudPosition: $currentHudPosition) } This will print Toggled but will not display the MenuItems Popover. If it makes any difference, this is attached to a child of a head tracked entity.
Search results for
A Summary of the WWDC25 Group Lab
10,109 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hey @Jermi, I recommend that you sign up for the upcoming Object Capture and RoomPlan lab! -- Greg
Topic:
Spatial Computing
SubTopic:
General
Tags:
They said Xcode will just read the .icon file so I assume this is only on macOS 26. I'd love to hear a confirmation of that - I'm not going to load Beta macOS. Beta Xcode was a big enough step for me. Someone asked in the group lab if the .icon files were back portable. The answer was Yes but I think what they meant was you can use the output of the Icon Composer back in older Xcode because they are just PNG or SVG files. (whatever you imported) I seem to remember never coming back from deleting the AppIcon asset. It's in the back of my mind as a Bad Idea! I think I created a new project and copied all my files back in. Probably I didn't have a recent enough commit. Xcode 26 Icon Composer Any Appearance Default Dark Dark Tinted Mono
Topic:
Developer Tools & Services
SubTopic:
Xcode
Hello! There's new API in RealityKit in visionOS 26 to generate and present spatial scenes in your own app. For more information, check out the new ImagePresentationComponent and Spatial3DImage APIs. The Presenting images in RealityKit sample code project is a great place to get started with these APIs, and the What's new in RealityKit video from WWDC25 has a section showing how to use ImagePresentationComponent for spatial scenes. Note that the spatial scene APIs mentioned above are for visionOS only. Please do file feedback via Feedback Assistant for what you would like to be able to create on iOS.
Topic:
Graphics & Games
SubTopic:
RealityKit
Tags:
As part of the WWDC25 Keynote, a technology was announced that can present 2D images as 3D spatial scenes. This announcement is supported by a Press Release. ...developers can use the Spatial Scene API to make their app experience even more immersive. Zillow is taking advantage of the API for their Zillow Immersive app, allowing users to see images of homes and apartments with the rich depth and dimension that spatial scenes offer. The feature also appears in the Photos App on iOS 26 Developer Beta 1. Tapping Spatial Scene on any photo opens a view of that photo with a parallax effect. I've searched the WWDC sessions and new documentation and have come up short. Reaching out here for help. Is there any documentation for Spatial Scene API? Or any guidance on how to implement the spatial scene in iOS?
Hello! There's new API in RealityKit in visionOS 26 to generate and present spatial scenes in your own app. For more information, check out the new ImagePresentationComponent and Spatial3DImage APIs. The Presenting images in RealityKit sample code project is a great place to get started with these APIs, and the What's new in RealityKit video from WWDC25 has a section showing how to use ImagePresentationComponent for spatial scenes. Note that the spatial scene APIs mentioned above are for visionOS only. Please do file feedback via Feedback Assistant for what you would like to be able to create on iOS.
Topic:
Graphics & Games
SubTopic:
RealityKit
Tags:
Hi I am trying to implement something simple as people can share their Spatial Photos with others (just like this post). I encountered the same issue with him, but his answer doesn't help me out here. Briefly speaking, I am using CGImgaeSoruce to extract paired leftImage and rightImage from one fetched spatial photo let photos = PHAsset.fetchAssets(with: .image, options: nil) // enumerating photos .... if asset.mediaSubtypes.contains(PHAssetMediaSubtype.spatialMedia) { spatialAsset = asset } // other code show below I can fetch left and right images from native Spatial Photo (taken by Apple Vision Pro or iPhone 15+), but it didn't work on generated spatial photo (2D -> 3D feat in Photos). // imageCount is 1 when it comes to generated spatial photo let imageCount = CGImageSourceGetCount(source) I searched over the net and someone says the generated version is having a depth image instead of left/right pair. But still I cannot extract any depth image from imageSource. The full code below, the imagePair extra
Hello! In visionOS 26, we have new APIs that make it possible to build a similar gallery interface yourself in RealityKit. For spatial photos, and also the new spatial scenes introduced in visionOS 26, check out the new ImagePresentationComponent and Spatial3DImage APIs. The Presenting images in RealityKit sample code project is a great place to get started, and the What's new in RealityKit video from WWDC25 has a section showing how to use ImagePresentationComponent for spatial photos. For spatial videos, we've added support for spatial styling in VideoPlayerComponent in visionOS 26. Check out the Playing immersive media with RealityKit to get started. We also have a section on spatial video rendering in RealityKit in the Support immersive video playback in visionOS apps video from WWDC25.
Topic:
Spatial Computing
SubTopic:
General
Tags:
Hi there! In visionOS 26, you can enable this spatial video styling in both AVPlayerViewController (from AVKit) and VideoPlayerComponent (from RealityKit). For full details on how to do so, check out the Support immersive video playback in visionOS apps video from WWDC25, which has a section dedicated to spatial video playback in VideoPlayerComponent.
Topic:
Spatial Computing
SubTopic:
General
Tags:
Hi @Kushagra_Kumar, I'm happy to share an update that this week at WWDC25, we announced the Foundation Models framework, giving developers access to on-device language models and the ability to generate inferences from prompts. -J
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Hi! I'm building an app that uses Swift Charts to visualize stock market data, and I'm encountering a couple of issues. The stock API I’m using provides data only for the trading days when the market is open. The problem is that I need to skip over the missing dates (non-trading days) in the chart, but still keep the x-axis formatted correctly (e.g., group ticks by month). If I convert the dates to String to handle missing data, I lose the correct x-axis formatting, and the date labels become inaccurate among with its data. Here’s som of the code I’m using for parsing the dates and structuring the data: struct StockDataPoint: Identifiable, Decodable { var id: String { datetime } let datetime: String let close: String var date: Date { datetime.toDate() ?? Date() } var closePrice: Double { Double(close) ?? 0.0 } } extension String { func toDate() -> Date? { let formatter = DateFormatter() formatter.locale = Locale(identifier: en_US_POSIX) formatter.timeZone = TimeZone(abbreviation: UTC) formatter.da
Hi, I have an app that uses Core Data to store user information and display it in various views. I want to know if it's possible to easily integrate this setup with FoundationModels to make it easier for the user to query and manipulate the information, and if so, how would I go about it? Can the model be pointed to the database schema file and the SQLite file sitting in the user's app group container to parse out the information needed? And/or should the NSManagedObjects be made @Generable for better output? Any guidance about this would be useful.
Hi everyone, I'm developing a C++ plugin (.bundle) for a third-party host application (Autodesk Maya) on macOS, and I'm finalizing the design for our licensing system. The plugin is distributed outside the Mac App Store. My goal is to securely store a license key in the user's Keychain. After some research, my proposed implementation is as follows: On activation, store the license data in the user's login keychain as a Generic Password (kSecClassGenericPassword) using the SecItem APIs. To ensure the plugin can access the item when loaded by Maya, I will use a specific Keychain Access Group (e.g., MY_TEAM_ID.com.mywebsite). The final .bundle will be code-signed with our company's Developer ID certificate. The signature will include an entitlements file (.entitlements) that specifies the matching keychain-access-groups permission. My understanding is that this combination of a unique Keychain Access Group and a properly signed/entitled bundle is the key to getting reliable Keychain ac
Topic:
Privacy & Security
SubTopic:
General
I suspect that the approach you’ve outlined won’t work. My understanding is that you’re building an in-process plug-in, that is, your plug-in’s code is loaded and executed within the host app. If so, you won’t be able to use a keychain access group because access to those is gated by entitlements, and you can’t change the entitlements of your host app. Fortunately there’s an easy way around this: Use the file-base keychain rather than the data protection keychain. See TN3137 On Mac keychain APIs and implementations for more background on that topic. ps I also recommend that you have a read of SecItem: Fundamentals SecItem: Pitfalls and Best Practices Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Topic:
Privacy & Security
SubTopic:
General
Just out of a WWDC lab with some folks on the team. Seems to just be a UX thing in the Shortcuts app. When using EntityQuery in the shortcuts app you are presented with a context menu to use a variable, when using EntityStringQuery you need to press and hold the parameter to open the context menu. Also in iOS 26, the system seems to be able to use suggestedEntities to match a string from siri FYI
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags: