How can I achieve the result of buttons glass effect like sample videos that was show at de WWDC25? I tried a lot of approaches and I still far a way from the video. I would like something like the pictures attached. Could send a sample code the get the same result? Thanks
Search results for
A Summary of the WWDC25 Group Lab
10,109 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I got feedback in a WWDC25 accessibility lab that a link was being read by VoiceOver as both a link and a button. I was surprised to discover that this seems to be the the default behavior. I'm also wondering if this is intentional or a bug.
Topic:
UI Frameworks
SubTopic:
SwiftUI
Tags:
[quote='788302021, shopon2024, /thread/788302, /profile/shopon2024'] Is it possible to scan for nearby WiFi networks … ? [/quote] No. [quote='788302021, shopon2024, /thread/788302, /profile/shopon2024'] Is it possible to … connect to a device in AP mode on iOS? [/quote] Yes. You can find a summary of iOS’s Wi-Fi APIs in TN3111 iOS Wi-Fi API overview. For your specific situation I have a forums post, Extra-ordinary Networking > Working with a Wi-Fi Accessory, that has more in-depth info. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Topic:
App & System Services
SubTopic:
Networking
Tags:
Are you sandboxing because you plan to ship on the App Store? Or sandboxing because it’s the right thing to do? Unix domain sockets are a bit of a weird edge case: You can use them for IPC between different components within your app by placing them in an app group container [1]. Otherwise they are blocked by the sandbox as part of its general policy of blocking unmediated IPC between code from different teams. You can’t use a temporary exception entitlement to get around this because of both business and technical limitations: On the business side, App Review generally won’t allow you to use temporary exception entitlements. On the technical side, entitlements like com.apple.security.temporary-exception.files.absolute-path.read-write only work for files and directories; they don’t work for Unix domain sockets. If you’re sandboxing your product because it’s the right thing to do then you can get around this by moving the code to a non-sandboxed XPC service. I talk more about this in The Case for Sand
Topic:
Privacy & Security
SubTopic:
General
Tags:
I’m talking about the App Sandbox that I have to enable under Signing & Capabilities for the Mac app I’m developing. I need to keep it enabled because I want to publish the app on the App Store, so simply disabling it isn’t an option. I recently attended a one-on-one lab, and the engineer mentioned that it’s not necessarily a problem if the file isn’t inside the sandbox (which surprised me a bit). We tried to create a temporary exception for the socket file, but I’m still seeing a deny(1) network-outbound... error. Unfortunately, we weren’t able to resolve the issue during the session, so he recommended that I post the question here in the forum. When you mention selecting the socket, do you mean creating a bookmark for it?
Topic:
Privacy & Security
SubTopic:
General
Tags:
It's interesting huh! One option would be to simply delete the file and let CoreData pull the data from the cloud … you might want to consider just excluding the file from the backup entirely I think this is not an option because the user can turn off iCloud and NSPersistentCloudKitContainer still works to store data without syncing to iCloud. So not all users can get this data back from iCloud, it may only exist in their backup. What does your app actually do and, most importantly, what if any background modes/work does it use? You can think of it as a very simple todo app where you create a todo, it shows up in the widget, and tapping a button marks it complete (via the main app process). Note the widget's access to the database is read-only and cloudKitContainerOptions is not set so the widget extension process does not sync with iCloud (only the main app process does). The only background modes/work used in the app is the remote notifications capability that allows CloudKit to silently notify the app when
Topic:
App & System Services
SubTopic:
iCloud & Data
Tags:
So, the first thing here is that the basic difference between the initial path-> step 1 was file:///private/var/mobile/Containers/Shared/AppGroup/FAF64427-9826-4C86-9C2E-D7E5285BA7EC/MyApp.sqlite ...and the post restore path-> step 5 and 7 was file:///private/var/mobile/Containers/Shared/AppGroup/FDE4F3AF-E775-4D5F-842D-1C5AA77BE26F/MyApp.sqlite ...is the standard behavior of the system, as the UUIDs are generated by the system whenever the app group is created. The real oddity here is this: and yet FileManager says the file exists, the file is readable and writable, and the contents at that path are non-nil The FileManager is low level enough that whatever it says is inherently true. So, some kind of file was absolutely there and was manipulatable by your app. More to the point, the file NOT being there wouldn't have been a bad thing, as CoreData would simply have created the file from scratch using the cloud. In terms of what's going on here, I suspect that the actual issue is a lower level s
Topic:
App & System Services
SubTopic:
iCloud & Data
Tags:
Hello! Happy to answer a few of these, let me know if this helps! I think adjustable works well for carousels in some cases, and not others. It's kind of up to you as a developer to decide if this makes sense for your app. Personally, where I've found combining a carousel into one adjustable element to work well is in the following 2 cases. The first, is when you are implementing some sort of picker, that is a horizontal list. The focused item of the carousel in this case is often the selected item. An example of this would be the Animoji picker in Messages. The other is when the carousel is infinitely scrolling, or looping. In this case, grouping into one element is useful so that VO users don't get stuck swiping over the same elements in a loop without being able to get to the content underneath it. Grouping carousels tends to work less well when you have sub-elements in the items in the carousel. If you have a looping/infinite carousel that also has sub-elements, I think you'll just need
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Hello @Ian_Chen thank you for your question!
Much of what you want to do will require code.
In order to trigger any behavior in response to a tap gesture you must first setup an entity with a collision component, an input target component, and if you are triggering a timeline animation, a behaviors component configured to start the timeline animation when a tap is applied. Then, you call applyTapForBehaviors() when a tap occurs in order to propagate the tap event to your entity.
You can disable and enable an entity in the timeline, and if this entity has a particle emitter component, the particles will also be disabled or enabled. There is not a way to control this precisely with only Reality Composer Pro, you will need to use the particle emitter component API to manage this.
I recommend taking a look at the new sample Petite Asteroids released for WWDC25. The intro sequence contains a particle system that is controlled in code. Additionally, see Compose interactive 3D content in Reality Co
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Tags:
Issue Summary I'm encountering a DCError.invalidInput error when calling DCAppAttestService.shared.generateAssertion() in my App Attest implementation. This issue affects only a small subset of users - the majority of users can successfully complete both attestation and assertion flows without any issues. According to Apple Engineer feedback, there might be a small implementation issue in my code. Key Observations Success Rate: ~95% of users complete the flow successfully Failure Pattern: The remaining ~5% consistently fail at assertion generation Key Length: Logs show key length of 44 characters for both successful and failing cases Consistency: Users who experience the error tend to experience it consistently Platform: Issue observed across different iOS versions and device types Environment iOS App Attest implementation Using DCAppAttestService for both attestation and assertion Custom relying party server communication Issue affects ~5% of users consistently Key Implementation Details 1. Attestat
Hi all, I’m developing a timer app with Live Activity support. On iOS 18.5 (iPhone 14 Pro Max), I cannot get Live Activity to start. When I call Activity.request(...) in my main app, it throws an unsupportedTarget error, and nothing appears on the Lock Screen or Dynamic Island. What I’ve done: Widget Extension Info.plist: NSExtension NSExtensionPointIdentifier com.apple.widgetkit-extension NSSupportsLiveActivities NSSupportsLiveActivitiesFrequentUpdates Live Activity UI: Implemented with ActivityConfiguration(for: xxx_Clock_liveactivitiesAttributes.self) and Dynamic Island support. App Group: Both main app and extension use the same App Group, and it’s enabled in Apple Developer Center and Xcode. Tested on: iPhone 14 Pro Max, iOS 18.5 (official release) Xcode [your version] (I have not tested on iOS 17.x, so I am not sure if this issue is specific to iOS 18.5.) What I’ve tried: Cleaned build folder, deleted Derived Data, uninstalled and reinstalled app. Rebooted device. Double-checked al
Keychain items stored using SecItem in your app will generally be migrated or synced to a new iPhone when a user transfers data from their old device, but the specifics depend on how the transfer is performed and the Keychain item attributes you’ve set. Here’s a concise explanation: iCloud Keychain Syncing: If iCloud Keychain is enabled on both devices, Keychain items with the kSecAttrSynchronizable attribute set to true are automatically synced to the new device via iCloud. This includes passwords, certificates, and other secure data stored in the Keychain. Items without this attribute (i.e., non-syncable items) are not synced via iCloud and require a different transfer method. Device-to-Device Transfer (Encrypted Backup or Direct Transfer): When using iCloud Backup or iTunes/Finder encrypted backup, Keychain items (both syncable and non-syncable) are included in the encrypted backup. Restoring this backup to a new iPhone will transfer all Keychain items. During a direct device-to-device transfer (e.g., usin
Topic:
Privacy & Security
SubTopic:
General
Tags:
Hello dc48: Thank you for your posing your questions. I'll do my best to answer them here. Does anyone have a template of an Apple Projected Media Profile Format Description I'd encourage you to take a look at Learn about the Apple Projected Media Profile from WWDC25. This session includes links to reference material, as well as a sample project: Converting projected video to Apple Projected Media Profile. This particular sample project includes a stereoscopic 180 asset, but it should be possible to adapt it for Wide FoV content. or a File of a Stereo wideFOV video? An example Wide FoV HLS stream can be found at Streaming Examples. The Streams for Apple Vision Pro listed here are used in both of the following sample projects: Playing immersive media with AVKit Playing immersive media with RealityKit Use case I have 2 compatible cameras that I stereo sync and I want to move the projection information from the compatible video to the Spatial video that combines them. You may also find it helpful to ref
Topic:
Media Technologies
SubTopic:
Video
Tags:
It is not possible to surface pin in the simulator. I confirmed this with Apple engineers in a WidgetKit Lab during WWDC25. However, as far as I can tell, that doesn’t actually make any functional difference to the widget itself, you just can’t see it in that frame
Topic:
Spatial Computing
SubTopic:
General
Tags:
We're using RealityKit to create a science education AR app for iOS, iPadOS, and visionOS. In the WWDC25 session video Bring your SceneKit project to RealityKit https://developer.apple.com/videos/play/wwdc2025/288 at 8:15, it's explained that when using RealityKit, RealityView should be used in all cases, whereas in the past, SceneKit required SCNView, SceneView, or ARSCNView, depending on an app's requirements. Because the initial development of our app on iOS predates iOS 18's RealityView, our app currently uses ARView to render RealityKit AR content on iOS and iPadOS. Is it recommended that we migrate to RealityView, or can we safely continue using our existing ARView implementation? We'd prefer to avoid unnecessary development cost. If migrating from ARView to RealityView is recommended, what specific benefits should we expect from this transition? Thank you.