Meet SwiftUI for spatial computing

RSS for tag

Discuss the WWDC23 Session Meet SwiftUI for spatial computing

View Session

Posts under wwdc2023-10109 tag

4 Posts
Sort by:
Post not yet marked as solved
0 Replies
587 Views
Hi Folks, I have two question if you can help me, first is as I tested none of env trackers work in simulator I know Reality kit needs to get data from LIDAR or camera to detect env and depth the question is: the env in xcode env is a just a HDR image? means we can not make any app for VisionPro until we get a device? as I watch Apple videos all are using real device and none of them are in xcode simulation when tracking world. Second question is are we able to add light or remove default light from scene? whatever I did it doesn't have any effect on my scene. I'm 3d model maker too with Blender so I can understand the graphs in composer pro but many of those doesn't reflect any effect. Thank you so much!
Posted
by Gamno.
Last updated
.