I have two apps released -- ReefScan and ReefBuild -- that are based on the WWDC21 sample photogrammetry apps for iOS and MacOS. Those run fine without LiDAR and are used mostly for underwater models where LiDAR does not work at all. It now appears that the updated photogrammetry session requires LiDAR data, and building my app on current xcode results in a non-working app. Has the "old" version of photgrammetry session been broken by this update? It worked very well previously so I would hate to see this regression to needing LiDAR. Most of my users do not have that.
Updated Object Capure -- needs LiDAR?
There seems to be another major change to photogrammetry session. The OBJ format is no longer supported as an output, only USDZ. What is going on with this code? It's gone from being very useful to not useful at all.
Hello,
Confirming with current documentation that LiDAR is required in a PhotogrammetrySession.
"RealityKit Object Capture is only available on Mac computers that meet the minimum requirements for performing object reconstruction, including a GPU with at least 4 GB of RAM and ray tracing support. It is also available on select iOS devices with LiDAR capabilities."
This assumes that captured images must have LiDAR depth data on iOS devices. I'll speculate that ensures a high quality result.
Are you saying that this is a new requirement for the API i.e. something has changed since WWDC21?