Posts

Post not yet marked as solved
1 Replies
638 Views
For apps built specifically for the new Lidar sensor that have little to no use on devices without it, is there an appropriate Required device capabilities string?
Posted
by sam598.
Last updated
.
Post not yet marked as solved
1 Replies
379 Views
The new object capture API is really quite remarkable, and I can already think of several possible use cases and pipelines for it. It is great that even though it can work with just 2D images the API can use metadata like a depth maps, lens data, gravity direction, and GPS location to create a better understanding of the scene and reconstruction. In a situation where the rough camera pose of each captured image is known (for example an ARKit tracked frame, or a static array of cameras) I would love the ability to add an initial camera transform matrix to each PhotogrammetrySample. Without assuming too much about how the underlying system works, I assume that the camera extrinsics and intrinsics are constantly being refined as the model is being reconstructed. In this situation I would not expect the input camera poses to be taken as absolute values. But I think being able to give those initial matrix transform values would have several benefits: Give an even better hint for the camera positions than just the gravity vector would alone. Define object scale based on camera extrinsics, even if no depth data is available. Predefine a coordinate space, origin and orientation for the capture. Have a common and consistent origin point and orientation between scans with an identical (or similar) camera setup. Without having tried a drone capture yet perhaps there is a way to do this with GPS data. But that feels like an unnecessary hack-a-round, and likely prone to conversion errors.
Posted
by sam598.
Last updated
.
Post not yet marked as solved
4 Replies
1.2k Views
After uploading a build with Xcode 11 Beta 2, and submitting to TestFlight external testing for review, this error appears:"Sorry, something went wrong. This build is using a beta version of Xcode and can’t be submitted. Make sure you’re using the latest version of Xcode or the latest seed release found in the TestFlight release notes."The latest TestFlight release notes state:"You can now submit apps built with Xcode 11 beta 2 using the SDK for iOS 13 beta 2, tvOS 13 beta 2, and watchOS 6 beta 2 for internal and external testing."Is there a discrepency, or is this feature not live yet? Thanks!
Posted
by sam598.
Last updated
.
Post not yet marked as solved
1 Replies
472 Views
iPhone 7+ iOS 12 Beta 2In previous versions of iOS the phone would only vibrate when an unrecognized fingerprint presses the home button to unlock the screen.Now in Beta 2 the phone will vibrate anytime a fingerprint is detected.Betwen success and failure it is a slightly different type of vibration, so I'm not sure if this is a bug or by design. But in either case it feels wrong, like I have made a mistake as a user.
Posted
by sam598.
Last updated
.