I’d like to use ARKit world tracking and display both the back camera feed and the front camera feeds, using the front feed as as a PIP. This would work great for an internet streaming use case.
However, it’s impossible. As soon as ARKit is told to use one mode, the camera for the other side freezes/doesn’t work. This page also says you have to pick one camera to show: https://developer.apple.com/documentation/arkit/arkit_in_ios/choosing_which_camera_feed_to_augment?language=objc
A question to the developers: why is this limitation in-place? Are there any work-arounds for the use case of ARKit world tracking + displaying the back camera feed + displaying the front camera feed as an overlay?
It’s possible to do this with plain camera initialization without ARKit. (There’s an official example.) With ARKit, it no longer works.
It’s strange that I cannot access the front feed via one of the other frameworks, but I guess that ARKit blocks that.
ARKit
RSS for tagIntegrate iOS device camera and motion features to produce augmented reality experiences in your app or game using ARKit.
Posts under ARKit tag
163 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi everyone,
I’m having trouble with image anchoring when working on a project in Reality Composer and Reality Composer Pro. Here’s the issue:
1. What I’m Trying to Achieve:
I want to create an AR scene where an object anchors to an image I provide. I don't want to create an app for this but just use the USDZ File the Scene creates. The USDZ File then should be viewable via the various integrations of AR Quick Look across the Apple Ecosystem. The image anchoring works perfectly when I preview the scene inside Reality Composer using AR mode.
2. The Problem:
When I export the project (tried both USDZ and Reality formats) and open it on my iPhone using the Files app (which uses AR Quick Look), the image anchoring no longer works. The object doesn’t anchor to the provided image as expected. It just anchors to the first plane it recognizes and not the image.
3. What I’ve Tried:
Exporting the scene in USDZ format.
Exporting the scene in Reality format.
Both formats result in the same issue: no image anchoring outside of the Reality Composer environment.
Trying different images but all resulting in same manor that the image anchoring is not working
Tried different iOS Version but resulting in the same issue
4. Current Setup:
Reality Composer Pro version: 2.0
iPhone model: iPhone 13 Pro
iOS version: 18.1.
5. What I Need Help With:
Is there a way to ensure image anchoring works in exported files when opened via AR Quick Look?
Do I need to configure something specific during the export process?
Are there limitations in AR Quick Look that prevent image anchoring from functioning correctly?
Do i need to create an app to make this work?
I’d appreciate any advice or insights from the community. If anyone has experience with similar issues or knows of a workaround, please let me know!
Thanks in advance, Mav
Topic:
Spatial Computing
SubTopic:
ARKit
Tags:
ARKit
AR Quick Look
Reality Composer
Reality Composer Pro
Hello everyone,
Since last night, the Object Capture feature in my app has stopped working. Whenever I try to use it, a blank screen is displayed instead of the expected functionality.
I’ve also tested several other apps that rely on Object Capture, and they are experiencing the same issue. This makes me think it might not be a problem specific to my device or app.
I’ve already tried restarting my device and ensuring all apps are up to date, but the issue persists.
Does anyone have more information about this issue? If so, is there any update on when it might be resolved?
Thank you in advance for your help!
Best regards