Post marked as unsolved
210
Views
Current BehaviorI've set up two scenes and set an object in the second scene to Start Hidden using the behavior preset. When the first scene transitions to the second scene, the object is briefly visible for a split second, then hides.Expected BehaviorWhen setting a Start Hidden behavior on an object, that object should not show at all upon scene change, until a show behavior is set to display it.
Post marked as unsolved
112
Views
Here is my code:
swift
struct ARDisplayView: View {
var body: some View {
return ARViewContainer().edgesIgnoringSafeArea(.all)
}
}
struct ARViewContainer: UIViewRepresentable {
var arView = ARView(frame: .zero)
func makeCoordinator() - Coordinator {
Coordinator(parent: self)
}
class Coordinator: NSObject, ARSessionDelegate {
var parent: ARViewContainer
var videoPlayer: AVPlayer!
init(parent: ARViewContainer) {
self.parent = parent
}
blahblah...
and This is ContentView.swift
swift
import SwiftUI
import RealityKit
struct ContentView: View {
var body: some View {
ARDisplayView()
}
}
I want to make a button to return ARViewContainer.
I tried this but it doesn't work:
swift
Button(action: { return ARViewContainer().edgesIgnoringSafeArea(.all) }) {
Text("Start AR")
}
What should I do?
Post marked as solved
100
Views
Given an AnchorEntity from say RealityKit's Scene anchors collection, is it possible to retrieve the ARAnchor that was used when creating the AnchorEntity?
Looking through the AnchorEntity documentation, - https://developer.apple.com/documentation/realitykit/anchorentity it seems that while you can create an AnchorEntity using an ARAnchor, there is no way to retrieve that ARAnchor afterwards.
Alternatively, the ARSession delegate functions receive a list of ARAnchors or an ARFrame that has ARAnchors, but I could not find an approach to retrieve AnchorEntities that might be associated with any of these ARAnchors.
Given an ARAnchor, is there a way to get an AnchorEntity associated with it?
Thanks,
Post marked as unsolved
96
Views
Is it possible to turn on and off different occlusion material when using Scene Understanding with LiDAR and RealityKit?
For example, if ARKit identifies a wall, I don't want that mesh to be used during occlusion (but I do want occlusion for other things, like the couch or the floor)
If I could do this, it would essentially make my walls transparent, and I could see the RealityKit objects that extend beyond the room I am in.
Thanks,
Post marked as solved
173
Views
Is there any way to "manually" update the joint's transform in BrodyTrackedEntity? It seems that current api did this automatically.
More context:
I am trying to replay the robot using the ARBodyAnchor stored from previous session. I get the information of joint's transform from the ARBodyAnchor and update the corresponding joints' transform in "BodyTrackedEntity.jointTransforms". However, this doesn't work. The joints are all over the place.
Any suggestions or possible direction I could look for is helpful.
Post marked as solved
208
Views
Is there an equivalent to MultipeerConnectivityService that implements SynchronizationService over TCP/IP connections?
I'd like to have two users in separate locations, each with a local ARAnchor but then have a synchronized RealityKit scene graph attached to their separate ARAnchors.
Is this possible?
Thanks,
Post marked as unsolved
105
Views
import RealityKit
let newModel = ModelEntity(
mesh: .generatePlane(width: 1, height: 1)
)
var mat = SimpleMaterial(
color: UIColor.blue.withAlphaComponent(0.5),
isMetallic: true
)
newModel.model?.materials = [mat]
/* next line breaks transluency */
newModel.model?.mesh = .generatePlane(width: 1, height: 1)
/* next line fixes it */
newModel.model = ModelComponent( mesh: .generatePlane(width: 1, height: 1), materials: [mat])
In my code I was doing this every time a plane anchor updated:
self.model?.mesh = MeshResource.generatePlane(
width: planeAnchor.extent.x,
depth: planeAnchor.extent.z + 10
)
This resulted in a mesh updating its size, but the transparency completely breaking.
When switching to this, transparency worked correctly, but now I have to fully recreate the model whenever I want to update the size
var material = SimpleMaterial(color: MaterialColorParameter.texture(resource!), isMetallic: true)
material.tintColor = .init(white: 1, alpha: 0.5)
self.model = ModelComponent(
mesh: .generatePlane(
width: planeAnchor.extent.x,
depth: planeAnchor.extent.z
),
materials: [material]
)
Can we get some bug fixes in realitykit this year? So that we don't have to set isMetallic: true, and tintColor just to use a color with opacity? Or have to fully recreate the model just to update the mesh size? Thanks :)
Post marked as unsolved
114
Views
Hi,
Xcode gives me an error when I try to connect automaticallyConfigureSession with arView., it also can't find ARWorldTrackingConfiguration in configuration. Someone an idea of what I'm doing wrong?
override func viewDidAppear( _ animated: Bool){
super.viewDidAppear(animated)
setupARView()
}
// MARK: Setup Methodes
func setupARView() {
arView.automaticallyConfigureSession = false
let configuration = ARWorldTrackingconfiguration()
configuration.planeDetection = [.horizontal, .vertical]
configuration.eviromentTexturing = .automatic
arView.session.run(configuration)
}
}
Post marked as unsolved
161
Views
Which camera lens does the iPhone use when running RealityKit?
I am asking because someone on a blog stated that the iPhone uses the telephoto lens for depth mapping. Obviously, not all iPhone have a telephoto lens. So either this is not true, or reality kit uses different lenses depending on the device.
Post marked as solved
637
Views
Hi All, Is there any way to record an AR Session with reality composer 1.4 that incldues lidar data? I used this recording method to record test captures that I can use to playback in xCode for development/testing while making my app. When I go to developer section in reality composer the record AR Session option is greyed out on the new iPad. Any workarounds or info about when this feature may be possible in reality composer?
Post marked as solved
459
Views
I am wondering how I can edit the func session(_ session: ARSession, didAdd anchors: [ARAnchor]) in the arView.session.delegate. I would like to run a function when the image is recognized.
struct ARViewWrapper: UIViewRepresentable {
let title: String
typealias UIViewType = ARView
func makeUIView(context: Context) -> ARView {
print("detected?")
let arView = ARView(frame: .zero, cameraMode: .ar, automaticallyConfigureSession: true)
let target = AnchorEntity(.image(group: "AR Resources", name: "qr1"))
arView.scene.anchors.append(target)
addARObjs(anchor: target, title: title, planeColor: .blue)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {
print("update!")
return()
}
}
Any and all guidance is appreciated! Thanks in advance.
Post marked as solved
1.4k
Views
Hi there!I'm trying to add a png image as texture in a material, but the transparency is not carried over. Areas that should have transparency have a black color instead. I've tried the different materials UnlitMaterial and SimpleMaterial. How can I fix this?Thanksvar profileMaterial = UnlitMaterial()
profileMaterial.baseColor = try! .texture(.load(named: "profilePicture.png"))
let profileMesh = MeshResource.generatePlane(width: 0.2, height: 0.2)
let profilePlane = ModelEntity.init(mesh: profileMesh, materials: [profileMaterial])
Post marked as unsolved
513
Views
Hi guys How could I control the opacity of the ModelEntity or mesh in RealityKit?
Post marked as unsolved
228
Views
I have a .stl file on disk that is being correctly loaded as a MDLAsset by MDLAsset.init(url:), and the MDLMesh is extracted from there seamlessly.
The problem is that I cannot create a ModelEntity using said mesh. All constructors for it only take MeshResource as a argument, and I can't seem to convert MDLMesh to a MeshResource - which doesn't make much sense.
Unfortunately I cannot use Entity.loadModel because it either panics with a bad memory access or loads my models incorrectly (the Z dimension is lost, so it's like the model is just a plane).
Any pointers on this?
Post marked as unsolved
139
Views
Can you assign an image (png) as an Entity within RealityKit? Trying to use ImageAnchor and spawn other images that will be animated when the ImageAnchor returns results