Post not yet marked as solved
How can I setup in lower resolution video in our ARSessions? Thanks in advance.
Post not yet marked as solved
Hi
Is this possible to have RoomCaptureSession and ARSession together, as we need feature points
Post not yet marked as solved
Hello, I am trying to get the transform matrix of an Entity after a translation like this:
uiView.scene.findEntity(named: "Model_Name")?.transform.matrix
I move the 3D model around the scene and i call this method again and the matrix is still the same. So i can't get the latest transform matrix of my 3D model.
This is the way I create an Entity
var anchorEntity = AnchorEntity()
anchorEntity = AnchorEntity(.plane(.any,
classification: .any,
minimumBounds: [0.1, 0.1]))
anchorEntity.addChild(MyEntity, preservingWorldTransform: true)
uiView.scene.addAnchor(anchorEntity)
I can get the latest position (X,Y,Z) of the Entity via EntityTranslationGestureRecognizer
guard let translationGesture = recognizer as? EntityTranslationGestureRecognizer else { return }
let position = translationGesture.location(in: translationGesture.entity)
Is there any way to get the latest transform matrix of an Entity because I want to create an ArAnchor in the latest position of my 3D model.
I am developing an app in SwiftUI where i can place an object on detected plane on screen tap. However i need to give screen location to implement
arView.raycast(from: CGPoint, allowing: ARRaycastQuery.Target, alignment:
ARRaycastQuery.TargetAlignment)
In UIKit i can easily get location with the handleTap function using the method below
arView.addGestureRecognizer(UITapGestureRecognizer(target: self, action: #selector(handleTap(recognizer:))))
@objc func handleTap(recognizer: UITapGestureRecognizer){
let tapLocation = recognizer.location(in: arView) }
How can i implement this in swiftUI structs to place objects on plane where tapped?
I’m trying to use QuickLook AR to show a 3D ThreeJS model in AR on iOS. Exporting my 3D object to USDZ using the Three USDZ exporter works fine. When opening the USDZ file in XCode everything seems to be fine.
Though, when opening the USDZ in QuickLook AR, the 3D model is flying above the ground, on my camera’s Y level. The camera PoV is positioned exactly in the middle of the X and Z axis of the 3D model and at the bottom of the Y level.
I have another problem with opening the USDZ in QuickLook AR, which is; When opening the USDZ in QuickLook AR, the model is invisible at first. Then when I scale the model down to < 10%, the model becomes visible, though it does not scale in size at all.
Also, the “Model” tab in QuickLook does not even show the 3D model. When switching between the “Model” and “AR” tabs, the model flies by really quick.
For reference, I’ve added my USDZ model below.
What I’m trying to accomplish is to position the 3D model in front of me, and for the 3D model to acknowledge the world shown by the camera. The 3D model should stick to walls, or at least the floor to begin with.
Button click code:
newScene.add(sceneRef)
const pivot = new THREE.Object3D()
newScene.add(pivot)
pivot.add(sceneRef)
// position the object on the pivot, so that it appears 5 meters
// in front of the user.
pivot.position.z = -50
const yaxis = new THREE.Vector3(0, 1, 0)
const zaxis = new THREE.Vector3(0, 0, 1)
const direction = zaxis.clone()
// Apply the camera's quaternion onto the unit vector of one of the axes
// of our desired rotation plane (the z axis of the xz plane, in this case).
direction.applyQuaternion(cameraRef.quaternion)
// Project the direction vector onto the y axis to get the y component
// of the direction.
const ycomponent = yaxis
.clone()
.multiplyScalar(direction.dot(yaxis))
// Subtract the y component from the direction vector so that we are
// left with the x and z components.
direction.sub(ycomponent)
// Normalize the direction into a unit vector again.
direction.normalize()
// Set the pivot's quaternion to the rotation required to get from the z axis
// to the xz component of the camera's direction.
pivot.quaternion.setFromUnitVectors(zaxis, direction)
// Finally, set the pivot's position as well, so that it follows the camera.
newScene.getWorldPosition(cameraRef.position)
newScene.updateMatrixWorld(true)
iosExporter.parse(newScene).then((result) => {
saveUSDZString(result, 'scene.usdz')
})
saveUSDZString function:
function saveString(text: any, filename: any) {
save(new Blob([text], { type: 'application/json' }), filename)
}
save function:
function save(blob: any, filename: any) {
link.href = URL.createObjectURL(blob)
link.download = filename
link.rel = 'ar'
let img = document.createElement('img')
img.alt = 'hi'
img.src = 'https://google.com/img'
link.appendChild(img)
link.click()
}
USDZ Model: https://wetransfer.com/downloads/2d2d2e840f9f964e036cd6077094c33220220630095321/a7f94b9f2bc730fead9107bf133e175220220630095338/193b81?utm_campaign=WT_email_tracking&utm_content=general&utm_medium=download_button&utm_source=notify_recipient_email
Post not yet marked as solved
I want to add a small banner to my AR model when viewed in QuickLook in my app. I haven't been able to find much information on this however I found this resource where it explains how to add it when you have a model on a website. My model is locally stored on the device so how would I add this?
Post not yet marked as solved
I am thinking of a web application to scan an object using a LiDAR sensor from iPhone. It'll be built using some Javascript-based language preferably using ReactJS.
We want to use this website on the Safari browser only for handling performance. Is there any way to do that?
Post not yet marked as solved
I'm having some issues with Reality Composer (the latest v1.5 with the latest Xcode beta) and I just wanted to check if these are known issues.
I have an image of a printed map which I'd like to turn into an AR-based interactive map. Something simple, where I tap a pin on the map, and details about that location move up from beneath the map, then move back out of view. The map, the pin and the location information are all separate flat images, and I'm using a horizontal anchor. Unfortunately, it seems that simple bugs are stopping this from working at all.
Say I set the pin to respond to a Tap. If I use the "Move, Rotate, Scale to" action to move a second object (location info) up, then add a Wait Action, then another "Move, Rotate, Scale to" to move the info object back down to its original position. The result: the info object doesn't initially move up as far as it should, and the second "Move" pushes the info object away and out of sight completely.
If I try another approach, using the Show and Hide actions (using "Move from below" and "Move to below" as the Motion type), and again with a Wait in the middle, it works the first time, but subsequent taps cause the info object to simply appear, with no incoming animation, and then the outgoing animation works correctly.
Is it just something wrong with my system, or is this broken? If I don't try to move objects around (i.e. Show/Hide with "No Motion") then I have more luck, but I'm feeling pretty constrained.
Thanks in advance for all help with this.
Post not yet marked as solved
I am attempting to build an AR app using Storyboard and SceneKit. When I went to run an existing app I have already used it runs but nothing would happen. I thought this behavior was odd so I decided to start from scratch on a new project. I started with the default AR project for Storyboard and SceneKit and upon run it immediately fails with an unwrapping nil error on the scene. This scene file is obviously there. I am also given four build time warnings:
Could not find bundle inside /Library/Developer/CommandLineTools
failed to convert file with failure reason: *** -[__NSPlaceholderDictionary initWithObjects:forKeys:count:]: attempt to insert nil object from objects[0]
Conversion failed, will simply copy input to output.
Copy failed file:///Users/kruegerwilliams/Library/Developer/Xcode/DerivedData/ARtest-bjuwvdjoflchdaagofedfxpravsc/Build/Products/Debug-iphoneos/ARtest.app/art.scnassets/ship.scn -> file:///Users/kruegerwilliams/Library/Developer/Xcode/DerivedData/ARtest-bjuwvdjoflchdaagofedfxpravsc/Build/Products/Debug-iphoneos/ARtest.app/art.scnassets/ship.scn error:Error Domain=NSCocoaErrorDomain Code=516 "“ship.scn” couldn’t be copied to “art.scnassets” because an item with the same name already exists." UserInfo={NSSourceFilePathErrorKey=/Users/kruegerwilliams/Library/Developer/Xcode/DerivedData/ARtest-bjuwvdjoflchdaagofedfxpravsc/Build/Products/Debug-iphoneos/ARtest.app/art.scnassets/ship.scn, NSUserStringVariant=(
I currently am unsure how to fix these errors? It appears as if they must be in the command line tools because after moving the device support files back to a stable version of Xcode the same issue is present. Is anyone else having these issues?
Post not yet marked as solved
Hello,
When it comes to creating application build I've made an application that requires LiDAR to function and knowing that it is possible to restrict an application to iPhone or iPad only for example does a similar situation exist for functionality?
Thank you!
Post not yet marked as solved
I updated my iPhone 12 Pro to iOS 16 beta, and the motion capture feature in ARKit seems stop functioning. I tried my own custom app (MoCáp) and BodyDetection sample code from Apple developer site, and they both don’t work. Anyone have the same issue?
Post not yet marked as solved
I'm trying to build an augmented reality app on my iphone. I made it in unity, and it builds and plays on an android, but crashes immediately on any iphone. I get this in my console in xcode.
2022-06-05 19:23:13.222039-0500 PieceoftheSkyWIP[12433:706015] Error loading /var/containers/Bundle/Application/4E214C36-6812-4936-951A-153FB4DA4454/PieceoftheSkyWIP.app/Frameworks/UnityFramework.framework/UnityFramework: dlopen(/var/containers/Bundle/Application/4E214C36-6812-4936-951A-153FB4DA4454/PieceoftheSkyWIP.app/Frameworks/UnityFramework.framework/UnityFramework, 0x0109): Library not loaded: @rpath/AlgoSdk.framework/AlgoSdk
Referenced from: /private/var/containers/Bundle/Application/4E214C36-6812-4936-951A-153FB4DA4454/PieceoftheSkyWIP.app/Frameworks/UnityFramework.framework/UnityFramework
Reason: tried: ‘/private/var/containers/Bundle/Application/4E214C36-6812-4936-951A-153FB4DA4454/PieceoftheSkyWIP.app/Frameworks/AlgoSdk.framework/AlgoSdk’ (no such file), ‘/private/var/containers/Bundle/Application/4E214C36-6812-4936-951A-153FB4DA4454/PieceoftheSkyWIP.app/Frameworks/UnityFramework.framework/Frameworks/AlgoSdk.framework/AlgoSdk’ (no such file), ‘/private/var/containers/Bundle/Application/4E214C36-6812-4936-951A-153FB4DA4454/PieceoftheSkyWIP.app/Frameworks/AlgoSdk.framework/AlgoSdk’ (no such file), ‘/private/var/containers/Bundle/Application/4E214C36-6812-4936-951A-153FB4DA4454/PieceoftheSkyWIP.app/Frameworks/UnityFramework.framework/Frameworks/AlgoSdk.framework/AlgoSdk’ (no such file), ‘/private/var/containers/Bundle/Application/4E214C36-6812-4936-951A-153FB4DA4454/PieceoftheSkyWIP.app/Frameworks/AlgoSdk.framework/AlgoSdk’ (no such file), ‘/System/Library/Frameworks/AlgoSdk.framework/AlgoSdk’ (no such file)
2022-06-05 19:23:13.222996-0500 PieceoftheSkyWIP[12433:706015] Error loading /var/containers/Bundle/Application/4E214C36-6812-4936-951A-153FB4DA4454/PieceoftheSkyWIP.app/Frameworks/UnityFramework.framework/UnityFramework: dlopen(/var/containers/Bundle/Application/4E214C36-6812-4936-951A-153FB4DA4454/PieceoftheSkyWIP.app/Frameworks/UnityFramework.framework/UnityFramework, 0x0109): Library not loaded: @rpath/AlgoSdk.framework/AlgoSdk
Referenced from: /private/var/containers/Bundle/Application/4E214C36-6812-4936-951A-153FB4DA4454/PieceoftheSkyWIP.app/Frameworks/UnityFramework.framework/UnityFramework
Reason: tried: ‘/private/var/containers/Bundle/Application/4E214C36-6812-4936-951A-153FB4DA4454/PieceoftheSkyWIP.app/Frameworks/AlgoSdk.framework/AlgoSdk’ (no such file), ‘/private/var/containers/Bundle/Application/4E214C36-6812-4936-951A-153FB4DA4454/PieceoftheSkyWIP.app/Frameworks/UnityFramework.framework/Frameworks/AlgoSdk.framework/AlgoSdk’ (no such file), ‘/private/var/containers/Bundle/Application/4E214C36-6812-4936-951A-153FB4DA4454/PieceoftheSkyWIP.app/Frameworks/AlgoSdk.framework/AlgoSdk’ (no such file), ‘/private/var/containers/Bundle/Application/4E214C36-6812-4936-951A-153FB4DA4454/PieceoftheSkyWIP.app/Frameworks/UnityFramework.framework/Frameworks/AlgoSdk.framework/AlgoSdk’ (no such file), ‘/private/var/containers/Bundle/Application/4E214C36-6812-4936-951A-153FB4DA4454/PieceoftheSkyWIP.app/Frameworks/AlgoSdk.framework/AlgoSdk’ (no such file), ‘/System/Library/Frameworks/AlgoSdk.framework/AlgoSdk’ (no such file)
my coworker also tried to build the project on his mac M1 and got an actual error that says something like this -
building for ios but the linked and embedded framework was built for ios + ios simulator unity
I'm using Unity 2019.4.0
ARFoundation 4.1.1
ARCore 4.1.1
ARKit 4.1.1
Xcode 13.4.1
Post not yet marked as solved
Hi, I'm working with RealityKit and Reality Composer. When I build a scene in Reality Composer, place the experience in Xcode and try the app out, the 3D models appears fine. When I place my hand over it, it doesn't recognise that my hand is in front of it and shows through. When place an object in front of it, it also doesn't recognise whether it is in front or behind the object. How do I fix this?
Post not yet marked as solved
Hello, I am new to reality composer and AR, so i would like some advice.
I've added a hide, show, and wait trigger to my image planes via the Reality Composer tool on the Mac. Through this, I've created a fake 'frame by frame' animation. The animation plays smoothly in reality composer. However, when exported as a usdz file, the file only plays the animation in Mac's quick look but not when it is imported in Xcode. In Xcode, it just displays all my images and does not hide them. Does anyone have any idea how to fix this? Thanks!
tl;dr hide and show triggers created in reality composer does not play in xcode, why?
Post not yet marked as solved
I want to make a web-based augmented reality app on iOS devices, but can I place multiple 3D models in different locations in a single link, and how do I place them
Post not yet marked as solved
Dear all,
In "Explore advanced rendering with RealityKit 2," Courtland presents how one can efficiently leverage dynamic meshes in RealityKit and update them at runtime.
My question is quite practical: Say, I have
a model of fixed topology and
a set of animations (coordinates of each vertex per frame, finite duration)
that I can only generate at runtime.
How do I drive the mesh updates at 60FPS?
Can I define a reusable Animation Resource for every animation once at startup and then schedule their playback like simple transform animations?
Any helpful reply pointing me in the right direction is appreciated. Thank you.
~ Alexander
Post not yet marked as solved
Short description
See ( https://youtu.be/fD6af6MaFRo ) to directly see the issue.
Description
The way a device (iPhone or iPad) is handled affects the ARKit result (camera tracking or scene reconstruction, since both are linked). A device handled in portrait position won’t behave the same way as a device handled in landscape position. One position gives a correct result, while the other doesn't: the camera tracking or the reconstructed scene has a visible error.
Which handled position gives a correct result?
It depends on the movement that the device performs :
It behaves correctly when you perform a pan with the device handled in portrait, or a tilt with the device handled in landscape.
The issue can be visible when you perform a pan with the device handled in landscape, or a tilt with the device handled in portrait.
In reality the device is not limited to only horizontal or vertical movements, so both portrait and landscape handle positions return an incorrect result depending on how much the wrong move was performed relative to device handle position.
Why some people might not have noticed it
This issue might go under radars for many AR applications, since they tend to:
Focus on one area in space
Turn around in a room in portrait position
Do a lot of movement but without really needing to have a precise ARKit result
What could be the root cause of this
Since we have no access to ARKit, we can only try to guess.
What we deduced is that the device should not move along its vertical, so not going in the direction of its front camera or lightning/usb-c connection. What could be the difference between the vertical and horizontal of the device? The only element we noticed is the camera sensor orientation, since it is a rectangle.
How to reproduce the issue
Since the issue is inside ARKit, every application on the App Store that we tested has this issue. So you have two choices:
Create a test project
Create a minimal ARKit project.
Use ArView with the session configured with ARWorldTrackingConfiguration (the issue might be present in other mode, just not tested).
Enable SceneReconstruction.
Use the ArView debugOptions .showSceneUnderstanding.
Aggregate environment data to generate a mesh by moving the device around.
Disable SceneReconstruction.
Handle the device in landscape position.
Do some 360° horizontally in the previously scanned environment.
Expected result:
When moving around in an environment, the displayed mesh provided by showSceneUnderstanding option is correctly aligned with the environment.
Current result:
When moving around in an environment, the displayed mesh is positioned incorrectly.
Download an AR app on the App Store:
Example of AR application:
- SiteScape
An example for you to see the result.
(SiteScape v1.3 on iPhone 12 Pro iOS 15.5).
- MetaScan
- CamTrackAr
A little bit harder to see the issue since only an anchor can be used as a reference to see an offset.
Important note
This issue is critical for the application we develop, we can’t release it with this bug. We understand that fixing this bug can take some time so we are doing our best to be patient. But, this issue has been reported since 16 Jun 2021 on Feedback Assistant (FB9184883), so we are waiting for almost a year now.
We have absolutely no visibility on how this issue is handled. So, we tried to use the Developer Technical Support (DTS) to ask help. We sent a message to the Apple Developer Program Support. We asked on Feedback Assistant. We learned nothing. What is the priority of this issue? Is even someone assigned to this issue right now? Is one year not enough time to fix it?
So can we have at least some information? And hopefully have this bug finally fixed.
Post not yet marked as solved
Hello, My problem is that I would like to have the first screen (the screen right before entering AR) removed from the AR Quick look. I want it to where when I open the USDZ, it goes straight into AR, and not show the screen before it, similar to how Apple does it when you want to view their iPhone 13 pro in AR.
Post not yet marked as solved
I wrote a simple ARKit app that has a hardcoded "virtual" hat into the Experience.rcproject. The hat is a .usdz file.
I want to add the functionality for the users of the app to import their own .usdz hats, in that they are not pre-hardcoded in the app. There would be a button "Upload your hat", user would click and import the .usdz file, and the hat would end up being on the user's head.
From the code perspective, the Experience model is strongly typed, so not sure how that could work:
func updateUIView(_ uiView: ARView, context: Context) {
let arConfiguration = ARFaceTrackingConfiguration()
uiView.session.run(arConfiguration,
options:[.resetTracking, .removeExistingAnchors])
let arAnchor = try! Experience.loadHat() // I want this to happen dynamically depending on the imported file from the UI
uiView.scene.anchors.append(arAnchor)
}
Is adding a .usdz file dynamically to Experience somehow possible?
Post not yet marked as solved
I'm trying to make an app for iMessages and Facetime that allows users to upload custom Animojis based on 3d Files. Is this possible?
I'm running into troubles and could really use some help from my fellow devs.
Thanks,
Ryan