Post not yet marked as solved
1.3k
Views
Has anyone been able to access multiple animations stored in a USDZ file through the RealityKit APIs?
Post marked as solved
744
Views
Does Apple have any documentation on using Reality Converter to convert FBX to USDZ on an M1 Max?
I'm trying to convert an .fbx file to USDZ with Apple's Reality Converter on an M1 Mac (macOS 12.3 Beta), but everything I've tried so far has failed.
When I try to convert .fbx files on my Intel-based iMac Pro, it succeeds.
Following some advice on these forums, I tried to install all packages from Autodesk
https://www.autodesk.com/developer-network/platform-technologies/fbx-sdk-2020-0
FBX SDK 2020.0.1 Clang
FBX Python SDK Mac
FBX SDK 2020.0.1 Python Mac
FBX Extensions SDK 2020.0.1 Mac
Still no joy.
I have a work around - I still have my Intel-based iMac. But I'd like to switch over to my M1 Mac for all my development.
Any pointers?
Note: I couldn't get the usdzconvert command line tool to work on my M1 Mac either. /usr/bin/python isn't there.
Post not yet marked as solved
28
Views
I have a framework which imports RealityKit. Minimum deployment target for the framework is iOS 13 and the app which uses the framework has iOS 12 as minimum deployment.
RealityKit and RealityFoundation is set to optional in the build phases of the framework and the framework is optional in the app as well.
But on launching the app in iOS 12 simulator I keep getting the following crash.
DYLD, can't resolve symbol _$s10RealityKit10HasPhysicsMp
If I change the minimum deployment of the framework to iOS 15, the same code works in iOS 12 simulator. No issues.
Also if I try with any other module like SwiftUI, which has min iOS 13 deployment, I don't find the issue and app launches without any issue.
Similarly, if I use ARKit instead of RealityKit, it works fine. No issues. Although ARKit min deployment is iOS 11 but it is also not available in simulators.
So why does weak linking RealityKit not work when the parent framework targets iOS 13 and iOS 14, but works alright when targeting iOS 15?
*
*
X
code version: 13.3.1**
Post not yet marked as solved
192
Views
Hi,when i select or try to edit a text element RC crashes. all other functions works fine.anybody know this problem?#######Application Specific Information:*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Invalid parameter not satisfying: index <= [_itemArray count]'terminating with uncaught exception of type NSExceptionabort() called#######My system:Reality Composer Version 1.4macOS 10.15.4iMac (Retina 5K, 27-inch, 2017)3,8 GHz Quad-Core Intel Core i5Radeon Pro 580regardsmichael
Post not yet marked as solved
43
Views
Dear all,
In "Explore advanced rendering with RealityKit 2," Courtland presents how one can efficiently leverage dynamic meshes in RealityKit and update them at runtime.
My question is quite practical: Say, I have
a model of fixed topology and
a set of animations (coordinates of each vertex per frame, finite duration)
that I can only generate at runtime.
How do I drive the mesh updates at 60FPS?
Can I define a reusable Animation Resource for every animation once at startup and then schedule their playback like simple transform animations?
Any helpful reply pointing me in the right direction is appreciated. Thank you.
~ Alexander
Post not yet marked as solved
48
Views
Anyone could help me for face tracking with rear camera to apply a 3d object?
Post not yet marked as solved
1k
Views
Hello,
in our app we are downloading some user generated content (.reality files and USDZs) and displaying it within the app.
This worked without issues in iOS 14 but with iOS 15 (release version) there have been a lot of issues with certain .reality files. As far as I can see USDZ files still work.
I've created a little test project and the error message log is not really helpful.
2021-10-01 19:42:30.207645+0100 RealityKitAssetTest-iOS15[3239:827718] [Assets] Failed to load asset of type 'RealityFileAsset', error:Could not find archive entry named assets/Scéna17_9dfa3d0.compiledscene.
2021-10-01 19:42:30.208097+0100 RealityKitAssetTest-iOS15[3239:827598] [Assets] Failed to load asset path '#18094855536753608259'
2021-10-01 19:42:30.208117+0100 RealityKitAssetTest-iOS15[3239:827598] [Assets] AssetLoadRequest failed because asset failed to load '#18094855536753608259'
2021-10-01 19:42:30.307040+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
2021-10-01 19:42:30.307608+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
2021-10-01 19:42:30.307712+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
2021-10-01 19:42:30.307753+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
2021-10-01 19:42:30.307790+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
2021-10-01 19:42:30.307907+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
2021-10-01 19:42:30.307955+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
2021-10-01 19:42:30.308155+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
2021-10-01 19:42:30.308194+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
▿ Failed to load loadRequest.
- generic: "Failed to load loadRequest."
Basic code structure that is used for loading:
cancellable = Entity.loadAsync(named: entityName, in: .main)
.sink { completion in
switch completion {
case .failure(let error):
dump(error)
print("Done")
case .finished:
print("Finished loading")
}
} receiveValue: { entity in
print("Entity: \(entity)")
}
Is there anyway to force it to load in a mode that enforces compatibility?
As mentioned this only happens on iOS 15. Even ARQuickLook can't display the files anymore (no issues on iOS 14).
Thanks for any help!
Post marked as solved
696
Views
The context is:
ARSessionDelegate
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {}
Upon Image Detection, place an overlay of a RC Entity at the ImageAnchor location.
Using IOS 14, I have been reliably using:
mySceneAnchorEntity = AnchorEntity(anchor: imageAnchor)
to render additive RC scene content. Additive model entities render correctly in the field of view.
When I upgraded to IOS 15 using the same code that had been working for many (~12) months on all versions of IOS 14, my code failed to render any RC scene content. I get a ghosting of all of the render content in the proper location, but visible only for a moment, then it disappears.
So, I finally found the root cause of the issue. It appears that IOS 15 only renders correctly in my application using:
mySceneAnchorEntity = AnchorEntity(world: imageAnchor.transform).
This led to many frustrating days of debugging to find the root cause. As a side note, IOS 14 renders RC scene entities correctly using both variants of AnchorEntity:
mySceneAnchorEntity = AnchorEntity(anchor: imageAnchor)
and
mySceneAnchorEntity = AnchorEntity(world: imageAnchor.transform)
So, this leads me to believe there is an issue with the behavior of IOS 15 with the following version of AnchorEntity:
mySceneAnchorEntity = AnchorEntity(anchor: imageAnchor)
Post not yet marked as solved
182
Views
I've noticed sometimes that even on iOS devices with AR Quick Look, the browser displays the file as text instead of opening as 3D.I'm curious to know how widespread this is. Have tested on several devices of different sizes, but there seems to be no correlation between phone (iphone5, 6, 7, X) and when the player does this.There is a test page here, but is appears to be regardless of usdz file. Also on the page an image of the unexpected output.https://lifetime2.unicaster.net/ar/kalinka.htmlAnyone that has time to test, a solution, theory or other experiences, it would be much appreciated.I'm wondering if it could be a memory issue, or a MIME type, although that would consistently cause the error, right?The model is a convertion from GLB in the Reality Converter.
Post marked as solved
850
Views
I am trying to do a hit test of sorts between a person in my ARFrame and a RealityKit Entity. So far I have been able to use the position value of my entity and project it to a CGPoint which I can match up with the ARFrame's segmentationBuffer to determine whether a person intersects with that entity. Now I want to find out if that person is at the same depth as that entity. How do I relate the SIMD3 position value for the entity, which is in meters I think, to the estimatedDepthData value?
Post marked as solved
146
Views
I would like to extract depth data for a given point in ARSession.currentFrame.smoothedSceneDepth.
Optimally this would end up looking something like:
ARView.depth(at point: CGPoint)
With the point being in UIKit coordinates just like the points passed to the raycasting methods.
I ultimately to use this depth data to convert a 2D normalized landmark from a Vision image request into a 3D world space coordinate in the 3D scene - I only lack the accurate depth data for a given 2D point.
What I have available is:
The normalized landmark from the Vision request.
Ability to convert this^ to AVFoundation coordinates.
Ability to convert this^ to screen-space/display coordinates.
When the depth data is provided correctly I can combine the 2D position in UIKit/screen-space coordinates with the depth (in meters) to produce an accurate 3D world position with the use of ARView.ray(through:)
What I have not been able to figure out is how to get this depth value for this coordinate on screen.
I can index the pixel buffer like this:
extension CVPixelBuffer {
func value(for point: CGPoint) -> Float32 {
CVPixelBufferLockBaseAddress(self, .readOnly)
let width = CVPixelBufferGetWidth(self)
let height = CVPixelBufferGetHeight(self)
//Something potentially going wrong here.
let pixelX: Int = width * point.x
let pixelY: Int = height * point.y
let bytesPerRow = CVPixelBufferGetBytesPerRow(self)
let baseAddress = CVPixelBufferGetBaseAddress(self)!
assert(kCVPixelFormatType_DepthFloat32 == CVPixelBufferGetPixelFormatType(depthDataMap))
let rowData = baseAddress + pixelY * bytesPerRow
let distanceAtXYPoint = rowData.assumingMemoryBound(to: Float32.self)[pixelX]
CVPixelBufferUnlockBaseAddress(self, .readOnly)
return distanceAtXYPoint
}
}
And then try to use this method like so:
guard let depthMap = (currentFrame.smoothedSceneDepth ?? currentFrame.sceneDepth)?.depthMap else { return nil }
//The depth at this coordinate, in meters.
let depthValue = depthMap.value(for: myGivenPoint)
The frame semantics [.smoothedSceneDepth, .sceneDepth] have been set properly on my ARConfiguration. The depth data is available.
If I hard-code the width and height values like so:
let pixelX: Int = width / 2
let pixelY: Int = height / 2
I get the correct depth value for the center of the screen.
I have only been testing in portrait mode.
But I do not know how to index the depth data for any given point.
Post not yet marked as solved
180
Views
I want to move toy_drummer entity with rotating toward other model entity.
as you can see video, firstly, it move very well to left one
then, I touch downward box entity but it move weird.
video is
youtu.be/3rb9W334Uwg
modelentity2 is the entity that I want to move
modelentityleft & modelentityright are the entity that modelentity2 should go to , so I can call objective entity
*code
when I touch left box entity to move,
if(entity?.parent?.name=="left"){
self.parent.value4="leftarea"
if(oktoadd){
self.counting = self.counting + 1
if(counting==1){
var position = firstResult.position
let currentMatrix = self.modelentityleft!.transform.matrix
let rotation = simd_float4x4(SCNMatrix4MakeRotation( -90.0 * .pi/180, 0,1,0))
let transform = simd_mul(currentMatrix, rotation)
self.modelentity2!.move(to: transform,
relativeTo: self.me,
duration: 1,
timingFunction: .linear)
DispatchQueue.main.asyncAfter(deadline: .now() + 1.1) {
counting=0
}
}
}
}
when click downward box entity,
if(ent
ity?.parent?.name=="down"){
self.parent.value4="downarea"
if(oktoadd){
self.counting = self.counting + 1
if(counting==1){
self.parent.value4="animating"
var position = firstResult.position
let currentMatrix = self.modelentitydown!.transform.matrix
let rotation = simd_float4x4(SCNMatrix4MakeRotation( 90.0 * .pi/180, 0,1,0))
let transform = simd_mul(currentMatrix, rotation)
print("added moving ...")
self.modelentity2!.move(to: transform,
relativeTo: self.me,
duration: 1,
timingFunction: .linear)
DispatchQueue.main.asyncAfter(deadline: .now() + 1.1) {
counting=0
}
}
}
}
in my code, I firstly change rotation toward objective entity then move
is there any way that I can do it more simply? (also wonder I hardcoded radian by 90 or -90 , any way to fix it by getting objective entity's position? )
Post not yet marked as solved
71
Views
I'm looking to use high-quality and large images as animated sprites via SpriteKit.
Right now the default maximum size of a texture atlas is 4096*4096, which is at least an order of magnitude below what I need. There's an option to create custom maximum sizes, but the tiny default and the age of the SpriteKit framework is giving me second thoughts even though I'd very much like to stick with an Apple framework and not have to rely on another party like Unity.
Before I invest my time and energy into SpriteKit I'd like to know whether the decade-old framework, running on modern hardware (A11 and newer), can support massive texture atlases while maintaining decent performance (~30 FPS).
If not, is it possible to implement something similar in less antiquated frameworks, such as RealityKit (non-AR).
Thanks in advance for your time.
Post not yet marked as solved
469
Views
I've been watching a 2019 Developer Video "Building Apps with RealityKit" and working along with it. It shows how to create a custom entity. It shows how to load entities from .usd files. How do you either load a custom entity, convert an Entity to a custom entity, or maybe move model hierarchy from an Entity to a custom entity? I assume there's a way to do this.
Post not yet marked as solved
103
Views
I'm currently weighing moving my project from SpriteKit to RealityKit as a way to future proof it. An important part of the project involves limiting the framerate sprites are displayed at. While RealityKit seems to be set at 60FPS, my project demands a much lower framerate.
I should note that I'm not using RealityKit for AR, so there's no camera passthrough to worry about.
Is there a way to limit RealityKit's rendering framerate?