Post not yet marked as solved
Hi all.
Can anyone show the code for multiroom on swiftUI?
And another question, is it possible to change the parameters (dimensions) of walls, doors, windows in an already scanned room by entering real dimensions from a laser tape measure? Because the scanned object has dimensional errors of up to 3 cm.
Thanks.
Post not yet marked as solved
I'm looking for the full sample code from WWDC23 session https://developer.apple.com/videos/play/wwdc2023/10192
Post not yet marked as solved
Hello,
we are using the RoomPlan API and our users are facing issues during scanning with more than 10% frequency rate.
The errors are of different type but aren't really suiting the context. For example, the error message "Roomsize exceeded" (translated from German) is popping up even though the room is relatively small. Bigger rooms are not a problem.
We would like to know what is triggering errors exactly so that we can reproduce it. We are going to build workarounds for it, like snapshotting the ARSession to continue later ideally.
Unfortunately on errors the ARSession is ended and all data scanned is lost.
Has anyone else encounters this?
Post not yet marked as solved
We have a 'roomplan' API object in a USDZ file that we need to display as a mesh object. Please suggest how to convert it.
Post not yet marked as solved
I am in need to create a 2D floor plan with the dimensions mentioned, from the generated 3D result of RoomPlan. Is there a way to create it in a little easy-to-understand manner? Or will it require manual elaborate coding?
Post not yet marked as solved
Our app is available in the App Store and it's working well on ios16 devices.
A few days ago, we noticed in Organizer weird bugs coming from ios17, with the only hint "NO_CRASH_STACK".
After installing ios17 on an iPhone, we were able to reproduce the crash directly at launch, but only when the app is downloaded from the appstore (no crash when the app is installed with Xcode 15 beta)
"type": "EXC_CRASH",
"signal": "SIGABRT"
},
"termination": {
"code": 4,
"flags": 518,
"namespace": "DYLD",
"indicator": "Symbol missing",
"details": [
"(terminated at launch; ignore backtrace)"
],
"reasons": [
"Symbol not found: _$s8RoomPlan0A14CaptureSessionCACycfc",
"Referenced from: <XXXX----XXXXXXX> /Volumes/VOLUME//.app/",
"Expected in: <XXXX--**-XXXXX-XXXXXXX> /System/Library/Frameworks/RoomPlan.framework/RoomPlan"
]
Does Anybody else encounter this issue?
What should we do to solve this?
thanks!
Post not yet marked as solved
I'm unable to debug on iPhone 12 running iOS 17 beta 3 via network. I'm running Xcode 15 beta 6
Device shows in devices and simulators and I can debug when connected with cable.
However the "Connect Via Network" option is frayed out, oddly however the checkbox is ticked
I'm developing a app using RoomPlan so network connectivity is a must for debugging
Anyone else encountered this and know how to get around this problem
Other devices running iOS 16 connect via network just fine
Post not yet marked as solved
Hi,
can you add in measure support for lidars scan using Roomplan api with export to usdz, now mesh2mesh have export dotBim and chech if usdz export have good semantics data in it to have good conversion. Also to have in measure app comparator to import usdz and check space or part for similarity or difference.
best regards,
ivo
Post not yet marked as solved
I got multiple crashes with the issue that the app has crashed in the SlamEngine during CV3DSLAMAnchorUpdateCopySessionID. There is no pattern when this occurs and I do not know how to reproduce this error but since it is occuring multiple times I thought i will ask in this forum if anyone has a similar issue or knows a fix for this.
Post not yet marked as solved
Hi! I'd like to use a RoomCaptureSessionDelegate to provide custom coaching. I don't want to re-do the entire user interface while scanning a room. Is it possible to combine the two (like getting the animations from the RoomCaptureView into a custom ARKit session, or to use the delegate of the RoomCaptureView capture session)?
Post not yet marked as solved
How can i display the dimension of sufrace or object in roomplan in the 3D model result value ? I have the result of 3D like this
and the result dimension in the terminal like this
and then i wanna make the dimension is display like this image
Post not yet marked as solved
After using the roomApi and after scanning I got some files named "Room.json", "Room.usdz".
In Room.json file there is a key with a string "coreModel":
how can I decode this. - The "coreModel": contains a String
Post not yet marked as solved
Let's say a 10-feet wall. Would RoomPlan return 11, 10.1, or 10.01 feet? Thanks a lot.
Post not yet marked as solved
I'd like to be able to associate some data with each CapturedRoom scan and maintain those associations when CapturedRooms are combined in a CapturedStructure.
For example, in the delegate method captureView(didPresent:error:), I'd like to associate external data with the CapturedRoom. That's easy enough to do with a Swift dictionary, using the CapturedRoom's identifier as the key to the associated data.
However, when I assemble a list of CapturedRooms into a CapturedStructure using StructureBuilder.init(from:), the rooms in the output CapturedStructure have different identifiers so their associations to the external data are lost.
Is there any way to track or identify CapturedRoom objects that are input into a StructureBuilder to the rooms in the CapturedStructure output? I looked for something like a "userdata" property on a CapturedRoom that might be preserved, but couldn't find one. And since the room identifiers change when they are built into a CapturedStructure, I don't see an obvious way to do this.
Post not yet marked as solved
Is it possible to use SceneKit nodes within RoomCaptureView or RoomCaptureSession? If so, then how can I add a 3D object while the scanning process is going on? Will it be possible to add that 3D object to the final 3D captured room?
Post not yet marked as solved
Currently camera access for developers is restricted. Many other frameworks such as roomplan api are also not available in the vision pro. The usage of camera feed is an important one for AR apps. Will this be made available in near future?
Post not yet marked as solved
For Roomplan api, currently there is no feature that detects ceiling or related objects. Can we expect any development in this direction in future?
Post not yet marked as solved
I am writing to seek assistance with a challenge I am facing while working on a 3D model rendering project. I believe your expertise in this area could be immensely helpful in resolving the issue.
The problem I am encountering involves difficulties in displaying textures on both parent and child nodes within the 3D model. Here are the key details of the problem:
This model contents wall_grp(doors, windows and wall) objects. We are using roomplan data in SCNView.
This code dependent on scene kit and room plan apis
When we are comment childnode code its working but in this case we don’t have windows and door on wall.
func updateWallObjects() {
if arch_grp.count > 0 {
if !arch_grp.isEmpty {
for obj in arch_grp[0].childNodes {
let color = UIColor.init(red: 255/255, green: 229/255, blue: 204/255, alpha: 1.0)
let parentNode = obj.flattenedClone()
for childObj in obj.childNodes {
let childNode = childObj.flattenedClone()
let childMaterial = SCNMaterial()
childNode.geometry?.materials = [childMaterial]
if let name = childObj.name {
if (removeNumbers(from: name) != "Wall") {
childNode.geometry?.firstMaterial?.diffuse.contents = UIColor.white
} else {
childNode.geometry?.firstMaterial?.diffuse.contents = color
}
}
childObj.removeFromParentNode()
parentNode.addChildNode(childObj)
}
let material = SCNMaterial()
parentNode.geometry?.materials = [material]
parentNode.geometry?.firstMaterial?.diffuse.contents = color
obj.removeFromParentNode()
arch_grp[0].addChildNode(parentNode)
}
}
}
}```
Please suggest us
Post not yet marked as solved
We are using the RoomPlan API to capture data, which is stored in the 'CapturedRoom' variable in our code (referred to as 'finalResult'). We then attempt to save a USDZ file in the file manager. Sometimes it works, but other times, we encounter issues like the one below
Coding Error: in _IsValidPathForCreatingPrim at line 3338 of usd/stage.cpp -- Path must be an absolute path: <>
cannotCreateNode(path: "/9EE71ED0F8D6415496A7B9F0C3671DB0321")
This is that code we are using for shaving CapturedRoom data
func saveFileLocal() {
if let finalResult {
let fm = FileManager.default
let documentsURL = fm.urls(for: .documentDirectory, in: .userDomainMask).first!
//let documentsURL = URL.documentsDirectory
let fileName = "\(UUID().uuidString).usdz"
let fileURL = documentsURL.appendingPathComponent(fileName)
do {
try finalResult.export(to: fileURL)
} catch {
print(error)
}
}
}
Please help us.
Post not yet marked as solved
We are trying to save usdz file in file manager some time its saved but some time we are getting the error. Like: path.absoluteURL file:///var/mobile/Containers/Data/Application/6D14A430-47B4-45F2-9D0D-6C31588A6A03/Documents/2896837C-C7E0-4FA8-BFE2-21A59B26D801.usdz Warning: in SdfPath at line 151 of sdf/path.cpp -- Ill-formed SdfPath </2896837CC7E04FA8BFE221A59B26D801>: syntax error Coding Error: in _IsValidPathForCreatingPrim at line 3338 of usd/stage.cpp -- Path must be an absolute path: <> cannotCreateNode(path: "/2896837CC7E04FA8BFE221A59B26D801")
func saveFileLocal() {
if let finalResult {
let fm = FileManager.default
var path = fm.urls(for: .documentDirectory, in: .userDomainMask).first!
let fileName = "(UUID().uuidString).usdz"
path.appendPathComponent(fileName)
do {
try finalResult.export(to: path.absoluteURL)
}
catch{
print(error)
}
}
}
func removeFiles() {
var filePath = ""
let fm = FileManager.default
let path = fm.urls(for: .documentDirectory, in: .userDomainMask).first!
do{
let content = try fm.contentsOfDirectory(atPath: path.path)
for c in content{
filePath = path.appendingPathComponent(c).absoluteString
if let url = URL(string: filePath){
try fm.removeItem(at: url)
}
}
}
catch{
print(error)
}
}