when i start roomplan scanner,there reminder "Move device to start",and other reminder, now i use roomplan in china, i want use custom remider in my apps, how should i do?
RoomPlan
RSS for tagCreate parametric 3D scans of rooms and room-defining objects.
Posts under RoomPlan tag
89 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hi,
I want to capture the real world colors and textures of the objects into RoomPlan scans. and export the model as USDZ. Is there a way to achieve this?
I am currently working on a project involving the merging of multiple scans into a single structure using Room plan json file. I am facing an issue and would greatly appreciate your assistance in resolving it.
The problem I am encountering is as follows:
"Cannot process multiFloorPlan: Invalid room location in structure"
Captured room json file paths:
[https://mimesisfbs.s3.ap-south-1.amazonaws.com/project_3dfile_json/2088a003-5712-4e9f-91eb-601b145ca98e-project_3dfile_json_file.json)
[https://mimesisfbs.s3.ap-south-1.amazonaws.com/project_3dfile_json/10ccd0d9-7843-41bd-a021-a57781231059-project_3dfile_json_file.json)
We are using same code as in example.
[https://developer.apple.com/documentation/RoomPlan/merging_multiple_scans_into_a_single_structure)
Hi all.
Can anyone show the code for multiroom on swiftUI?
And another question, is it possible to change the parameters (dimensions) of walls, doors, windows in an already scanned room by entering real dimensions from a laser tape measure? Because the scanned object has dimensional errors of up to 3 cm.
Thanks.
I'm looking for the full sample code from WWDC23 session https://developer.apple.com/videos/play/wwdc2023/10192
Hello,
we are using the RoomPlan API and our users are facing issues during scanning with more than 10% frequency rate.
The errors are of different type but aren't really suiting the context. For example, the error message "Roomsize exceeded" (translated from German) is popping up even though the room is relatively small. Bigger rooms are not a problem.
We would like to know what is triggering errors exactly so that we can reproduce it. We are going to build workarounds for it, like snapshotting the ARSession to continue later ideally.
Unfortunately on errors the ARSession is ended and all data scanned is lost.
Has anyone else encounters this?
We have a 'roomplan' API object in a USDZ file that we need to display as a mesh object. Please suggest how to convert it.
Hi,
can you add in measure support for lidars scan using Roomplan api with export to usdz, now mesh2mesh have export dotBim and chech if usdz export have good semantics data in it to have good conversion. Also to have in measure app comparator to import usdz and check space or part for similarity or difference.
best regards,
ivo
Hi! I'd like to use a RoomCaptureSessionDelegate to provide custom coaching. I don't want to re-do the entire user interface while scanning a room. Is it possible to combine the two (like getting the animations from the RoomCaptureView into a custom ARKit session, or to use the delegate of the RoomCaptureView capture session)?
How can i display the dimension of sufrace or object in roomplan in the 3D model result value ? I have the result of 3D like this
and the result dimension in the terminal like this
and then i wanna make the dimension is display like this image
After using the roomApi and after scanning I got some files named "Room.json", "Room.usdz".
In Room.json file there is a key with a string "coreModel":
how can I decode this. - The "coreModel": contains a String
I'd like to be able to associate some data with each CapturedRoom scan and maintain those associations when CapturedRooms are combined in a CapturedStructure.
For example, in the delegate method captureView(didPresent:error:), I'd like to associate external data with the CapturedRoom. That's easy enough to do with a Swift dictionary, using the CapturedRoom's identifier as the key to the associated data.
However, when I assemble a list of CapturedRooms into a CapturedStructure using StructureBuilder.init(from:), the rooms in the output CapturedStructure have different identifiers so their associations to the external data are lost.
Is there any way to track or identify CapturedRoom objects that are input into a StructureBuilder to the rooms in the CapturedStructure output? I looked for something like a "userdata" property on a CapturedRoom that might be preserved, but couldn't find one. And since the room identifiers change when they are built into a CapturedStructure, I don't see an obvious way to do this.
Is it possible to use SceneKit nodes within RoomCaptureView or RoomCaptureSession? If so, then how can I add a 3D object while the scanning process is going on? Will it be possible to add that 3D object to the final 3D captured room?
For Roomplan api, currently there is no feature that detects ceiling or related objects. Can we expect any development in this direction in future?
Currently camera access for developers is restricted. Many other frameworks such as roomplan api are also not available in the vision pro. The usage of camera feed is an important one for AR apps. Will this be made available in near future?
I got multiple crashes with the issue that the app has crashed in the SlamEngine during CV3DSLAMAnchorUpdateCopySessionID. There is no pattern when this occurs and I do not know how to reproduce this error but since it is occuring multiple times I thought i will ask in this forum if anyone has a similar issue or knows a fix for this.
I am writing to seek assistance with a challenge I am facing while working on a 3D model rendering project. I believe your expertise in this area could be immensely helpful in resolving the issue.
The problem I am encountering involves difficulties in displaying textures on both parent and child nodes within the 3D model. Here are the key details of the problem:
This model contents wall_grp(doors, windows and wall) objects. We are using roomplan data in SCNView.
This code dependent on scene kit and room plan apis
When we are comment childnode code its working but in this case we don’t have windows and door on wall.
func updateWallObjects() {
if arch_grp.count > 0 {
if !arch_grp.isEmpty {
for obj in arch_grp[0].childNodes {
let color = UIColor.init(red: 255/255, green: 229/255, blue: 204/255, alpha: 1.0)
let parentNode = obj.flattenedClone()
for childObj in obj.childNodes {
let childNode = childObj.flattenedClone()
let childMaterial = SCNMaterial()
childNode.geometry?.materials = [childMaterial]
if let name = childObj.name {
if (removeNumbers(from: name) != "Wall") {
childNode.geometry?.firstMaterial?.diffuse.contents = UIColor.white
} else {
childNode.geometry?.firstMaterial?.diffuse.contents = color
}
}
childObj.removeFromParentNode()
parentNode.addChildNode(childObj)
}
let material = SCNMaterial()
parentNode.geometry?.materials = [material]
parentNode.geometry?.firstMaterial?.diffuse.contents = color
obj.removeFromParentNode()
arch_grp[0].addChildNode(parentNode)
}
}
}
}```
Please suggest us
How can i draw a CapturedRoom.Surface.Curve in Scenekit? Is there a way
by using an UIBezierPath or by splitting it up into segments?
Hi! I'm currently doing a simple RoomCaptureView-based room capture (iOS 17, iPhone 15 super mega ultra pro max), and I'm unable to export anything I capture, even though I have a dollhouse I can play with after the export.
Attached is a screen grab of the doll-house expand/collapse view. (it always fails) My export code is in the RoomCaptureView delegate. The passed in error is nil.
func captureView(didPresent processedResult: CapturedRoom, error: (Error)?) {
let fm = FileManager()
do {
let documentDirectoryURL = try fm.url(for: .documentDirectory,
in: .userDomainMask,
appropriateFor: nil,
create: true)
let now = generateCurrentTimeStamp() // of the form 2023_10_02_09_45_58
let destinationURL = documentDirectoryURL.appendingPathComponent("\(now).usdz")
try processedResult.export(to: destinationURL)
} catch {
print("oops no processed result? \(error)")
}
}
The constructed URL is: file:///var/mobile/Containers/Data/Application/7DD98157-909A-40A1-9271-1AFCD5336E8B/Documents/2023_10_02_09_45_58.usdz
The error in the catch is (not sure where the rest of my path went...):
▿ Error
▿ cannotCreateNode : 1 element
- path : "/2023_10_02_09_45_58"
And this is printed to the console:
Warning: in SdfPath at line 151 of sdf/path.cpp -- Ill-formed SdfPath </2023_10_02_09_38_17>: syntax error
Coding Error: in _IsValidPathForCreatingPrim at line 3338 of usd/stage.cpp -- Path must be an absolute path: <>
The same sequence happens if I use any of the different export types (mesh-default, parametric, model).
I'd attach my project but looks like zip files and dropbox links are forbidden here. If you want a small sample project, feel free to email me at markd at borkware dot com (since looks like emails are forbidden here too)
Thanks in advance! I'd love to present this to my cocoaheads in a couple of weeks.
We are attempting to update the texture on a node. The code below works correctly when we use a color, but it encounters issues when we attempt to use an image. The image is available in the bundle, and it image correctly in other parts of our application. This texture is being applied to both the floor and the wall. Please assist us with this issue."
for obj in Floor_grp[0].childNodes {
let node = obj.flattenedClone()
node.transform = obj.transform
let imageMaterial = SCNMaterial()
node.geometry?.materials = [imageMaterial]
node.geometry?.firstMaterial?.diffuse.contents = UIColor.brown
obj.removeFromParentNode()
Floor_grp[0].addChildNode(node)
}