Post

Replies

Boosts

Views

Activity

Swift Package Manager expose internal dependencies
Hello, this might be a really basic question as I don’t have a lot of experience with Swift Package Manager yet. I would like to know what I need to do in order specify that the frameworks my package depends on, will also be imported and exposed within the file where I import my package. For example: I have a framework called »UIComponents« which internally imports UIKit to build custom components. Now in my app I import UIComponents and have its contents available. However there I can not reference things from UIKit. For that I would need to import UIKit separately. So how can I import my framework and then also have access to UIKit? Similarly to how UIKit also gives me access to stuff from Foundation. Thanks!
2
0
3.9k
Jun ’20
Create Alpha Matte from SceneDepth API in SceneKit (Occlusion)
Hello, I would love to utilize the new Occlusion Feature that has been added to RealityKit. But as my app still requires a lot of features that RealityKit doesn't yet deliver I have to keep using SceneKit for now. I was wondering if I could convert the depth map provided by the SceneDepth API into an alpha matte that I could then feed into a SCNTechnique to achieve a 'poor mans' occlusion. Is there some kind of CIFilter or workflow that could help me with this? Maybe some kind of edge detection? Thankful for any hints!
3
0
1.2k
Jul ’20
Set camera feed as texture input for CustomMaterial
Hello, in this project https://developer.apple.com/documentation/arkit/content_anchors/tracking_and_visualizing_faces there is some sample code that describes how to map the camera feed to an object with SceneKit and a shader modifier. I would like know if there is an easy way to achieve the same thing with a CustomMaterial and RealityKit 2. Specifically I'm interested in what would be the best way to pass in the background of the RealityKit environment as a texture to the custom shader. In SceneKit this was really easy as one could just do the following: material.diffuse.contents = sceneView.scene.background.contents As the texture input for custom material requires a TextureResource I would probably need a way to create a CGImage from the background or camera feed on the fly. What I've tried so far is accessing the captured image from the camera feed and creating a CGImage from the pixel buffer like so: guard     let frame = arView.session.currentFrame,     let cameraFeedTexture = CGImage.create(pixelBuffer: frame.capturedImage),     let textureResource = try? TextureResource.generate(from: cameraFeedTexture, withName: "cameraFeedTexture", options: .init(semantic: .color)) else {     return } // assign texture customMaterial.custom.texture = .init(textureResource) extension CGImage {   public static func create(pixelBuffer: CVPixelBuffer) -> CGImage? {     var cgImage: CGImage?     VTCreateCGImageFromCVPixelBuffer(pixelBuffer, options: nil, imageOut: &cgImage)     return cgImage   } } This seems wasteful though and is also quite slow. Is there any other way to accomplish this efficiently or would I need to go the post processing route? In the sample code the displayTransform for the view is also being passed as a SCNMatrix4. CustomMaterial custom.value only accepts a SIMD4 though. Is there another way to pass in the matrix? Another idea I've had was to create a CustomMaterial from an OcclusionMaterial which already seems to contain information about the camera feed but so far had no luck with it. Thanks for the support!
8
0
3.7k
Jul ’21
Animate transparency of blending property in RealityKit 2
Hello, PhysicallyBasedMaterial in RealityKit 2 contains a blending property to adjust the transparency of a material. Is there a way to animate this over time to fade entities in and out? I've tried the new FromToByAnimation API but could not figure out if there is a supported BindPath for the transparency. Ideally what I would like to achieve is something similar to SceneKits SCNAction.fadeIn(duration: …) which also worked on a whole node. I figured I could also go the route of a custom fragment shader here, though that seems overkill. As RealityComposer also supports fade actions I would assume that this is at least supported behind the scenes. Thanks for any help!
2
0
1.8k
Jul ’21
Transparent blending with gradient texture leads to color banding (RealityKit 2)
Hello, I have a cylinder with an Unlit Material and a base color set. Now I want to apply a gradient as the alpha mask so I get this kind of halo GTA-like checkpoint look. The code: var baseMaterial = UnlitMaterial(color: UIColor.red) baseMaterial.blending = .transparent(opacity: .init(scale: 100, texture: .init(maskTextureResource))) // maskTextureResource is the gradient mask baseMaterial.opacityThreshold = 0 This works but unfortunately leads to some ugly visible gradient banding. I've also tried to play with the scale of the blending texture but that did not help. As an alternative approach I tried to solve this via a custom surface shader. Code below: [[visible]] void gradientShader(realitykit::surface_parameters params) {     auto surface = params.surface();     float2 uv = params.geometry().uv0();     float h = 0.5; // adjust position of middleColor     half startAlpha = 0.001;     half middleAlpha = 1;     half endAlpha = 0.001;     half alpha = mix(mix(startAlpha, middleAlpha, half(uv.y/h)), mix(middleAlpha, endAlpha, half((uv.y - h)/(1.0 - h))), half(step(h, uv.y))); surface.set_emissive_color(half3(params.material_constants().emissive_color())); surface.set_base_color(half3(params.material_constants().base_color_tint()));     surface.set_opacity(alpha); } The result looks really nice and smooth but unfortunately this now also culls the inner part of the cylinder. Even on the semitransparent parts. What I want is a having the effect applied on both the outer and inner part of the cylinder being visible. So the transparent part of the outside allows you to seethrough to the inside. I've got this working by using a PhysicallyBasedMaterial instead of an UnlitMaterial (which does not support blending out of the box) but again had to issue with the banding. On my Custom Material faceCulling is set to .none. Here is how it currently looks – as you can see in the left one the alpha mask is not smooth and has banding artefacts: Thank you for any help!
5
0
1.8k
Sep ’21
dyld: Library not loaded: RealityFoundation – Error on iOS 14 with RealityKit 2
I've just implemented some necessary fixes to our app to ensure compatibility with iOS 15 and ARKit 2. Now when I deploy the app via XCode 13 RC (Version 13.0 (13A233)) I get the following crash right at launch: dyld: Library not loaded: /System/Library/Frameworks/RealityFoundation.framework/RealityFoundation   Referenced from: /private/var/containers/Bundle/Application/…   Reason: image not found dyld: launch, loading dependent libraries DYLD_LIBRARY_PATH=/usr/lib/system/introspection DYLD_INSERT_LIBRARIES=/Developer/usr/lib/libBacktraceRecording.dylib:/Developer/usr/lib/libMainThreadChecker.dylib:/Developer/Library/PrivateFrameworks/DTDDISupport.framework/libViewDebuggerSupport.dylib Is this a known issue? I've already tried deleting Derived Data and clearing the project but the problem persists. The minimum deployment target is iOS 13 for the main app and iOS 14 for the AppClip. All iOS 15 related fixes are wrapped into if #available(iOS 15.0, *) { … } This is a pretty major problem for us as we now can't send out Testflights or upload to AppStoreConnect for Monday. Thanks!
1
0
3.9k
Sep ’21
Loading of older .reality files is broken on iOS 15 (works on iOS 14)
Hello, in our app we are downloading some user generated content (.reality files and USDZs) and displaying it within the app. This worked without issues in iOS 14 but with iOS 15 (release version) there have been a lot of issues with certain .reality files. As far as I can see USDZ files still work. I've created a little test project and the error message log is not really helpful. 2021-10-01 19:42:30.207645+0100 RealityKitAssetTest-iOS15[3239:827718] [Assets] Failed to load asset of type 'RealityFileAsset', error:Could not find archive entry named assets/Scéna17_9dfa3d0.compiledscene. 2021-10-01 19:42:30.208097+0100 RealityKitAssetTest-iOS15[3239:827598] [Assets] Failed to load asset path '#18094855536753608259' 2021-10-01 19:42:30.208117+0100 RealityKitAssetTest-iOS15[3239:827598] [Assets] AssetLoadRequest failed because asset failed to load '#18094855536753608259' 2021-10-01 19:42:30.307040+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307608+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307712+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307753+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307790+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307907+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307955+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.308155+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.308194+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 ▿ Failed to load loadRequest.   - generic: "Failed to load loadRequest." Basic code structure that is used for loading: cancellable = Entity.loadAsync(named: entityName, in: .main)     .sink { completion in         switch completion {         case .failure(let error):             dump(error)             print("Done")         case .finished:             print("Finished loading")         }     } receiveValue: { entity in         print("Entity: \(entity)")     } Is there anyway to force it to load in a mode that enforces compatibility? As mentioned this only happens on iOS 15. Even ARQuickLook can't display the files anymore (no issues on iOS 14). Thanks for any help!
8
0
3.5k
Oct ’21
Create MTLLibrary from raw String for use within RealityKit
Hello, I have a usecase where I need to to download and compile metal shaders on demand as strings or .metal files. These should then be used for CustomMaterials and/or postprocessing within RealityKit. Essentially this boils down to having raw source code that needs to be compiled at runtime. My plan was to use the method makeLibrary(source:options:completionHandler:) to accomplish this. The problem is that I get the following error during compilation: RealityKitARExperienceAssetProvider: An error occured while trying to compile shader library »testShaderLibrary« - Error Domain=MTLLibraryErrorDomain Code=3 "program_source:2:10: fatal error: 'RealityKit/RealityKit.h' file not found #include <RealityKit/RealityKit.h> My code for creating the library looks like this (simplified example): let librarySourceString: String = """ #include <metal_stdlib> #include <RealityKit/RealityKit.h> using namespace metal; [[visible]] void mySurfaceShader(realitykit::surface_parameters params) {     params.surface().set_base_color(half3(1, 1, 1)); } """ mtlDevice.makeLibrary(source: librarySourceString, options: nil) { library, error in     if let error = error {         dump(error) return     }     // do something with library } So I'm wondering if there's a way to tell the metal compiler how to resolve this reference to the RealityKit header file? Would I need to replace that part of the source string maybe with an absolute path to the RealityKit framework (if so – how would I get this at runtime)? Appreciate any hints - thanks!
1
0
1.7k
Nov ’21
What causes »ARSessionDelegate is retaining X ARFrames« console warning?
Hi, since iOS 15 I've repeatedly noticed the console warning »ARSessionDelegate is retaining X ARFrames. This can lead to future camera frames being dropped« even for rather simple projects using RealityKit and ARKit. Could someone from the ARKit team please elaborate what causes this warning and what can be done to avoid it? If I remember correctly I didn't even assign an ARSessionDelegate. Thank you!
4
1
3.0k
Nov ’21
Precompile/prewarm shaders to avoid jank
Hi, is there a way to force RealityKit to compile/prewarm and cache all shaders that will be used within a Scene in advance – ideally in the background? This would be useful for adding complex models to the scene which sometimes can cause quite a couple dropped frames even on the newest devices (at least I assume the initial delay when displaying them is caused by the shader compilation) but also for CustomMaterials. Note this also happens with models that are loaded asynchronously. Thanks!
1
0
1.2k
Nov ’21
ARView Frame Timestamp/Elapsed Time
Hello, I've been looking all over the place but so far I haven't found a trivial way to grab ARView's current timestamp. So basically the elapsed time since the scene started rendering. I can access that in the Surface and Geometry shaders but I would like to pass a timestamp as a parameter in order to drive shader animations. I feel that's more efficient than injecting animation progress manually on every frame, especially if there are lot's of objects with that shader. So far what I've done is subscribing to the Scenes Update event and using the delta time to calculate the elapsed time myself. But this is quite error prone and tends to break when I present the scene a second time (e.g closing and reopening to AR experience). The only other option I found was using a render callback and to grab the time property from the PostProcessContext. That works well but do I really have to go that route? Would be great if there is an easy way to achieve this. Pretty much an equivalent to this: https://developer.apple.com/documentation/scenekit/scnscenerenderer/1522680-scenetime NOTE: I'm not looking for the timestamp of the ARSessions current frame. Thank you!
0
0
925
Jan ’22
Updating blending property to .transparent(opacity: …) removes material color texture (iOS 16 Beta)
Hello, On iOS 16 when I'm retrieving an existing material from a model entity and update it's blending property to .transparent(opacity: …) the color or baseColor texture get's removed after reassigning the updated material. My usecase is that I want to fade in a ModelEntity through a CustomSystem and therefore need to repeatedly reassign the opacity value. I've tested this with UnlitMaterial and PhysicallyBasedMaterial – both suffer from this issue. On iOS 15 this works as expected. Please let me know if there is any workaround, as this seems to me like a major regression and ideally I need this to work once iOS 16 gets released to the public. The radar number including a sample project is: FB11420976 Thank you!
4
0
1.3k
Aug ’22
RealityKit Sample Project Issues; Performance and Crashes
Hi, please let me know if I should rather file feedback for this, but I figured it's worth to flag it one way or an another: Test Xcode Version: 14.0 beta 6 (14A5294g) 1. Project »Altering RealityKit Rendering with Shader Functions« This project crashes right away when running it on a device (iOS 15 and 16). Screenshot: 2. Project »Altering RealityKit Rendering with Shader FunctionsUsing object capture assets in RealityKit« Suffers from pretty bad performance when run on a device – barely scratching 20-25fps on an iPhone 12 Pro. iPhone XS even less. Screenshot: As these are official sample project I feel like they should work flawlessly out of the box. Best Arthur
3
0
1k
Sep ’22
Codable Conformance results in significant binary size increase
Hello, I recently converted from manual dictionary-based JSON Serialisation to Codable and noticed that this resulted in a pretty significant growth in binary size (I looked that up via App Thinning Size Report). The difference from codable to non codable is ~800KB. As our app also supports App Clips I now can't fulfill the 10MB Universal Bundle Size limit anymore. Is there any way I can make this a little more lean? Would it help to manually implement all Codable methods instead of relying on compiler synthetization? Thanks for any hints!
0
0
838
Nov ’22