Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

ARSCNView ignores output of SCNTechnique (sometimes)
I am using SCNTechnique in combination with ARSCNView. The technique is doing so minor post-processing. I have written several filter variant for this post-processing, but I'm facing an issue when with one of the filters/fragment shaders, SCNTechnique discards my output and just presents the plain camera feed on screen instead. This is clearly visible in the Metal pipeline, using the GPU frame debugger. Let me stress that my setup works for 90% of my filters, but not this one and I want to know why. iOS 18.1, iPhone 13 Mini. Xcode 16.1. Encoder 0 & 1 are injected by the system. Render encoder 2 & 3 correspond to my SCNTechnique's render passes: one to manipulate pixel data (darken it in this case) and another to BLIT it back to the main texture. I know the separate buffer is not strictly for this particular operation, but it shouldn't matter. Note that the issue occurs in encoder 4 (not mine but ARKit's). In Render Encoder 4, scn_postprocess_AR_fragment handle my texture (#0, ending in f980) and another from the camera feed (Texture 2). I know this pass is typically used for grain because that's what it used to do before I disabled grain on ARSCNView (+ the buffer still contains grain paramaters). I have other post-processing filters that work just fine. By what magic is ARKit determining to use Texture 2 instead of my Texture 0? Sure, I could keep digging into the minute differences between my shaders to find out which LoC affects how some ARKit shader down the line operates, but it's awfully opaque so far.
0
0
604
Nov ’24
Why does the planeNode in SceneKit flicker when using class property instead of a local variable?
I am working on a SceneKit project where I use a CAShapeLayer as the content for SCNMaterial's diffuse.contents to display a progress bar. Here's my initial code: func setupProgressWithCAShapeLayer() { let progressLayer = createProgressLayer() progressBarPlane?.firstMaterial?.diffuse.contents = progressLayer DispatchQueue.main.async { var progress: CGFloat = 0.0 Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { timer in progress += 0.01 if progress > 1.0 { progress = 0.0 } progressLayer.strokeEnd = progress // Update progress } } } // MARK: - ARSCNViewDelegate func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) { progressBarPlane = SCNPlane(width: 0.2, height: 0.2) setupProgressWithCAShapeLayer() let planeNode = SCNNode(geometry: progressBarPlane) planeNode.position = SCNVector3(x: 0, y: 0.2, z: 0) node.addChildNode(planeNode) } This works fine, and the progress bar updates smoothly. However, when I change the code to use a class property (self.progressLayer) instead of a local variable, the rendering starts flickering on the screen: func setupProgressWithCAShapeLayer() { self.progressLayer = createProgressLayer() progressBarPlane?.firstMaterial?.diffuse.contents = progressLayer DispatchQueue.main.async { [weak self] in var progress: CGFloat = 0.0 Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { [weak self] timer in progress += 0.01 if progress > 1.0 { progress = 0.0 } self?.progressLayer?.strokeEnd = progress // Update progress } } } After this change, the progressBarPlane in SceneKit starts flickering while being rendered on the screen. My Question: Why does switching from a local variable (progressLayer) to a class property (self.progressLayer) cause the flickering issue in SceneKit rendering?
0
0
570
Nov ’24
GKGameCenterViewController won't turn off.
GKGameCenterViewController won't turn off. With Core and GameKit from GitHub apple/unityplugins, I succeeded in logging in and displaying GKGameCenterViewController. Other leaderboards and everything work fine, but when I press X on GameCenter to return to the game, nothing happens. I tried debugging by printing logs here and there in the plugin to check, but I didn't get any results. When I press X, I couldn't get any logs or responses. It was like a button with no listener attached. No, it was more like an image. Based on the community posts that said it worked fine before, it seems that the recent GameCenter update was not applied to the plugin, was omitted, or changed, causing a mismatch.
1
0
663
Nov ’24
Unable to test Game Center achievements in ad hoc builds to registered device
Hi, I have a test app with a single "game centre achievement" and I am running it on my iPad and unable to list that achievement with GameCenterManager.shared.loadAchievements The app is still in version 1.0, prepare for submission status, (it is not ready for review submission). Game Center entitlement is added for the app and the achievement is added to the Game Center section of app. However it is marked as NotLive. I am using a sandbox account to login to game centre on the iPad and I can't fetch this achievement. Is it because it is "NotLive". ? How do I test my Game Center achivement on the device without releasing it yet.
1
0
636
Nov ’24
Rotating ModelEntities (without Gestures) Help
I'm building a proof of concept application leveraging the PlaneDetectionProvider to generate UI and interactive elements on a horizontal plane the user is looking at. I'm able to create a cube at the centroid of the plane and change it's location via position. However, I can't seem to rotate the cube programmatically and from this forum post in September I'm not sure if the modelEntity.move functionality is still bugged or the documentation is not up to date. if let planeCentroid = planeEntity.centroid { // Create a cube at the centroid let cubeMesh = MeshResource.generateBox(size: 0.1) // Create a cube with side length of 0.1 meters let cubeMaterial = SimpleMaterial(color: .blue, isMetallic: false) let cubeEntity = ModelEntity(mesh: cubeMesh, materials: [cubeMaterial]) cubeEntity.position = planeCentroid cubeEntity.position.y += 0.3048 planeEntity.addChild(cubeEntity) let rotationY = simd_quatf(angle: Float(45.0 * .pi/180.0), axis: SIMD3(x: 0, y: 1, z: 0)) let cubeTransform = Transform(rotation: rotationY) cubeEntity.move(to: cubeTransform, relativeTo: planeEntity, duration: 5, timingFunction: .linear) Ideally, I'd like to have the cube start/stop rotation when the user pinches on the plane mesh but I'd be happy just to see it rotate!
2
0
544
Nov ’24
how to get a null acceleration structure w/o trigging an API validation error
I want to turn off my ray-tracing conditionally. There's is_null_acceleration_structure but when I don't bind an acceleration structure (or pass nil to setFragmentAccelerationStructure), I get the following API validation error: -[MTLDebugRenderCommandEncoder validateCommonDrawErrors:]:5782: failed assertion `Draw Errors Validation Fragment Function(vol_deferred_lighting): missing instanceAccelerationStructure binding at index 6 for accelerationStructure[0]. I can turn off API validation and it works, but it seems like I should be able to use nil for the acceleration structure w/o triggering a validation error. Seems like a bug, right? I suppose I can work around this by creating a separate pipeline with the ray-tracing disabled via a function constant instead of using is_null_acceleration_structure. (Can we get a ray-tracing tag for questions?)
1
0
513
Nov ’24
SCNNode into SKScene is deformed when hit an object
Into a SKScene, I add a SCNSphere as a child of SKShapeNode, as depicted below. When the sphere hit another node (the fence in the example) the sphere is deformed as it were elastic. I didn't found any information about elastic properties. Someone know a way to avoid the deformation? import SwiftUI import SpriteKit import SceneKit @main struct MyApp: App { var body: some Scene { WindowGroup {SpriteView(scene: GameSceneSK(size: UIScreen.main.bounds.size))} } } class GameSceneSK: SKScene { override func sceneDidLoad() { var fencePoints = [ CGPoint(x: 300, y: 0), CGPoint(x: 300, y: 400), CGPoint(x: 0, y: 400) ] let fence = SKShapeNode(points: &fencePoints, count: fencePoints.count) fence.physicsBody = SKPhysicsBody(edgeChainFrom: fence.path!) addChild(fence) let sphereGeometry = SCNSphere(radius: 20) let sphereNode = SCNNode(geometry: sphereGeometry) let sphereScnScene = SCNScene() sphereScnScene.rootNode.addChildNode(sphereNode) let ball3D = SK3DNode(viewportSize: CGSize(width: 40, height: 40)) ball3D.scnScene = sphereScnScene let ball = SKShapeNode(circleOfRadius: 20) ball.physicsBody = SKPhysicsBody(circleOfRadius: 20) ball.addChild(ball3D) physicsWorld.gravity = CGVector(dx: 0.2, dy: 0.2) addChild(ball) } }
2
0
746
Nov ’24
Cannot use Metal graphics overview HUD with multiple CAMetalLayers
I have multiple CAMetalLayers that I render content to and noticed that the graphics overview HUD does not function properly when I have more than one CAMetalLayer. The values reported will be very strange. For example, FPS may report 999 or some large negative value. It the HUD simply not designed to work with multiple CAMetalLayers or MTKViews? When I disable all but one of my CAMetalLayers, the HUD works as expected.
1
0
689
Nov ’24
D3DMetal unsupported CheckFeatureSupport query 53 while running simple vulkaninfo using Mesa 24.3 Dozen (Vulkanon12) driver..
Hi, wanted to test if possible to use Mesa3D Dozen driver(Vulkan on D3D12 )+D3DMetal 2b3 to get maybe better Vulkan driver on Wine than default MoltenVK.. this will support Vulkan windows apps via using D3D12Metal.. using vulkan_dzn.dll,dzn_icd.x86_64.json,dxil.dll from x64 folder from: https://github.com/pal1000/mesa-dist-win/releases/download/24.3.0-rc1/mesa3d-24.3.0-rc1-release-msvc.7z using simple vulkaninfo app and running like: wine64 vulkaninfo I get error: [D3DMetal:LOG:2A825] Unsupported API: CheckFeatureSupport, unhandled support query 53 also seems D3DMetal Wine integration on Whisky doesn't expose d3d12core.dll and d3d12.dll like new Agility D3D12 dlls or VKD3D, so getting: MESA: error: Failed to retrieve D3D12GetInterface MESA: error: Failed to load DXCore but anyways seems to try to load the driver as: WARNING: dzn is not a conformant Vulkan implementation, testing use only. full log: MESA: error: Failed to retrieve D3D12GetInterface MESA: error: Failed to load DXCore WARNING: dzn is not a conformant Vulkan implementation, testing use only. [D3DMetal:LOG:2A825] Unsupported API: CheckFeatureSupport, unhandled support query 53 00bc:fixme:dcomp:DCompositionCreateDevice 0000000000000000, {c37ea93a-e7aa-450d-b16f-9746cb0407f3}, 000000000011E328. MESA: error: Failed to load DXCore WARNING: dzn is not a conformant Vulkan implementation, testing use only. [D3DMetal:LOG:2A825] Unsupported API: CheckFeatureSupport, unhandled support query 53 00bc:fixme:dcomp:DCompositionCreateDevice 0000000000000000, {c37ea93a-e7aa-450d-b16f-9746cb0407f3}, 000000000011E578. ERROR: [Loader Message] Code 0 : setup_loader_term_phys_devs: Call to 'vkEnumeratePhysicalDevices' in ICD c:\windows\system32\.\vulkan_dzn.dll failed with error code -3 ERROR: [Loader Message] Code 0 : setup_loader_term_phys_devs: Failed to detect any valid GPUs in the current config ERROR at C:\j\msdk0\build\Khronos-Tools\repo\vulkaninfo\vulkaninfo.h:241:vkEnumeratePhysicalDevices failed with ERROR_INITIALIZATION_FAILED
0
0
776
Nov ’24
How to Disable Object Occlusion in a RealityView?
Hi everyone, I’m working on a project using RealityKit and encountering an issue with object occlusion. Specifically, I need to disable the occlusion of real-world objects (e.g., tables, walls) in my RealityView. I want virtual entities to render fully, even if real-world objects would normally block their view. I’ve explored options in ARSession and ARWorldTrackingConfiguration but haven’t found anything that affects occlusion in RealityView. I suspect there might be a setting or approach I’ve missed. Has anyone dealt with a similar scenario or knows how to achieve this? Any insights or pointers would be greatly appreciated! Thanks in advance, Nicolas
1
1
492
Nov ’24
Game Porting Toolkit formula install error
So recently I have been trying to install game-porting-toolkit, and when I try to run brew -v install apple/apple/game-porting-toolkit Im getting an error: Error: apple/apple/game-porting-toolkit 1.1 did not build Logs: /Users/mateusz/Library/Logs/Homebrew/game-porting-toolkit/00.options.out /Users/mateusz/Library/Logs/Homebrew/game-porting-toolkit/01.configure /Users/mateusz/Library/Logs/Homebrew/game-porting-toolkit/01.configure.cc /Users/mateusz/Library/Logs/Homebrew/game-porting-toolkit/wine64-build If reporting this issue please do so to (not Homebrew/brew or Homebrew/homebrew-core): apple/apple I searched on this forum some threads but I couldn't take any advice from them. I would really appreciate any help.
0
0
821
Nov ’24
Can SceneKit be used with Swift 6 Concurrency ?
I am trying to port SceneKit projects to Swift 6, and I just can't figure out how that's possible. I even start thinking SceneKit and Swift 6 concurrency just don't match together, and SceneKit projects should - hopefully for the time being only - stick to Swift 5. The SCNSceneRendererDelegate methods are called in the SceneKit Thread. If the delegate is a ViewController: class GameViewController: UIViewController { let aNode = SCNNode() func renderer(_ renderer: any SCNSceneRenderer, updateAtTime time: TimeInterval) { aNode.position.x = 10 } } Then the compiler generates the error "Main actor-isolated instance method 'renderer(_:updateAtTime:)' cannot be used to satisfy nonisolated protocol requirement" Which is fully understandable. The compiler even tells you those methods can't be used for protocol conformance, unless: Conformance is declare as @preconcurrency SCNSceneRendererDelegate like this: class GameViewController: UIViewController, @preconcurrency SCNSceneRendererDelegate { But that just delays the check to runtime, and therefore, crash in the SceneKit Thread happens at runtime... Again, fully understandable. or the delegate method is declared nonisolated like this: nonisolated func renderer(_ renderer: any SCNSceneRenderer, updateAtTime time: TimeInterval) { aNode.position.x = 10 } Which generates the compiler error: "Main actor-isolated property 'position' can not be mutated from a nonisolated context". Again fully understandable. If the delegate is not a ViewController but a nonisolated class, we also have the problem that SCNNode can't be used. Nearly 100% of the SCNSceneRendererDelegate I've seen do use SCNNode or similar MainActor bound types, because they are meant for that. So, where am I wrong ? What is the solution to use SceneKit SCNSceneRendererDelegate methods with full Swift 6 compilation ? Is that even possible for now ?
5
0
1.1k
Nov ’24
SKNode.zPosition causes nodes to flicker by reordering them for 1 frame
When running the sample code below, every 3 seconds the middle sprite is replaced by a new one. When this happens, most of the time a flicker is noticeable. When recording the screen and stepping through the recording frame by frame, I noticed that the flicker is caused by a temporary reordering of the nodes’. Below you find two screenshots of two consecutive frames where the reordering is clearly visible. This only happens for a SpriteKit scene used as an overlay for a SceneKit scene. Commenting out buttons.zPosition = 1 or avoiding the fade in/out animations solves the issue. I have created FB15945016. import SceneKit import SpriteKit class GameViewController: NSViewController { let overlay = SKScene() var buttons: SKNode! var previousButton: SKSpriteNode! var nextButton: SKSpriteNode! var pageContainer: SKNode! var pageViews = [SKNode]() var page = 0 override func viewDidLoad() { super.viewDidLoad() let scene = SCNScene(named: "art.scnassets/ship.scn")! let scnView = self.view as! SCNView scnView.scene = scene overlay.anchorPoint = CGPoint(x: 0.5, y: 0.5) scnView.overlaySKScene = overlay buttons = SKNode() buttons.zPosition = 1 overlay.addChild(buttons) previousButton = SKSpriteNode(systemImage: "arrow.uturn.backward.circle") previousButton.position = CGPoint(x: -100, y: 0) buttons.addChild(previousButton) nextButton = SKSpriteNode(systemImage: "arrow.uturn.forward.circle") nextButton.position = CGPoint(x: 100, y: 0) buttons.addChild(nextButton) pageContainer = SKNode() pageViews = [SKSpriteNode(systemImage: "square.and.arrow.up"), SKSpriteNode(systemImage: "eraser")] overlay.addChild(pageContainer) setPage(0) Timer.scheduledTimer(withTimeInterval: 3, repeats: true) { [self] _ in setPage((page + 1) % 2) } } func setPage(_ page: Int) { pageViews[self.page].run(.sequence([ .fadeOut(withDuration: 0.2), .removeFromParent() ]), withKey: "fade") self.page = page let pageView = pageViews[page] pageView.alpha = 0 pageView.run(.fadeIn(withDuration: 0.2), withKey: "fade") pageContainer.addChild(pageView) } override func viewDidLayout() { overlay.size = view.frame.size } } extension SKSpriteNode { public convenience init(systemImage: String) { self.init() let width = 100.0 let image = NSImage(systemSymbolName: systemImage, accessibilityDescription: nil)!.withSymbolConfiguration(.init(hierarchicalColor: NSColor.black))! let scale = NSScreen.main!.backingScaleFactor image.size = CGSize(width: width * scale, height: width / image.size.width * image.size.height * scale) texture = SKTexture(image: image) size = CGSize(width: width, height: width / image.size.width * image.size.height) } }
1
0
649
Nov ’24
Dismissing a Window that contains MTKView no longer updates
I'm writing a swift app that uses metal to render textures to the main view. I currently use a NSViewRepresentable to place a MTKView into the window and a MTKViewDelegate to perform the metal operations. It's running well and I see my metal view being updated. However, when I close the window (either through the user clicking the close button or by programatically using the appropriate @Environment(\.dismissWindow) private var dismissWindow) and then reopen the window, I no longer receive calls to MTKViewDelegate draw(in mtkView: MTView). If I manually call the MTKView::draw() function my view updates it's content as expected, so it seems to be still be correctly setup / alive. As best as I can tell the CVDisplayLink created by MTKView is no longer active (or at least that's my understanding of how the MTKView::draw() function is called). I've setup the MTKView like this let mtkView = MTKView() mtkView.delegate = context.coordinator // My custom delegate mtkView.device = context.coordinator.device // The default metal device mtkView.preferredFramesPerSecond = 60 mtkView.enableSetNeedsDisplay = false mtkView.isPaused = false which I was hoping would call the draw function at 60fps while the view is visible. I've also verified the values don't change while running. Does anyone have any ideas on how I could restart the CVDisplayLink or anyother methods to avoid this problem?? Cheers Jack
2
0
591
Nov ’24
RealityComposer text items (arkit/ios18) not displaying
I have created a simple scene in reality composer (composer not composer pro). It contains just a cube and text item. I convert this to usdz file and load it into a Arkit swift app. Since ios 18/xcode 16 - the "text" element is not displayed at all. The cube is displayed, anchors correctly and can be moved etc.... The output from usdchecker ➜ Desktop usdchecker GKTUHR1.6.3.usdz -v --arkit Opening GKTUHR1.6.3.usdz Checking layer <GKTUHR1.6.3.usdz>. Checking package <GKTUHR1.6.3.usdz> Checking prim </Root>. Checking prim </Root/Scenes>. Checking prim </Root/Scenes/Scene>. Checking prim </Root/Scenes/Scene/Gravity>. Checking prim </Root/Scenes/Scene/sceneGroundPlane>. Checking prim </Root/Scenes/Scene/sceneGroundPlane/physicsMaterial>. Checking prim </Root/Scenes/Scene/Children>. Checking prim </Root/Scenes/Scene/Children/hello>. Checking prim </Root/Scenes/Scene/Children/hello/Generated>. Checking prim </Root/Scenes/Scene/Children/hello/Generated/Text>. Checking prim </Root/Scenes/Scene/Children/hello/Generated/Text/Material>. Checking prim </Root/Scenes/Scene/Children/hello/Generated/Text/Material/PBRShader>. Checking shader </Root/Scenes/Scene/Children/hello/Generated/Text/Material/PBRShader>. Checking prim </Root/Scenes/Scene/Children/hello/Children>. Checking prim </Root/Scenes/Scene/Children/Box>. Checking prim </Root/Scenes/Scene/Children/Box/Generated>. Checking prim </Root/Scenes/Scene/Children/Box/Generated/Mesh0>. Checking prim </Root/Scenes/Scene/Children/Box/Generated/Mesh0/Mesh0>. Checking prim </Root/Scenes/Scene/Children/Box/Generated/Mesh0/Material>. Checking prim </Root/Scenes/Scene/Children/Box/Generated/Mesh0/Material/PBRShader>. Checking shader </Root/Scenes/Scene/Children/Box/Generated/Mesh0/Material/PBRShader>. Checking prim </Root/Scenes/Scene/Children/Box/Children>. Checking prim </Root/Scenes/Scene/Children/Box/PhysicsMaterial_Box>. Found material bindings but no MaterialBindingAPI applied on the prim </Root/Scenes/Scene/sceneGroundPlane>. (fails 'MaterialBindingAPIAppliedChecker') Found material bindings but no MaterialBindingAPI applied on the prim </Root/Scenes/Scene/Children/hello/Generated/Text>. (fails 'MaterialBindingAPIAppliedChecker') Found material bindings but no MaterialBindingAPI applied on the prim </Root/Scenes/Scene/Children/Box>. (fails 'MaterialBindingAPIAppliedChecker') Found material bindings but no MaterialBindingAPI applied on the prim </Root/Scenes/Scene/Children/Box/Generated/Mesh0>. (fails 'MaterialBindingAPIAppliedChecker') Failed!
0
0
613
Nov ’24
SpriteKit: SKTileMap leaks with `SKTexture(rect: CGRect)` usage
Hello reader, I am facing an issue that I am not able to resolve. I have been able to create a demo project that demonstrates the issue, which I hope enables you to have a look as well and hopefully find a way to resolve it. What is the issue: I am using SKTileMapNode in order to draw tile maps. Instead of using the tilesets as you can use from within the Xcode editor, I prefer to do it all programmatically using tilesheets (for a plethora of reasons that I will leave out of this equation). This is the code of the gameScene: import SpriteKit import GameplayKit class GameScene: SKScene { private let tileSize = CGSize(width: 32, height: 32) override func didMove(to view: SKView) { super.didMove(to: view) let tileSet = createTileSet() let tileMap = SKTileMapNode(tileSet: tileSet, columns: 100, rows: 100, tileSize: tileSize) for column in 0..<tileMap.numberOfColumns { for row in 0..<tileMap.numberOfRows { guard let tileGroup = tileSet.tileGroups.randomElement() else { fatalError() } tileMap.setTileGroup(tileGroup, forColumn: column, row: row) } } addChild(tileMap) } private func createTileSet() -> SKTileSet { let tileSheetTexture = SKTexture(imageNamed: "terrain") var tileGroups = [SKTileGroup]() let relativeTileSize = CGSize(width: tileSize.width/tileSheetTexture.size().width, height: tileSize.height/tileSheetTexture.size().height) for idx in 0...2 { for jdx in 0...2 { let tileTexture = SKTexture(rect: .init(x: CGFloat(idx) * relativeTileSize.width, y: CGFloat(jdx) * relativeTileSize.height, width: relativeTileSize.width, height: relativeTileSize.height), in: tileSheetTexture) let tileDefinition = SKTileDefinition(texture: tileTexture, size: tileSize) let tileGroup = SKTileGroup(tileDefinition: tileDefinition) tileGroups.append(tileGroup) } } let tileSet = SKTileSet(tileGroups: tileGroups) return tileSet } override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) { presentSceneAgain() } func presentSceneAgain() { if let frame = view?.frame { view?.presentScene(GameScene(size: frame.size), transition: .doorsCloseHorizontal(withDuration: 1.0)) } } } This demo project create a tilemapnode of 100 X 100 tiles. Then, it fills these 10.000 tiles with a random tile from the tilesheet named "terrain.png". This tile sheet contains many tiles, but I only take the 9 tiles (3 X 3) from the lower left corner as a random tile option. Thus, the 10.000 tiles get filled with one of these 9 tiles. So it doesnt look pretty or anything, but that isnt the purpose. Now, to create these 9 tile textures, I use the SKTexture(rectIn:) method on the source texture being "terrain.png". I think the code is quite clear in itself, but so far the explanation. When you run it, you should see the map being rendered. When you tap the scene, the scene will present a new instance of the scene. Not more than that. Now, when you do this, have a look at the RAM usage of the app. You will see it steadily increases over time, each time you click the scene and a new scene is presented. I looked deeper into what is happening, and what I see in the memory graph, is that for every present of the scene that is done, there are 3 SKTexture instances being created that are never released. The first time the scene is rendered, there 11 SKTexture instances allocated (I dont know why there are 11 though. I would expect 10: the source texture and the 9 tile textures). But then as mentioned, after a tap and a new present, I get 14 SKTexture, of which 3 are zombies, see image leak_1. Moreover, Xcode reports multiple additional leaks from Jet and Metal allocations, see image leak_all. As far as I know, the code presented is not retaining any references that it should not, and I suspect this leaks are happening somewhere inside SpriteKit. But I am not able to find exactly where, or how to resolve it. I hope someone can help with this issue.
2
1
676
Nov ’24
Pack high bit of every byte in ARM NEON, for 64 bytes like AVX512 vpmovb2m?
__builtin_ia32_cvtb2mask512() is the GNU C builtin for vpmovb2m k, zmm. The Intel intrinsic for it is _mm512_movepi8_mask. It extracts the most-significant bit from each byte, producing an integer mask. The SSE2 and AVX2 instructions pmovmskb and vpmovmskb do the same thing for 16 or 32-byte vectors, producing the mask in a GPR instead of an AVX-512 mask register. (_mm_movemask_epi8 and _mm256_movemask_epi8). I would like an implementation for ARM that is faster than below I would like an implementation for ARM NEON I would like an implementation for ARM SVE I have attached a basic scalar implementation in C. For those trying to implement this in ARM, we care about the high bit, but each byte's high bit (in a 128bit vector), can be easily shifted to the low bit using the ARM NEON intrinsic: vshrq_n_u8(). Note that I would prefer not to store the bitmap to memory, it should just be the return value of the function similar to the following function. #define _(n) __attribute((vector_size(1<<n),aligned(1))) typedef char V _(6); // 64 bytes, 512 bits typedef unsigned long U; #undef _ U generic_cvtb2mask512(V v) { U mask=0;int i=0; while(i<64){ // shift mask by 1 and OR with MSB of v[i] byte mask=(mask<<1)|((v[i]&0x80)>>7); i++;} return mask; } This is also a dup of : https://stackoverflow.com/questions/79225312
0
0
474
Nov ’24
Render to multiple offscreen images with SCNRenderer
I am trying to extract some built-in and custom render passes from SceneKit, so that I can pass them into a metal pipeline and do some additional work with them. I have a metal viewport, and have instantiated a SCNRenderer so that I can render a SCNScene using SceneKit to a texture as part of my metal draw pass. This works as expected. Now I want to output multiple textures from the SceneKit render, not just the final color. I want to extract Depth, Normal, Lighting, Colour and a custom SCNTechnique for world position. I can easily use a SCNTechnique to render one of these to the color output, but it's not clear how I would render multiple passes in one render call. Is there some way to pass a writeable buffer/texture to a SCNTechnique, so that I can populate it in my SCNTechnique shader at render time with the output from the pass? Similar to how one would bind a buffer for a metal shader. SCNTechnique obfuscates things, so it's not clear how to proceed. Does anyone have any ideas?
0
0
603
Dec ’24
Which Apple technologies to use for simple 2d motion graphics software?
I plan to create a simple motion graphics software for macOS that animates text, basic shapes, and handles audio. I'll use SwiftUI for the UI. What are the commonly used technologies for rendering animated graphics? Core Animation is suitable for UI animations but not for exporting and controlling UI animations. Basic requirements: Timeline user interface Animation of text and basic shapes Viewer in SwiftUI GUI with transport control (play, pause, scrub, …) Export to video file Is Metal or Core Graphics typically used directly? I want to keep it as simple as possible.
1
0
677
Dec ’24