Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics

Post

Replies

Boosts

Views

Activity

Game Can't Open (Cookie Run Kingdom)
Recently started playing this game, when I restarted my computer once it glitched out and when I tried to open it again, it said "last time you opened this app, it was force quit. Do you want to reopen it?" then the app closes itself immediately. How can I fix this? How can I remove the caches?
1
0
63
1d
Xcode memory meter goes to high while converting USDZ file to scn file by programatically
We have requirement adding usdz file to UIView and showing the it’s content and archive the data and save to file. When user open the file, we need to unarchive that usdz content and binding with UIView and showing it to user. Initially, we created SCNScene object passing usdz file url like below. do { usdzScene = try SCNScene(url: usdzUrl) } catch let error as NSError { print(error) } Since SCNScene support to NSSecureCoding protocol , we directly archive that object and save it file and load back it from file and created SCNScene object using NSKeyedUnarchiver process. But for some files, we realised high memory consumption while archiving the SCNScene object using below line. func encode(with coder: NSCoder) { coder.encode(self.scnScene, forKey: "scnScene") } File referene link : toy_drummer_idle.usdz When we analyse apple documentation (check discussion section) , it said, scn file extension is the fastest format for processing than the usdz. So we used SCNSecne write to feature for creating scn file from given usdz file. After that, When we do the archive SCNScene object that was created by sun file url, the archive process is more faster and it will not take high memory as well. It is really faster previous case now. But unfortunately, SCNScene write method will take lot of time for this conversion and memory meter is also going high and it will be caused to app crash as well. I check the output file size as well. The given usdz file size is 18MB and generated scn file size is 483 MB. But SCNScene archive process is so fast. Please, analyse this case and please, provide some guideline how we can optimise this behaviour. I really appreciate your feedback. Full Code: import UIKit import SceneKit class ViewController: UIViewController { var scnView: SCNView? var usdzScene: SCNScene? var scnScene: SCNScene? lazy var exportButton: UIButton = { let btn = UIButton(type: UIButton.ButtonType.system) btn.tag = 1 btn.backgroundColor = UIColor.blue btn.addTarget(self, action: #selector(buttonPressed(_:)), for: .touchUpInside) btn.setTitle("USDZ to SCN", for: .normal) btn.setTitleColor(.white, for: .normal) btn.layer.borderColor = UIColor.gray.cgColor btn.titleLabel?.font = .systemFont(ofSize: 20) btn.translatesAutoresizingMaskIntoConstraints = false return btn }() func deleteTempDirectory(directoryName: String) { let tempDirectoryUrl = URL(fileURLWithPath: NSTemporaryDirectory()) let tempDirectory = tempDirectoryUrl.appendingPathComponent(directoryName, isDirectory: true) if FileManager.default.fileExists(atPath: URL(string: tempDirectory.absoluteString)!.path) { do{ try FileManager.default.removeItem(at: tempDirectory) } catch let error as NSError { print(error) } } } func createTempDirectory(directoryName: String) -> URL? { let tempDirectoryUrl = URL(fileURLWithPath: NSTemporaryDirectory()) let toBeCreatedDirectoryUrl = tempDirectoryUrl.appendingPathComponent(directoryName, isDirectory: true) if !FileManager.default.fileExists(atPath: URL(string: toBeCreatedDirectoryUrl.absoluteString)!.path) { do{ try FileManager.default.createDirectory(at: toBeCreatedDirectoryUrl, withIntermediateDirectories: true, attributes: nil) } catch let error as NSError { print(error) return nil } } return toBeCreatedDirectoryUrl } @IBAction func buttonPressed(_ sender: UIButton){ let scnFolderName = "SCN" let scnFileName = "3D" deleteTempDirectory(directoryName: scnFolderName) guard let scnDirectoryUrl = createTempDirectory(directoryName: scnFolderName) else {return} let scnFileUrl = scnDirectoryUrl.appendingPathComponent(scnFileName).appendingPathExtension("scn") guard let usdzScene else {return} let result = usdzScene.write(to: scnFileUrl, options: nil, delegate: nil, progressHandler: nil) if (result) { print("exporting process is success.") } else { print("exporting process is failed.") } } override func viewDidLoad() { super.viewDidLoad() let usdzUrl: URL? = Bundle.main.url(forResource: "toy_drummer_idle", withExtension: "usdz") guard let usdzUrl else {return} do { usdzScene = try SCNScene(url: usdzUrl) } catch let error as NSError { print(error) } guard let usdzScene else {return} scnView = SCNView(frame: .zero) guard let scnView else {return} scnView.translatesAutoresizingMaskIntoConstraints = false self.view.addSubview(scnView) self.view.addSubview(exportButton) NSLayoutConstraint.activate([ scnView.leadingAnchor.constraint(equalTo: view.safeAreaLayoutGuide.leadingAnchor), scnView.trailingAnchor.constraint(equalTo: view.safeAreaLayoutGuide.trailingAnchor), scnView.topAnchor.constraint(equalTo: view.safeAreaLayoutGuide.topAnchor), scnView.bottomAnchor.constraint(equalTo: view.safeAreaLayoutGuide.bottomAnchor, constant: -30), exportButton.widthAnchor.constraint(equalToConstant: 200), exportButton.heightAnchor.constraint(equalToConstant: 40), exportButton.centerXAnchor.constraint(equalTo: view.safeAreaLayoutGuide.centerXAnchor), exportButton.bottomAnchor.constraint(equalTo: view.safeAreaLayoutGuide.bottomAnchor), ]) DispatchQueue.main.asyncAfter(deadline: .now() + 0.01) {[weak self] in guard let self else {return} loadModel(scene: usdzScene) } } func loadModel(scene: SCNScene){ guard let scnView else {return} scnView.autoenablesDefaultLighting = true scnView.scene = scene scnView.allowsCameraControl = true } } ![]("https://developer.apple.com/forums/content/attachment/5ff0197b-7e0b-4efe-b295-cc205c73dc02" "title=Screenshot 2024-06-20 at 12.23.40.png;width=1712;height=1013")
0
0
62
1d
Game porting toolkit 1.0.3 to 1.1 update error
It show the following : ==> Upgrading apple/apple/game-porting-toolkit 1.0.3 -> 1.1 ==> Staging /Users/mtahin07/Library/Caches/Homebrew/downloads/7baed2a6fd34b4a641 ==> Patching ==> /private/tmp/game-porting-toolkit-20240619-16666-si8pyg/wine/configure --pre ==> make Last 15 lines from /Users/mtahin07/Library/Logs/Homebrew/game-porting-toolkit/02.make: /private/tmp/game-porting-toolkit-20240619-16666-si8pyg/wine/dlls/crypt32/unixlib.c:641:23: error: use of undeclared identifier 'noErr' if (status == noErr) ^ /private/tmp/game-porting-toolkit-20240619-16666-si8pyg/wine/dlls/crypt32/unixlib.c:647:31: warning: this function declaration is not a prototype [-Wstrict-prototypes] if ((status = SecItemExport(cert, kSecFormatX509Cert, 0, NULL, &certData)) == noErr) ^ /private/tmp/game-porting-toolkit-20240619-16666-si8pyg/wine/dlls/crypt32/unixlib.c:647:51: error: use of undeclared identifier 'kSecFormatX509Cert' if ((status = SecItemExport(cert, kSecFormatX509Cert, 0, NULL, &certData)) == noErr) ^ /private/tmp/game-porting-toolkit-20240619-16666-si8pyg/wine/dlls/crypt32/unixlib.c:647:95: error: use of undeclared identifier 'noErr' if ((status = SecItemExport(cert, kSecFormatX509Cert, 0, NULL, &certData)) == noErr) ^ 2 warnings and 7 errors generated. make: *** [dlls/crypt32/unixlib.o] Error 1 make: *** Waiting for unfinished jobs.... If reporting this issue please do so to (not Homebrew/brew or Homebrew/homebrew-core): apple/apple I can't verify what error occur
1
0
116
2d
Elapsed scene time used in custom shader uniform
If you create a custom shader you get access to a collection of uniform values, one is the uniforms::time() parameter which is defined as "the number of seconds that have elapsed since RealityKit began rendering the current scene" in this doc: https://developer.apple.com/metal/Metal-RealityKit-APIs.pdf Is there some way to get this value from Swift code? I want to animate a value in my shader based on the time so I need to get the starting time value so I can interpolate the animation offset from that point. If I create a System in the update() function I get a SceneUpdateContext instance and that has a deltaTime property but not an elapsedTime property which I would assume would map to the shader time() value.
0
0
106
3d
App using MetalKit creates many IOSurfaces in rapid succession, causing MTKView to freeze and app to hang
I've got an iOS app that is using MetalKit to display raw video frames coming in from a network source. I read the pixel data in the packets into a single MTLTexture rows at a time, which is drawn into an MTKView each time a frame has been completely sent over the network. The app works, but only for several seconds (a seemingly random duration), before the MTKView seemingly freezes (while packets are still being received). Watching the debugger while my app was running revealed that the freezing of the display happened when there was a large spike in memory. Seeing the memory profile in Instruments revealed that the spike was related to a rapid creation of many IOSurfaces and IOAccelerators. Profiling CPU Usage shows that CAMetalLayerPrivateNextDrawableLocked is what happens during this rapid creation of surfaces. What does this function do? Being a complete newbie to iOS programming as a whole, I wonder if this issue comes from a misuse of the MetalKit library. Below is the code that I'm using to render the video frames themselves: class MTKViewController: UIViewController, MTKViewDelegate { /// Metal texture to be drawn whenever the view controller is asked to render its view. private var metalView: MTKView! private var device = MTLCreateSystemDefaultDevice() private var commandQueue: MTLCommandQueue? private var renderPipelineState: MTLRenderPipelineState? private var texture: MTLTexture? private var networkListener: NetworkListener! private var textureGenerator: TextureGenerator! override public func loadView() { super.loadView() assert(device != nil, "Failed creating a default system Metal device. Please, make sure Metal is available on your hardware.") initializeMetalView() initializeRenderPipelineState() networkListener = NetworkListener() textureGenerator = TextureGenerator(width: streamWidth, height: streamHeight, bytesPerPixel: 4, rowsPerPacket: 8, device: device!) networkListener.start(port: NWEndpoint.Port(8080)) networkListener.dataRecievedCallback = { data in self.textureGenerator.process(data: data) } textureGenerator.onTextureBuiltCallback = { texture in self.texture = texture self.draw(in: self.metalView) } commandQueue = device?.makeCommandQueue() } public func mtkView(_ view: MTKView, drawableSizeWillChange size: CGSize) { /// need implement? } public func draw(in view: MTKView) { guard let texture = texture, let _ = device else { return } let commandBuffer = commandQueue!.makeCommandBuffer()! guard let currentRenderPassDescriptor = metalView.currentRenderPassDescriptor, let currentDrawable = metalView.currentDrawable, let renderPipelineState = renderPipelineState else { return } currentRenderPassDescriptor.renderTargetWidth = streamWidth currentRenderPassDescriptor.renderTargetHeight = streamHeight let encoder = commandBuffer.makeRenderCommandEncoder(descriptor: currentRenderPassDescriptor)! encoder.pushDebugGroup("RenderFrame") encoder.setRenderPipelineState(renderPipelineState) encoder.setFragmentTexture(texture, index: 0) encoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4, instanceCount: 1) encoder.popDebugGroup() encoder.endEncoding() commandBuffer.present(currentDrawable) commandBuffer.commit() } private func initializeMetalView() { metalView = MTKView(frame: CGRect(x: 0, y: 0, width: streamWidth, height: streamWidth), device: device) metalView.delegate = self metalView.framebufferOnly = true metalView.colorPixelFormat = .bgra8Unorm metalView.contentScaleFactor = UIScreen.main.scale metalView.autoresizingMask = [.flexibleWidth, .flexibleHeight] view.insertSubview(metalView, at: 0) } /// initializes render pipeline state with a default vertex function mapping texture to the view's frame and a simple fragment function returning texture pixel's value. private func initializeRenderPipelineState() { guard let device = device, let library = device.makeDefaultLibrary() else { return } let pipelineDescriptor = MTLRenderPipelineDescriptor() pipelineDescriptor.rasterSampleCount = 1 pipelineDescriptor.colorAttachments[0].pixelFormat = .bgra8Unorm pipelineDescriptor.depthAttachmentPixelFormat = .invalid /// Vertex function to map the texture to the view controller's view pipelineDescriptor.vertexFunction = library.makeFunction(name: "mapTexture") /// Fragment function to display texture's pixels in the area bounded by vertices of `mapTexture` shader pipelineDescriptor.fragmentFunction = library.makeFunction(name: "displayTexture") do { renderPipelineState = try device.makeRenderPipelineState(descriptor: pipelineDescriptor) } catch { assertionFailure("Failed creating a render state pipeline. Can't render the texture without one.") return } } } My question is simply: what gives?
0
0
98
3d
Can I get a point on a texture in realityKit on VisionOS?
Hi, I am currently considering porting my AR game from SceneKit to RealityKit so it appears in 3D on Vision OS, but one crucial question to knonw if it can even be ported is: I need a tap on a 3D Model that is than translated into a tap onto the texture of that model. On visionOS this would be the gaze of the person so the question is if I can get the point of a texture is user is looking at (optimally always to I can have a hover effect, but if not at least when tapping the finger together). If that is not possible, is it possible to touch a reality kit object and get the location of that on the texture? All the best Christoph
4
0
141
4d
Error: apple/apple/game-porting-toolkit 1.1 did not build
I was trying update gptk, so i removed the old one ( installed by Xcode 15.0 ) and reinstalled it, but accuired a issue: Error: apple/apple/game-porting-toolkit 1.1 did not build Logs: /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/00.options.out /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/01.configure /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/01.configure.cc /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/02.make /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/wine64-build If reporting this issue please do so to (not Homebrew/brew or Homebrew/homebrew-core): apple/apple Error: Your Xcode (15.0) is outdated. Please update to Xcode 15.4 (or delete it). Xcode can be updated from the App Store. So I upgrade the Xcode to 15.4 by app store and still encountered: Error: apple/apple/game-porting-toolkit 1.1 did not build Logs: /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/00.options.out /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/01.configure /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/01.configure.cc /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/02.make /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/wine64-build If reporting this issue please do so to (not Homebrew/brew or Homebrew/homebrew-core): apple/apple I've been reading some post says use 15.1, but still get same error report. What should I do next? Any help woulbe appreciated.
0
0
159
4d
How to reduce draw call count in RealityKit
I'm trying to render a large number of entities, it looks like each ModelEntity causes a draw call, even if you share the ModelComponent so each Entity shares the mesh and materials. I tried to use the MeshInstanceCollection inside MeshResource to generate a large number of objects in the scene, the code works and draws many objects but the draw count is still one call per instance, this seems strange I would assume it should only be one draw call for the single entity since I have specified to use instancing in the resource. Has anybody else successfully used instancing in RealityKit to draw a large number go Entities (maybe around 10,000) or drawn this amount of items successfully with 60fps any other way? Here is some sample code that draws 100 cubes using instancing but still causes 100 draw calls. func instanceTest(scene: RealityKit.Scene) { var resource = MeshResource.generateBox(size: 0.2) var contents = MeshResource.Contents() contents.models = resource.contents.models var arr: [MeshResource.Instance] = [] var matrix = matrix_identity_float4x4 matrix[3, 0] = 0.5 for i in 0..<100 { let inst = MeshResource.Instance(id: "\(i)", model: "MeshModel", at: matrix) arr.append(inst) } contents.instances = MeshInstanceCollection(arr) let updatedResource = try? MeshResource.generate(from: contents) let unlitMaterial = UnlitMaterial(color: .red) let modelEntity = ModelEntity( mesh: updatedResource!, materials: [unlitMaterial] ) let anchor = AnchorEntity() anchor.addChild(modelEntity) scene.addAnchor(anchor) }
0
0
100
4d
Couldnt link game on game center
my game center account are linked both on my ipad and iphone but the activity on the game center didnt update when i play games on my ipad. It only linked to games on my phone. So all the games i‘ve downloaded on my ipad arent linked to the game center. Is there any solution so that the games i‘ve downloaded on my ipad could link to the game center? i have tried to turn on and off both on my ipad and iphone but it still only linked to the phone.
0
0
61
4d
How to create screen-space meshes selectively in RealityKit AR Mode Using New OrthographicCameraComponent?
I'd like to create meshes in RealityKit ( AR mode on iPad ) in screen-space, i.e. for UI. I noticed a lot of useful new functionality in RealityKit for the next OS versions, including the OrthographicCameraComponent here: https://developer.apple.com/documentation/realitykit/orthographiccameracomponent?changes=_3 I think this would help, but I need AR worldtracking as well as a regular perspective camera to work with the 3D elements. Firstly, can I have a camera attached selectively to a few entities, just for those entities? This could be the orthographic camera. Secondly, can I make it so those entities are always rendered in-front, in screenspace? (They'd need to follow the camera.) If I can't have multiple cameras, what can be done in that case? Is it actually better to use a completely different view / API for layering on-top of RealityKit? I would much rather keep everything in RealityKit, however, for simplicity.
0
0
154
5d
Using a scene from Reality Composer Pro in an IOS app?
I am trying to establish a workflow with using Reality Composer Pro to make scenes - I am grey boxing a scene using primitives at the moment. I have set up a cube with a texture material and a simple animation to spin. I am confused as to what I should be loading. I have created what I think is a scene asset in the package for the Reality Composer Project. Here is a code snippet: struct ContentView: View { var body: some View { RealityView { content in do { let scene = try await ModelEntity(named: "HOF") content.add(scene) } catch { print("Error loading scene: \(error.localizedDescription)") } } } } Here is the project layout in Reality Composer Pro:
1
0
143
5d
SceneKit or RealityKit for Non-AR Game Development
Hi everyone, I'm choosing a framework for developing a game that doesn't involve augmented reality (AR) and I'm unsure whether to use SceneKit or RealityKit. I would like to hear from Apple engineers on this matter. Which of these frameworks is better suited for creating non-AR games? Additionally, I'd like to know if it's possible to disable AR in RealityKit using the updated RealityView? Thanks in advance for your insights and recommendations!
2
0
246
5d
Reality Composer Pro inline documentation comments
The WWDC24 video "Build a spatial drawing app with RealityKit" https://developer.apple.com/wwdc24/10104 at 12:04 includes a slide showing a Reality Composer Pro shader graph that features wonderful inline documentation comment boxes: Are shader graph inline comments a new feature that Reality Composer Pro supports? This would be extraordinarily useful, as complex shader graphs can be challenging to decipher. If so, how are inline shader graph comments created in Reality Composer Pro?
1
0
139
6d
copyFromBuffer offset and size working even when not multiple of 4
Hi, Reading the copyFromBuffer documentation states that on macOS, sourceOffset, destinationOffset, and size "needs to be a multiple of 4, but can be any value in iOS and tvOS". However, I have noticed that, at least on my M2 Max, this limitation does not seem to exist as there are no warnings and the copy works correctly regardless of the offset value. I'm curious to know if this is something that should still be avoided. Is the multiple of 4 limitation reserved for non Apple Silicon devices and that note can be ignored for Apple Silicon? I ask because I am a contributor to Metal.jl, and recently noticed that our tests pass even when copying using copyWithBuffer with offsets and sizes that are not multiples of 4. If that coul cause issues/correctness problems, we would need to fix that. Thank you. Christian
0
0
85
6d
Unable to create PhysicsJoint using Entity's Geometric Pin
Hello, I'm trying to attach one entity to another entity via the new PhysicsFixedJoint. I have a usdz that contains a skeletal pose which expose the joints as pins as desired. However the when I access the pin, it is returning a GeometricPin, instead of an EntityGeometricPin as you would expect. I can't use the returned GeometricPin to create the joint. Am I missing something? Shouldn't access the Entity's pins object return EntityGeometricPins instead of GeometricPin? Here is the code sample: var body: some View { RealityView { content in if let scene = try? await Entity(named: "Scene", in: untitledBundle) { content.add(scene) let attack = try! Entity.load(named: "Attack01_SingleSword") let anchor = scene.findEntity(named: "Root") anchor?.addChild(attack) let sword = try! Entity.load(named: "OHS08_Sword") anchor?.addChild(sword) if let swordEntity = findModelComponentEntity(entity: sword) { let swordPin = swordEntity.pins.set( named: "test", position: SIMD3<Float>.zero ) if let attackEntity = findModelComponentEntity(entity: attack) { let attackPin = attackEntity.pins["root/pelvis/spine_01/spine_02/spine_03/clavicle_r/upperarm_r/lowerarm_r/hand_r/weapon_r"]! // This is returning GeomtricPin instead of the EntityGeometricPin that the "pins" object contains let joint = PhysicsFixedJoint( pin0: swordPin, pin1: attackPin // This is a compile error since it is not an EntityGeometricPin type ) try! joint.addToSimulation() } } } } }
0
0
104
6d